Incrementalism

One of the hardest won lessons of my career is the power of incrementalism. I am, at my core, an idealist. When i look at a product or code base i tend to see the ways it deviates from my ideals, rather than the ways it is useful or does match my ideals. That is a nice way of saying i tend come off negative and cynical when discussing… well, almost anything.

This idealism is a powerful motivator to produce elegant code and good products. I suspect it is one of the reasons i have become a good software developer. It does have its dark side though. Overcoming some of the weaknesses of my idealism is an ongoing theme in my life. Most importantly idealism makes developing a sense of “good enough” very difficult. Paying attention to good enough is one of the most important skills any engineer can develop because otherwise you constantly produce solutions in search of a problem. An almost sure sign of an underdeveloped sense of good enough is the big rewrite. Early in my career i was involved in a couple of big rewrites. Both were successful from a technical perspective but both were a complete failure from a business perspective. They took too long and did not provide any immediate benefits to the customer.

In both cases i was crucial in convincing the business to do the big rewrite.

In the intervening years i have come to realize that, to use a football analogy, real progress is made 3 yards and a cloud of dust at a time. If you want a successful product, or a nice code base you will get it one small improvement at a time. The bigger the change you are contemplating the more uncomfortable you should be. If it is going to take more than a few man weeks of effort there is almost certainly a better, more incremental, way to achieve the goal.

I am not saying you should not rewrite the products you work on. Quite the contrary actually. If you are not in the middle of completely rewriting your code base you are Doing It Wrong. But that rewrite should be broken into many small improvements and it should never end. Those improvements will, over time, constitute a complete rewrite in practical terms. However, the business and customers will continue to get value out of the improvements as they are made, rather than having to wait many months. The developers benefit too, because the changes get released and battle tested in smaller, easier to debug pieces.

While i think all projects should be engaged in the continuous incremental rewrite, every rewrite needs to have a strategic vision. You need to know where you want to be in 1-3 years. Without such a vision you won’t know which incremental improvements to make. Once your team has a shared vision for where the product is headed you can make surprisingly rapid progress toward those goals without disrupting the day to day business. Be prepared for this strategic vision to change over time. As you gain more information about the domain and customers it is inevitable that your thinking will evolve. This is a key benefit of this model. You are continually able to make course corrections because you are always getting new information by putting the improvements in front of customers and getting feed back with very little delay.

Bookmarks and URI based versioning

Threads about how to version hypermedia (or REST) APIs are multitude. I certainly have made my opinion known in the past. That being said, the most common approach being used in the wild is putting a version number in the URI of the resources which are part of the API. For example, http://api.example.com/v1/products/42.

That approach has the advantage of being simple and easy to understand. Its main downside is that it makes it difficult for existing clients to switch to a newer version of the if one becomes available. The difficultly arises because most existing clients will have bookmarked certain resources that are needed to accomplish their goals. Such bookmarks complicate the upgrade quite significantly. Clients who want to use an upgraded API must choose to rewrite those bookmarks based on some out of band knowledge, support both the old and new version of the API, or force the user to start over from scratch.

None of these are good options. The simplest, most attractive approach is the first. However, forcing clients to mangle saved URIs reduces the freedom of the server to evolve. The translation between the two versions of the API will have to be obvious and simple. That means you are going to have to preserve key parts of the URI into the new structure. You cannot switch from a numeric surrogate key to a slug to improve your SEO. Likewise, cannot move from a slug to a numeric surrogate key to prevent name collisions. You never know when the upgrade script will be executed. It could be years from now so you will also need to maintain those URIs forever. Some clients have probably bookmarked some resources that you do not think of as entry points, you will need to be this careful for every resource in your system.

The second option, forcing clients to support both versions of the API, is even worse that the first. This means that once a particular instance of a client has used the API it is permanently locked into that version of that API. This is horrible because it means that early users cannot take advantage of new functionality in the API. It is also means that deprecated versions of the API must be maintained much longer than would otherwise be necessary.

The third option, forcing users to start over from scratch, is what client writers must do if they want to use functionality which is not available in the obsolete version when there is no clear upgrade path between API versions. This is not much work for the client or server implementers but it seriously sucks for the users. Any configuration, and maybe even previous work, is lost and they are forced to recreate it.

A way forward

Given that this style of versioning is the most common we need a solution. The link header provides one possible solution. We can introduce a link to relate the old and new versions of logically equivalent resources. When introducing a breaking API change the server bumps the API version and changes the URIs in any way it likes, eg the new URI might be http://example.com/v2/products/super-widget. In the old version of the API a link header is added to responses to indicated the equivalent resource in the new API, eg http://example.com/v2/rels/v2-equivalent.

>>>
GET /v1/orders/42 HTTP/1.1
...

<<<
HTTP/1.1 200 OK
link: <http://example.com/v2/orders/super-widget>; rel="alternate http://example.com/v2/rels/v2-equivalent"
...

Older clients will happily ignore this addition and continue to work correctly. Newer clients will check every response involving a stored URI for the presences of such a link and will treat it as a redirect. That is, they will follow the link and use the most modern variant they support.

If you are really bad at API design you can stack these links. For example, the v1 variants might have links to both the v2 and v3 variants. Chaining might also work but it would require clients to, at least, be aware that any intermediate version upgrade link relations so that they could follow that chain to the version they prefer.

You could also add links to the obsolescent variant’s body. This would be almost equivalent except that it requires clients to be able to parse older responses enough to search for the presence of such a link. Using the HTTP link header field nicely removes that requirement by moving the link from the arbitrarily formatted body to the HTTP header which will be supported by all reasonable HTTP clients.

Using URIs to version APIs may not be the cleanest way to implement versioning but the power of hypermedia allows us to work around its most obvious deficiencies. This is good given the prevalence of that approach to versioning.

And a lot of that performance, Prasad said, came from removing unnecessary design wankery (our verbiage, not his) — the rounded corners, the omnipresent gradients. By making things simple, clean, modern, flat, and even print magazine-like, the LinkedIn app only got faster and better on the performance side, as well.

Venture Beat

permalink