Now that I've grabbed you with a clickbait title, let me amend: interface changes come to the table with an inherent very large negative value attached. Unless the value offered by the change is extremely high, the cost of breaking people's familiarity and use cases will vastly outweigh the benefits.

I'm specifically talking about user interfaces, here, not APIs, but I do think the same principle applies; however, APIs tend to be well-defined with simple semantics, so dealing with changes is easier and the initial cost is lower. However, with a big enough overhaul for little enough benefit or with poor enough documentation, APIs can easily fall foul of this as well. Consider Python 3, for instance, which after seven years still has remarkably poor adoption.

Every interface change inherently has the following negative properties:

  • It breaks user familiarity. Something in their interface has changed, and now they need to figure out what it means and how to deal with it. Sometimes, obviously the impact is minimal—most people will immediately figure out a change in icon style. (Not everyone will, though, so keep in mind that even updating your “save” icon from a 5.25″ to a 3.5″ floppy disk will still confuse some of your users.)
  • Relatedly, any significant change will break user workflows. In typical style, there is of course an xkcd comic about this, but Randall Munroe is uncharacteristically attacking a reasonable point with hyperbole. Who are you to say that the way a user uses your software is invalid? The vast majority of software in the world is extremely fragile and difficult to use, and most users—who are not as stupid as we like to assume—have figured this out and that once they have figured out a workflow by which they can convince your software to do something useful for them, they should very carefully continue to do exactly that lest something break. When your update breaks their workflow, they have to do a bunch of work to figure out a new one, and they start to distrust your updates, which you really don't want. Making spacebar no longer overheat the processor may be a reasonable change with only unreasonable workflows based around it, but rearranging all the controls in an office suite successfully pissed off a great many users, even highly technical ones.

If your interface changes willy-nilly, users will very quickly grow suspicious of updates. Users who refuse to update are obviously a huge problem—consider all the troubles caused by IE6 or Windows XP persisting long past their expiration date—and excessive change just grows that demographic. Interfaces should remain stable to the maximum extent possible. Changing things because you think they're disorganized (Windows XP's poorly-received categorized control panel layout), you want to try something new (Office's poorly-received ribbon interface) or you just have a pathological inability to leave something that works alone (basically everything Google does) is not sufficiently justified!

This is not a strictly academic concern, by the way. This post was inspired by conversations with users of Apple's iOS who upgraded from version 6 to version 7, were grossly inconvenienced by the redesign and changes, and then refused to upgrade any more. (There's others who upgraded and lost data due to a flawed upgrade process, but that's a seperate, albeit related, issue.) Unstable interfaces are breeding the next generation of millstones: software that is outdated and unpatched but widely used.