(4 December 2011)
Jonathan Zittrain explains that the personal computer is dead
because our computing environments are becoming walled gardens. And he thinks this is bad.
Personally I don't think this is the end of the world that he claims it is.
First, you have to understand my place in the computing hierarchy. I'm not a developer. I'm not an end-user (or at least, not for the purposes of this discussion). I'm in between. I'm one of the vast army of unpaid computer support staff that family and friends reach out to when something goes wrong with their computer.
I've been asked all kinds of things. Like why Facebook is claiming my mother's browser is out of date when she's playing a Scrabble game. Like why this computer is slow. Why that program doesn't work.
For decades we've enjoyed a simple way for people to create software and share or sell it to others. People bought general-purpose computers—PCs, including those that say Mac. Those computers came with operating systems that took care of the basics. Anyone could write and run software for an operating system, and up popped an endless assortment of spreadsheets, word processors, instant messengers, Web browsers, e-mail, and games.
That's right, anyone could sit in their metaphorical garage and bang out a computer program that could be given out, sold to, or otherwise inflicted on the unsuspecting masses. This is not necessarily a bad thing in and of itself.
What makes it a bad thing is that the vast majority of those programs inflicted on the masses isn't written very well. Its buggy, it dies, it crashes, it corrupts its own data as well as other data.(*)
If one is to assume that this state of being is unacceptable, there are really only two logical evolutionary paths from here:
1) Open Source. The software is written and given away, and if it doesn't work to your liking you can always fix it.
This is like the local home-improvement store. You can buy all kinds of pre-made components to put in your house, or you can buy yourself the raw metal and lumber and tools to make your own components.
Most people don't do the latter because A) they don't have time, and B) they realize that they are not very good at the required skills to end up with a passable result.
In the Open Source world, there are pre-made components, and if you want, well there are compilers and source code and frameworks to make it do what you want to do. But the same reasoning drives why, in the larger community of computer users, nobody cares.
2) The Walled Guarden. This is the App Store, or an otherwise curated selection of available options.
With this curation comes implied endorsement. Most people have experience with the wild world of software for their platforms. They've downloaded, they've installed, they've enjoyed the crashes and data corruption and awful interfaces and the lack of functionality. Sturgeon's Law
explains this when it says "90% of anything is crap."
Curation will never be the total cure for this. I am sure there are some stunningly bad apps in the App Store. But if curation can change that ratio -- even a bit -- then the majority of end-users are going to go for it.
That software ranged from the sublime to the ridiculous to the dangerous—and there was no referee except the user's good taste and sense, with a little help from nearby nerds or antivirus software. (This worked so long as the antivirus software was not itself malware, a phenomenon that turned out to be distressingly common.)
There are three important concepts in this quote.
First, his back-hand reference to "nearby nerds" is an admission that the current end-user computing ecosystem requires an awful lot of support. Pretty much everyone knows someone they can call when the computer misbehaves. Some of them get called more than others. But there is an awful lot of free support going on to keep computers humming on end users' desks.
The second is the need for anti-malware software. The fact that computers need software to protect their users from their own programs is amazing. It is like requiring life jackets in cars because sometimes the bridges people drive over would collapse into rivers.
Now writing software is hard. Writing operating system software is even harder. And we have been doing it for far less time than we have been building bridges. And we are learning, things are getting better. But the fact that this is an accepted state of affairs still boggles the mind.(**)
The third is the implication that the end user needed to be educated in order to avoid some of these pit-falls.
Let me tell you what end users think about these things: they don't.
Well they do. When they have some crash or virus or data corruption, and their "local nerd" is asking questions that are strongly leading the suggestion that they shouldn't have been doing what they were doing, end users are not feeling their freedom.
End users want to be just that: users. They don't want to tweak their anti-virus or manage their fragmentation rate or deal with DLLs any more than they want to gap the spark plugs in their cars or calibrate the focus of their microwave's emitters.
They want to put in their bread, press a button, and get their toast. Similarly, they want to put in their desires, and get their information out.
End users want their computers to be like toasters, when really they are more like cars. The problem is that end users need to be educated that their computers require ongoing preventative maintenance and routine care in use. Just like you need oil changes and shouldn't drive at maximum acceleration and braking all the time, end users need to know that their anti-virus requires care and that just because you can
install something doesn't make it a slam-dunk that you should
So when Apple comes along and says (or implies) that the apps in the App Store will Just Work, end users sign up for that.
And more importantly: they will pay a premium to do it.
The fact that this premium is "anything more than free" in no way belittles the accomplishment. People who won't pay their "local nerd" to fix their mess actually are willing to spend $2 on an app. It is a huge step up the monetization-of-value ladder, and opens the door to monetizing other improvements in value.
Apple's Line Of Business
Apple isn't in the business of promoting freedom for software engineers. Apple is in the business of separating happy users from their money. And with a little bit of research, Apple has determined that a happy end user is a user who is more likely to keep buying more of the same.
So if Apple can get together a collection of apps to sell, it is in Apple's interest that these apps do more or less what they say they'll do, and not do things which are generally considered bad. This will improve the user experience, and improve Apple's bottom line.
Apple understands something that very few other vendors are willing to act on: the experience gained through third-party additions directly reflects on the primary vendor.
To wit: if you have Windows programs that crash, you are most likely to blame Windows for the crashing, even if the reason is that some free set of mouse pointer icons had some spyware that hooked a DLL in an incompatible way that makes other, totally unrelated software die.
Again, curation is not going to be the total cure for this. But if it can remove the most blatant examples of this, the experience for the end user will continue to improve.
Peter Parker's Uncle Ben famously said: With great power, comes great responsibility.
Software engineers have had great power over the last 20 years. The fact that users were essentially forced to buy the same general-purpose computing devices that software programmers used to write software opened up a vast market to them. Some of them showed responsibility. Others have not.
The end result is that the squandering of this power (or shirking of this responsibility) by some will lead to reduced markets for the vast majority. If curation extends to general computing platforms (ie the Mac App Store), the OS vendors will be able to become gate keepers to their user communities. And yes, while the end goal of this gate keeping is monetizing their user communities, the immediate goal is happier end users.
It will also affect the hobbyist software engineer. If general purpose computing platforms no longer have a critical mass, they will become rarer and inevitably more expensive. This is similar to the hobbyist auto mechanic, who finds that the electronics in today's card tend to inhibit much of the playing around that they might be inclined to do. There are more hobbyist-friendly cars to play with, but they will tend to be the older, pre-computer cars that come with their own special problems such as more polluting engines.
In many ways it is a more mature version of the Internet at large. In the beginning, making content and making that content available to the masses was very hard and expensive. Now it is so cheap that any moron like me can have a whole whack of blogs. The problem is that the ratio of quality to noise is still 9-to-1 or worse. Consumers of information don't have the time, energy, or inclination to sift through web sites like mine to find actual quality. So they turn to Google or other social aggregators like Digg or Slashdot. It doesn't impede my ability (or, evidently, my inclination) to continue writing. It just doesn't do anything to help an audience find me(***).
Similarly, software engineers will always have the ability to write software. The fact that they don't have cheap, ready access to a mass market of potential users/consumers/customers doesn't change that. It does mean that the threshold that one has to pass in order to access that market is increased. But OS vendors really are not required to provide general access to their customer base.
Apple has successfully monetized the dissatisfaction that most end users feel with their platforms. Thanks to the mess that general computing has been over the last 20 years, Apple has brought together a product that promises to be better -- not perfect, just better -- than what end users have been used to.
And frankly, software engineers have nothing but their own exploitation of their "freedom" to blame.
(*) = Well except for your software, I'm sure your code should be etched on copper and displayed prominently in the Louvre. But most of your competition can barely code their way out of a paper bag.
(**) = As is the idea that the manufacture of such software should be a protected industry, and that Microsoft should be prevented from including such software in their own Operating System bundle, but this is a rant for another time.
(***) = Of course, since I'm doing this on my own time, I'm doing it for my own entertainment. I really don't care about building a community of consumers -- I have no delusions about which side of Sturgeon's Law
my content falls into.