We must be getting better: The Simplicity Era of TiVo, Apple, Google (and Rails)
Have you ever had a conversation with a fellow developer that went something like the following:
Q: Why is this so complicated?
A: Because I didn't have enough time to make it simple.
To someone outside of the field, it might seem odd, but to a programmer, it's pretty straightforward. Simplicity is hard to do the first time around. During development you need to make guesses about how your code and end product will be used well above and beyond what the requirements say. This leads to over-complication as without knowing the exact use cases you can't make add the right abstractions (and the wrong abstractions often do worse things to complexity than none at all.) Writing a blog post is the same way - editing and reworking tends to chop and simplify rather than add.
As Alan Perlis Said: "Simplicity does not precede complexity, but follows it."
In the pre-DVD days when people actually recorded shows onto these huge VHS tape things (I know, crazy, right?) a common joke was how hard it was to correctly set a VCR to properly record a show taking place at a future date. Entire sitcom episodes were written around messing up the recording of some important show or event.
Technology was something to be afraid of, something that only the geeks and 7-year olds could figure out. For all intents it looked like we were heading towards a bureaucracy-led eventual meltdown of society as things got more and more complicated to the point where nothing worked anymore. Terry Gilliam captured this idea perfectly in his fantastically over-the-top movie Brazil. In the movie, one justification for the film's military state is a series of constant terrorist explosions, but while it's never outright explained, those explosions are arguably just caused by a society overtaken by bad duct-work. I couldn't find the intro clip with the malfunctioning apartment, so here's a younger De Niro as a rogue heating engineer:
It's understandable, however, that things have to get worse before they get better. Every advance forward is preceded by a minor step back as the kinks in a new technology get worked out. The first TV tuner card I bought in the mid 90's was a complete disaster - my computer didn't have enough power to run it correctly and so my dreams of coding while watching the game turned out to be a disaster of frustration. A cheap $30 portable TV would have gotten the job done 10 times better.
Instead I futzed around with getting drivers to work for Windows95 and tried to use a UI that should have been taken out back and shot. But because of idiots like me who kept buying these crappy cards and validating the market, people kept developing the technology to the point where someone came up with TiVo.
At that point not only could my parents record TV shows, but they could do it in without reading a manual. TiVo simplified the interface to the point where it was usable without needing to think about it. TiVo was a hundred times more powerful than programming a VCR yet it could be done without even glancing at the instructions.
A couple years earlier, Google launched it's search engine and overwhelmed people by underwhelming them. I still remember the first time I saw the Google home page with none of that "Portal" junk that was popular at the time, just search. It was an epiphany. Turns out that's what people wanted. Yes, Google eventually launched iGoogle, but it kept it off by default and that's the way it stays for most users.
When simplicity gets the job done, people like it and they will stick with it. Like like the saying goes: KISS.
Which brings us to Apple. When the iPod launched, it really only had one thing going for it: simplicity. A simple, elegant interface and a simple, elegant way to get music onto the it via iTunes. It lacked any compatibility with other software and has less storage than competitors, but since you could use it without having to "learn" anything it quickly won people over (I'm air-quoting learn because yes, you did learn to use it but it wasn't a struggle). Apple followed with the iPhone using the same formula of making things that were hard to figure out on other phones easy on the iPhone, and then took it to a whole new level with the iPad. Say what you will about the lack of features or restrictions (and I have), the most common review of the iPad by a member of the tech press goes something like this: "My [Dad/Mom/Husband/Wife/Kid] picked it up and just started using it and I couldn't get it back from them."
Companies had been trying and failing to make a "simple" computer for years (Anyone remember Microsoft's Bob?) However once the technology caught up to Steve Job's obsessive vision and delivered thin screens, SSD's, multi-touch and fast enough processors, it became became physically possible to build a simple reasonably-general-purpose computer that you could just pick up and use.
To get back to programming, the 90s poster children for complexity are C++ and Java. If C was the two steps forward of a previous era, a simple well-defined language that forms the backbone of GNU and L(unix), C++ was the step backwards as we hurtled ourselves forward into a new OOP phase. C++ was the promise of the great next thing but was hindered by the complexity of not knowing which parts were needed. I'm not implying C++ was at all a failure, just that it's high level of complexity was a result of treading into unknown waters, and so people coped. To quote Joshua Bloch from Peter Seibel's Coder's at work:
I think C++ was pushed well beyond its complexity threshold and yet there are a lot of people programming it. But what you do is you force people to subset it. So almost every shop that I know of that uses C++ says, “Yes, we’re using C++ but we’re not doing multiple-implementation inheritance and we’re not using operator overloading.” There are just a bunch of features that you’re not going to use because the complexity of the resulting code is too high. And I don't think it's good when you have to start doing that. You lose this programmer portability where everyone can read everyone else's code, which I think is such a good thing.
A next step in the same space to follow C++ was Java - and the primary thing that Java did was simplify coding and remove features from C++. Memory leaks? Gone with Garbage Collection. Multiple inheritance? No go, but you can have a simplified version called Interfaces. Operator overloading? Gone. People figured out the parts of OOP that were necessary to get the job done and dropped the rest.
Unfortunately Java focused on simplifying the language and forgot about simplifying the development eco-system (although they did nail deployment with build-once run anywhere) While the Java language is simplified, the steps to setting up and building it or any Java projects are anything but - and anyone who argues otherwise should be required to write an Ant build.xml from scratch.
What if we wanted something simpler than Java? What could be simpler than what might be the following "Hello World!" program (written in PHP):
(As linked to by this Reddit post technically even without any <?php .. ?> tags in, it's still a valid .php file)
PHP filled the need for simplicity and quickly became the de-facto open-source web development language, but as things started to get complicated as project size grew, the same impulse to mitigate complexity with OOP pushed people towards other solutions. As Einstein's famous saying goes "make everything as simple as possible, but not simpler." Out of that need for more expressiveness but still keeping the simplicity came David Heinemeier Hansson's decision to try Ruby and create what would eventually be Rails.
Rails was simple because it was opinionated. It came with a set belief system about how you should go about your business. This was something that had never been expressed so openly by a programming framework, essentially: "we know we can't make everyone happy and keep it simple, so instead let's just make most people happy" (Caveat, I work on a Rails CMS - so I'm more than a little biased)
Simplicity and a strong opinion usually go hand in hand. When we know what we're doing, we usually are willing to make the hard, initially unpopular decisions for our users about what they don't need. When things get muddled, we have to go down multiple tracks as we don't know the destination.
So many different things have cropped up recently that seem just, well, simpler than what came before them that I'm starting to come to the conclusion that we must be getting better at this whole computer thing as we're finally comfortable with what we don't need in our software.
|Complicated Version||Simple Version|
|Distributed file systems and Complex CDNs||Amazon's S3 & Cloudfront|
|Oracle, MySql and other RDBMS's
||NoSql's like MongoDB, Riak and Cassandra|
|Complicated caching and proxying||Memcached, HAProxy|
|SOAP, Corba, XML-RPC||REST|
I once assumed that the graph of technological complexity was rising infinitely up and to the right, but now I think there's hope. Computers arrived and took over so quickly in the grand scheme of things that it just took us a few decades to figure out a way to simplify and still end up where we needed to be. That's not to say that everything is getting simpler right away, once a system is in production it's going to be installed and supported for a long time and unfortunately this generation's simplicity is necessarily built on the back of last generation's complexity, but I think we're headed in the right direction.
So here's to the Simplicity Era, may we stay simple and happy with our two steps forward until we find the next big step back to take.
[ As an aside, I've intentionally mostly ignored Microsoft in this discussion - from a programming standpoint they exist in a world adjacent to the one I've been paying attention to for the past 10 years, but I think their (now-waning) might made them somewhat oblivious to the forces of simplicity - and no, I don't think Visual Basic is simple, Basic or not. Now with Windows, I told them they needed to make it simpler, and so Windows 7, that was my idea. ]
Let me know your thoughts on any other technologies or trends that are getting the simplification bug, or feel free to burst my tiny, happy bubble and share your irrefutable evidence that we're headed into a Brazil-like complexity meltdown. Do you think we are entering a new Era of Simplicity or is it just a temporary trend, like drop shadows, rounded corners and Lady Gaga?