How Epic Fail Can Lead to Spectacular Success
How does innovation happen? The popular conception, arising from our cultural memory of Thomas Edison and the powerful metaphor of his light bulb, is that innovation is a sudden spark of genius.
But that’s rarely true. The road to innovation is littered with tiny incremental improvements, outright failure, and, as Edison famously said, “99% perspiration.”
Today’s captivating techno gadgets that seem like overnight successes did not spring, like Athena from Zeus, fully realized from the mind of someone like Steve Jobs. In fact, the Information Superhighway is strewn with product road kill, whole companies wrecked, and entire industries that crashed and burned.
Some of these spectacular failures are easily recalled. Windows Vista, anyone?
Others, not so much. Anyone remember the CueCat?
In 2000 I got my CueCat in the mail because I subscribed to the print version of Wired. Digital Convergence Corporation (DCC) purchased Wired’s mailing list, no doubt at significant venture-funded expense, to mail free CueCats to likely adopters. I was instructed to plug the CueCat into my PC’s PS/2 port (no, not the USB port!), install software from the CD that came with it, REGISTER, and then presto! I would be able to scan barcodes on advertisements in magazines like Wired, and “automagically” see my browser redirected to a corresponding URL – FOR ANOTHER FREAKING AD.
My reaction: why the HELL would anyone do this? It was what we now call an epic fail. Needless to say it was widely ridiculed, and DCC – along with tens of millions in venture funding and over 100 patents – evaporated into the void.
Fast forward a decade and bar-code scanning is a huge activity among smart-phone users. Didn’t the CueCat prove people can’t be bothered to scan bar codes? Why would something that failed so spectacularly come back so strong?
It wasn’t the activity that failed, it was DCC’s concept. They assumed they could hook people with what turned out to be a false convenience. The huge barriers to adopting the ecosystem far outweighed the burden of typing in a few keystrokes in a browser. And the supposed benefit – advertising – was something most people considered an annoyance. It failed on almost every level.
Contrast this with bar-code scanning apps on smartphones, like the Amazon.com app or Google Shopper. What’s different?
• The CueCat was hugely inconvenient.
• Smart phone apps are easy to use, and run on a device with much greater built-in capabilities than the CueCat.
• The CueCat could only be used when you happened to be reading a magazine within 3 feet of your PC. (Which if you were like me was… um… Oh yeah: NEVER.)
• Anytime you encounter a bar code, you probably also have your smartphone. Plus, they can scan many different types of barcodes and even recognize products by cover art. All in seconds, right where you’re standing.
3. User benefit
• The CueCat’ principle purpose was to serve ads. Very few people actively seek out ads.
• Barcode scanning apps conspicuously don’t serve ads. In fact, they let you instantly order an item you’re holding in your hand at a better price. Or learn far more about a product than you can from the packaging. Like reviews to tell you whether it’s a good buy or not.
So what are some other spectacular tech failures, and how did they lead to the great successes of today?
The GUI and the Xerox PARC
Way before Microsoft Windows and even before the Apple Macintosh, the first computer to come with a Graphical User Interface (GUI) was made by, wait for it… Xerox. Yes, Xerox. In 1973! If you’re a “digital native,” chances are you’re younger than the GUI itself.*
Researchers at the Xerox Palo Alto Research Center (PARC for short) created a workstation called the Alto, which was basically the first “personal” computer. Before that, computers were shared, meaning you had to book time to use them. The Alto had the graphics and mouse interface we’re familiar with today, and it could print to Laser Printers, also developed at the PARC. Altos at the PARC were even connected to the Internet! (Well, such as it was. At the time the Internet was a fledgling network of military and educational computers, and the vast majority of its traffic originated within the PARC itself.)
So here’s the “duh” moment: why isn’t Xerox the world’s most valuable company today? Well, the Xerox leadership back east thought they had no commercial potential. So Xerox only made a few thousand Altos for internal use at the PARC.
To be fair, few saw this potential until Apple introduced the personal computer in 1977. By the time the Xerox Star (which I confess being old enough to have used) came out in 1981, at $20,000 it had no chance against rival PC’s from IBM and Apple that didn’t have GUI’s, but cost ten times less. It also didn’t help that in 1979 they had given tours of the PARC to Steve Jobs and his team from Apple, who adapted the GUI concept to create the Lisa (itself a failure at nearly $10,000) and ultimately, in 1984, the Macintosh.
Steve Jobs DID see the potential of the GUI, and now Apple, not Xerox, is among the world’s most valuable companies. Today, some form of GUI powers just about every digital experience we have. Without the GUI, no Mac, no Windows, no Web, no Facebook, no Twitter, no YouTube, no smartphones, no tablets. Even flat-screen HDTV’s run a form of GUI.
Fresh off the success of the Macintosh in 1987, with Jobs in exile at Next (another brilliant failure that eventually gave us MacOS X and, in part, Pixar), Apple started developing what they considered the next generation of personal computer – the Personal Digital Assistant (PDA). When they eventually brought the Newton Message Pad to market, beginning in late 1993, it was widely ridiculed. And with good reason. It wasn’t in color, its battery life was pathetic, and it’s handwriting recognition… Well, let’s just say that “feature” made lots of money for the creators of Doonesbury and The Simpsons.
Instead of waiting too long like Xerox did, Apple rushed the Newton to market with too much power at too high a price before many of its features were ready. It wasn’t long before cheaper PDA’s like the Palm Pilot, with less power but better battery life, and much better handwriting recognition, succeeded where the Newton had failed. Eventually it occurred to companies like Palm to combine a PDA with a cel phone, which they did with the Treo, arguably the first viable smartphone.
But it took Steve Jobs’ return to Apple for the smartphone era to really take off. His innovation this time? Combine a smartphone with a touch-screen. No more clumsy stylus and balky handwriting recognition. Boom – the iPhone.
Windows XP Tablet Edition.
Speaking of touch-screens, they power the sexiest devices we covet today – smartphones and tablets. And the idea of a Tablet PC has been around as a concept for a while. But Microsoft decided to make a huge marketing push to drive Tablet PC adoption beginning in 2001. Even bigger than the marketing push was the effort and expense of getting PC makers to create the device itself – a laptop with a screen that could fold back over the keyboard and respond to stylus input.
The market responded with a predictable yawn. In contrast to the Alto and the Newton, Tablet PC’s were too incremental. People were happy paying about a thousand dollars for a standard laptop. Why would they pay almost twice that for a slower laptop, with less battery life, with a touchscreen they couldn’t use at the same time as the keyboard, and with a stylus they were likely to lose? Handwriting recognition was better, but still awkward and not as fast as typing. Worse, the Windows Tablet Edition operating system just felt…kludgey. Some complained it was the same old Windows XP with some stylus input hacks glued on.
Why did it fail? Was the public not ready to migrate to touch as an input method? Well, not exactly. Most of the problem was that Tablet PC’s were poorly implemented. The GUI, designed for keyboard and mouse-driven PC’s since the days of the Alto, was force-fitted to accommodate a new kind of input. Worse, stylus-plus-handwriting recognition was simply inefficient as an input method. And the hardware wasn’t available to fully realize the concept.
Beginning with the iPhone, Apple got several things right with touch screens that the Tablet PC makers didn’t. First, the OS was built for touch input from the ground up. Second, Apple completely abandoned the idea of handwriting recognition and stylus input, instead creating an on-screen keyboard you could type directly into, which was much faster. Third, and most significantly, the hardware (high-sensitivity touch screens) caught up. Ten years later, iPads are selling so well Apple can’t make them fast enough.
How can failure lead to success?
So were all these dramatic failures just a waste of time? Not at all. None of these were utter, complete failures - some of them were even great ideas. (Well, maybe not the CueCat.) Most of them had kernels of future success that just needed refinement, or fresh eyes or time for the hardware to catch up. Yes, there were big winners and big losers, but isn’t that the nature of competition?
Often it’s these near misses that make greater later success possible. They pave the way for future generations by establishing what works and what doesn’t. The pain of failure, usually measured in lost millions, is a strong motivator, and forces designers to start over and rethink from the ground up. If the history of innovation didn’t include stark failure, we wouldn’t have dramatic breakthroughs, just slightly better versions of the same old thing.
Let’s go all the way back to the Xerox PARC. Most of the innovations that changed our lives and shaped the current world were created there. In a sense, it was a dramatic success. And it still exists to this day, providing top-class R&D for companies, universities and governments. The only thing that failed the PARC was Xerox, the company whose name it bore.
Then again, perhaps that’s not surprising, considering Xerox’ first successful product routinely burst into flames. http://boingboing.net/2011/04/21/xeroxs-first-success.html
*(In fact, it was first prototyped at Stanford in the mid 1960’s by Doug Englebart, who showed it to the public during his legendary “Mother of All Demos” in December, 1968.)
What great technology flops and subsequent successes can you think of? Share your stories in the comments!