March 20, 2012Ryan Block
I love the story of the light bulb. We often credit Edison for its invention, but few know that his famous practical, inexpensive incandescent bulb was essentially an iteration of another light bulb invented a year earlier by a British scientist by the name of Joseph Swan. And even fewer people may realize that by the time Edison “invented” them in the late 1870s, electrically powered light bulbs had been in slow, steady development for decades.
Better still: only a few months after Edison received his patent, he’d already moved on to the next iteration, which increased the bulb’s life a thousand-fold. The story of Edison and his light bulb isn’t just a story of invention; it’s about the invariable trajectory of progress.
Go to your closet and dig up one of those gadgets you’ve got stored away from, say, ten or fifteen years ago. Dust it off, give it a look, and try not to chuckle—go ahead, I dare you to not be even the slightest bit amused by this antique. Remember how amazing that thing was when you first got it? I bet at the time you probably weren’t even thinking about how in just a few years it would be a barely recognizable relic from a bygone era, laughable at best.
But at what point was it, exactly, that we evolved from using those primitive things to equipping ourselves with the latest mind-blowing technologies? Weren’t there at least a dozen or more highly memorable inflection points along the way, where technology leaps ahead and the curtain’s drawn back just far enough for us to glimpse the future? Well, as it turns out, in the future (and make no mistake, we’re officially living in it) invention isn’t all that much different from the past: constant, incessant iteration, often so gradual we don’t even notice it.
If you want to find ground zero for industry-wide technological innovation, you might look to CES. The largest industry trade show in the country, CES has long been the ultimate clearing house for the tech industry’s advances, neatly packaged and presented at the beginning of each new year—a beacon for the changes to come. So why is it that just about everyone who follows technology will tell you that, paradoxically, nothing much actually gets announced there?
Not literally, of course; my company, gdgt, tracked nearly a thousand consumer electronics product launches in over forty of the top verticals in early January, and I’ve heard estimates that more than 20,000 new products were launched at CES 2012. But I’d be willing to wager most people would be hard-pressed to name any of them. As it turns out, we’ve learned to expect incredible new advances in technology so constantly that we’re often unable to tell when we’ve actually turned a corner.
I’m reminded of a quote by Marie Curie: “One never notices what has been done; one can only see what remains to be done.” Like any good scientist in her lab, we build, iterate, experiment, ship, and start over on thousands of devices and applications every year; what failed miserably last time might, with some slight tweaks, be a game-changer this time around. Or it might fail again, even harder.
Granted, there’s one industry behemoth whose launches are as much about its products as they are about the vision of the products (and, ultimately, portend). But even Apple’s indirect impact is in many ways greater than its market share would indicate, as it is setting the trends that the rest of the technology industry force-multiplies as it races to compete.
The next time you buy a new device, consider the endless iterations, bit by bit, component by component; the day-in and day-out innovations, from protocols to minute standards—no matter how behind-the-scenes—that serve as its foundation and enable its existence. And then think about the home it will inevitably find in your closet when you trade up to the next even more amazing thing (which you haven’t yet imagined could even exist) at a moment you may not even realize is actually a hidden inflection point.
This article is commissioned by Qualcomm Incorporated. The views expressed are the author’s own.
March 20, 20120