Classical PCs are on a downward slope, tablets never quite became The Next Big Thing, and smartphones are approaching saturation in developed markets. Is this the end of the go-go days? Where will we find new growth paths, new forms of personal computing?
Having devised a means of communicating through ideograms, alphabets, and numerals, Homo Faber, the tool-making species, invented a succession of accessories that would help us, as individuals, remember and process our symbols: The printing press; the abacus; and now that we’re (possibly) approaching the end of the Holocene, the personal computer.
You could say that the PC revolution began in 1968 when the term “personal computer” was first used to describe the Hewlett-Packard 9100 desktop calculator (“You Say Calculator, I Say Computer”—see the genial HP computer museum article). The first microprocessors gave PCs a strong kick, Moore’s Law fueled growth, and the internet explosion finished the job. By the early 1990’s, PCs had become universal office and home fixtures.
But now, as the revolution approaches its 50th year, sales of conventional PCs are in decline: According to IDC, worldwide PC shipments are expected to fall by -8.7% in 2015 and not stabilize until 2017. (IDC’s guarded optimism about 2017 is laudable—or maybe it’s just self-interest: Don’t tell your clients they’ll be fighting for smaller and smaller scraps.)
A few years ago, tablets were lauded as The Next Big Thing that would replace the PC, but they never made good on the promise. In the quarter ending in June 2015, iPad sales were down 23% with revenue of $4.6 billion. (The Mac was +9% with $6 billion—we’ll get back to tablets and hybrid PC/tablets a little later.)
Smartphones—our most personal computers—may have contributed to the demise of the PC by being “good enough” for the modern imperative of personal computer users: going online. According to Pew Research, 64% of American adults own a smartphone and, more strikingly [emphasis mine]: “7% of Americans own a smartphone but have neither traditional broadband service at home, nor easily available alternatives for going online other than their cell phone.”
The US is hardly top-of-the heap when it comes to smartphone penetration: We rank 13th behind countries such as Denmark, Finland, and Sweden. The smartphone-as-internet-portal is even more pervasive—and more important—in countries such as Spain where 80% of users access the internet through mobile devices, to say nothing of regions such as Africa where inexpensive smartphones are engines of commerce and healthcare.
After years of unprecedented growth, the PC market is approaching the shoulder in the Diffusion of Innovation curve:
Are PC makers doomed to fight for scraps, oxymoronic “incremental innovations”? Or will new avenues for growth emerge, new technologies, new jobs to be done?
We can look for answers at the topmost layer of abstraction by imagining new forms of personal computing. Or we can turn to lower levels—to the sub-plots—the evolution of existing forms of desktops, laptops, tablets, smartphones, and their ecosystem companions.
Let’s start by Googling “the future of personal computing.” We see plenty of “new form” abstractions: Dreams of new ways to interact through facial recognition, hand movements, and voice intonation; the increasingly ubiquitous connections between computerized objects are extended to our bodies; more sensors, more intelligence, a virtual (or, at least, “augmented”) reality:
If you look at the world 10 years from now, everyone’s wearing something that looks like sunglasses, but has a super high-res 8K display for each eye that gives you a seamless combination of reality and computer-generated images.
(“Epic Games’ CEO Tim Sweeney: Virtual Reality Is the Future and ‘We Are 100 Percent In’”; found at re/code)
The results of such predictions are an invitation to modesty. The decades-old IP On Everything prediction needs more time; the promise of handwriting recognition has outlived the need; the term “virtual reality” has been kicking around for twenty years with no practical evidence other than highly specialized applications; Microsoft’s Kinect, once heralded as the future of gestural interaction, has been unbundled from the Xbox game console because it didn’t meet expectations.
In the meantime, no one saw the smartphone revolution that brought on the destruction of incumbents and rise of new services such as Uber.
We will only see the path after the fact.
The subplot—the evolution of existing personal computers—is easier on the mind.
Let’s start with the iPad. After its first meteoric rise, many of us anticipated that the iPad would displace the conventional PC, but we soon fell back to Earth. My thesis, explained in a note titled “The iPad Is A Tease,” is that the iPad couldn’t perform as many jobs as we thought it would. If you want to work with a complex document for example, you still need a PC.
Now, we have the recently announced iPad Pro, a consumer/creator hybrid that includes a stylus, split screen, and access to the previously hidden file system. After years of deriding the “toaster-fridge” approach to tablets, Tim Cook has called the iPad Pro “the clearest expression of our vision of the future of personal computing.” Let’s take him at his word. This gets us to a scenario where, over time, the iPad morphs into a laptop running iOS, powered by an Apple AX processor.
What about the Mac, then? With its full control of the operating system, software tools (LLVM, Clang, Swift), and App Store, couldn’t Apple create an AX-based version of OS X? In Theory—the land where everything works as planned—yes. In Reality, this would mean carrying forward layers upon layers of software silt that have accumulated inside the noble but ancient OS X.
Whether through Apple’s long-term vision, or the growing realization of an opportunity, iOS has become the OS in Apple’s future. iOS has already shipped on more than one billion devices; where Macintosh unit sales are measured in millions per quarter, iOS devices are multiples of tens of millions. Built to fit the constraints of the first iPhone’s limited processing power, iOS is still much smaller than OS X: 1.3 Gigabytes for the latest release, versus 8.41 GB for my MacBook’s System Folder. iOS has a lot of room to grow into a fuller, richer OS, unencumbered by past sins.
If we accept the scenario of an iPad evolution into an iOS-based laptop, or even desktop, what happens to the Mac as we know it today?
Picture (no pun intended) digital cameras. With its ubiquity, connectivity, performance, and photo editing software, the smartphone has swallowed the point-and-shoot market, but it’s not a replacement for the pricey DSLR that’s beloved by the hobbyist and essential for commercial jobs such as sports, product, or food photography.
By analogy, even if an iOS-based laptop comes to serve many needs, there are jobs where a 27” iMac, its 5K display, 4 GHz Intel processor, 64 GB of RAM, and terabytes of disk storage is irreplaceable—and will stay so for some time. The two will co-exist just like smartphones and DSLRs.
Undoubtedly, the speculation I’m making about the future of iOS has been entertained elsewhere. In particular, Android engineers must be agitating similar thoughts regarding the future of their operating system.
Microsoft is in a different game. With no more than 3% market share, the Windows Phone OS doesn’t enjoy the support of a lively ecosystem. Growing it to assume the role of a laptop- or desktop-class OS is technically feasible… it’s a mere matter of software. But unlike Android and iOS, it doesn’t have a broad base of hardware to run on. Microsoft would need to evangelize OEMs on a new software/hardware combo, one that would compete with existing Windows devices. Not likely.
Actually, one can’t help but wonder how long Microsoft will keep pouring money into Windows Phone devices… or into Windows hybrids and laptops, for that matter. At $1 billion or more per quarter, these devices can’t be bringing much to the bottom line. Microsoft is generating nice numbers in other parts of its business. One day, the company will have to more fully dedicate itself to what it does best.
Personal computing isn’t about to get boring.
This post originally appeared at Monday Note.