Maturity doesn’t have to be boring. The 40-year-old Apple doesn’t lack the challenges it needs, both external and internal, to stay interesting.
I’ll begin with a bit of the personal history that colors my views of Apple.
I was born a geek. In 1955, I salivated while looking at the first OC 71 transistor in the prefect of discipline’s office at the very Breton boarding school where I was sent to quell my agitation. In June 1968, many clandestine radios and other electronics projects later, I got the biggest break of my business life: HP put an end to years of psychological moratorium by taking me off the streets and giving me the dream job of launching their first desktop computer on the French market. I saw HP’s rise to dominance in two personal computing genres: pre-microprocessor desktop machines (the 16-bit 9800 series), and mobile, pocketable devices such as the HP 65. I loved it.
Late 1980, I got another significant break, another dream job: starting Apple France. After HP, having scrubbed the toilets at ailing French subsidiaries of American tech companies, I was ready for a clean slate.
At Apple, my sentiments for personal computers organized themselves—I found better ways to explain and describe the source of our interest, and the roles these special machines played in our lives.
Simplifying, but without distorting the key concept, humankind needed a more flexible means of expression than hieroglyphs, mere pictures on a cave wall, and invented alphabets and numerals—symbols that have no intrinsic meaning. Combined into sentences, phrases, and formulae, these symbols gave us tremendous power to think, persuade, seduce, and calculate. The same set of symbols could be used in sacred texts, Elizabethan poetry, Marcus Aurelius’ meditations, Wall Street pitches, and general relativity.
But our invention was too much for our central nervous system: We had trouble memorizing long strings of symbols; few people could do long division in their head, let alone extract cubic roots.
Luckily, we are the Homo Faber, the tool-making species, and thus began a long procession of computing, storage, and communication devices, from the abacus to electro-mechanical devices, and on to big, expensive computers called mainframes. Electronics moved from tubes to transistors to integrated circuits, propelled by our unquenchable thirst for symbol manipulation. In the early 70s, 8-bit microprocessors appeared and the personal computer revolution started.
Apple, born on Apr. 1, 1976, wasn’t the first personal computer company—there was a plethora of early entrants such as Ohio Scientific, Victor, Commodore, Eagle, Tandy, Altair… Apple “just” managed to create a clean, simple design, thanks to Steve Wozniak, ex-Intel Mike Markkula, and a tireless, inspired, and inspiring promoter, Steve Jobs.
When I joined Apple, I saw the Apple symbol for what it represented: A machine that extended the reach of your mind and your body. A device that powered five key activities: think, organize, communicate, learn, and play. This was a truly personal computer that you could lift with your arms, your credit card, and your own brains.
In the past 40 years our personal computers have, of course, become immensely more powerful and convenient, but the roots of our interest in personal computing haven’t changed. And, against many odds, Apple has become a giant, world-spanning, immensely-rich company.
In retrospect, we can see three Apple eras.
Apple 1.0 was a turbulent period: The rise of the Apple brand, its loss to the IBM PC and Microsoft; the hope and trouble with the Macintosh; Jobs forced out followed by a succession of “professional” CEOs and progressively deteriorating finances.
In retrospect, Jobs’ departure from Apple was one of the best things that happened to him and the company he co-founded. If hadn’t left for an outside tour, the Pixar success and the NeXT technical achievements and business challenges, he wouldn’t have been able to return and jump start the company’s next phase.
Apple 2.0 began in late 1996 when Jobs managed what turned out to be a reverse acquisition of Apple. We owe much gratitude to then-CEO Gil Amelio who unwittingly saved the company by hiring Steve to “advise” him. Jobs’ advice? Show Amelio the door and install himself as “interim” CEO. Jobs then made an historic deal with Bill Gates which gave him time to let his team of NeXT engineers completely rebuild the Mac OS on a modern Unix foundation. Steve also rummaged through the company and found Jony Ive who gave us the colorful iMacs, the first of a series of admired designs.
What followed is recognized as the most striking turnaround story in any industry, one that has been misunderstood and pronounced as doomed at almost every turn. The list of Jobs’ “mistakes” includes killing the Macintosh clone program by canceling Mac OS licenses; getting rid of floppies and, later, CD/DVD-ROMs (mostly); entering the crowded MP3 player field; introducing iTunes and the micropayment system; the overpriced, underpowered $500 iPhone; the stylus-free iPad (ahem)…
We’ve seen the punishment for these mistakes: Apple sells approximately $250 billion worth of iPhones every year, that’s six phones every second manufactured and delivered to more than 130 countries.
Despite its enormous size and influence, Apple’s business remains simple to understand. The company makes personal computers, as illustrated in this telling line-up:
Personal computers, small, medium, and large.
Everything else Apple does—from iTunes to iCloud storage, apps, and accessories—has one and only one raison d’être: Push up the volumes and margins of the company’s personal computers.
Steve Jobs left us in early October 2011, “Too soon,” as I wrote in my heartfelt homage to the once unmanageable co-founder who turned into a manager extraordinaire, captain of industry, and editor-in-chief of a team of designers, engineers, supply-chain managers, and finance experts.
We’re now in the Apple 3.0 era, under Tim Cook’s leadership.
As I’ve written several times here, I admire Tim Cook’s calm determination. Jobs once enjoined his successors to avoid the “What would Steve have done” approach and, instead, make truly autonomous decisions. Indeed, in its fifth year under Cook, Apple continues to Think Different—and differently from what Steve Thought. For example, Apple now pays dividends, actively supports civil rights causes, maximizes recycling, and makes extensive use of renewable energies—activities that weren’t as much in focus in the Apple 2.0 era.
With predictable and dismaying regularity, however, observers question Apple 3.0’s future.
And, certainly, Cook’s Apple faces questions old and new.
The iPhone now carries about two-thirds of Apple’s revenue and probably more of its profits. To many, this creates a weakness: If Apple stays at the high end of the smartphone market, the company will be exposed to disruption from low-end Android clones, margin losses and, in the end, it will be relegated to insignificance.
It’s a well-known and well-worn theory…and it needs a closer look.
Turning our gaze to the “full-size” personal computer market, what do we see? It’s in decline. The not-so-new Windows 10 hasn’t breathed life into the mid- to low-end of the segment. By contrast, the high-priced Macintosh line keeps gaining market share. I’d like to hear genuine experts, not clickbait netwalkers, explain why the approach that works so well for the Mac won’t work for the iPhone.
Another question: After a stellar beginning, the iPad seems to have stalled. How will Apple reverse its downtrending tablet sales? Horace Dediu once observed that the best way to predict Apple’s business is to listen to what Cook and other execs say. For the iPad, Apple they keep saying “It truly is the future of personal computing.” They mean (my words, not theirs): We’ll make the iPad a truly hybrid tablet-laptop, a toaster-fridge device, like Microsoft’s Surface Pro machines, only better. Only idiots never change their minds. Jobs changed direction when it suited him, why not continue the tradition and also borrow from borrowers?
The iPad question is more than just a matter of convertible hardware and a once-mocked stylus—pardon—the “completely familiar, entirely revolutionary” Pencil. A more important issue is iOS vs OS X. To put it succinctly, OS X has had a glorious past, but iOS is the future. There were 250M iOS devices sold last year vs. 25 OS X Macs…where would you put your best engineers? iPads will some day cannibalize Macs and iMacs. Not a real problem, the money stays in the family.
I won’t dwell too long on the Apple Watch. So far, pardon the pun, it cleans its competition’s clock. It may not sell as much as optimistic seers said it would, but it has done better than dismissive pessimists predicted. Right now it’s not a true, autonomous personal computer—it needs an iPhone—but some day it will gain more computing power, more sensors, more apps. When, how, and how fast, I don’t know. But it will.
Then there’s Apple 3.0 as a services company. Reading financial statements, we see that Apple has achieved significant growth in its services business. This is good and confirms that Apple is getting better at the game, but there are glitches. iTunes is still an abomination before the Lord, the App Stores are poorly curated, web versions of iWork apps pale in comparison to Microsoft’s Office Cloud implementation, and more. This is no ground for despair, just for worries about islets of mediocrity silently metastasizing inside the spaceship.
Looking further into the future, the putative Apple Car could inaugurate the Apple 4.0 era. Monday Note readers know I’ve gone back and forth between skepticism, based on technical and cultural challenges, and a more optimistic view of Apple’s possible contribution to a new genre of automobiles.
How far away is this future? Two weeks ago, at the SXSW (South By Southwest) festival, Chris Urmson, Google’s director of self-driving cars, gave a sobering yet helpful talk about his project’s future. Lee Gomes analyzes Urmson’s presentation in an IEEE Spectrum article [as always, edits and emphasis mine]:
Not only might it take much longer to arrive than the company has ever indicated—as long as 30 years, said Urmson—but the early commercial versions might well be limited to certain geographies and weather conditions. Self-driving cars are much easier to engineer for sunny weather and wide-open roads, and Urmson suggested the cars might be sold for those markets first.
Gomes then quotes Urmson:
How quickly can we get this into people’s hands? If you read the papers, you see maybe it’s three years, maybe it’s thirty years. And I am here to tell you that honestly, it’s a bit of both.
From a kremlinology angle, it’s more than mildly interesting to watch Alphabet Inc. prune and shape its projects, “liberating” the robotics team once acquired from Boston Dynamics, telling us a real-world autonomous car is harder to make than we were once led to believe. I’m surprised how little play this talk from Alphabet/Google got in the media.
From the Apple angle, things become clearer: the putative Apple Car, when/if it ever materializes, will be a more conventional electric vehicle, complete with Apple style and software smarts.
These are but the most visible open questions. The history of tech companies show how change often comes from unexpected places, and how sudden it can be. Think Nokia, Microsoft, and IBM, to name a few. A page from my own history, I saw how less powerful but more flexible (and less expensive) 8-bit “microcomputers,” as they were first called, completely disrupted HP.
Apple won’t become boring with age. The company is just as exciting—and occasionally as unexpected—as it was 40 years ago. Of course, I owe Apple an unending debt: This is the company that made my life exciting, rewarding, and brought me to Silicon Valley.
This post originally appeared at Monday Note.