“Just you wait,” Intel kept telling us, year after year. “Yes, our legacy x86 architecture, dominant in the PC world, hasn’t yet won a place in smartphones and tablets, but our company’s superior manufacturing technology will inevitably lead to victory…”
Last month, Intel finally threw in the towel and tossed thousands of people to the curb. What happened?
Intel has had its share of failures: The sluggish iAPX32; the eight-year BiiN misadventure with Siemens (from Wikipedia: “…the company name was considered to be an acronym for Billions Invested In Nothing”); the grand Itanium, quickly nicknamed Itanic, a 64-bit architecture that was (sort of) co-developed with HP in a convoluted arrangement that seemed aimed at snuffing the Palo Alto company’s competing Precision Architecture development.
Such failures, quickly swept under the rug, are expected and forgivable—the artifacts of a prosperous company that has the resilience and might to absorb a few missteps.
Less pardonable is Intel’s great sin of omission.
In 1985, fresh from moving the Macintosh to the x86 processor family, Steve Jobs asked Intel to fabricate the processor that would inspirit the future iPhone. The catch: This wouldn’t be an Intel design, but a smaller, less expensive variant of the emerging ARM architecture, with the moderate power appetite required for a phone.
Intel said no. Paul Otellini, Intel’s CEO at the time, manfully takes responsibility for passing up the opportunity [as always, edits and emphasis mine]:
I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.
…and, perhaps more importantly:
The lesson I took away from that was, while we like to speak with data around here, so many times in my career I’ve ended up making decisions with my gut, and I should have followed my gut. My gut told me to say yes.
The smartphone train left the station and Intel has never caught up.
This seductively simple summary fails to render a complicated story that started well before the “iPhone miss” and leaves at least one question still unanswered.
We’ll start with the Microsoft connection.
Consider two identical processors—same computing performance, power consumption, size, and manufacturing cost. One difference, however: processor I runs Windows, processor II doesn’t. Which one gets a higher market price? x86 processors fetch a significant premium because they—and only they—run Windows. I took a walk through Intel’s SEC filings over the past 11 years: The gross margin stays above 60% most of the time, peaking at 67.7%. These quasi-software margin numbers are the result of Intel’s symbiosis with Microsoft, a relationship that coined its own name: Wintel.
Even though the relationship is inarguably the source of the company’s fortune, Intel’s leaders resent the Wintel yoke. Their attempts to find another line of business of equivalent earnings power has led to failed server farms (before Amazon’s AWS), modems, even “Smart Toys” with Mattel: “The feedback we received from industry experts and families on our first Intel Play QX3 Computer Microscope product was tremendous,” said Jeff Abbate, director of the Intel® Play Smart Toy Lab.
Resentful as they were of Microsoft, Intel elders developed a culture based around their dependency on the software company. Manufacturing process became a weapon of choice against competitors: Intel’s semiconductor factories (fabs) could crank out a seemingly unlimited number of ever-more powerful chips, which meant more powerful Windows-laden PCs, which led to quicker PC replacement sales, which led to Wintel-sized profits that fueled the creation of bigger, more productive fabs (and those became increasingly more expensive: The aborted Fab 42 in Chandler, Arizona—once lauded by president Obama as an exemplar of US manufacturing prowess—would have cost $5 billion).
The Wintel virtuous cycle was running at full-speed when Steve Jobs made his visit to Intel. Despite Otellini’s gut feelings, the iPhone was an unproven product in a misunderstood market. It didn’t fit Intel’s financial model—which reminds us that a spreadsheet isn’t a window through which to view the future.
The putative Apple business didn’t fit the Intel culture in other ways. Intel styled itself as a designer of microprocessors, not a mere fabricator of someone else’s design. By controlling the design (and thus the cost and timetable) for x86 processors, Intel maintained power over PC manufacturers: Here’s what you’re going to get, how much you’re going to pay, and when you’ll get your shipment.
Intel’s design power translated into pricing power. As a mere fabricator, Intel would have to take a step down the power and pricing ladder. This was incompatible with Intel’s culture, which, by then, was thoroughly hooked on the Wintel margins drug.
(This same cultural incompatibility may explain why, in 2006, shortly after turning down the iPhone, Intel sold its XScale ARM business to Marvell. Unlike the “Any Color As Long It’s Black” x86, an ecosystem of building blocks and processor extensions had quickly grown up around the ARM architecture, thus allowing the device manufacturers to custom tailor ARM chips and get the best fit for their products. Bad cultural fit, out goes XScale.)
But Intel had a justification, a story that it kept telling the world and, more perniciously, itself:
Just you wait. Yes, today’s x86 are too big, consume too much power, and cost more than our ARM competitors, but tomorrow… Tomorrow, our proven manufacturing technology will nullify ARM’s advantage and bring the full computing power and immense software heritage of the x86 to emerging mobile applications.
Year after year (after year), Intel has repeated the promise. There are some variations in the story, such as the prospect of the 3D transistor, but mobile device manufacturers don’t seem to be listening. As a result, Intel has had to resort to buying affection:
“Contra revenue,” as Intel calls it, sounds innocuous enough. One of those murky financial terms that are ignored by people outside of the investment community…Intel is, in essence, paying tablet makers to adopt its Bay Trail Atom chips because it needs to catch up.
As ARM devices have became faster and more powerful, with better add-on units such as graphics and communication processors, Intel’s “Just you wait” story has become less believable. But let’s not dwell on the reason why so many outside Intel have parroted the party line. Let’s ask another question, instead: When it became obvious that the iPhone was a success, and particularly when Apple announced the A4 chip, why didn’t Paul Otellini jump in his car and head over to Cupertino’s 1 Infinite Loop? Why didn’t he make an offer that Jobs (and then Tim Cook) couldn’t refuse, such as replacing frenemy Samsung as Apple’s Ax mobile processor manufacturer?
I’ve heard tentative explanations that center on financial self-preservation, that retooling its fabs for the Ax would have been too costly for Intel. I’m unconvinced… Self-deception is probably a more apt word. That’s what cultures do, living right below the surface of what we persist in calling our consciousness, invisibly shaping our perceptions, creating our own reality distortion fields.
Last month, Intel’s self-deception finally became untenable. CEO Brian Krzanich, who succeeded Paul Otellini in 2013, essentially called it quits on its mobile x-86 efforts, offering a born-again strategy focusing on the Cloud, the Internet of Things, and 5G wireless technology…
Because Intel’s management believed their “Just you wait” story too deeply and for too long, 11% of Intel’s workforce—12,000 people—are in the process of losing their jobs. It’s what Intel gingerly calls a “restructuring initiative to accelerate transformation.”
When Intel announced layoffs of about 3% last year, CEO Brian Krzanich had this to say: ‘This is the way a meritocracy works.” One wonders: To whom do those words apply when contemplating Intel’s failure in the mobile world?
Postscript: I recommend Inside Intel by Tim Jackson, a book that shows the company at its most combative and paranoid, and discusses its use of “extra-legal” tactics. Surprisingly, Intel didn’t sue the book’s author.
This post originally appeared at Monday Note.