Four years later: Why did Apple drop PowerPC?
Updated at 5:25 p.m. PDT: adding Windows discussion.
It's been four years this month since Apple announced it would drop the PowerPC architecture and switch to Intel's x86 design. One person involved in the back-and-forth between Apple and IBM at the time provides some insight into why it happened.
When Apple made the watershed announcement in June 2005 ending its longstanding relationship with IBM and Motorola, Apple CEO Steve Jobs attributed the switch to a superior Intel roadmap.
"Looking ahead Intel has the strongest processor roadmap by far," Jobs said in a statement at the time. "It's been ten years since our transition to the PowerPC, and we think Intel's technology will help us create the best personal computers for the next ten years."
One oft-cited reason was that Apple didn't believe it could get the requisite performance per watt from processors being supplied by IBM and Freescale--formerly Motorola's chipmaking arm. Translation: Apple was worried about IBM's and Motorola's ability to deliver competitive processors for laptops. (Update: Another reason often put forward is that Apple simply wanted to be able to run Windows.)
A former IBM executive, who worked at IBM at the time and was involved in discussions with Apple, offered his perspective in a conversation we had during dinner at a recent technology conference. Let me emphasize that this is one person's opinion, not necessarily the gospel truth. I will not publish his name or title.
While he acknowledged the public reasons put forward by Apple, there was more to it--not surprisingly--than that. The upshot: Apple wanted better pricing, according to this person.
Apple was paying a premium for IBM silicon, he said, creating a Catch-22. IBM had to charge more because it didn't have the economies of scale of Intel, but Apple didn't want to pay more, even though it supposedly derived more from an inherently superior RISC design as manifested in the PowerPC architecture.
Here's what Jobs said in 2003: "The PowerPC G5 changes all the rules. This 64-bit race car is the heart of our new Power Mac G5, now the world's fastest desktop computer," Jobs said in a statement. "IBM offers the most advanced processor design and manufacturing expertise on earth, and this is just the beginning of a long and productive relationship." (Sounds suspiciously similar to what Jobs said about Intel after Apple made the switch.)
Despite the praise heaped on IBM's technology in 2003, Apple believed, by 2005, that it couldn't compete on cost, according to this person.
For IBM, the business with Apple was a financial sinkhole because the company had to invest a lot of money in chipsets, compilers, and other supporting technologies but could only take about 5 percent of the overall PC processor market, he said. So, in the end, it was impossible to make money.
Why 5 percent? Apple insisted on double sourcing (IBM and Motorola). So, from the start, this left IBM with about half the market it could have had. This, he said, was an enormous financial burden. Paraphrasing the ex-IBMer: Intel was a single company with the lion's share of the market. While two companies--IBM and Motorola--had to divvy up a much smaller share of the market, while still investing, individually, tremendous amounts of money. And Apple played one against the other, according to this person.
IBM had been concentrating on delivering high-performance, single-core PowerPC processors, this person said. (Presumably by ratcheting up the gigahertz rating on single processors. The goal was to exceed 3GHz.) But when Intel, as part of the discussions with Apple, showed a dual-core (multi-core) processor roadmap, Apple reconsidered this strategy, according to this person. (Though IBM did deliver multi-core PowerPC designs for the Mac as shown in the graphic.)
Interestingly, IBM had hoped to amortize the cost of PowerPC on Cell, the PowerPC-based chip design now used in the Sony PlayStation, some IBM severs, and IBM Roadrunner supercomputers. Big Blue was hoping to move Apple to Cell and then get the economies of scale there, according to this person.
Can parallels be drawn with Advanced Micro Devices and its struggles to compete with Intel over the last few years? Possibly. Very few chipmakers have the multibillion dollar coffers to fund the R&D and manufacturing necessary to be a leader in a major chip market, let alone stay competitive. Witness AMD last year going to the brink and then saving itself by spinning off its manufacturing operations.
And Apple chose Intel in 2005, not AMD, and has stayed with this single source for its Mac line since.