The rise and fall of AMD: A company on the ropes

The conclusion of our two-part series on AMD. Part one covered AMD’s attempts to transform itself from a second-source supplier of Intel designs into a chipmaker powerhouse in its own right.
Athlon 64, and AMD’s competitive peak
Overall, the Opteron’s architecture was similar to K7’s but with two key differences. The first was that the CPU incorporated the system’s memory controller into the chip itself, which greatly reduced memory latency (albeit at the cost of some flexibility; new CPUs had to be introduced to take advantage of things like dual-channel memory and faster memory types like DDR2). This showed that AMD saw the benefits of incorporating more capability into the CPU itself, an instinct that would inform the later purchase of GPU maker ATI Technologies.
The K8’s biggest benefit for servers, though, was its 64-bit extensions. The extensions enabled AMD’s chips to run 64-bit operating systems that could address more than 4GB of memory at a time, but they didn’t sacrifice compatibility or speed when running then-standard 32-bit operating systems and applications. These extensions would go on to become the industry standard, beating out Intel’s alternate 64-bit Itanium architecture—Intel even licensed the AMD64 extensions for its own compatible x86-64 implementation. (Intel’s initial approach could only run x86 code with an immense performance penalty.)
The K8 architecture was successful on the desktop in the form of the Athlon 64 lineup, but it was the Opteron server variants that brought AMD real success in the high-margin market. By the time Intel introduced dual-core Xeons based on the company’s Core architecture in September of 2006, AMD had snapped up an estimated 25 percent of the server market. AMD continued to iterate successfully on K8 for a few years, performing several architecture tweaks and manufacturing process upgrades and even helping to usher in the multicore era of computing with the Athlon 64 X2.
Enlarge / The Opteron CPU, and the K8 architecture upon which it was based, helped AMD break into some new and lucrative markets.
Flickr user tamasrepus
Despite technical successes, AMD’s financial situation had become precarious. Processor unit sales were falling, and margins on most chips dropped quickly after 2000. AMD also had problems with producing too much inventory; in the second half of 2002, AMD actually had “to limit shipments and to accept receipt of product returns from certain customers,” it announced, because the chips it made weren’t selling fast enough. The company had a net loss of $61 million in 2001, $1.3 billion in 2002, and $274 million in 2003.
What was sucking away the company’s money? It was those darned fabs, just as Raza had feared. In the company’s 2001 10-K, AMD estimated, “construction and facilitation costs of Dresden Fab 30 will be approximately $2.3 billion when the facility is fully equipped by the end of 2003.” There was also a $410 million to AMD Saxony, the joint venture and wholly owned subsidiary that managed the Dresden fab.
By the following year, AMD upped its estimated costs to fund Dresden to $2.5 billion and added that by the end of 2001, it had invested $1.8 billion. The estimated costs continued to rise, as per the 2003 10-K: “We currently estimate that the construction and facilitation costs of Fab 30 will be $2.6 billion when it is fully equipped by the end of 2005. As of December 29, 2002, we had invested $2.1 billion in AMD Saxony.” That same year, AMD plowed ahead with a new Dresden fab (“Fab 36”), investing $440 million into it by the end of the year.
The money for these huge investments all relied on AMD’s ability to sell chips, and AMD’s ability to sell chips was made easier by its competitive edge over Intel. Unluckily for AMD, Intel didn’t take this challenge lying down.
Full Story: The rise and fall of AMD: A company on the ropes | Ars Technica.

Scroll to Top