The first Radeon was superior to Nvidia’s GeForce2 in almost every way but it set the tone for how AMD would fair against the jolly green giant for the next 25 years

Futureproof features mean squat here and now.

Futureproof features mean squat here and now.

It only seems like yesterday when I got my hands on ATI Technologies’ first and much-anticipated Radeon graphics card but time has a funny way of playing tricks on your memory, and it was 25 years ago—to this very week—as I managed to easily pick one up at launch (how times have changed). The Radeon DDR joined my reasonably sized collection of graphics cards; on paper, it was the best of them all.

But what transpired over the following months not only put me off Radeon cards for a good while, until the 9700 appeared in 2002, but it also set the tone as to how ATI and eventually AMD would fare against Nvidia in the graphics card market.

But let’s start with why I was so looking forward to the first Radeon. At that time, I had around 12 graphics cards in a number of PCs—most of which were used for testing. I can’t remember everything I had but I do recall an Nvidia Riva TNT2 Ultra and a GeForce 256, a Matrox Millenium G200, and a 3dfx Voodoo 3 3000.

The Radeon DDR made them all look rubbish, though. Its GPU had two pixel pipelines, each sporting three texture units, and with a clock speed of 166 MHz, it could churn out over 300 million pixels and almost 1,000 billion texels per second. The GeForce 256 couldn’t get anywhere near those figures.

Admittedly, neither could the Radeon itself, as it just didn’t have enough memory bandwidth to support those throughputs in actual games but no graphics card did at the time. ATI was determined to make its graphics processor more futureproof than Nvidia’s and while both companies offered hardware acceleration for vertex transform and lighting (aka hardware TnL), ATI also added vertex skinning and keyframe interpolation into the pipeline.

A photo of an ATI Radeon DDR 64 MB graphics card

The 64 MB version of the Radeon DDR (Image credit: Future, Anandtech)

Just a few weeks later, Nvidia launched its NV15 processor and within a few days, a GeForce2 GTS arrived on my doorstep. Essentially a pumped-up GeForce 256, the new Nvidia chip sported a much higher clock speed than its predecessor, along with twin texture units in each pixel pipeline. But that was it and the Radeon DDR still looked to be the better product—more features, better tech—even though it offered less peak performance.

Coding legend John Carmack agreed at the time: “The Radeon has the best feature set available, with several advantages over GeForce…On paper, it is better than GeForce in almost every way.” When it came to actually using the two cards in games, though, the raw grunt of the GeForce2 GTS put it well ahead of the Radeon DDR, except in high-resolution 32-bit colour gaming.

That’s partly because no game was using the extra features sported by the Radeon but it was mostly because the GeForce2’s four pixel pipelines with dual texture units (aka TMUs) were better suited for the type of rendering used back then. ATI’s triple TMU approach was ideal for any game that did lots of rendering passes to generate the graphics but none were doing this, and it would be years before games did. By then, of course, there would be a newer and faster GPU on the market.

Something else that Nvidia did better than ATI was driver support, specifically stable drivers. I spent more time dealing with crashes, rendering bugs, twitchy behaviour when playing in 16-bit mode, and so on than I ever did with any of my Nvidia cards. The Radeon DDR was also my first introduction to the world of ‘coil whine’ as that particular card squealed and squeaked constantly. I spent many weeks conducting back-and-forth tests with ATI to try and narrow down the source of the issue but to no avail.

An AMD Radeon 9800 Pro graphics card.

ATI’s halcyon days: the Radeon 9800 Pro (Image credit: Future, Anandtech)

Drivers would continue to be ATI’s bugbears for years to come, even after AMD acquired them in 2006. By then, the brief glory of the Radeon 9700 Pro and 9800 Pro (and the abject misery of the GeForce FX) was gone, despite the likes of the Radeon X1900 and X1800. Nvidia’s graphics cards weren’t necessarily any faster or sported any better features, but the drivers were considerably better, and that made the difference in games.

Over the years, AMD would try again with all kinds of ‘futureproof’ GPU designs, such as the compute-focused design of the Radeon R9 Fury and its on-package High Bandwidth Memory (HBM) chips. Where Team Red always seemed to be looking ahead, Nvidia was focused on the here and now, and it wasn’t until 2019 that AMD finally stopped trying to over-engineer its GPU designs and stuck to making them good at gaming, in the form of RDNA and the Radeon RX 5000-series.

But even now, AMD just can’t seem to help itself. Take the chiplet approach of the last-gen RX 7900 XTX—it didn’t make a jot of difference to AMD’s profit margins or its discrete GPU market share, or even make the graphic cards any faster or cheaper. With the recent RX 9070 lineup, though, Team Red has finally produced a graphics card that PC gamers need now. Even its driver team is on the ball and I can’t recall when I last said that about AMD.

Your next upgrade

Nvidia RTX 5090 Founders Edition graphics card on different backgrounds

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

Arguably, Nvidia seems to have gone down an old Team Red road of late, what with iffy GeForce drivers and graphics cards that are barely any faster than the previous generation but are replete with features for the future (e.g. neural rendering). But when you effectively control the GPU market, you can afford to have a few hiccups; AMD, on the other hand, has to play it safe.

I just wish it had done so years ago when it was battling toe-to-toe with Nvidia at retailers. The Radeon RX 9070 XT is the best card that it’s released for a long time and although its price is all over the place at the moment, the fundamental product is really good (and far better than I expected it to be). Just imagine how different things would be today if AMD had used the same mindset behind the design of the 9070 for everything between the X1800-series and the first RDNA card.

Will Radeon still be a household name in another 25 years? I might not be around to see if it comes to pass but I wouldn’t bet against it, as despite all its trials and tribulations, it’s still here and still going strong.

About Post Author

Leave a Reply

Your email address will not be published. Required fields are marked *