Hardware sites are buzzing with news of the release (finally!) of ATI/AMD's new HD2900XT card. It sports some pretty awesome technology, and even has a GPU with an insane 700 million transistors. It does pretty well in 3DMark06. But uh, how does it do in games?

I'm of the personal view that benchmarking programs like 3DMark are a total joke. They are a poor indicator of real-world performance, and the recent batch of reviews of the HD2900XT seem to validate that. While the card competes with or even surpasses nVidia's flagship 8800GTX in some 3DMark06 tests, it's just not up to snuff in the real-world performance department.

Perhaps the most concise summary of the new card's performance is from the chaps over at [H]ardOCP, who unlike everyone else on the planet refuse to use synthetic benchmarks for hardware comparisons. They always run their cards through tough real-world performance tests. This is what they had to say about the HD2900XT:

"A day late and a dollar short.” Cliché but accurate. The Radeon HD 2900 XT is late to the party and unfortunately is bringing with it performance that cannot compete. The GeForce 8800 GTS 640 MB is $50 cheaper, performs better, and draws a lot less power than the 2900 XT.

This is as good as it is going to get for a while from ATI. The GeForce 8800 GTX will still dominate at the high end of the video card market. Of course we do not know about DX10 games yet, and there is no way to make any predictions how that comparison will turn out. As it stands right now the Radeon HD 2900 XT, in our opinion, is a flop.

The fact that most tests show the card struggling to compete with a stock 8800GTS demonstrates that despite hosting a significant amount of advanced technology, there are clearly some inherent design flaws with the card. I for one speculate that ATI delayed the card because they realized that it was not performing up to expectations, and were attempting to tweak things and figure out a competitive price point.

The ridiculous part of this is that there is already talk among enthusiast communities about "wait for the R700" or "wait for DirectX10 games" or "wait for better drivers" from folks clearly flabbergasted by the unexpectedly midrange performance of this card. Well folks, the R700 doesn't exist, DirectX10 games are months away, and the HD2900XT is launching no fewer than nine months after its original intended release date, so ATI has certainly had plenty of time to polish its drivers.

On a personal note, I have conflicted feelings about the performance of this card. On one hand, I'm glad I didn't wait around for six months, as I purchased my 8800GTS back in December, overclocked the tar out of it, and it's been giving me absolutely stellar performance for the last few months. On the other hand, we're faced with the simple and unfortunate reality that ATI's flop makes for less competition in the GPU marketplace, and that in the long haul is never good for gamers. We've already seen the 8800GTX launch at a staggering $599, and recently nVidia launched a slightly overclocked version of the card, tagged the 8800 Ultra, for an astonishing $829. In other words, nVidia wants nearly $300 for a slight bump in clock frequencies, ones that can easily be acheived on a stock 8800GTX (indeed, many factory-overclocked 8800GTX cards outperform the 8800 Ultra).

Right now, ATI has conceded the high end to nVidia and released the HD2900XT at a rather reasonable $399. So based on price point alone, it may not be a total flop. But, there is one very interesting point to consider – some benchmarks have shown that in super-high 2560×1600 resolution (like you see on 30" LCD monitors) and/or super-high antialiasing modes such as 16xAA, the HD2900XT actually performs reasonably well compared to nVidia's flagship 8800GTX card, probably because of the insanely high memory bandwidth. So given the card's $150 price advantage over the 8800GTX, perhaps there is indeed a place for this card at the high end, amongst an extreme niche of ultra-high-end enthusiast gamers. For the rest of us, however, ATI has failed to provide a compelling competitor to nVidia's popular and impressive 8-series cards.

Like this blog? Digg it to share!

Notify of

1 Comment
Inline Feedbacks
View all comments
16 years ago

I’m starting to get a bit worried about the inflationary tendencies I’m seeing in hardware prices. With the latest round of consoles (excluding the Wii, which is in a very weird position somewhere between generations) starting out at $100 more than the last generation and topping out at a good $300 more, and the latest “budget” graphics cards weighing in at the price point of the last series’ top-of-the-line…. Well, it’s just not striking me as a good time to be a low-income gamer. I do understand that all of these things far outperform the hardware they’re replacing, but that… Read more »