Tuesday, April 24th 2007

ATI Radeon HD 2900 XT Performance Benchmarks

The title speaks for itself. DailyTech has managed to run through some benchmarks with the ATI Radeon HD 2900 XT 512MB graphics card. The tests were conducted on an Intel D975XBX2 BadAxe2, Intel Core 2 Extreme QX6700 and 2x1GB DDR2-800 MHz. The operating system on the test system was Windows XP, with a fresh install before benchmarking each card. Testing of the AMD ATI Radeon HD 2900 XT was performed using the 8.361 Catalyst RC4 drivers, while the GeForce 8800 GTS used ForceWare 158.19 drivers. All game tests were run with the maximum detail settings at resolutions of 1280x1024.

The ATI Radeon HD 2900 XT is expected to be widely available in mid-May, with a suggested retail price of $499.Source: DailyTech
Add your own comment

94 Comments on ATI Radeon HD 2900 XT Performance Benchmarks

#1
DrunkenMafia
Man thats wierd, I thought in that other review the 2900xt was just beating the 8800gtx???? That one has the XTX getting pwned by the 8800gtx.. who to believe, I can tell you now that I don't reckon ATI would make everyone wait all this time for a card that is not as fast as the opositions card, unless they are trying to go bust... Maybe thats what they are doin',....Oh well I will believe it when I see actually see it. How long now, 3 weeks??
Posted on Reply
#2
Zalmann
by: DrunkenMafia
Man thats wierd, I thought in that other review the 2900xt was just beating the 8800gtx???? That one has the XTX getting pwned by the 8800gtx.. who to believe, I can tell you now that I don't reckon ATI would make everyone wait all this time for a card that is not as fast as the opositions card, unless they are trying to go bust... Maybe thats what they are doin',....Oh well I will believe it when I see actually see it. How long now, 3 weeks??
I think they were quoting the 8800GTS and not the GTX. Still, it doesn't add up as these new benchmarks state different numbers to those originally posted.

The HD 2900XT benchmarks used release candidate drivers. These new benchmarks use the drivers supplied to thier partners, which are pretty much the final release.
Posted on Reply
#3
DrunkenMafia
Are you seriously telling me that the flagship ATI card that ISNT even OUT yet is not as good as a 9 month old Nvidia one...... oooooh that would be a bad bad bad move on ati's part... heh. that would be like the 8800gtx coming out and being slower than a X1950.... that didnt happen...

I couldn't care less about optimised drivers either, that card SHOULD tromp any of the existing cards on the market.... maybe the ATI supernerd techies just aren't as good as the nvidia supernerd techies...


BTW "supernerd techies" is by no means an actual REAL life position... haahaa
Posted on Reply
#4
Zalmann
by: DrunkenMafia
Are you seriously telling me that the flagship ATI card that ISNT even OUT yet is not as good as a 9 month old Nvidia one......
In a nutshell, Yes, that is what the article is stating.
Posted on Reply
#5
DrunkenMafia
Whoa that sux bigtime..... do i believe these benches though....
Posted on Reply
#6
Zalmann
I have a theory, maybe this is the cause of the delays in getting the R600 out into the public. It was probably to buy time for them to look at how to extract more performance out of the GPU. This is only my (conspiracy) theory though, whether or not it's true is another thing.
Posted on Reply
#7
Casheti
If this whole ATi fiasco is true, then I'm pissed :banghead:

But seeing as I dislike AMD, and like ATi, I can comfort myself by simply saying this is AMD's fault :)
Posted on Reply
#8
newbielives
Looks like this may all be true since they've been hiding their benchmarks for so long instead of bragging about it like they should
Posted on Reply
#9
Tatty_One
Senior Moderator
by: Zalmann
I have a theory, maybe this is the cause of the delays in getting the R600 out into the public. It was probably to buy time for them to look at how to extract more performance out of the GPU. This is only my (conspiracy) theory though, whether or not it's true is another thing.
I think you may well have a very valid point there!
Posted on Reply
#10
SpoonMuffin
from what i read the xtx wasnt going to be a valable till later, oh well, we will see the real numbers once the cards are in real test labs and the NAD's are gone.
Posted on Reply
#11
wazzledoozle
I dont care about DX10 cards performance in DX9, I care about DX10. For all we know, the R600 could be utter shit in current games but own it up in Crysis etc.

Radeon 9700 anyone?
Posted on Reply
#12
theonetruewill
I'm confused about the X2900XTX's performance, but actually more concerned with the 8800GTX's supposed results.

The Oblivion fps is just too high imo at 1920x1200 - 98.4??!!. Surely it can't be that high. Same with FEAR's result at this resolution.
Posted on Reply
#13
SpoonMuffin
by: wazzledoozle
I dont care about DX10 cards performance in DX9, I care about DX10. For all we know, the R600 could be utter shit in current games but own it up in Crysis etc.

Radeon 9700 anyone?
then again wazzle, once the 9700 got driver updates it started pulling ahead of the fx59*0 cards, hell the 9600 256mb i use to own was faster then the 5800ultra i had, ALOT faster, even in dx7 and 8 games at decent resolutions(1152 range) and dx9, well we all know how the fx line did/dos in dx9 games.

Dont take any current results as gosple, its like anything else, proof in the pudding as my grandfather use to say.

i will stick with my current card till i see a real valid reasion to upgrade, specly if i can add a 2nd card to run as a PPU in this system.

I avoid first gen cards anymore, because the fx line was first gen and sucked ass, gf6 line was better BUT still had problems due to poor driver support, i went from a 9600 to a 9800SE 256bit hard moded to pro@higher then xt clocks(my buddy still uses the card, hes now the 4th owner, went from friend to friend to friend lol)
then i got a x800pro vivo that i flashed to xt pe the day i got it, this after i tryed a 6800gt and sold it due to driver problems(3 of us all had the same problems with the 6800gt/ultra cards driver...) thankfully the 6800gt was selling for VERY nice prices at the time, so selling it got me more then enought to get the x800pro vivo and a good addon cooler :)
b4 anybody says that the x800 was a first gen card, it wasnt, it was/is effectivly an evolution on he 9700 core, alot more powerfull sure, but still, very simlar.

i avoided the x1800 line because, why upgrade to something thats going to give you no performance advantege over what you have?
i got a nice price for a laptop i sold to a buddy, as payment i had him order my parts for this system at newegg, worked out really well imho, i payed 278bucks for an x1900xtx and 28bucks for a vf900cu :)
Posted on Reply
#14
SK-1
by: theonetruewill
I'm confused about the X2900XTX's performance, but actually more concerned with the 8800GTX's supposed results.

The Oblivion fps is just too high imo at 1920x1200 - 98.4??!!. Surely it can't be that high. Same with FEAR's result at this resolution.
I never saw that,...very astute of you to notice.I am not sure about these benchmarks at all now.A few posts back mentioned that IF the R600 was really kickin-it,then they would be letting loose a LOT of benchies,...I do tend to believe this.
Posted on Reply
#15
SpoonMuffin
true, Fear is CPU limmited in most cases not GPU, and oblivion, is horribly unoptimized for video and cpu, somebody needs to recode the damn thing so it runs properly on pc hardware.......*grummbles about how poor it runs on even the best systems*
Posted on Reply
#16
Kasparz
If these benchmarks are right(i know its not right ;) )then explain this : 2900XT are lot faster than 8800GTS. 2900XT is lot slower than 8800GTX. So, its seems that 8800GTS is twice slower than 8800GTX. But it isn't. Second. GDDR4 have weaker timings than GDDR3, but thats not hurting performance that much. 2900XTX has much faster ram, little faster GPu speed, and extra 512MB ram. Thats lot faster than 2900XT. In some benchmarks XTX are slower than XT. Yeah, but me a Barbie...
So there are two reasons of this. Either card are defective(remember 8800GTX wrong resistor value?) and didn't switch to 3D mode or some driver problems, or Nvidia paid for dailytech to do all this.
Posted on Reply
#17
Casheti
by: SpoonMuffin
true, Fear is CPU limmited in most cases not GPU, and oblivion, is horribly unoptimized for video and cpu, somebody needs to recode the damn thing so it runs properly on pc hardware.......*grummbles about how poor it runs on even the best systems*
A friend with E6600 and 7950GT SLi still lags sometimes on Oblivion at 1280x1024 :wtf:
Posted on Reply
#18
WarEagleAU
Bird of Prey
by: Kasparz
If these benchmarks are right(i know its not right ;) )then explain this : 2900XT are lot faster than 8800GTS. 2900XT is lot slower than 8800GTX. So, its seems that 8800GTS is twice slower than 8800GTX. But it isn't. Second. GDDR4 have weaker timings than GDDR3, but thats not hurting performance that much. 2900XTX has much faster ram, little faster GPu speed, and extra 512MB ram. Thats lot faster than 2900XT. In some benchmarks XTX are slower than XT. Yeah, but me a Barbie...
So there are two reasons of this. Either card are defective(remember 8800GTX wrong resistor value?) and didn't switch to 3D mode or some driver problems, or Nvidia paid for dailytech to do all this.
Nice theory there Kasparz. Im not sure whats going on, but it could be the reason ATI was holding off on the DDR4 models on this model (not the x1950xtx which uses GDDR4 too if Im not mistaken.) There are inconsistencies in the 8800gts, gtx and the hd2900xt from the first post of the hd2900xt vs the gts. As was said, that worries me as well. However, we can banter back and forth on the numbers, however, if they are true, it seems ATI cant exactly catch nvidia with this generation, but perhaps with the hd3000XTx (just throwing this out there, I would have went with the hd2800xt and xtx first, but who knows why they chose these monikers.)
Posted on Reply
#19
Tatty_One
Senior Moderator
by: theonetruewill
I'm confused about the X2900XTX's performance, but actually more concerned with the 8800GTX's supposed results.

The Oblivion fps is just too high imo at 1920x1200 - 98.4??!!. Surely it can't be that high. Same with FEAR's result at this resolution.
Why would you be concerned about the 8800GTX's benches and performance in those tests? You quote for example it's scores at 1920 x 1200 in Fear, are you aware that a 1950XTX crossfire setup in Fear cannot compete with a single 8800GTX at stock if you use "average framerate acheived", I mention this as purely a real world comparison as there is a guy on another forum that I visit who has owned both and actually did a comparison and detailed it in a thread, He got an Average of 89FPS and a MAX of 184! at those specicific resolutions.......it is here......POST 25:

http://www.hardforum.com/showthread.php?t=1118553&page=2

I could not find a bench with those particular resolutions for oblivion specifically for the 8800GTX but I foind one for 1600 x 1200 and it was running with AA/AF and HDR enabled and acheiving 123FPS!! Now I am not saying this release is accurate at all and TBH I have some doubts over the comparison of the 2 cards in question, I have little doubt though that on the right rig the 8800GTX can acheive around what they said it can in this thread starter.
Posted on Reply
Add your own comment