Wednesday, October 16th 2013

Radeon R9 290X Pitted Against GeForce GTX TITAN in Early Review

Here are results from the first formal review of the Radeon R9 290X, AMD's next-generation flagship single-GPU graphics card. Posted by Chinese publication PCOnline.com.cn, the it sees the R9 290X pitted against the GeForce GTX TITAN, and GeForce GTX 780. An out-of-place fourth member of the comparison is the $299 Radeon R9 280X. The tests present some extremely interesting results. Overall, the Radeon R9 290X is faster than the GeForce GTX 780, and trades blows, or in some cases, surpasses the GeForce GTX TITAN. The R9 290X performs extremely well in 3DMark: FireStrike, and beats both NVIDIA cards at Metro: Last Light. In other tests, its half way between the GTX 780 and GTX TITAN, leaning closer to the latter in some tests. Power consumption, on the other hand, could either dampen the deal, or be a downright dealbreaker. We'll leave you with the results.
More results follow.

Source: PCOnline.com.cn
Add your own comment

121 Comments on Radeon R9 290X Pitted Against GeForce GTX TITAN in Early Review

#76
Am*
HumanSmokeFirstly, Titan (and the identical power delivery reference GTX 780) seems to be a reasonable seller for Nvidia...So by your reckoning Nvidia shot themselves in the foot by selling a load of high-revenue GPUs in a package that keeps warranty failure rates low for eight months...
Let's get a few things straight: first of all, Titan was a terrible seller for Nvidia as a Geforce card and anyone that told you otherwise, is delusional to say the least. If you read that from Nvidia, "reasonable" means a piss poor seller where it mattered, because had it sold well, Nvidia would be screaming this from the rooftops and AMD wouldn't think twice about doing the same thing if they could. This joke of a card had a small surge of pre-order sales in the first month from several pro market customers who wanted to test the waters with lots of cheap Double Precision/CUDA cards where Titan was the cheapest option (and since GK104 is a complete joke in this area), which Nvidia could've milked for FAR more money than they did. Secondly, the Titan has been an expensive dust collector for all the biggest e-tailers in our country since the first few weeks of release and I can count on my hand the number of people in the UK that I know or have seen on forums and gaming communities, who bought one (even where everyone is the most likely to brag about their setups, I can literally point out most of the Titan owners). What all of Titan owners/ex-owners complained about was the underpowered stock design and no 3rd party alternatives to fix the problems. So yeah, they lost plenty of potential sales, whether you like it or not.
HumanSmokeSecondly, Nvidia have had two salvage parts collecting revenue and dominating the review benchmarks for the same length of time. Bonus point question: When was the last time Nvidia released a Quadro or Tesla card that didn't have a GeForce analogue of equal or higher shader count?

* Answer: Never
Congrats on stating the obvious, Sherlock. Did you see me mention anywhere that it won't be released? No...the question is how late will it be, and if anybody is going to be left that will want it by the time it actually comes out.
1d10tMight have something to add,no matter how fast your monitor,windows only sees them in 60Hz.It's in the panel,not in OS'es or graphic card.
FYI,I have 240Hz panel and yet still have a severe judder :shadedshu
I think you're getting confused here. The fastest consumer panel you can get is 144Hz and Windows detects every last Hz perfectly. Any panels above that (especially the TVs) are just pure marketing bullshit -- they are 60Hz, low quality CCFL panels with frame repeating image controllers that "smudge" the image from one frame to the next, which is why you have severe "juddering". People really need to avoid falling for the marketing bullshit from manufacturers -- you cannot get a true 240Hz display the same way any TN monitor cannot display more than 256K colours, even though most manufacturers will flat out lie and say that their TN panels can display 16.7M colours with more image trickery (dithering).
Posted on Reply
#77
Casecutter
CasecutterFirst no one buys this level of card for 1920x1080p.
Man didn't mean to rile so many feathers, although I should've not used the words "no one".

First I didn't say multi-panels, and sure there will be instances with specific type panels and if a competitive play you'd need some... Über. While, even Far Cry 3, Crysis 3 if you juggle settings a little, a R9-280X you can make it work. And those are perhaps the two exceptions, because even Metro LL on a R9-280X is 60FpS at 1920x1080p. If those are what you intend to play there're always exceptions/concessions for some, but the large bulk of enthusiasts aren't looking at special circumstances, just a normal 1920x1080p. Perhaps 27" they got a year or two back and hoping to hold out perhaps another generation for 4k to become what some might see as "affordable". Till then I see that many have nestled in with what they have.

You could send $650-1000 on the graphic card and then what… a multi-panels on the cheap perhaps, because thinking "it" will give you a path later to get a 4k isn't a good avenue. The smart move for the average person (for that money) step-up from an older 24” 1080p TN and last generation cards (6950/560Ti) and get a descent 2560x1440 monitor and say $310 for a R9-280X and have a fairly enjoyable enthusiast experience. At least not all that much different than those that dropped $650-1000 on just the graphic card.
Posted on Reply
#78
HumanSmoke
Am*
HumanSmokeFirstly, Titan (and the identical power delivery reference GTX 780) seems to be a reasonable seller for Nvidia...So by your reckoning Nvidia shot themselves in the foot by selling a load of high-revenue GPUs in a package that keeps warranty failure rates low for eight months...
Yup, that's some foot shooting right there.
Let's get a few things straight: first of all, Titan was a terrible seller for Nvidia as a Geforce card and anyone that told you otherwise, is delusional to say the least.
Comprehension fail or trolling ?
I believe we were talking about GK110- that is Titan and the 780....but of course, you're argument stands up fairly well if you're ignoring the largest selling parts :banghead:

You also seem to be making some fundamental error in what the Titan in particular was supposed to be. For the consumer, the card was supposed to represent the (fleeting) pinnacle of single-GPU performance with slight incentive of Nvidia not diluting FP64 performance. For Nvidia it represented PR. Every GPU review since (whether Nvidia or AMD) the Titan's launch has featured the card at or near the top of every performance metric. Eight months of constant PR and advertising that hasn't cost Nvidia an additional penny.

If sales of Titan (or the 780 for that matter) were paramount then you bet that Nvidia wouldn't have priced it as they have -in exactly the same way that AMD priced a supply constrained HD 7990 at $999. That hypothesis is all the more credible when the main revenue stream for GK110, the Tesla K20 is known to be supply constrained itself.

You're living in cloud cuckoo land if you believe that taking the muzzle off the AIB's for vendor specials, and lowering prices would have any significant impact of the overall balance sheet. The market for $500+ graphics cards is negligible in the greater scheme of things. Now subtract the percentage of people that if presented with a $500 card wouldn't also pay for a $650 (or more) board. Now subtract the percentage of people that would spend the same amount of cash on two lower specced cards offering better overall performance

Basically you're putting the gaming aspect under the microscope and not really looking at the big picture...the other alternative is that some random internet poster knows more about strategic marketing than the company with the sixteenth highest semiconductor revenue in the world :slap:
Posted on Reply
#79
Blín D'ñero
Benchmarks @ 3840×2160 – AMD Radeon R9 290X Versus NVIDIA GeForce GTX 780:

LegitReviews
In Bioshock: Infinite with the Ultra preset the AMD Radeon R9 290X ran at an average of 44.22 FPS and the NVIDIA GeForce GTX 780 was at 39.63.This shows a significant 11.6% performance advantage for the new AMD Radeon R9 290X with the Hawaii GPU.

Tomb Raider showed the AMD Radeon R9 290X averaging 43.0 FPS and the NVIDIA GeForce GTX 780 was at 40.8 FPS. This would make the AMD Radeon R9 290X roughly 5.4% faster than the NVIDIA GeForce GTX 780 in Tomb Raider.

These legit benchmark numbers should help shed light on a number of things. For one the AMD Radeon R9 290X appears to perform better than an NVIDIA GeForce GTX 780 at 4K resolutions, but pretty much everyone assumed or hoped that would be the case. It should be noted that these results are also on beta drivers and as time goes on we expect AMD to get some more performance from the brand new 28nm Hawaii GPU as time goes on.

When it comes to pricing, the NVIDIA GeForce GTX 780 retails right now for around $659.99 online, so where will the AMD Radeon R9 290X be priced? AMD has traditionally priced their cards very competitively against NVIDIA, so will we see the AMD Radeon R9 290X in the $629-$649 price range?
Here are the specs of the system AMD is running in Montreal:
Intel Core i7-3960X – 3.3Ghz
MSI X79A-GD65
16GB DDR3-1600
Windows 7 64 w/ SP1
NVIDIA driver: 331.40
Future build of AMD CATALYST 13.11 beta (to be posted with launch)
PC Game Settings:
BioShock Infinite: 3840×2160 Ultra preset
Tomb Raider: 3840×2160 normal preset with TressFX on

AMD is also letting sites publish their own 4K benchmarks from Bioshock Infinite and Tomb Raider today, so expect to see some sites having numbers from their own unique test setup. We are here in Montreal and far from our test system, so expect to see our benchmark results when the full review goes live.
Source: LegitReviews

TomsHardware: First Official AMD Radeon R9 290X Benchmarks
[...]
There's not much more I can say at this point, except that I have several cards tested at 3840x2160, and R9 290X doesn’t just do well against the GeForce GTX 780…
[...]
Hmmm... "R9 290X Quiet Mode"... Promising! :D
Posted on Reply
#81
Blín D'ñero
Who forgets to mention the "Quiet Mode"...:rolleyes:
Posted on Reply
#82
springs113
blín d'ñerowho forgets to mention the "quiet mode"...:rolleyes:
lmfao
Posted on Reply
#83
HumanSmoke
Blín D'ñeroWho forgets to mention the "Quiet Mode"...:rolleyes:
Bwahahahahahaha.
Let me guess....you'll be petitioning W1zzard to measure sound and power consumption using "Quiet Mode", and to measure gaming benchmarks using "Noisy As Fuck Mode".
Posted on Reply
#84
Casecutter
HumanSmokeIf sales of Titan (or the 780 for that matter) were paramount then you bet that Nvidia wouldn't have priced it as they have
Not arguing just stating the obvious marketing employed.

A movie theater sells small popcorn for $3 and a large for $7. People are actually more often shown they'll buy the $3 size having trouble rationalizing the higher price.

Then the Theater introduces a medium that somewhat close to the size of the large, but not as much though folks rationalize that it's just .50 cent more and they get more. The theater actually starts selling more larges while medium and smalls aren't near as popular.

More choice provokes more thought and changes how the brain rationalizes things. In this case it's opposite. Nvidia release Titan and sure everyone salivates over what they’d like, but for large part of the market it's hard to justify, then add a $650 part and… set the hook.
Posted on Reply
#85
Blín D'ñero
HumanSmokeBwahahahahahaha.
Let me guess....you'll be petitioning W1zzard to measure sound and power consumption using "Quiet Mode", and to measure gaming benchmarks using "Noisy As Fuck Mode".
:wtf: No. AnandTech preview makes no mention of Quiet Mode (is it used/not used) nor the test system. Still you're eating that picture??? I guess you're just happy the 780 doesn't look too bad in it. :rolleyes:
Posted on Reply
#86
HumanSmoke
Blín D'ñero:wtf: No. AnandTech preview makes no mention of Quiet Mode (is it used/not used) nor the test system. Still you're eating that picture??? I guess you're just happy the 780 doesn't look too bad in it. :rolleyes:
Y'know, if I wanted "Quiet Mode", I'd just set a fan profile in AB or Precision and allow the card to throttle rather than increase fan noise. I guess some people need to be spoon fed this advanced thinking as a "feature".
CasecutterNot arguing just stating the obvious marketing employed.
Bingo. Elementary sales technique. I doubt Nvidia wanted to sell the Titan in quantity - especially not at Wal-Mart prices when the Tesla variety brings in the revenue. A second tier salvage part (GTX 780) would have limited appeal for a pro part since the loss of shaders aren't going to be mitigated by any meaningful lowing of power consumption.
Posted on Reply
#87
1d10t
Am*I think you're getting confused here. The fastest consumer panel you can get is 144Hz and Windows detects every last Hz perfectly. Any panels above that (especially the TVs) are just pure marketing bullshit -- they are 60Hz, low quality CCFL panels with frame repeating image controllers that "smudge" the image from one frame to the next, which is why you have severe "juddering". People really need to avoid falling for the marketing bullshit from manufacturers -- you cannot get a true 240Hz display the same way any TN monitor cannot display more than 256K colours, even though most manufacturers will flat out lie and say that their TN panels can display 16.7M colours with more image trickery (dithering).
Later i knew that 240Hz only quadrupling single frame 60Hz with MEMC methods while dimming backlight in lightning fast speed,it makes me outrage.Sure my panel had all the goodies,IPS panel 2 ms GTG 5 ms MRPT 240Hz 3D MMR put nicely across 42 diagonal while sitting in ridiculously one grand mark.But to know windows only sees them in 60Hz even i did some firmware flashing,define manual structured DDC and custom distributed EDID...I even fried my BCM3556 board.All pain and sacrifice for the glory of 3D is in vain thanks to 4K monitor :shadedshu
Posted on Reply
#88
okidna
1d10tThey even boasting their i3+650Ti will decimate my FX8350+CF 7970 :wtf:
1d10tFYI,I have 240Hz panel and yet still have a severe judder :shadedshu
1d10tIt's nice to know you had a better life than mine sir,may God always bless your family and guide you in your hardest time :toast:
Although in my early 30's i'm still at shitty job with minimum wages and barely make a living,I still believe God would have pity on me so i could date someone and make my own family someday :)
Coming from the same country as you, I wouldn't say having "FX8350+CF 7970", "240Hz panel", and those fancy water cooling stuff as an indicator of "barely make a living".... unless you're some kind of robot who doesn't eat at all and use all of your paycheck/money to buy PC hardware :laugh:.

Seriously, be thankful. There's a lot of people out there in our country who's not as lucky as you.

Anyway, from Montreal : imgur.com/a/MEXNo
Posted on Reply
#89
the54thvoid
Intoxicated Moderator
okidnaAnyway, from Montreal : imgur.com/a/MEXNo
Non troll question. Thats an x79 mobo. High chance the 780 is running at pci-e 2 . If the 290x is running at pci-e 3 would the 4k resolution have an effect on fps?
Posted on Reply
#91
SIGSEGV
okidnaComing from the same country as you, I wouldn't say having "FX8350+CF 7970", "240Hz panel", and those fancy water cooling stuff as an indicator of "barely make a living".... unless you're some kind of robot who doesn't eat at all and use all of your paycheck/money to buy PC hardware :laugh:.

Seriously, be thankful. There's a lot of people out there in our country who's not as lucky as you.

Anyway, from Montreal : imgur.com/a/MEXNo
who knows.
i believe he had took 90% of his wages for months or years maybe to buy shinny hardware...:) It because the same shit also happened to me.. :laugh:
bencrutzcmon man, he's exaggerating. he's a manager on a large scale IT company :D he wouldn't have much time posting here otherwise :shadedshu
wow, that's great..

-----


according to videocardz dot com.. videocardz.com/46929/official-amd-radeon-r9-290x-2160p-performance-17-games
Posted on Reply
#92
bencrutz
okidnaComing from the same country as you, I wouldn't say having "FX8350+CF 7970", "240Hz panel", and those fancy water cooling stuff as an indicator of "barely make a living".... unless you're some kind of robot who doesn't eat at all and use all of your paycheck/money to buy PC hardware :laugh:.

Seriously, be thankful. There's a lot of people out there in our country who's not as lucky as you.

Anyway, from Montreal : imgur.com/a/MEXNo
cmon man, he's exaggerating. he's a manager on a large scale IT company :D he wouldn't have much time posting here otherwise :shadedshu
the54thvoidNon troll question. Thats an x79 mobo. High chance the 780 is running at pci-e 2 . If the 290x is running at pci-e 3 would the 4k resolution have an effect on fps?
errr, except from the picture it's obvious that 780 & 290x benched using the very same mobo :confused:
Posted on Reply
#93
okidna
the54thvoidNon troll question. Thats an x79 mobo. High chance the 780 is running at pci-e 2 . If the 290x is running at pci-e 3 would the 4k resolution have an effect on fps?
That's a valid concern.

I read this thread @ EVGA forum : forums.evga.com/tm.aspx?m=2032492
It seems that 331.40 BETA provides PCI-E 3.0 support for Titan and 780 under X79 platform (but with Ivy Bridge-E processor, don't know about SB-E).
Posted on Reply
#94
Xzibit
the54thvoidNon troll question. Thats an x79 mobo. High chance the 780 is running at pci-e 2 . If the 290x is running at pci-e 3 would the 4k resolution have an effect on fps?
How do you account for reviewer setups from Toms Hardwares, Anandtech & Eteknix that used there own hardware?
Posted on Reply
#95
the54thvoid
Intoxicated Moderator
okidnaThat's a valid concern.

I read this thread @ EVGA forum : forums.evga.com/tm.aspx?m=2032492
It seems that 331.40 BETA provides PCI-E 3.0 support for Titan and 780 under X79 platform (but with Ivy Bridge-E processor, don't know about SB-E).
Yeah but i'm using those drivers and i'm not showing pci-e 3 unfortunately.

As for Xzibit 's point, yes, that's valid. If an Ivybridge board is used it negates any issues :-)

Like I said, not trolling but AMD's set up isn't 'potentially' using equal specs on both cards. If other reviews use Ivy, then all's cool (if the lane bandwidth is even a problem in the first place!)
Posted on Reply
#96
TheHunter
What's with these crappy 4k benchmarks, I dont care for that reso, even less for that lousy fps. I mean I would never play at 24-50fps..


Give us some 1920x1200 reso benchmarks.
Posted on Reply
#97
okidna
the54thvoidYeah but i'm using those drivers and i'm not showing pci-e 3 unfortunately.

As for Xzibit 's point, yes, that's valid. If an Ivybridge board is used it negates any issues :-)

Like I said, not trolling but AMD's set up isn't 'potentially' using equal specs on both cards. If other reviews use Ivy, then all's cool (if the lane bandwidth is even a problem in the first place!)
Ah, my bad, didn't realize you have SB-E :D
Posted on Reply
#98
1d10t
okidnaSeriously, be thankful. There's a lot of people out there in our country who's not as lucky as you.

Anyway, from Montreal : imgur.com/a/MEXNo
Not that i'm complaining :D
BiggieShadyValid concern. PCIE 2 is barely enough. 4k requires just a little less than 16 GBps : web.forret.com/tools/video_fps.asp?width=3840&height=2160&fps=60&space=rgba&depth=8

www.rtcmagazine.com/files/images/3097/RTC02-TCTW-PCISIG-Table1_large.jpg
This.
290X implement a "new" crossfire methods via PCIe bus link.Although looks good on paper,my major concern is 38 PCIe lane 2.0 over my 990FX boards.Two of them could consume 32 lanes,and left 6 lane.It's still unclear to me whether AMD will do full duplex or needed another lane for crossfiring.On the side note,AMD could utilize IOMMU which is available across all 900 series board,perhaps...by creating virtual sideband addressing.
SIGSEGVwho knows.
i believe he had took 90% of his wages for months or years maybe to buy shinny hardware...:) It because the same shit also happened to me.. :laugh:
we're in the same boat here...the key is 2S ,saving and starve :laugh:
bencrutzcmon man, he's exaggerating. he's a manager on a large scale IT company :D he wouldn't have much time posting here otherwise :shadedshu
How dare you spreading FUD :p
Btw how are you old friend?is your shoulder been recovered?
Posted on Reply
#99
bencrutz
1d10tHow dare you spreading FUD :p
Btw how are you old friend?is your shoulder been recovered?
:D

not 100% but okay, better be, coz i spent pretty much 3 titans for it :shadedshu and now am broke :roll:
Posted on Reply
#100
NeoXF
TheHunterWhat's with these crappy 4k benchmarks, I dont care for that reso, even less for that lousy fps. I mean I would never play at 24-50fps..


Give us some 1920x1200 reso benchmarks.
LOLWUT :confused:


Anyway, here's an compiled comparison of another NDA-breaking run...
Posted on Reply
Add your own comment
Apr 26th, 2024 14:52 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts