Wednesday, October 16th 2013

Radeon R9 290X Pitted Against GeForce GTX TITAN in Early Review

Here are results from the first formal review of the Radeon R9 290X, AMD's next-generation flagship single-GPU graphics card. Posted by Chinese publication PCOnline.com.cn, the it sees the R9 290X pitted against the GeForce GTX TITAN, and GeForce GTX 780. An out-of-place fourth member of the comparison is the $299 Radeon R9 280X. The tests present some extremely interesting results. Overall, the Radeon R9 290X is faster than the GeForce GTX 780, and trades blows, or in some cases, surpasses the GeForce GTX TITAN. The R9 290X performs extremely well in 3DMark: FireStrike, and beats both NVIDIA cards at Metro: Last Light. In other tests, its half way between the GTX 780 and GTX TITAN, leaning closer to the latter in some tests. Power consumption, on the other hand, could either dampen the deal, or be a downright dealbreaker. We'll leave you with the results.
More results follow.

Source: PCOnline.com.cn
Add your own comment

121 Comments on Radeon R9 290X Pitted Against GeForce GTX TITAN in Early Review

#1
Ghost
by: sweet
Another victim of the scheme pulled by nVidia's dynamic boost :ohwell:
Oh right lol. Titan @ 836~900 MHz. Still should be faster clock-to-clock.
Posted on Reply
#2
Am*
by: the54thvoid
I have read (up to this morning at least) every page of the GTX Titan owners thread at OCN. Everybody accepts that the Titan is hamstrung by BIOS limits but even without that, if temps are controlled it hits about 1097-1137MHz before throttling. FTR, a Titan at 1137MHz is f*cking fast.
It makes me wonder how good GK110 really could have been with all 2880 cores enabled and with beefier VRMs from, say Gigabyte or MSI. Nvidia really shot themselves in the foot by not allowing custom designs -- the problem is Titan hits its board power limits long before it gets anywhere near 'unsafe' temperatures. Maybe Nvidia will release the full GK110 shortly after the R290X? Will be interesting to see what it will be capable of if that is indeed the case.

by: the54thvoid

And I couldn't agree more with you. If you want massive frame rates on BF4 at 1440p res or higher, it looks like 290X IS the way forward. But bear in mind my 1136MHz Titan averages 60fps at Ultra settings (at 1440p).
Thanks for this. With my GTX 460 OC'd to hell and back and on its last legs, crawling through BF4 at medium-ish settings at 40-60FPS @ 1080p, your post really makes me really look forward to my next upgrade. I finally hope to use my 120Hz monitor to its full extent soon. :rockout:
Posted on Reply
#3
sweet
by: Ghost
Oh right lol. Titan @ 836~900 MHz. Still should be faster clock-to-clock.
The scheme of nVidia is much more sophisticated. Even that the boost clock says 900 MHz, every stock Titan will run at 993~1016 MHz, because of the Kepler boost. nVidia just knows how to cheat with benchmark.
Posted on Reply
#4
Casecutter
First no one buys this level of card for 1920x1080p. Could see this not from using actual/final release drivers for reviewers of R9 290X. So most all those numbers those are... not worth speculation.

Don't put to much in Furmark consumption test, unless there's a score saying the amount of "work" that produced it's hardly significant or earth-shattering. The one the it might indicate is the cooler has some headroom.

But these numbers do hold to what I've been saying, can soundly beat a 780, while will spar with Titan depending on the Title. Metro was one Nvidia had owned but not so much anymore.

If AMD hold to what they've indicated and done with re-badge prices they'll have a win!
We wait...
Posted on Reply
#5
1d10t
by: dom99
No, Ive seen these before and I think its some bloke in China who claims to have a card and benched it.
thanks for pointing out, lad :toast:

by: Dj-ElectriC
BTW, i feel that test results will be more in favor of the R9 290X at 2560X1440, that's where he belongs
AMD should make this statement for their card...TO THE EYEFINITY AND BEYOND :laugh:

by: Novulux
http://videocardz.com/images/2013/10/AMD-R9-290X-Performance.jpg

http://videocardz.com/46785/amd-radeon-r9-290x-performance-charts-emerge
And another graph that has questionable results? R9-290 might be an awesome card if this is true...
Jeez..those graphs look very tempting.Bummer,i don't know whom i should trust now.I need salvation...


by: thematrix606
Most of the gamers use 1080, of course they will be used for 1080. Why in hell would you think otherwise?
Have you never played beyond 60Hz on a monitor? And they barely even reach 60FPS in Crysis 3 and other games!
Most of the gamers will choose mainstream card.Why in hell would you think otherwise?
Only enthusiast and score bitching worshiper will look otherwise.

by: RCoon
...Also the fact that in the steam survey, NVidia cards took the highest share of video cards in systems, should tell you enough about how their Q4 results will turn out...
Most of steam user consist from pre-build PC's,average joe and regular jane who bought marketing induced product.They only know three thing :
- anything cost more is better.
- any product which hordes the market is always faster.
- and the worst...buy product that had commercial opening scene in a game will make your game stable.
Let me guess,Intel's Havok and nVidia old motto The Way It's Meant To be Played definitely winner here.Heck,some of my colleagues believe AMD processor and graphics doesn't do gaming.They even boasting their i3+650Ti will decimate my FX8350+CF 7970 :wtf:
Posted on Reply
#6
erocker
by: sweet
The scheme of nVidia is much more sophisticated. Even that the boost clock says 900 MHz, every stock Titan will run at 993~1016 MHz, because of the Kepler boost. nVidia just knows how to cheat with benchmark.
Doesn't a Titan run those boost clocks with games too? I'm having difficulty understanding where the cheating comes in to play.
Posted on Reply
#7
the54thvoid
by: erocker
Doesn't a Titan run those boost clocks with games too? I'm having difficulty understanding where the cheating comes in to play.
Was about to post same. Boost applies in all scenarios. There is no cheating at all, just misguided sentiment.
Posted on Reply
#8
Am*
by: Casecutter
First no one buys this level of card for 1920x1080p.
How do you work that out when current games like Far Cry 3, Crysis 3, Metro LL etc under the highest settings, cannot exceed 60FPS @1080p average on the most expensive single-GPU cards? Or how about the fact that next gen consoles will struggle to run launch titles natively @1080p (BF4 will run @720p)? You'll be surprised to know that the vast majority buying these cards will be gaming @1080p, whether it is across 1 or 3 panels. I am using a 27" 2560x1440 IPS panel right alongside my 1080p panel -- guess what, I still game on the TN panel due to the higher framerate, better response time and less input lag. It definitely makes more sense to game @1080p, at least competitively, since IPS benefits mostly do not apply much to gaming (viewing angles don't matter since I sit right in front of the monitor, and neither does the superior colour accuracy, since most games these days tend to have a really limited colour palette anyway -- BF3 has a blue tint to everything, BF4 seems to have a grey tint etc).
Posted on Reply
#9
ManosHandsOfFate
by: sweet
There is no doubt that this card is above Titan clock to clock. However, this beast will consume bunch of wattage and that tiny fan is not really reliable. The card is capable at the top, but it is not perfect. Hope that the custom versions will be available soon.
What are you talking about? There is 100% doubt as whether the 290x is faster than the Titan, and that's what EVERYONE is trying to figure out. So for you to say there is none, just sets the whole conversation back.
Posted on Reply
#10
Rei86
Damn that power graph, I'm a power user so i don't really care about how much power it'll suck up but still makes me rethink about the PSU I'm running.

Still starving for a review of these things.
Posted on Reply
#11
theoneandonlymrk
by: ManosHandsOfFate
What are you talking about? There is 100% doubt as whether the 290x is faster than the Titan, and that's what EVERYONE is trying to figure out. So for you to say there is none, just sets the whole conversation back.
obviously there are plenty who doubt the R9 290x is faster, this is yet another example of something you can throw steam stats at ;)(not me),,most have nvidia apparently:)

even when its out,, and mantels out, your still going to be able to find a fair few who would nock it, even if it were 50% faster.

stop trying to figure something thats mathmatically proveable and proven,, its about "application" anyway and more importantly wizzards application of it(R9 290X) that really matters:D
Posted on Reply
#12
HumanSmoke
by: Am*
It makes me wonder how good GK110 really could have been with all 2880 cores enabled and with beefier VRMs from, say Gigabyte or MSI. Nvidia really shot themselves in the foot by not allowing custom designs
Firstly, Titan (and the identical power delivery reference GTX 780) seems to be a reasonable seller for Nvidia...So by your reckoning Nvidia shot themselves in the foot by selling a load of high-revenue GPUs in a package that keeps warranty failure rates low for eight months...

Yup, that's some foot shooting right there.

Secondly, Nvidia have had two salvage parts collecting revenue and dominating the review benchmarks for the same length of time. Bonus point question: When was the last time Nvidia released a Quadro or Tesla card that didn't have a GeForce analogue of equal or higher shader count?*

Quadro K6000 (2880 core) released four days ago
Tesla K40 (2880 core) imminent

* Answer: Never
Posted on Reply
#13
TheHunter
by: Assimilator
WTF is the point of measuring power consumption in FurMark? Give us the power consumption in the tested applications/games alongside the frame rates so we can draw a useful conclusion about the power vs FPS numbers.

I'm also skeptical about Hawaii's performance in general. It still seems to be slower than the GK110 clock-for-clock, so there's nothing stopping nVIDIA from releasing a "780 Ultra" or somesuch with 1GHz core clock, which will then blow R290/X out of the water.
Yeah, I bet they tested ES with probably improper driver/bios TDP protection for such apps (OCCT, Afterburner, Furmark)..

I mean if you'd remove that in GK110 it would be the same 400-450w for sure.
Posted on Reply
#14
arbiter
So its only as fast as a stock clocked titan in most things? So what would be the results if it was tried against a titan OC'ed to 1000-1100mhz which seems to be where a lot of people are able to OC titan's to with little to no problem?
Posted on Reply
#15
Blín D'ñero
by: arbiter
[...] So what would be the results if it was tried against a titan OC'ed to 1000-1100mhz which seems to be where a lot of people are able to OC titan's to with little to no problem?
The result: Titan would be way OV'priced...
Titan $ 999,- (newegg), whereas R9 290X $600 ~650 (expected)
Posted on Reply
#16
ManosHandsOfFate
[quote="Blín D'ñero, post: 2998122"]The result: Titan would be way OV'priced...
Titan $ 999,- (newegg), whereas R9 290X $600 ~650 (expected)[/quote]Probably closer to $700, if not more... :(
Posted on Reply
#17
Casecutter
by: Am*
It definitely makes more sense to game @1080p, at least competitively
Competitively okay that's a point.

by: ManosHandsOfFate
Probably closer to $700, if not more...
There no way AMD can think $700, they have to move enough to get an ROI for a new part, it not like they've geldings and are just needing a home. AMD needs to insure they can recoup engineering and set-up, while splitting that over enough wafers. I don't see this as a boutique product I think they see it as a full-production offering no different the Tahiti was 2 years ago.

I'm wanting a $550 MSRP.
Posted on Reply
#18
SIGSEGV
by: 1d10t

Only enthusiast and score bitching worshiper...

Heck,some of my colleagues believe AMD processor and graphics doesn't do gaming.They even boasting their i3+650Ti will decimate my FX8350+CF 7970 :wtf:
LOL
so true...

Even my father would prefer to choose intel celeron (i'm not sure it's dual cores) and its gpu rather than amd A8-4 series (4 cores) and its igpu.

@casecutter : count me in, i would be happy to get them in cfx and put it under water.
Posted on Reply
#19
xorbe
Tell me about power while playing a game, not Furmark ... who knows which is throttling more.
Posted on Reply
#20
thematrix606
by: Casecutter
First no one buys this level of card for 1920x1080p.
And yet again, another 60Hz monitor owner. Please go back to your cave. :banghead::banghead::banghead:

by: 1d10t
thanks for pointing out, lad :toast:
Most of the gamers will choose mainstream card.Why in hell would you think otherwise?
Only enthusiast and score bitching worshiper will look otherwise.
So a gamer with enough money to spend on nice hardware (anyone with a job really) is an enthusiast? Wow.
Posted on Reply
#21
HammerON
The Watchful Moderator
by: thematrix606
And yet again, another 60Hz monitor owner. Please go back to your cave. :banghead::banghead::banghead:



So a gamer with enough money to spend on nice hardware (anyone with a job really) is an enthusiast? Wow.
And what monitor(s) do you own?
I will stick with my "lackluster" 60Hz 30" Dell and wait for the 4K monitors to come down in price. Does this mean I am in a cave as well? I absolutely love my monitor and see no reason at this point to go to 120Hz.
Posted on Reply
#22
1d10t
by: SIGSEGV
LOL
so true...
Even my father would prefer to choose intel celeron (i'm not sure it's dual cores) and its gpu rather than amd A8-4 series (4 cores) and its igpu.
I know you won't...even you show up in AMD Lounge...hahaha :laugh:
Not even a year and i already miss everyone in last gath @ bandung :toast:

by: thematrix606
So a gamer with enough money to spend on nice hardware (anyone with a job really) is an enthusiast? Wow.
Why would someone spend $600 or $1000 for stupid card that cannot even attract a woman with nice boobs?Yup...that's enthusiast.

by: HammerON
And what monitor(s) do you own?
I will stick with my "lackluster" 60Hz 30" Dell and wait for the 4K monitors to come down in price. Does this mean I am in a cave as well? I absolutely love my monitor and see no reason at this point to go to 120Hz.
Might have something to add,no matter how fast your monitor,windows only sees them in 60Hz.It's in the panel,not in OS'es or graphic card.
FYI,I have 240Hz panel and yet still have a severe judder :shadedshu
Posted on Reply
#23
the54thvoid
by: 1d10t
Why would someone spend $600 or $1000 for stupid card that cannot even attract a woman with nice boobs?Yup...that's enthusiast.
Ah, the naivety of youth. I enjoy having both. ;)
Posted on Reply
#24
Ahhzz
by: tigger
I love the smell of new hardware in the morning.

whether these are better than Nvidia cards or not, I still love it when new GPU's come out, all the speculation and "discussion" is interesting and sometimes fun to read.

Personally for me, a 280x or a 7970.
I'm with you on that. A 280x is looking more and more like the bee's knees.
Posted on Reply
#25
1d10t
by: the54thvoid
Ah, the naivety of youth. I enjoy having both. ;)
It's nice to know you had a better life than mine sir,may God always bless your family and guide you in your hardest time :toast:
Although in my early 30's i'm still at shitty job with minimum wages and barely make a living,I still believe God would have pity on me so i could date someone and make my own family someday :)
Posted on Reply
Add your own comment