• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Radeon R9 290X Pitted Against GeForce GTX TITAN in Early Review

Well, the 24th of October, as i posted last night here and here (and nobody reacted :laugh:): that is what the Chinese reviewer said. :)

Dates quoted for availability for sale in the UK on a couple of sites say 25th October so that would tie in with the 24th NDA lift prediction.
 
Another victim of the scheme pulled by nVidia's dynamic boost :ohwell:

Oh right lol. Titan @ 836~900 MHz. Still should be faster clock-to-clock.
 
I have read (up to this morning at least) every page of the GTX Titan owners thread at OCN. Everybody accepts that the Titan is hamstrung by BIOS limits but even without that, if temps are controlled it hits about 1097-1137MHz before throttling. FTR, a Titan at 1137MHz is f*cking fast.

It makes me wonder how good GK110 really could have been with all 2880 cores enabled and with beefier VRMs from, say Gigabyte or MSI. Nvidia really shot themselves in the foot by not allowing custom designs -- the problem is Titan hits its board power limits long before it gets anywhere near 'unsafe' temperatures. Maybe Nvidia will release the full GK110 shortly after the R290X? Will be interesting to see what it will be capable of if that is indeed the case.

And I couldn't agree more with you. If you want massive frame rates on BF4 at 1440p res or higher, it looks like 290X IS the way forward. But bear in mind my 1136MHz Titan averages 60fps at Ultra settings (at 1440p).

Thanks for this. With my GTX 460 OC'd to hell and back and on its last legs, crawling through BF4 at medium-ish settings at 40-60FPS @ 1080p, your post really makes me really look forward to my next upgrade. I finally hope to use my 120Hz monitor to its full extent soon. :rockout:
 
Oh right lol. Titan @ 836~900 MHz. Still should be faster clock-to-clock.

The scheme of nVidia is much more sophisticated. Even that the boost clock says 900 MHz, every stock Titan will run at 993~1016 MHz, because of the Kepler boost. nVidia just knows how to cheat with benchmark.
 
First no one buys this level of card for 1920x1080p. Could see this not from using actual/final release drivers for reviewers of R9 290X. So most all those numbers those are... not worth speculation.

Don't put to much in Furmark consumption test, unless there's a score saying the amount of "work" that produced it's hardly significant or earth-shattering. The one the it might indicate is the cooler has some headroom.

But these numbers do hold to what I've been saying, can soundly beat a 780, while will spar with Titan depending on the Title. Metro was one Nvidia had owned but not so much anymore.

If AMD hold to what they've indicated and done with re-badge prices they'll have a win!
We wait...
 
No, Ive seen these before and I think its some bloke in China who claims to have a card and benched it.

thanks for pointing out, lad :toast:

BTW, i feel that test results will be more in favor of the R9 290X at 2560X1440, that's where he belongs

AMD should make this statement for their card...TO THE EYEFINITY AND BEYOND :laugh:

http://videocardz.com/images/2013/10/AMD-R9-290X-Performance.jpg

http://videocardz.com/46785/amd-radeon-r9-290x-performance-charts-emerge
And another graph that has questionable results? R9-290 might be an awesome card if this is true...

Jeez..those graphs look very tempting.Bummer,i don't know whom i should trust now.I need salvation...


Most of the gamers use 1080, of course they will be used for 1080. Why in hell would you think otherwise?
Have you never played beyond 60Hz on a monitor? And they barely even reach 60FPS in Crysis 3 and other games!

Most of the gamers will choose mainstream card.Why in hell would you think otherwise?
Only enthusiast and score bitching worshiper will look otherwise.

...Also the fact that in the steam survey, NVidia cards took the highest share of video cards in systems, should tell you enough about how their Q4 results will turn out...

Most of steam user consist from pre-build PC's,average joe and regular jane who bought marketing induced product.They only know three thing :
- anything cost more is better.
- any product which hordes the market is always faster.
- and the worst...buy product that had commercial opening scene in a game will make your game stable.
Let me guess,Intel's Havok and nVidia old motto The Way It's Meant To be Played definitely winner here.Heck,some of my colleagues believe AMD processor and graphics doesn't do gaming.They even boasting their i3+650Ti will decimate my FX8350+CF 7970 :wtf:
 
The scheme of nVidia is much more sophisticated. Even that the boost clock says 900 MHz, every stock Titan will run at 993~1016 MHz, because of the Kepler boost. nVidia just knows how to cheat with benchmark.

Doesn't a Titan run those boost clocks with games too? I'm having difficulty understanding where the cheating comes in to play.
 
Doesn't a Titan run those boost clocks with games too? I'm having difficulty understanding where the cheating comes in to play.

Was about to post same. Boost applies in all scenarios. There is no cheating at all, just misguided sentiment.
 
First no one buys this level of card for 1920x1080p.

How do you work that out when current games like Far Cry 3, Crysis 3, Metro LL etc under the highest settings, cannot exceed 60FPS @1080p average on the most expensive single-GPU cards? Or how about the fact that next gen consoles will struggle to run launch titles natively @1080p (BF4 will run @720p)? You'll be surprised to know that the vast majority buying these cards will be gaming @1080p, whether it is across 1 or 3 panels. I am using a 27" 2560x1440 IPS panel right alongside my 1080p panel -- guess what, I still game on the TN panel due to the higher framerate, better response time and less input lag. It definitely makes more sense to game @1080p, at least competitively, since IPS benefits mostly do not apply much to gaming (viewing angles don't matter since I sit right in front of the monitor, and neither does the superior colour accuracy, since most games these days tend to have a really limited colour palette anyway -- BF3 has a blue tint to everything, BF4 seems to have a grey tint etc).
 
Last edited:
There is no doubt that this card is above Titan clock to clock. However, this beast will consume bunch of wattage and that tiny fan is not really reliable. The card is capable at the top, but it is not perfect. Hope that the custom versions will be available soon.

What are you talking about? There is 100% doubt as whether the 290x is faster than the Titan, and that's what EVERYONE is trying to figure out. So for you to say there is none, just sets the whole conversation back.
 
Damn that power graph, I'm a power user so i don't really care about how much power it'll suck up but still makes me rethink about the PSU I'm running.

Still starving for a review of these things.
 
What are you talking about? There is 100% doubt as whether the 290x is faster than the Titan, and that's what EVERYONE is trying to figure out. So for you to say there is none, just sets the whole conversation back.

obviously there are plenty who doubt the R9 290x is faster, this is yet another example of something you can throw steam stats at ;)(not me),,most have nvidia apparently:)

even when its out,, and mantels out, your still going to be able to find a fair few who would nock it, even if it were 50% faster.

stop trying to figure something thats mathmatically proveable and proven,, its about "application" anyway and more importantly wizzards application of it(R9 290X) that really matters:D
 
It makes me wonder how good GK110 really could have been with all 2880 cores enabled and with beefier VRMs from, say Gigabyte or MSI. Nvidia really shot themselves in the foot by not allowing custom designs
Firstly, Titan (and the identical power delivery reference GTX 780) seems to be a reasonable seller for Nvidia...So by your reckoning Nvidia shot themselves in the foot by selling a load of high-revenue GPUs in a package that keeps warranty failure rates low for eight months...

Yup, that's some foot shooting right there.

Secondly, Nvidia have had two salvage parts collecting revenue and dominating the review benchmarks for the same length of time. Bonus point question: When was the last time Nvidia released a Quadro or Tesla card that didn't have a GeForce analogue of equal or higher shader count?*

Quadro K6000 (2880 core) released four days ago
Tesla K40 (2880 core) imminent

* Answer: Never
 
WTF is the point of measuring power consumption in FurMark? Give us the power consumption in the tested applications/games alongside the frame rates so we can draw a useful conclusion about the power vs FPS numbers.

I'm also skeptical about Hawaii's performance in general. It still seems to be slower than the GK110 clock-for-clock, so there's nothing stopping nVIDIA from releasing a "780 Ultra" or somesuch with 1GHz core clock, which will then blow R290/X out of the water.

Yeah, I bet they tested ES with probably improper driver/bios TDP protection for such apps (OCCT, Afterburner, Furmark)..

I mean if you'd remove that in GK110 it would be the same 400-450w for sure.
 
So its only as fast as a stock clocked titan in most things? So what would be the results if it was tried against a titan OC'ed to 1000-1100mhz which seems to be where a lot of people are able to OC titan's to with little to no problem?
 
[...] So what would be the results if it was tried against a titan OC'ed to 1000-1100mhz which seems to be where a lot of people are able to OC titan's to with little to no problem?
The result: Titan would be way OV'priced...
Titan $ 999,- (newegg), whereas R9 290X $600 ~650 (expected)
 
It definitely makes more sense to game @1080p, at least competitively
Competitively okay that's a point.

Probably closer to $700, if not more...
There no way AMD can think $700, they have to move enough to get an ROI for a new part, it not like they've geldings and are just needing a home. AMD needs to insure they can recoup engineering and set-up, while splitting that over enough wafers. I don't see this as a boutique product I think they see it as a full-production offering no different the Tahiti was 2 years ago.

I'm wanting a $550 MSRP.
 
Last edited:
Only enthusiast and score bitching worshiper...

Heck,some of my colleagues believe AMD processor and graphics doesn't do gaming.They even boasting their i3+650Ti will decimate my FX8350+CF 7970 :wtf:

LOL
so true...

Even my father would prefer to choose intel celeron (i'm not sure it's dual cores) and its gpu rather than amd A8-4 series (4 cores) and its igpu.

@casecutter : count me in, i would be happy to get them in cfx and put it under water.
 
Last edited:
Tell me about power while playing a game, not Furmark ... who knows which is throttling more.
 
First no one buys this level of card for 1920x1080p.

And yet again, another 60Hz monitor owner. Please go back to your cave. :banghead::banghead::banghead:

thanks for pointing out, lad :toast:
Most of the gamers will choose mainstream card.Why in hell would you think otherwise?
Only enthusiast and score bitching worshiper will look otherwise.

So a gamer with enough money to spend on nice hardware (anyone with a job really) is an enthusiast? Wow.
 
Last edited:
And yet again, another 60Hz monitor owner. Please go back to your cave. :banghead::banghead::banghead:



So a gamer with enough money to spend on nice hardware (anyone with a job really) is an enthusiast? Wow.

And what monitor(s) do you own?
I will stick with my "lackluster" 60Hz 30" Dell and wait for the 4K monitors to come down in price. Does this mean I am in a cave as well? I absolutely love my monitor and see no reason at this point to go to 120Hz.
 
LOL
so true...
Even my father would prefer to choose intel celeron (i'm not sure it's dual cores) and its gpu rather than amd A8-4 series (4 cores) and its igpu.

I know you won't...even you show up in AMD Lounge...hahaha :laugh:
Not even a year and i already miss everyone in last gath @ bandung :toast:

So a gamer with enough money to spend on nice hardware (anyone with a job really) is an enthusiast? Wow.

Why would someone spend $600 or $1000 for stupid card that cannot even attract a woman with nice boobs?Yup...that's enthusiast.

And what monitor(s) do you own?
I will stick with my "lackluster" 60Hz 30" Dell and wait for the 4K monitors to come down in price. Does this mean I am in a cave as well? I absolutely love my monitor and see no reason at this point to go to 120Hz.

Might have something to add,no matter how fast your monitor,windows only sees them in 60Hz.It's in the panel,not in OS'es or graphic card.
FYI,I have 240Hz panel and yet still have a severe judder :shadedshu
 
Why would someone spend $600 or $1000 for stupid card that cannot even attract a woman with nice boobs?Yup...that's enthusiast.

Ah, the naivety of youth. I enjoy having both. ;)
 
I love the smell of new hardware in the morning.

whether these are better than Nvidia cards or not, I still love it when new GPU's come out, all the speculation and "discussion" is interesting and sometimes fun to read.

Personally for me, a 280x or a 7970.

I'm with you on that. A 280x is looking more and more like the bee's knees.
 
Back
Top