Monday, September 23rd 2013

Radeon R9 290X Pictured, Tested, Beats Titan

Here are the first pictures of AMD's next-generation flagship graphics card, the Radeon R9 290X. If the naming caught you off-guard, our older article on AMD's new nomenclature could help. Pictured below is the AMD reference-design board of the R9 290X. It's big, and doesn't have too much going on with its design. At least it doesn't look Fisher Price like its predecessor. This reference design card is all that you'll be able to buy initially, and non-reference design cards could launch much later.

With its cooler taken apart, the PCB is signature AMD, you find digital-PWM voltage regulation, Volterra and CPL (Cooperbusmann) chippery, and, well, the more obvious components, the GPU and memory. The GPU, which many sources point at being built on the existing 28 nm silicon fab process, and looks significantly bigger than "Tahiti." The chip is surrounded by not twelve, but sixteen memory chips, which could indicate a 512-bit wide memory interface. At 6.00 GHz, we're talking about 384 GB/s of memory bandwidth. Other rumored specifications include 2,816 stream processors, four independent tessellation units, 176 TMUs, and anywhere between 32 and 64 ROPs. There's talk of DirectX 11.2 support.
It gets better, the source also put out benchmark figures.

The R9 290X is significantly faster than NVIDIA's GeForce TITAN graphics card among the two games it was tested on, Aliens vs. Predators 3, and Battlefield 3. It all boils down to pricing. AMD could cash in on its performance premium, by overpricing the card much like it did with HD 7990 "Malta," or it could torch NVIDIA's high-end lineup by competitively pricing the card.
Source: DG's Nerdy Story
Add your own comment

142 Comments on Radeon R9 290X Pictured, Tested, Beats Titan

#102
TheoneandonlyMrK
Hayder_Masteri feel high power usage here :rolleyes:
i could'nt care less so long as my psu can handle it and it oc's like a beast (TPU not EPU)but your probably wrong anyway if rumours are to be believed and this is "as" or more efficient than prior alternate vendor cards.

oh hang on you were just trollin werent you, helpful comment dude:shadedshu
Posted on Reply
#103
Casecutter
buildzoidThe yields for silicon scale according to transistor count much more than the size of the silicon and the gpu will have around 40% more silicon so the yields will be a bit more than 40% lower. That's why a full gk 110 cost so much more than a gk104.
Looking to learn here...

So to number of transistors within the die (chip) has more effect on what each chip costs, and more than the number of die’s they harvest form each wafer? I always thought whatever they (AMD/TSMC) feel can be create within the die is free (within limits). Unless such increase also increase the amount of discarded area between those die's, but I always figured that waste is basically accounted for by the 18% growth from each harvested candidate.

From what you're saying a GK110 which that it has less transistors is less costly to produce, even though the each physical harvested die is much larger?
Posted on Reply
#104
N3M3515
the54thvoidI'm not wrong in my suppositions.

Core size is far less relevant than what the die space comprises of. It's still reckoned to be one of AMD's biggest dies. So 28nm or not, the efficiency of the architecture is paramount.

Bioshock is also an AMD sponsored title - should play better on their hardware, not worse.

There is no point in AMD releasing a card that cannot comfortably surpass the competitors flagship. The early benches also show that the Titan beats the unnamed Radeon card in every synthetic benchmark (and I know we all know benchmarks mean nothing).

I guess we all need to see what happens on Wednesday. I'm not surprised if it beats Nvidia's greatest in most metrics. But again, clocks and maximum overhead play a huge role.

Hell, an overclocked well cooled 780 beats a stock Titan.

If I'm wrong and it doesn't hump Titan, I really hope it has a lot of headroom (some sources say it doesn't). It will be a bit dull if the two cards are close.
Acording to those benchs, next gen amd is about 38% faster than 7970Ghz Ed. That's not bad considering 6970 was about 20% faster than 5870, which happens to be the same case since both where on the same node. What's the big tragedy then?
Posted on Reply
#105
EarthDog
84% of all statistics are made up on the spot.
Posted on Reply
#106
TheoneandonlyMrK
CasecutterLooking to learn here...

So to number of transistors within the die (chip) has more effect on what each chip costs, and more than the number of die’s they harvest form each wafer? I always thought whatever they (AMD/TSMC) feel can be create within the die is free (within limits). Unless such increase also increase the amount of discarded area between those die's, but I always figured that waste is basically accounted for by the 18% growth from each harvested candidate.

From what you're saying a GK110 which that it has less transistors is less costly to produce, even though the each physical harvested die is much larger?
As you increase individual die size less can be made per wafer.
As you increase transistor density the chance of a failing transistor increases as imperfections cause sections to fail completely or not perform upto speed
So density/complexity and die size x node size /maturity = price per chip to amd
Posted on Reply
#107
razaron
erockerI didn't like the Tahiti's XT price either but I don't know, maybe we were spoiled a little bit. I remember x850XT's, 6800 Ultra's and the like going for $500-600 bucks and then competition got fierce for a while in the pricing department.

I am hoping this happens again and Nvidia's current pricing is just the high point in a succession of peaks and valleys. Fingers crossed AMD doesn't base pricing on that structure.
I was bored so I factored in inflation. $500 in 2005 (6800 Ultra) is worth $598.76 in 2013. According to this, anyway.
Posted on Reply
#108
erocker
*
razaronI was bored so I factored in inflation. $500 in 2005 (6800 Ultra) is worth $598.76 in 2013. According to this, anyway.
Seems pretty spot-on. :)
Posted on Reply
#109
Casecutter
theoneandonlymrkAs you increase individual die size less can be made per wafer.
As you increase transistor density the chance of a failing transistor increases as imperfections cause sections to fail completely or not perform upto speed
So density/complexity and die size x node size /maturity = price per chip to amd
Agree completely with that way of stating it... What AMD has on its side at this time is maturity of process. Something that was solely missing when TSMC and AMD went with what amounted to "risk production back" in late 2011 and into 2012 to get Tahiti. I believe those same issues side-lined the GK100, while we didn't see a GK110 about till Q4 2012 as a Tesla K20 part, while Titian took till Feb 21, 2013.

I would agree if they tried this revision of architecture and on a die shrink (20Nm) it would be too much, they get it worked-out-here and then basically then spin it down to 20Nm with some other optimization through "Pirate Islands" and they minimize risk. Not like AMD did debuting a whole new GCN architecture on a die-shrink. I think the price increase for the harvested chips will be consonant if not improved. And why I maintain the rational AMD will harvest 3 true derivatives, not just a XT/LE, but more what Nvidia did with a GK104.
Posted on Reply
#110
the54thvoid
Intoxicated Moderator
EarthDog84% of all statistics are made up on the spot.
And only 20% of those are accurate.
Posted on Reply
#111
PopcornMachine
the54thvoidAnd only 20% of those are accurate.
Seriously beat me to it. Dang it! :mad:
Posted on Reply
#112
erocker
*
the54thvoidAnd only 20% of those are accurate.
Five out of ten dentists agree. :D
Posted on Reply
#113
Lou007
Allow me to put it into perspective for you, in New Zealand a Titan starts at $1890 as does a Gtx690 and a gtx780 will go for about $1350. A Radeon 7990 will set you back a cool $1190. Now with that said can you honestly say a Titan is worth that premium? If the R9 290X is priced at the 7990 price point which if I'm not mistaken is what the suit from AMD said would happen, then why would you even bother with NVIDIA products.
Posted on Reply
#114
TheoneandonlyMrK
Lou007Allow me to put it into perspective for you, in New Zealand a Titan starts at $1890 as does a Gtx690 and a gtx780 will go for about $1350. A Radeon 7990 will set you back a cool $1190. Now with that said can you honestly say a Titan is worth that premium.
Surely you see nvidia were and are in a hard place with titan they left dp compute performance in so could only price it so low without quadro buyers feeling ripped yet there only plan now involves a whole new process paradym shift ie tsv connected edram onto a v hot typically gpu , danger will Robinson.
I am very eager to see how 2014 pans out all in.

Im still on for one of these but it will have to stretch the 7970 a fair bit as at 1080p I can get away with a ghz 7970 and value before epeen counts to me.
Posted on Reply
#115
Xzibit
Matt Skynner: Corporate Vice President & GMThey’re coming in Q4. I can’t reveal a pricepoint but we’re looking at more traditional enthusiast GPU pricepoints. We’re not targeting a $999 single GPU solution like our competition because we believe not a lot of people have that $999. We normally address what we call the ultra-enthusiast segment with a dual-GPU offering like the 7990. So this next-generation line is targeting more of the enthusiast market versus the ultra-enthusiast one
I know companies don't always follow common sense but here is a thought.

7990 is selling for as low as $599-$649 now. So taking that statement id be surprise if it was priced higher then there dual-gpu solution.

Less than 22hrs to go

Posted on Reply
#116
Fluffmeister
Lou007Allow me to put it into perspective for you, in New Zealand a Titan starts at $1890 as does a Gtx690 and a gtx780 will go for about $1350. A Radeon 7990 will set you back a cool $1190. Now with that said can you honestly say a Titan is worth that premium? If the R9 290X is priced at the 7990 price point which if I'm not mistaken is what the suit from AMD said would happen, then why would you even bother with NVIDIA products.
Sucks to live in middle earth then, move closer to civilization.
Posted on Reply
#117
ensabrenoir
fluffmeistersucks to live in middle earth then, move closer to civilization.
15 yards for unnecessary roughness!!!:roll:


Posted on Reply
#118
Steevo
Tonight at midnight eastern or midnight oceanic time? Is W1zz in Hawaii, or has he had the card and posting a review?


When will I get GTA5 on PC with one of these?
Posted on Reply
#119
The Von Matrices
I hope this card does away with the biggest thing I hated about Tahiti - the recessed die. Tahiti is the only GPU with the die below the level of the shim, making all but 79xx-specific coolers incompatible with it. You can't use a universal cooler on a 79xx card, and you can't use a 79xx cooler on any other card. I'm using copper shims with my MCW82's, but it's far from an optimal solution for thermal conductivity. For the good of making coolers more compatible, I hope that the Tahiti recessed core was a one time thing.
Posted on Reply
#120
Xzibit
SteevoTonight at midnight eastern or midnight oceanic time? Is W1zz in Hawaii, or has he had the card and posting a review?


When will I get GTA5 on PC with one of these?
Its at 9:00am Hawaii local time..

Here is the webcast livestream link. It should have a timer countdown to your time-zone.
AMD Webcasts Product Showcase at GPU '14

In-case you miss it. It will be posted on AMD YouTube channel afterwards.
Posted on Reply
#124
KranK_
Hmm?

Where are the benchmarks of the Titan beating the R9 290X? The Titan beats it in quite a few benchmarks.If this is all ATI has...they are in trouble.:twitch:
Posted on Reply
#125
erocker
*
KranK_Where are the benchmarks of the Titan beating the R9 290X? The Titan beats it in quite a few benchmarks.If this is all ATI has...they are in trouble.:twitch:
How so? We're working on leaked and rumored benchmarks that seem to be saying just the opposite.
Posted on Reply
Add your own comment
May 4th, 2024 00:42 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts