• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon R9 290X Priced at $729.99 on Newegg.com

I think that anything over 650$ for the retail card is just greed on AMD's side unless the 290x really stands out vs the 780.
IT's greed on either side and "Greed is Good" for the share holders. Get too greedy and folks can just not purchase, bad for share holders. That's how it works, not saying it right it's the mentality of companies beholden to sharholders. Run the price up as high as your marketing feels it will bare for the volume you have to sell.
 
Funny. Peeps were under the impression it would cost $599, not $729.

So they are actually releasing Hawaii GPU for $599, just not Hawaii Pro :laugh: R9 290X will probably come with a premium at $699
 
IT's greed on either side and "Greed is Good" for the share holders. Get too greedy and folks can just not purchase, bad for share holders. That's how it works, not saying it right it's the mentality of companies beholden to sharholders. Run the price up as high as your marketing feels it will bare for the volume you have to sell.

At this price point, 1 sell = 3 sells with a healthly profit. You could argue that there are more 'parts' that make up a higher end card but those are cheap, so is 'better' silicon. Add on some $20 bling and you're sustainable.
 
At this price point, 1 sell = 3 sells with a healthly profit. You could argue that there are more 'parts' that make up a higher end card but those are cheap, so is 'better' silicon. Add on some $20 bling and you're sustainable.
True enough to an extent. The mark up on higher priced cards is proportionally higher than 2-3 lower tier cards.
2-3 lower tier cards might sell for substantially less than one high priced card, but their component fit-out - 2-3 x PCB's, I/O, GDDR5 chips, power delivery and logic components, cooling, packaging and shipping will still be proportionally higher- and if the larger die yields well you also have a likely minimal overall fabrication cost penalty- if any.
 
too expensive at all :banghead:
 
To AMD: Make Mantle UE3.5, UE4, Unity3D, Source(1&"2") and maybe whatever Ubisoft's using and you've got the performance lock-down.

Frostbyte 3 & whatever Eidos is using are a great start.


And here's a little joke: If I've got a quad @ 4GHz, 8GB of RAM & a R7 260X GPU... will I be able to run any XBO-era game up to it's EOL @ 900p or such? :p
 
This is fake news
It says 9999.99 now and everyone can modify that price

What I found out though is that if you type in R9 290X on amazon, you will get a wonderful Sapphire Radeon HD 7950 3GB DDR5
 
I think that anything over 650$ for the retail card is just greed on AMD's side unless the 290x really stands out vs the 780.

Especially when you think about the fact that the R9 280X (7970) can be had for $300. 2 of those cards would surely and handily beat the R9 290X/780/TITAN. At $500 then the solution makes sense, and $600 your just getting it for the simplicity of 1 card, at $700 :wtf:.
 
This is fake news
It says 9999.99 now and everyone can modify that price

What I found out though is that if you type in R9 290X on amazon, you will get a wonderful Sapphire Radeon HD 7950 3GB DDR5

LOL, of course it is i just found it funny finding it in the content of their website at the time.
 
Youch, thats too hefty for my tastes, 2x280s or 270s will suffice
 
I think I'll stick with my 7870 XT for a while. I'll probably pick up 280X or two at some point, but not any time soon.

Any word of GTX 770 price cut?
 
Saw this posted @XS.

http://www.xtremesystems.org/forums...ands-details&p=5209737&viewfull=1#post5209737

2600K@5 GHz
3dmarkocjvswi.png
 
^ Looks like about the same grapics score a Titan gets... Not that I`m fully convinced of the validity of it or anything.
 
^ Looks like about the same grapics score a Titan gets... Not that I`m fully convinced of the validity of it or anything.

Not quite, short by about 700 points on the graphic department, but you have to consider that this are probably beta drivers they're using for that run, you can even see the discrepancies in frame pacing compared to Titan, the lines are much flatter when you compare both. Chances are the driver used for that run is not optimized for the 290X, that is, like you said, if this is a real benchmark.

This is what a single Titan scores:

Singletitan.jpg
 
Not at all surprised at the pricing.
AMD isn't here to run a charity.
 
hope AMD would reduce price soon. alot of people must be disappointed at the launch price
 
Not quite, short by about 700 points on the graphic department, but you have to consider that this are probably beta drivers they're using for that run, you can even see the discrepancies in frame pacing compared to Titan, the lines are much flatter when you compare both. Chances are the driver used for that run is not optimized for the 290X, that is, like you said, if this is a real benchmark.

This is what a single Titan scores:

http://img.techpowerup.org/131006/Singletitan.jpg

Um, no. Well, maybe your WC/OC`d ones. :D
But we`re talking about stock here... well, w/ each brand`s implementation of turbo/self-OC I guess...

931600.jpg

firestrike-extreme.png


Either way, I expect AT LEAST a little improvement w/ newer drivers, maybe not how R7000s did since early 2012 `till now, but still...
 
Last edited:
Um, no. Well, maybe your WC/OC`d ones. :D
But we`re talking about stock here... well, w/ each brand`s implementation of turbo/self-OC I guess...

Either way, I expect AT LEAST a little improvement w/ newer drivers, maybe not how R7000s did since early 2012 `till now, but still...

GTX Titan, at 993 core, memory at stock, 1.15 volts. (custom bios with boost eliminated, this is about the standard stock boost speed with fan on higher settings to keep thermals down.) Almost every Titan goes to 993 as a default top boost and stays there. Reviewers that don't touch fan profiles will eventually get lower scores.

Titan at stock (boost) is only 9% faster than the reported 290X run.

Note gpu-z reported 1006Mhz but AB reported 993Mhz.

Untitled.png


And this is overclocked... :laugh:

25% faster than stock. :p

Untitled126.png


I should point out that in the initial leaks of the 290X performance it was noted the AMD card lost in every benchmark, even though it won in most of the games. This might not be very well optimised for benchmarks at this stage so I wouldn't be concerned with them at all. After, all, benchmarks are just for fun and you don't buy a gfx card for that (well, most folk don't). It's the games performance that counts and i think the 290X will work out very well.
 
Last edited:
I see two different Titans, base clock difference.
I see comprehension fail...of the heavily overclocked variety.

the54thvoid has already explained that his Titan is running at 993MHz (AB reported).

A stock Titan, while nominally 834MHz and 876Mhz boost, will actually in stock trim and with adequate cooling run at pretty much it's highest speed bin in 3DMark Firestrike.

The highest speed bin for a stock titan without voltage modification is in fact 992MHz.
Untitled.jpg

As you can plainly see from the reported clockspeeds on this page, the stock Titan will indeed peg the maximum boost state in games, let alone an intensive graphics benchmark. Since Firestrike will peg GPU usage close to 100% it soon becomes apparent that the Titan will sit at the highest freq. it can unless thermally limited.
 
Um, no. Well, maybe your WC/OC`d ones. :D
But we`re talking about stock here... well, w/ each brand`s implementation of turbo/self-OC I guess...

http://hwbot.org/image/931600.jpg
http://legitreviews.com/images/reviews/2144/firestrike-extreme.png

Either way, I expect AT LEAST a little improvement w/ newer drivers, maybe not how R7000s did since early 2012 `till now, but still...

Yes, I must admit that this is the SC model that runs a little faster out of the box than the base model :) I didn't OC the card for that run per se, but my WC will provide a higher thermal headroom for more turbo boost, however the CPU was running at 4.3Ghz compared to that 2600 at 5Ghz, the physics score was actually lower than that supposed 290X run, its probably not a perfect comparison, but it should give you a ballpark figure for reference :)

But like you said, chances are that run, if real, was probably made using early drivers, just check the frame pacing graphic, it's all over the place compared to more recent AMD drivers.

Thing is, if AMD has its way that probably won't even matter, as developers will most likely choose to support mantle if the architecture is shared by all next gen consoles and AMD PC cards, like I said, it could represent a paradigm shift in PC video performance, so people should look at this card as a future proof investment, in my personal case I already have a rig waiting for these cards once they are released, but at $729 a pop for this limited edition, I prefer to wait for the vanilla version, I don't really care about BF4 premium, if AMD releases the not limited edition of this card at $599, it'll have a winner in its hands :)
 
Last edited:
Thing is, if AMD has its way that probably won't even matter, as developers will most likely choose to support mantle if the architecture is shared by all next gen consoles and AMD PC cards..
Game developers won't "choose" Mantle, they will code for it if AMD throw money their way in the same way that game developers have always added features.
Not so sure about the common API reasoning either. Sony have their own console API's (2 + a wrapper I believe- neither of which is Mantle), and I don't see Microsoft ditching D3D any time soon. Mantle is pretty much PC and architecture (and game engine at this stage) specific, and once the hyperbole is stripped away it looks like a welcome addition to PC gaming but not the second coming of Christ. The fact Steam looks linked to Nvidia and Intel wont help propagate Mantle in its current guise.
AMD claims that they can do 9x the draw calls with Mantle...........Before everyone gets too excited, we will not see a 9x improvement in overall performance with every application. A single HD 7790 running in Mantle is not going to power 3 x 1080P monitors in Eyefinity faster than a HD 7970 or GTX 780 (in Surround) running in DirectX. Mantle shifts the bottleneck elsewhere.

What kind of improvements can we expect? Performance will increase when things are not shader bound. We probably could see an improvement in geometry handling (higher polygon scenes). Other than that though, do not expect miracles. In applications that require multiple shading passes, complex lighting, tessellation and deformation, we might see improvements in the single digit percentages, perhaps as high as the low teens.
[Source]
 
Game developers won't "choose" Mantle, they will code for it if AMD throw money their way in the same way that game developers have always added features.
Not so sure about the common API reasoning either. Sony have their own console API's (2 + a wrapper I believe- neither of which is Mantle), and I don't see Microsoft ditching D3D any time soon. Mantle is pretty much PC and architecture (and game engine at this stage) specific, and once the hyperbole is stripped away it looks like a welcome addition to PC gaming but not the second coming of Christ. The fact Steam looks linked to Nvidia and Intel wont help propagate Mantle in its current guise.

[Source]

I don't think anyone in their right mind is expecting a 9X performance leap (particularly on older software and cards) out of this mantle initiative, however, I think it's reasonable to expect a few developers to jump ship to AMD if they play their cards just the right way.

There's a precedent, remember back in 2004 when some HL2 benchmarks leaked running much faster on Ati hardware than on Nvidia's cards? When the game was finally released the Source engine made full use of the R300 potential, and even mid level cards like the Radeon 9500 would out perform the green team's flagship at the time (the FX5800) by quite a measurable margin.

Sure, the particulars of that debacle were very different from what we have today, but truth is, AMD is in an enviable position before the release of these new cards, as sole providers for both major next gen consoles they'll use that leverage to help developers make a rendering engine get much "closer to the metal" on most gaming platforms that share this architecture and reduce costs by making it much easier to port games between Xbone, PS4 and PC without loss of performance in the process.

Sure, Mantle is a PC initiative, and it may be in the interest of MS and Sony not to provide the necessary tools to make this porting so easy, but developers working on GCN based GPUs will find it much easier to program to these particular hardware targets and make such optimizations available when releasing a particular game on multiple platforms.

Look, I might be playing devil's advocate here, but the potential is there, and AMD has made everyone fully aware that they intend to take advantage of their position on multiple occasions, so it's not a stretch to think that we may see history repeat itself like the glory days of the 9700Pro.
 
Look, I might be playing devil's advocate here, but the potential is there, and AMD has made everyone fully aware that they intend to take advantage of their position on multiple occasions, so it's not a stretch to think that we may see history repeat itself like the glory days of the 9700Pro.
Well, ideally you'd hope that the Mantle initiative gives MS some cause for thought on overhauling DirectX, but knowing how agile MS are, I wouldn't hold my breath.
A lot will depend on how committed (cash) AMD are to making it work. Activision would seem to a natural bandwagon jumper so I could see them adopting it- but I can't really see much benefit in easing workload on an engine that produced stratospheric framerates in any case.
Ideally the API needs to raise the game i.q. levels substantially over DX/OGL to make the software anything other than a bullet point...if it doesn't then it becomes just another feature like TressFX or PhysX, and of course if it does then we end up full circle from the advent of the gaming GPU, because dollars to donuts the other two vendors will follow suit....the good old, bad old days of Glide, Matrox Simple Interface, RRedline, SGL etc...
 
Back
Top