Thursday, January 29th 2015

AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price

AMD decided to cash-in on the GeForce GTX 970 memory controversy, with a bold move and a cheap (albeit accurate) shot. The company is making its add-in board (AIB) partners lower pricing of its Radeon R9 290X graphics card, which offers comparable levels of performance to the GTX 970, down to as low as US $299.

And then there's a gentle reminder from AMD to graphics card buyers with $300-ish in their pockets. With AMD, "4 GB means 4 GB." AMD also emphasizes that the R9 290 and R9 290X can fill their 4 GB video memory to the last bit, and feature a 512-bit wide memory interface, which churns up 320 GB/s of memory bandwidth at reference clocks, something the GTX 970 can't achieve, even with its fancy texture compression mojo.
Add your own comment

181 Comments on AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price

#76
Pumper
NC37Who cares? DX12 is coming. Within not too long, all these cards will be worthless in value as everyone will shift to 12. Even if games don't support it yet, the desire to get on board with a new standard will be enough. Specially with M$ finally getting off their butt thanks to AMD Mantle showing them their butt.
lol, what are you talking about? Every single DX11 card will be able to use DX12 as it's just a software update.
Posted on Reply
#77
xfia
yeah DX12 is going to work for the jaguar apu's in the xb1 and ps4... exactly what that hardware needs
Posted on Reply
#78
Big_Vulture
Who is changing it to the double power hungry slower Radeon, that is stupid:roll:. Maybe $250 would be acceptable for 290X.
Posted on Reply
#79
AsRock
TPU addict
NC37Who cares? DX12 is coming. Within not too long, all these cards will be worthless in value as everyone will shift to 12. Even if games don't support it yet, the desire to get on board with a new standard will be enough. Specially with M$ finally getting off their butt thanks to AMD Mantle showing them their butt.
Some fully support DX12 as i understand.

www.amd.com/en-us/press-releases/Pages/amd-demonstrates-2014mar20.aspx
Big_VultureWho is changing it to the double power hungry slower Radeon, that is stupid:roll:. Maybe $250 would be acceptable for 290X.
Whats that bird that shoves it's head in sand ?, oh yes a ostrich.
Posted on Reply
#80
buggalugs
Big_VultureWho is changing it to the double power hungry slower Radeon, that is stupid:roll:. Maybe $250 would be acceptable for 290X.
It isn't slower. I get 980 performance with my 290X. I get around 15,000 3D marks (11) with a 290X and 4790K and its completely silent, temps are around 69 degrees full load...... If any of you guys have a 970 run 3D mark 11 and post your score.

Im kind of glad I bought my 290X when I did last year, long before the 970/980 came out because all these cards will be worthless once DX 12 cards come out very soon. At least I got a good year out of it, If you just bought a 970/980 when we are so close to Windows 10 and DX12, its going to be a very poor investment for you. DX 12 is a game changer.
Posted on Reply
#81
Prima.Vera
buggalugsIm kind of glad I bought my 290X when I did last year, long before the 970/980 came out because all these cards will be worthless once DX 12 cards come out very soon. At least I got a good year out of it, If you just bought a 970/980 when we are so close to Windows 10 and DX12, its going to be a very poor investment for you. DX 12 is a game changer.
Neh, I don't think so tbh. I hope D3D12 wouldn't be a a"game changer" like D3D11 was over D3D9.1 ;)
Posted on Reply
#82
Xzibit
Most of the game engines would need a heavy re-write and developers are to lazy for that. 12years later and we are still getting DX9 game engines. Not to mention how long its taken to transition to 64bit. Plus they aren't going to spend time and money because bigger texture packs and DLC content are all the rage now a days.

We'll see DX9 game engines being ported over to DX12 and we'll be luck to get newer DX10 ported by 2016. Maybe Microsoft will pump money into a game to have DX12 native but it will just be a showcase product. At the earliest maybe we'll start seeing a handful of games by 2016 Holiday Season.

Oops forgot. DX12 is on XB1 so we are still going to get half ass ports. They are just going to take less effort to port.

If you get passed all that you still have to deal with Patch-apalooza
Posted on Reply
#83
GreiverBlade
NC37Who cares? DX12 is coming. Within not too long, all these cards will be worthless in value as everyone will shift to 12.
sweet utopia... [sarcasm]that's a fact that DX11 was widely adopted on launch date and all DX9/10 cards were rendered worthless/obsolete by that mean[/sarcasm] ... not mentioning that DX12 features are not all mandatory, any DX11/11.2 GPU will be able to hold and run games and be used in W10.

but that's not in the way of the thread, let stick to it.
Posted on Reply
#84
Recus
btarunrThat doesn't mean you can't return a product what doesn't work as advertised. Take them to consumer court, get back not just your money but also legal fees incurred.
So why you can't refund games in Steam? Double standards?
Posted on Reply
#85
btarunr
Editor & Senior Moderator
RecusSo why you can't refund games in Steam? Double standards?
Because you're not buying games, you're buying a licence to play that game, which is subject to Steam's and the game dev's EULAs.
Posted on Reply
#86
64K
btarunrBecause you're not buying games, you're buying a licence to play that game, which is subject to Steam's and the game dev's EULAs.
True. You don't own the games you have on Steam. They can lock you out of your account and games if they want to. It's in the EULA.
Posted on Reply
#87
Recus
btarunrBecause you're not buying games, you're buying a licence to play that game, which is subject to Steam's and the game dev's EULAs.
But the game is stored in your PC so technically you bought him. Also physical copies have to be activated through Steam.
Posted on Reply
#88
64K
RecusBut the game is stored in your PC so technically you bought him. Also physical copies have to be activated through Steam.
Yes, it's stored on your drive but to be able to play most games on Steam you have to be able to log on to Steam to start the game. When you click on the game icon that's what happens. If your account is locked (usually for using hacks in MP or a hack to get free games from Steam) then the game being stored on your drive is useless.
Posted on Reply
#89
lukart
Right now 290's are a steal. For this price who goes for 970 its a 101% nvidia Fanboy. I mean, 290x obviously has more muscle than 970, full 4Gb :D better 4K game play.... any reason left for buying the 970?
Posted on Reply
#90
TRWOV
lukartRight now 290's are a steal. For this price who goes for 970 its a 101% nvidia Fanboy. I mean, 290x obviously has more muscle than 970, full 4Gb :D better 4K game play.... any reason left for buying the 970?
Thermals, loudness and power requirements are some. Granted, the 3 of them are easily dismissable, though. AIB designs take care of the first two and as for the third most people already have >500w PSUs. Power bills would be a factor only if you kept your PC on 24/7 folding or something. Also Physx but that's more like a footnote rather than a full reason (haven't seen many new games using Physx).
Posted on Reply
#91
rruff
TRWOVThermals, loudness and power requirements are some.
QC, reliability, drivers, features.

For power consumption, the 290x uses ~8W more at idle, 43W more with multi-monitor, 67W playing a Bluray, 155W at max. Figure ~$1/yr/watt for 24/7 operation. It can be a lot depending on what you are doing with it.

www.techpowerup.com/reviews/ASUS/GTX_970_STRIX_OC/23.html
Posted on Reply
#92
Sony Xperia S
rruffQC, reliability, drivers, features.
You would believe your rich imagination because you work at nvidia and are very deep into their world?.... :meh: :rolleyes:
Posted on Reply
#93
AsRock
TPU addict
Big_VultureWho is changing it to the double power hungry slower Radeon, that is stupid:roll:. Maybe $250 would be acceptable for 290X.
TRWOVThermals, loudness and power requirements are some. Granted, the 3 of them are easily dismissable, though. AIB designs take care of the first two and as for the third most people already have >500w PSUs. Power bills would be a factor only if you kept your PC on 24/7 folding or something. Also Physx but that's more like a footnote rather than a full reason (haven't seen many new games using Physx).
Power usage might depend to even more so if you use vsync.
Posted on Reply
#94
Hayder_Master
bowwahahaha most funny video i ever see about 970
Posted on Reply
#95
LiveOrDie
Who cares the card still runs the same the 290x is a hair dryer end of story.
Posted on Reply
#96
AsRock
TPU addict
Live OR DieWho cares the card still runs the same the 290x is a hair dryer end of story.
WOW, only 1/2 truths now, your starting to sound like nVidia.
Posted on Reply
#97
rruff
Sony Xperia SYou would believe your rich imagination because you work at nvidia and are very deep into their world?.... :meh: :rolleyes:
I don't work at Nvidia. And I have no dog in this hunt.

If you want some ideas, look at reviews from owners on both sides and see what they say. Nvidia cards have fewer issues than AMD's.
Posted on Reply
#99
GhostRyder
Big_VultureWho is changing it to the double power hungry slower Radeon, that is stupid:roll:. Maybe $250 would be acceptable for 290X.
290X is more powerful than a GTX 970 so that point is invalid...But then again that comment an obvious troll bait...
rruffQC, reliability, drivers, features.

For power consumption, the 290x uses ~8W more at idle, 43W more with multi-monitor, 67W playing a Bluray, 155W at max. Figure ~$1/yr/watt for 24/7 operation. It can be a lot depending on what you are doing with it.

www.techpowerup.com/reviews/ASUS/GTX_970_STRIX_OC/23.html
QC is about the same dude including reliability, the differences posted are normally very small except in cases involving massive problems with certain specific cards which is not something seen very often. Drivers are just as fine no matter what brand your using so that argument is completely irrelevant and the same goes for features because both sides have a counter to each feature within reason.

On top of that power consumption is already proven time and again to be a moot point except in small situations. In most cases you would have to run a card for such a high amount of time under load through the year to really equate to power differences becoming present on your bill. On top of that it normally would take years of doing that just to equate to a reasonable price difference between the cards especially when one card is cheaper than the other. Not to mention you have to include people who use Vsync or similar which alleviates a lot of stress off the GPU and lowers the power usage as well. The only major concern for power usage would be a PSU for users which a ~500watt is generally what a gamer buys and will run the card so its still a moot point.

Anyway, either way its funny AMD is doing this to cash up on people returning the card with that type of joke add. Either way I am sure they are going to get some sales with that price on their cards since they are still one of the best high resolution performing GPU's out there at the moment. Prices so good on high end gaming cards more people can join the fray and get some serious gaming cards for a good price.
Cybrnook2002Lightning at Newegg is $299 after rebate, that's an awesome deal for a mean overclocker:

www.newegg.com/Product/Product.aspx?Item=N82E16814127787
Dang, now I wish I wanted/needed one of those variants.
Posted on Reply
#100
Sony Xperia S
rruffNvidia cards have fewer issues than AMD's.
That's good for the stupid because they never have to think that probably the problem is in them or in their system. :laugh: :rolleyes:
Posted on Reply
Add your own comment
Apr 25th, 2024 18:23 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts