• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Should I stay with GTX970 or switch to a R9 290x?

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
18,874 (3.07/day)
Location
UK\USA
Processor AMD 3900X \ AMD 7700X
Motherboard ASRock AM4 X570 Pro 4 \ ASUS X670Xe TUF
Cooling D15
Memory Patriot 2x16GB PVS432G320C6K \ G.Skill Flare X5 F5-6000J3238F 2x16GB
Video Card(s) eVga GTX1060 SSC \ XFX RX 6950XT RX-695XATBD9
Storage Sammy 860, MX500, Sabrent Rocket 4 Sammy Evo 980 \ 1xSabrent Rocket 4+, Sammy 2x990 Pro
Display(s) Samsung 1080P \ LG 43UN700
Case Fractal Design Pop Air 2x140mm fans from Torrent \ Fractal Design Torrent 2 SilverStone FHP141x2
Audio Device(s) Yamaha RX-V677 \ Yamaha CX-830+Yamaha MX-630 Infinity RS4000\Paradigm P Studio 20, Blue Yeti
Power Supply Seasonic Prime TX-750 \ Corsair RM1000X Shift
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ Wooting Two HE
Benchmark Scores Meh benchmarks.
Keep the 970 and add another if you ever decide to upgrade your monitor/

He be better of replacing that PSU with a quality one, than getting a second card.
 
Joined
Jan 18, 2012
Messages
751 (0.17/day)
System Name My PC
Processor i7 4790k @4.4ghz
Motherboard Gigabyte z97m-d3h
Cooling Corsair H105
Memory 4x4GB Corsair Dominator Platinum 2133-9-11-11-31-1T
Video Card(s) GTX970 Stric oc
Storage Samsung 840Pro 512GB
Display(s) Asus ROG SWIFT
Case Lian Li 359
Audio Device(s) Denon DA-300USB / Denon AH-D5000
Power Supply Corsair AX860
Mouse Roccat Kone Pure Optical
Keyboard Corsair K70
Software Win10 64-bit home
The whole thing with the GTX970 is that "nVidia lied" and not that "the card is crap". It took months for super-geeks to actually figure out that the card had a problem, and from what I understand they randomly found that out. I am sure that in the near future there will be an official or unofficial method of dealing with the problem (e.g. limit the vRAM the system can see to 3.5GB)

Apart from that, as everyone said before, it's near impossible to reach 3.5GB on a single 1050p/1080p monitor.

Very important about Skyrim: Modding skyrim with tons of texture mods makes the game crash on many systems. I personally can't get the game to run with all the mods without crashing or performing badly and I have a GTX Titan with 6GB vram, so don't judge your card by skyrim, its just wrong. In fact anything near 4GB vRam use in skyrim makes the game behave terribly on my card.
 
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
haha this thread is funny especially seeing as how the worst advice is from a mod.. yeah sli that pos and have the possibility for the worst sli experience ever haha

yeah thats a hyper cache and is better than 3.5gb haha if the judges are nv fanboys they have nothing to worry about it seems..
 
Joined
Dec 23, 2012
Messages
1,705 (0.41/day)
Location
Somewhere Over There!
System Name Gen2
Processor Ryzen R9 5950X
Motherboard Asus ROG Crosshair Viii Hero Wifi
Cooling Lian Li 360 Galahad
Memory G.Skill Trident Z RGB 64gb @ 3600 Mhz CL14-13-13-24 1T @ 1.45V
Video Card(s) Sapphire RX 6900 XT Nitro+
Storage Seagate 520 1TB + Samsung 970 Evo Plus 1TB + lots of HDD's
Display(s) Samsung Odyssey G7
Case Lian Li PC-O11D XL White
Audio Device(s) Onboard
Power Supply Super Flower Leadex SE Platinum 1000W
Mouse Xenics Titan GX Air Wireless
Keyboard Kemove Snowfox 61
Software Main: Gentoo+Arch + Windows 11
Benchmark Scores Have tried but can't beat the leaders :)
Like others mentioned, stay at 970 if your monitor is only up to 1080p. Heck even my 750ti is already good enough for me @ 1080p. I am not a MSAA or ultra detail junkie. High settings is good enough as I dont see much difference going from high to ultra.
 
Joined
Oct 24, 2009
Messages
430 (0.08/day)
Location
Belgium
System Name Illidan
Processor AMD Ryzen 9 5900X
Motherboard Gigabyte B550 Aorus Pro V2
Cooling Scythe Mugen 4
Memory G.Skill Trident Z 32GB DDR4 3000MHz 14CL
Video Card(s) AMD Radeon RX 6900 XT
Storage Crucial P1 1TB + Sandisk Ultra II 960GB + Samsung EVO Plus 970 2TB + F3 1TB + Toshiba X300 4TB
Display(s) Iiyama G-MASTER G4380UHSU-B1
Case Corsair 750D Airflow
Audio Device(s) Sony WH1000-XM4
Power Supply Seasonic Focus PX-850
Mouse Logitech G604
Keyboard Corsair Vengeance K70 (Cherry MX Red)
Software Windows 11 Pro
haha this thread is funny especially seeing as how the worst advice is from a mod.. yeah sli that pos and have the possibility for the worst sli experience ever haha

yeah thats a hyper cache and is better than 3.5gb haha if the judges are nv fanboys they have nothing to worry about it seems..

When will you stop bashing Nvidia? Or stop filling entire threads with fanboy comments? You don't even own a GTX970. And by the looks of it, your radeon card would probably struggle as well when the frame buffer is using 4GB on high resolutions (especially in the future) because you will be GPU limited. Take that from a computer engineer. There is no reason not to buy a GTX970 on 1080p (except budget wise), and it will be future proof (as is a R9 card) at that resolution.

I'd take my 'POS' GTX970 any day over radeon cards. They've let me down as well several times and far worse than Nvidia did now.
 
Joined
May 13, 2008
Messages
669 (0.11/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
The 970 should generally be fine for his needs.

In his situation, were size/energy not a concern, I would pick the 290x. Not because of the extra 512MB alone, but because it is a consistent experience with more absolute gpgpu capability. The 970 is not a bad card, and in many situations they are over-all similar, but yes...issue remains with the accessing the last .5GB (which I really do hate repeating) does indeed cause hitching/stutter/latency/whatever you want to call it (maybe I need to make a video or something, but proof already exists many other places). Will he often, if ever, experience that at 1680x1050 or even 1080p? I don't know. Generally, I would not think so, but I also do not partake in Skyrim let-alone modded with extreme textures (When you're a Dark Souls kind of guy, Skyrim is kind of sacrilege, and fwiw those will run decently on almost anything, even modded to a fairly significant degree because of their last-gen console roots). The only reservation (for typical usage) I would give is using it over 1080p (or rather expecting 1440 to not cause at least *some* issues associated with it), but that is not the case here.

I typically play games at 1440p with mine, because 1440p->4k scaled looks pretty darn good to me (it meets my diminishing returns quota) on my tv where-as 1080p->4k doesn't especially. I have learned to live with the less-than-optimal frame rate and the occasional hiccup associated with the buffer setup in some situations, many of which I do believe would not be there if the 970 memory system was different. It kind of sucks, but again, it was, and is, still the best card for me (and many others) at this point (for me the issue is hdmi 2.0 and the price difference for a 980).

I respect if others have different opinions or way of seeing things, but that has been my experience.
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
It took months for super-geeks to actually figure out that the card had a problem, and from what I understand they randomly found that out.

From what I know, they only figured it out because some of the monitoring tools that tell memory usage were not reporting more than 3.5GB in use when the same game with the same settings were using over 3.5GB with the GTX980. This lead to them assuming that the card was actually only using 3.5GB. That was not the case.* It was just the monitoring tools did not know how to read the usage of the extra 0.5GB so they would only report 3.5GB used. That lead to where we are now, and now we know the 0.5GB is used, it is just slower than the rest of the VRAM(but still way faster than accessing system RAM).

*Shadow of Mordor seems to be the only game that for whatever reason does not use the extra 0.5GB, it instead when it goes over 3.5GB it starts paging out to system RAM. That is why it stutters so badly with the HD texture pack. People have tried to point to SoM to show what happens when that extra 0.5GB is accessed, but that is not what happens when the extra 0.5GB is accessed, that is what happens when you start to page out to system RAM. And in my testing the stuttering is pretty much just as bad with the R9 290X 4GB(yes, I actually own them both). The HD texture pack for SoM says it requires 6GB of VRAM, and they mean it. You will get stuttering with less than that, the 4GB on the 290X does not noticeably help here.

I am sure that in the near future there will be an official or unofficial method of dealing with the problem (e.g. limit the vRAM the system can see to 3.5GB)

The thing is, the card is faster with the 0.5GB than if it just had 3.5GB. I don't understand why people think that the 0.5GB is actually hurting the card's performance. That 0.5GB is still way faster than paging out to system RAM, which is what happens when you run out of VRAM, and what causes the really noticeable stuttering. If anything it is a buffer to help prevent paging out to system RAM and seeing noticeable stuttering. The only bad thing here was the marketing...

yeah sli that pos and have the possibility for the worst sli experience ever haha

And yet it is basically the best 4K option for the money.

You don't even own a GTX970.

I see so many people that don't even own the card talking like they know how it performance. Going on about how bad the stuttering is...they hear...

Take it from someone that has used both the 970 and the 290X, the 970 is the better card and the stuttering is not noticeable. The 290X was not smother than the 970.

The whole thing with the GTX970 is that "nVidia lied" and not that "the card is crap".

That is the thing that gets me. I'm surprised they don't remember when AMD released a card, let it go through all the reviews, then put out a driver(and all future drivers) that reduced the performance of the card to stop it from dying, because the stock cooler was insufficient and cards were overheating to death...

Or when AMD released cards with 1GHz advertised core clocks that would actually drop down as low as 550MHz after only a few minutes of gaming because of thermal throttling...

But yeah, everyone things nVidia is so terrible for lying...they lied about how their 4GB card actually had 4GB of memory, and they said it has 64 ROPs when it actually has 64 ROPs... The only thing they really "lied" about was the amount of L2, which can basically be explained away as a misprint, and it was on a private spec sheet that wasn't even released to the public anyway because the amount of L2 is not an advertised spec.
 
Joined
Nov 9, 2010
Messages
5,654 (1.15/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
Keep the 970 at anything up to 1080p. It will also go a bit beyond depending on the games.

On Dx support, Maxwell is designed to support the upcoming virtual memory unification via CUDA 6 when it launches, and it's sounding like that will come around the time Dx12 launches. The next architecture though, Pascal, will have hardware supported unified memory via NVLink.

Pascal will be out some time in 2016. I would go high end Pascal before I'd SLI Maxwell, anyday. Pascal will also have stacked DRAM, so they'll likely pack lots of memory into those cards. And they'll be small and very efficient, using a PCB that's only as long as a standard ink pen.
 
Joined
Apr 29, 2014
Messages
4,180 (1.15/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
And yet it is basically the best 4K option for the money.
Not really, the 290X is a better value for 4K... Heck at 4K the difference between the R9 290 and the GTX 970 is 1% according to techpowerup while being significantly cheaper so that would really be the best value 4K setup.

Take it from someone that has used both the 970 and the 290X, the 970 is the better card and the stuttering is not noticeable. The 290X was not smother than the 970.
I have used both as well dude, the 970 is not a better card than the 290X. Really neither of them are better as they perform pretty much on par with the 290X inching ahead the higher the resolution goes up...

That is the thing that gets me. I'm surprised they don't remember when AMD released a card, let it go through all the reviews, then put out a driver(and all future drivers) that reduced the performance of the card to stop it from dying, because the stock cooler was insufficient and cards were overheating to death...

Or when AMD released cards with 1GHz advertised core clocks that would actually drop down as low as 550MHz after only a few minutes of gaming because of thermal throttling...
Umm, no they don't...I own 3 of them all reference coolers and before I took them off not one of them throttled down with the fans on auto "Uber" and I tested using heaven, coin mining, and 3d mark before I was willing to take them apart for my system. "Quiet" mode caused throttling but the fan was heavily limited... Also the driver they released to counter any throttling was fixing the fan speed difference caused by the fans where one fan thought 55% was equal to like 2100RPM versus another that thought 2400RPM was 55% (random example). They made it all uniform by RPM instead which resolved that issue.

When will you stop bashing Nvidia? Or stop filling entire threads with fanboy comments? You don't even own a GTX970. And by the looks of it, your radeon card would probably struggle as well when the frame buffer is using 4GB on high resolutions (especially in the future) because you will be GPU limited. Take that from a computer engineer. There is no reason not to buy a GTX970 on 1080p (except budget wise), and it will be future proof (as is a R9 card) at that resolution.

I'd take my 'POS' GTX970 any day over radeon cards. They've let me down as well several times and far worse than Nvidia did now.
I could argue not to buy a GTX 970 at 1080p, but a cheaper card like the 960, 780, 770, R9 280X, 290, etc. All will give you 60FPS at 1080p pretty easily with eyecandy turned up while costing less than a GTX 970, a 970 is more for 1440p as its really overkill for 1080p.

Either way let the OP do his test and decide since that is what he decided at this point. Its better to see it in person instead so he can decide for himself.
 
Joined
Oct 24, 2009
Messages
430 (0.08/day)
Location
Belgium
System Name Illidan
Processor AMD Ryzen 9 5900X
Motherboard Gigabyte B550 Aorus Pro V2
Cooling Scythe Mugen 4
Memory G.Skill Trident Z 32GB DDR4 3000MHz 14CL
Video Card(s) AMD Radeon RX 6900 XT
Storage Crucial P1 1TB + Sandisk Ultra II 960GB + Samsung EVO Plus 970 2TB + F3 1TB + Toshiba X300 4TB
Display(s) Iiyama G-MASTER G4380UHSU-B1
Case Corsair 750D Airflow
Audio Device(s) Sony WH1000-XM4
Power Supply Seasonic Focus PX-850
Mouse Logitech G604
Keyboard Corsair Vengeance K70 (Cherry MX Red)
Software Windows 11 Pro
I could argue not to buy a GTX 970 at 1080p, but a cheaper card like the 960, 780, 770, R9 280X, 290, etc. All will give you 60FPS at 1080p pretty easily with eyecandy turned up while costing less than a GTX 970, a 970 is more for 1440p as its really overkill for 1080p.
.

Yes, you have a valid point there if you are on a budget. A bit headroom is never bad though. Plus some features like DSR or triple monitor might be too taxing (now or in the near future). But that's not really 1080p :)
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Not really, the 290X is a better value for 4K... Heck at 4K the difference between the R9 290 and the GTX 970 is 1% according to techpowerup while being significantly cheaper so that would really be the best value 4K setup.

We were talking SLI. ;)

There isn't a single GPU out that is 4K capable, so talking about 4K results with single cards is pointless. You have to use at least 2 cards for adequate 4K gaming. And the 970 configuration is the best option. Price wise the 970 and 290X are right about the same(the 290X being about $10 cheaper per card), performance wise at 4K the two combinations are so close the difference wouldn't be noticeable, and the 970 consume half the power and put out half the heat, and the 970s are quieter. So performance wise they are equal, cost wise they are equal, but the 970 edges out with heat, power and noise, which is why I said SLI 970s is the best combination for the money.

I have used both as well dude, the 970 is not a better card than the 290X. Really neither of them are better as they perform pretty much on par with the 290X inching ahead the higher the resolution goes up...

Umm, no they don't...I own 3 of them all reference coolers and before I took them off not one of them throttled down with the fans on auto "Uber" and I tested using heaven, coin mining, and 3d mark before I was willing to take them apart for my system. "Quiet" mode caused throttling but the fan was heavily limited... Also the driver they released to counter any throttling was fixing the fan speed difference caused by the fans where one fan thought 55% was equal to like 2100RPM versus another that thought 2400RPM was 55% (random example). They made it all uniform by RPM instead which resolved that issue.

W1zzard seems to disagree with you. http://www.techpowerup.com/reviews/AMD/R9_290X/30.html

Even with the Uber BIOS, the card starts to throttle instantly. Not to mention the fan sounding like a jet engine. And this was an open test bench. My reference 290X's throttled like crazy in my 650D, the Sapphire Tri-X cards I had, the top one with worse airflow started to throttle too. And when that top card start to throttle, the stuttering is way worse than going over the 3.5GB with a 970, heck it is worse than paging out to system RAM because it isn't a small frame hitch once in a while it is a complete performance loss.

The driver I was talking about was not for the current generation. ;) How quickly people forget history...
 
Joined
Apr 29, 2014
Messages
4,180 (1.15/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
We were talking SLI. ;)

There isn't a single GPU out that is 4K capable, so talking about 4K results with single cards is pointless. You have to use at least 2 cards for adequate 4K gaming. And the 970 configuration is the best option. Price wise the 970 and 290X are right about the same(the 290X being about $10 cheaper per card), performance wise at 4K the two combinations are so close the difference wouldn't be noticeable, and the 970 consume half the power and put out half the heat, and the 970s are quieter. So performance wise they are equal, cost wise they are equal, but the 970 edges out with heat, power and noise, which is why I said SLI 970s is the best combination for the money.
I understood but looking at single card figures we can get a little taste of what to expect from a dual+ card configuration. Scaling is pretty close together and is game dependent so its really just depending on the choices. That said at 4K the conclusions most people put across for 4K (right now) is go for the GTX 980's right now first, R9 290X's second, and then it depends from that point between 970's or R9 290s which to go for.
http://www.hardocp.com/article/2014...970_sli_4k_nv_surround_review/11#.VOqNRGc5CUk (1 Example from HARDOCP)

W1zzard seems to disagree with you. http://www.techpowerup.com/reviews/AMD/R9_290X/30.html

Even with the Uber BIOS, the card starts to throttle instantly. Not to mention the fan sounding like a jet engine. And this was an open test bench. My reference 290X's throttled like crazy in my 650D, the Sapphire Tri-X cards I had, the top one with worse airflow started to throttle too. And when that top card start to throttle, the stuttering is way worse than going over the 3.5GB with a 970, heck it is worse than paging out to system RAM because it isn't a small frame hitch once in a while it is a complete performance loss.

The driver I was talking about was not for the current generation. ;) How quickly people forget history...

They released the driver update pretty fast as I recall something within like 2 weeks I think. Even so I could run all three of mine which start overclocked without throttling under stress, my test bed is (Well was not using it now since I gave sold it) was in a Lanboy Air case which had 1 120 fan side intake even on CFX (Though I only tried two of them at a time). 60% fan speed dropped the temps down to the 80's for me on them though again that was in the lanboy air.

As far as your duo TRI-X cards are concerned, well they are top down cooler cards you have to compensate them in SLI/CFX by fixing airflow to resolve the issue. My friends 780ti rig has 2 gigabyte windforce cards and he had to do that with as its the same scenario where heat gets pushed onto the top card and causes issues. But that's a different discussion altogether...

The driver I was talking about was not for the current generation. ;) How quickly people forget history...
Drivers are supposed to improve things over time, at release some of the cards had issues due to the different stocks fans which all differed in some way until they synchronized them up. Bugs are bugs and it should not have been an issue at release though driver updates are meant for that.

I don't think for 1080p or even 1440p the GTX 970 is a bad choice, heck its probably one of the best cards for 1440p if you want a nice single cards and are concerned with your PSU or power. However for 4K is a different story, but that's not the OP's story hence why I think it would be fruitless to give it up unless he finds it giving him problems when he get it.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
I think the funniest thing is that you link to an article to try to prove your point about 4K and in that article he links to an article about the 290X Crossfire where he talks about the cards throttling even in Uber mode...and heating is room up to uncomfortable levels...

Yeah, but they are great cards for Crossfire 4K...

Plus, in his Apples to Apples comparison, the 970 SLI in BF4 4K 2xAA the 970SLI is a whole 3FPS lower than the 290X Crossfire. Sorry, I'll turn off AA(which isn't really needed at 4k anyway) and get over 60FPS and enjoy half the power consumption and heat output.

And when you start to take into account the fact that you basically can't overclock the 290X because it just throttles anyway reversing your overclock, and don't even think about upping the voltage on air, the 970s look even better.

Drivers are supposed to improve things over time, at release some of the cards had issues due to the different stocks fans which all differed in some way until they synchronized them up. Bugs are bugs and it should not have been an issue at release though driver updates are meant for that.
It still sounds like you are trying to address a driver issue with the 290 cards, I'm not talking about a 290 driver update. AMD, in the past, released a card to market(NOT THE 290 series), let it go through reviews and then a few months after it was on the market released a driver that slowed performance because their reference cooler wasn't good enough. It had nothing to do with the fan either, it was the VRM section, which had not airflow from the fan.
 
Joined
Mar 4, 2005
Messages
3,612 (0.52/day)
System Name TheReactor / HTPC
Processor AMD 7800x3d 5050Mhz / Intel 10700kf (5.1ghz All Core)
Motherboard ASrock x670e Taichi / ROG Strix z490-e gaming
Cooling HeatKiller VI CPU/GPU Block -2xBlackIce GTX 360 Radiators - Swiftech MCP655 Pump
Memory 32GB G.Skill 6000Mhz DDR5 / 32GB G.Skill 3400Mhz DDR4
Video Card(s) Nvidia 3090ti / Nvidia 2080ti
Storage Crucial T700 2TB Gen 5 / Samsung Evo 2Tb
Display(s) Acer Predator xb271hu - 2560x1440 @144hz
Case Corsiar 550
Audio Device(s) on board
Power Supply Antec Quattro 1000W
Mouse Logitech G502
Keyboard Corsair Gaming k70
Software Windows 10 Pro 64bit
If you are not going to be upgrading your monitor anytime soon, then there is little point in spending more for a ~slightly better card. If you are saving money, then save up for a better Monitor and worry about a video card at that time. heck by the time you decide to upgrade to 1440, 1600, 4k monitor the next gen video cards might be out.
 
Joined
Apr 29, 2014
Messages
4,180 (1.15/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
I think the funniest thing is that you link to an article to try to prove your point about 4K and in that article he links to an article about the 290X Crossfire where he talks about the cards throttling even in Uber mode...and heating is room up to uncomfortable levels...

Yeah, but they are great cards for Crossfire 4K...

Plus, in his Apples to Apples comparison, the 970 SLI in BF4 4K 2xAA the 970SLI is a whole 3FPS lower than the 290X Crossfire. Sorry, I'll turn off AA(which isn't really needed at 4k anyway) and get over 60FPS and enjoy half the power consumption and heat output.
Yea but plenty of tests for other games were showing higher which was my point that its the higher performing card plus HardOCP is the type of people that test with settings based on the performance they deem as "Playable" many (If not most) of the times. As for the heating issue, well I am sorry to ask but what gaming computer on the high side does not heat up a room after gaming intensely for awhile. Even when I forced myself to downgrade to two GTX 460's in the past they heated up once I started playing games and brought my rooms temps up. One of the many reasons that since my 9800GX2's I have built custom liquid loops to compensate that. As for power well the only thing I will say on that is the differences are there but power consumption is not going to make much of an impact on usage and what not unless your stressing the machine very intensely. But I won't argue the lower power point other than saying that...

And when you start to take into account the fact that you basically can't overclock the 290X because it just throttles anyway reversing your overclock, and don't even think about upping the voltage on air, the 970s look even better.
Well even on the stock cooler I was able to maintain an 1100 (up from 1030) on the core and a 1400 on the ram with a 20% voltage limit. But I did bump the fan speed to 65% to keep the core clocks stable while benching them so I would expect that the Tri-X, windforce, Gaming, or whatever coolers could easily maintain the clock increase. Heck you can buy some light the lightning from MSI that start at 1090mhz. But to really get some serious clocking power you do need liquid.

It still sounds like you are trying to address a driver issue with the 290 cards, I'm not talking about a 290 driver update. AMD, in the past, released a card to market(NOT THE 290 series), let it go through reviews and then a few months after it was on the market released a driver that slowed performance because their reference cooler wasn't good enough. It had nothing to do with the fan either, it was the VRM section, which had not airflow from the fan.
My apologies I misinterpreted what you were referring to, I was still thinking we were referring to the 290X. Are you referring to Cypress (5870) and the 4870 (R700) VRM throttling issue? Its been awhile since I thought about that since I never actually owned a Cypress chip or a R700 on my own (Though I saw a few and played with them). If that's not what you mean could you elaborate which your referring to (Not to defend but out of curiosity).

Either way though, both cards have their markets and are great in their own respects. I still view the 290X as the better card from a pure performance perspective but the 970 is a very nice single GPU that holds its own as being in the top GPU's right now. Its a good overclocker as well which makes for an excellent performing card especially for a single GPU and gaming at 1440p or two and using something like the ROG Swift (1440p and 144hz). I normally look at something from pure performance and balance on the other parts of the board (RAM for instance), big reason why I had GTX 580's, GTX 295, and many others in the past because I went for pure performance not so much power consumption or other attributes. Its my way of looking and I am a firm believer of buying the best performance for your money regardless of who made the GPU.
 
Joined
Feb 20, 2015
Messages
4 (0.00/day)
Processor Intel Core i5-4690K
Motherboard ASRock Z97 PRO4 Z97
Cooling BOX
Memory HyperX Savage 2x8GB 2400MHz DDR3 CL11
Video Card(s) ?
Storage SSD Intel 530 Series 120GB & HDD Samsung HD103UJ 1TB
Display(s) Samsung SyncMaster 2043NWX
Case Zalman Z1
Audio Device(s) Onboard Realtek ALC892
Power Supply Chieftec A80 CTG-650C
OP DELIVERED:
I'm really tired, after assembling all this stuff. I'll try to test some games tomorrow. :D
I had to buy DVI to D-SUB adapter and screen looks strange...
Windows 10 detects my monitor as "Generic Non-PnP Monitor" - Is this can affect gameplay, fps and other stuff like that? I already set the resolution at 1680x1050. :(
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
As for the heating issue, well I am sorry to ask but what gaming computer on the high side does not heat up a room after gaming intensely for awhile.

Mine doesn't. Even when I had the SLI 970s the temperature of my room didn't go up, same with my 670s. But man, my 470s turned my room into an oven and the 290Xs were even worse! It is winter and I had to open the window in my room to keep from sweating to death with the 290Xs(I used to do the same thing with my 470s).

Even when I forced myself to downgrade to two GTX 460's in the past they heated up once I started playing games and brought my rooms temps up.

Of course you did, because the 460s were power hungry heat producers, like all Fermi cards were.

One of the many reasons that since my 9800GX2's I have built custom liquid loops to compensate that.

Yeah, water cooling doesn't affect room temps one bit...

As for power well the only thing I will say on that is the differences are there but power consumption is not going to make much of an impact on usage and what not unless your stressing the machine very intensely. But I won't argue the lower power point other than saying that...

But that is one of the things you have to consider when talking about the best bang for the buck with dual-cards for 4K. Power consumption is a factor because it does increase the price. And no, I'm not going to try to say you pay more in power usage because I've said plenty of times before that it really doesn't really make a difference unless you pay some insanely high power rates. However, what does make a difference is the power supply you have to buy. You can comfortably run dual 970s on a decent 650w power supply. Crossfire 290Xs requires at least a decent 850w, and even that would be less comfortable than the 650w with the 970s. So the two cards might cost $20 less, but the power supply you need will cost ~$40 more.

My apologies I misinterpreted what you were referring to, I was still thinking we were referring to the 290X. Are you referring to Cypress (5870) and the 4870 (R700) VRM throttling issue? Its been awhile since I thought about that since I never actually owned a Cypress chip or a R700 on my own (Though I saw a few and played with them). If that's not what you mean could you elaborate which your referring to (Not to defend but out of curiosity).

I'm talking about the HD4850 specifically. After all the reviews had been done, and the card had received great praise, AMD released a driver that throttled the cards at load and reduced performance because the VRMs were overheating and burning up with the stock cooler. This was more of a design flaw in the cooler than anything else. They tried for a single slot card and it really should have been dual slot, becuase the single slot cooler provided no airflow over the VRMs. The problem was the driver reduced performance on all cards, including ones with proper aftermarket coolers, so it really did suck. IMO, it was way worse than what nVidia did, at least nVidia was honest about the performance.

OP DELIVERED:
I'm really tired, after assembling all this stuff. I'll try to test some games tomorrow. :D
I had to buy DVI to D-SUB adapter and screen looks strange...
Windows 10 detects my monitor as "Generic Non-PnP Monitor" - Is this can affect gameplay, fps and other stuff like that? I already set the resolution at 1680x1050. :(

Glad to hear you got it put together!

As long as you can set the resolution and refresh rates properly, the Generic PnP monitor thing shouldn't matter. It is probably just because the monitor is older and connected over VGA, and Win10 doesn't have a built in driver for it. Monitor drivers aren't necessary in my experience unless you aren't getting the proper resolution and refresh rates.

Edit: I just looked up your monitor and Samsung hasn't put out drivers since Vista, so that is probably why it is showing up as a Generic monitor. It shouldn't affect anything though.
 
Last edited:
Joined
Jan 22, 2015
Messages
143 (0.04/day)
Location
Argentina
System Name Silent Potato
Processor i5 4590 @ 3.3 / 3.7 (Stock)
Motherboard Asus z97-C
Memory 2x4GB Gskill Sniper 1866
Video Card(s) Gigabyte GTX960 Windforce
Power Supply XFX PRO650W XXX Ed.
I'm talking about the HD4850 specifically. After all the reviews had been done, and the card had received great praise, AMD released a driver that throttled the cards at load and reduced performance because the VRMs were overheating and burning up with the stock cooler. This was more of a design flaw in the cooler than anything else. They tried for a single slot card and it really should have been dual slot, becuase the single slot cooler provided no airflow over the VRMs. The problem was the driver reduced performance on all cards, including ones with proper aftermarket coolers, so it really did suck. IMO, it was way worse than what nVidia did, at least nVidia was honest about the performance.

IIRC 79xx had the same issue with bad VRM cooling, mostly happened on reference/aftermarket blower-type fans. 7950 was a matter of luck, as some were failing (driver crash, BSOD, artifacts, etc) 4-6 months in, and others 2 years after.

Seriously, just for the sake of it, look on google for a couple of 7950's (HiS, Gigabyte, Sapphire, etc) and on Newegg you will see that all have a 15-20% 1-star rating with the issues stated above. I can understand DOA/Faulty cards, but that high rate of failure after a short span?

Wish I'd knew before buying my HiS 7950 Boost. After a year it's already RMA material (no more warranty).
 
Joined
Apr 29, 2014
Messages
4,180 (1.15/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
Mine doesn't. Even when I had the SLI 970s the temperature of my room didn't go up, same with my 670s. But man, my 470s turned my room into an oven and the 290Xs were even worse! It is winter and I had to open the window in my room to keep from sweating to death with the 290Xs(I used to do the same thing with my 470s).
Mines not bad honestly though I do have the ceiling fan on at least low while playing BF4 or the likes. But I never counted my trio of 290X's as unbearably hot under load. Though maybe I am more used to it than others...

Of course you did, because the 460s were power hungry heat producers, like all Fermi cards were.
Fermi was, but I never really counted the twin 460 SE's I had as power hungry. But I guess 250 watts is a bit much (Just randomly looked up a review) but I still stand by what I said.

Yeah, water cooling doesn't affect room temps one bit...
In theory the same amount of heat is being dissipated so no it would not, but there are other things to take into account.

But that is one of the things you have to consider when talking about the best bang for the buck with dual-cards for 4K. Power consumption is a factor because it does increase the price. And no, I'm not going to try to say you pay more in power usage because I've said plenty of times before that it really doesn't really make a difference unless you pay some insanely high power rates. However, what does make a difference is the power supply you have to buy. You can comfortably run dual 970s on a decent 650w power supply. Crossfire 290Xs requires at least a decent 850w, and even that would be less comfortable than the 650w with the 970s. So the two cards might cost $20 less, but the power supply you need will cost ~$40 more.
Ill give you that, though I would argue many people who look at this area already buy/have that level of PSU but your right in that area that it does add to the price.

I'm talking about the HD4850 specifically. After all the reviews had been done, and the card had received great praise, AMD released a driver that throttled the cards at load and reduced performance because the VRMs were overheating and burning up with the stock cooler. This was more of a design flaw in the cooler than anything else. They tried for a single slot card and it really should have been dual slot, becuase the single slot cooler provided no airflow over the VRMs. The problem was the driver reduced performance on all cards, including ones with proper aftermarket coolers, so it really did suck. IMO, it was way worse than what nVidia did, at least nVidia was honest about the performance.
Ah the 4850, well yes I completely agree as its a similar situation about lying/cheating from a company. I had heard about the VRM overheating issue but never looked to deep into it honestly beyond a few articles. Though some were pointing out that it was more for programs like furmark which pushed the GPU's beyond their normal limits but it did seem to be more than just that. But as for which is worse is kinda hard to decide, I guess lying about the VRAM is a problem because its going to shorten its life and harm those who bought for 4K versus ruining performance down the road heavily limited the cards potential. I thought you could get around that though (The VRM throttling) but even if it was possible it does not make it right.
OP DELIVERED:
I'm really tired, after assembling all this stuff. I'll try to test some games tomorrow. :D
I had to buy DVI to D-SUB adapter and screen looks strange...
Windows 10 detects my monitor as "Generic Non-PnP Monitor" - Is this can affect gameplay, fps and other stuff like that? I already set the resolution at 1680x1050. :(
Just means it could not detect everything about the monitor which happens on older monitors. As long as the resolution if right then you are fine!

Wish I'd knew before buying my HiS 7950 Boost. After a year it's already RMA material (no more warranty).
HIS has a two year warranty, you should be able to RMA it and get a replacement.
 

HammerON

The Watchful Moderator
Staff member
Joined
Mar 2, 2009
Messages
8,397 (1.52/day)
Location
Up North
System Name Threadripper
Processor 3960X
Motherboard ASUS ROG Strix TRX40-XE
Cooling XSPC Raystorm Neo (sTR4) Water Block
Memory G. Skill Trident Z Neo 64 GB 3600
Video Card(s) PNY RTX 4090
Storage Samsung 960 Pro 512 GB + WD Black SN850 1TB
Display(s) Dell 32" Curved Gaming Monitor (S3220DGF)
Case Corsair 5000D Airflow
Audio Device(s) On-board
Power Supply EVGA SuperNOVA 1000 G5
Mouse Roccat Kone Pure
Keyboard Corsair K70
Software Win 10 Pro
Benchmark Scores Always changing~
haha this thread is funny especially seeing as how the worst advice is from a mod.. yeah sli that pos and have the possibility for the worst sli experience ever haha

yeah thats a hyper cache and is better than 3.5gb haha if the judges are nv fanboys they have nothing to worry about it seems..
So if I read your cynical post correctly then you feel the OP should go with the R9 290X? Wouldn't it have been easier to state that then attack another's opinion?
As a user of 2 GTX 780's for over two years playing games on a 30" Dell (2560x1600), I still stand by my statement. If the OP were to want to upgrade their monitor to 4K then I would advise against 2 GTX 970's.
Thanks again for your insight though:)
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Mines not bad honestly though I do have the ceiling fan on at least low while playing BF4 or the likes. But I never counted my trio of 290X's as unbearably hot under load. Though maybe I am more used to it than others...

When I had the 470s as my main rig for the longest time(come to think of it, that was the longest I've ever owned a GPU) I was the same way, I didn't really notice the heat build up in the room. Or I compensated for it, with the ceiling fan on high and the window open in the winter. I even went as far as having a remote thermostat put in the room so the A/C would kick on in the summer based on the room's temperature to always keep it at least under 80°F.

When I switched out the 470s for 670s the wife actually asked if I had done something to my computer because the room was so much cooler... When I put the 290Xs in she first complained about the noise, then after my first night of gaming complained about the heat again and asked if I put my old graphics cards back in.:laugh: She was happy to hear I was just testing them and they weren't permanent.

In theory the same amount of heat is being dissipated so no it would not, but there are other things to take into account.

Other things such as?

Ill give you that, though I would argue many people who look at this area already buy/have that level of PSU but your right in that area that it does add to the price.

I disagree. Just look at the OP. He bought a 650w power supply, there are probably a lot of people out there with decent 750-650w power supplies that could SLI 970s without a problem but couldn't Crossfire 290Xs. Not that I would SLI 970s on the OP's Chieftech unit, but I also wouldn't run a single 290X on it either...

Ah the 4850, well yes I completely agree as its a similar situation about lying/cheating from a company. I had heard about the VRM overheating issue but never looked to deep into it honestly beyond a few articles. Though some were pointing out that it was more for programs like furmark which pushed the GPU's beyond their normal limits but it did seem to be more than just that. But as for which is worse is kinda hard to decide, I guess lying about the VRAM is a problem because its going to shorten its life and harm those who bought for 4K versus ruining performance down the road heavily limited the cards potential. I thought you could get around that though (The VRM throttling) but even if it was possible it does not make it right.

It started with Furmark and expanded to other things, including 3DMark(drastically lowering scores compared to reviews) and even some high end games. I believe Crysis was greatly effected.

I don't think the nVidia issue is anywhere near as bad as what AMD did. The nVidia thing does not affect reviews. It doesn't change the performance of the card one bit. It doesn't shorten its life at all either. 4K is as high as these cards will ever see, and the reviews show they perform very well at that resolution. The performance we saw in the reviews has not changed, the only thing that has changed is specs on paper. They haven't gone back after the fact and change the card in any way, the cards the reviewers got are the cards the consumers get.

AMD, on the other hand, went back after the fact and made changes that actually do negatively affect performance. The performance we saw in reviews is not the performance consumers get. Changing actual performance is way worse than changing specs on paper.
 
Last edited:
Joined
Oct 29, 2009
Messages
2,669 (0.50/day)
System Name Old Gateway / Steam Deck OLED LE
Processor i5 4440 3.1ghz / Jupiter 4c 8t
Motherboard Gateway / Valve
Cooling Eh it doesn't thermal throttle
Memory 2x 8GB JEDEC 1600mhz DDR3 / 16gb DDR5 6400
Video Card(s) RX 560D 4GB / Navi II 8CU
Storage 240gb 2.5 SSD / 1TB nvme
Display(s) Dell @ 1280*1024 75hz / 800p OLED
Case Gateway / Valve LE
Audio Device(s) Gateway Diamond Audio EMC2.0-USB 5375U ($15 a long ass time ago), Valve
Power Supply 380w oem / 65w valve USB-C
Mouse Purple Walmart special, 1600dpi. Black desk mat
Keyboard SteelSeries Apex 100 / virtual
VR HMD Lmao
Software Windows 10 / Steam OS
Benchmark Scores It can run Crysis (Original), Doom 2016, and Halo MCC. SD LE 45fps
I'm running HL2 Cinematic Mod with max settings and HD characters at 2560x1440 and the most VRAM i've used has been around 3.2GB, i think at one point it even hit 3.4GB. Still ran fine, up until the part where it crashed at the level where you fight the striders before getting to the Citadel wall, but i think that was more a level design issue. but either way according to gpu-z logs it never hit over 3.5gb. got pretty good FPS the whole time to, never under 50fps
 
Joined
Jan 22, 2015
Messages
143 (0.04/day)
Location
Argentina
System Name Silent Potato
Processor i5 4590 @ 3.3 / 3.7 (Stock)
Motherboard Asus z97-C
Memory 2x4GB Gskill Sniper 1866
Video Card(s) Gigabyte GTX960 Windforce
Power Supply XFX PRO650W XXX Ed.
I'm running HL2 Cinematic Mod with max settings and HD characters at 2560x1440 and the most VRAM i've used has been around 3.2GB, i think at one point it even hit 3.4GB. Still ran fine, up until the part where it crashed at the level where you fight the striders before getting to the Citadel wall, but i think that was more a level design issue. but either way according to gpu-z logs it never hit over 3.5gb. got pretty good FPS the whole time to, never under 50fps

IIRC Gpu-Z won't recognise the last 0.5gb. I saw someone's screenshot going way overboard it and gpu-z showed 3556mb usage or around that. Not your case since it didn't went above 3.4gb, but have that in mind.
 
Top