• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce GTX 980 SLI

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,831 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
NVIDIA's $550 GeForce GTX 980 shook up the high-end graphics card market, and today, we are testing two of these cards in SLI. This killer combination will let you build an Ultra HD capable gaming system, or enjoy smooth fragging with G-SYNC Surround.

Show full review
 
Last edited:
Nice review and take this with a grain of salt, it's a question and nothing else.

Why use an AMD driver that is a known slower driver?

In my tests of your 14.6 beta driver vs the latest 14.7 RC3, the RC3 is much faster :wtf:
 
Nice review and take this with a grain of salt, it's a question and nothing else.

Why use an AMD driver that is a known slower driver?

In my tests of your 14.6 beta driver vs the latest 14.7 RC3, the RC3 is much faster :wtf:

They've also not used the 344.16 WHQL driver that is specifically for the 970/980. They're using the buggier 344.07's that have caused some issues. Let's call it a balance :laugh:
 
They've also not used the 344.16 WHQL driver that is specifically for the 970/980. They're using the buggier 344.11's that have caused some issues. Let's call it a balance :laugh:
Yeah I don't follow Nvidia drivers so I wouldn't know, I know a few of us asked wizz some time ago what his reason was for the AMD driver choice was but I cant remember.

Im impressed with this new card but would I spend over a grand compared to what I have, Nope! Maybe next time around but hey who knows ;)
 
Why use an AMD driver that is a known slower driver?

In my tests of your 14.6 beta driver vs the latest 14.7 RC3, the RC3 is much faster :wtf:
This gets asks all the time, and W1z has proven time and time again that drivers don't make a noticeable performance difference. Even the ones that were claimed to give 25%+ increases only proved to give 1-5% in his real world tests.
 
Why use an AMD driver that is a known slower driver?
the underlying reason is that rebenching on new driver takes ages .. like 2 weeks non stop. i'm not sure if it's really a slow driver, we'll see next rebench, which will be sometime in october, once new games are out. and maybe amd manages to release a whql driver till then as well.

i would like to use the latest drivers for every review too, but it's simply not possible with that many games and cards. if we had only 4 cards to compare to, in 5 games, 3 resolutions, maybe. but i doubt anyone would want that.
 
Interesting performance lead drop off from 1600p to 6K and then 4K when compared with the R9 295. At 1600p it had a 16% lead, then that drops to 9 and then 7% by the time you reach 4k. If there is a dual card, it might just go neck and neck at 4k by the time all is said and done.
 
I can't decide between waiting for the gtx 980 ti or grabbing another gtx 980.
 
crysis_3840_2160.gif


This seems to be very wrong (at the same time 5760x1080 scales quite nicely - 58% above a single card).
 
Nice review and take this with a grain of salt, it's a question and nothing else.

Why use an AMD driver that is a known slower driver?

In my tests of your 14.6 beta driver vs the latest 14.7 RC3, the RC3 is much faster :wtf:


It's a question of having high FPS with massive frame drops, or having low-tuned FPS with less frame drops. You'll see this more when watching videos on Youtube. Rightclick with your mouse on the screen, and enable "stats for nerds." Last beta driver would drop large amount of frames for smoother video play. RC3 drops significantly less. The old 13.8 beta would drop 1 to 2 frames over 5 minutes, but the playback wasn't as smooth as it is now.

Whole point of G-SYNC is to tackle the issues of frame drops, reduce tearing, and promote better video playback quality. You get a similar situation with RadeonPro's Dyanimc Framerate Lock and the upcoming Free-Sync. For now, I guess GTX 970 and 980 are really dependent on G-SYNC like a crutch to have less frame drops.

@ Wizzard,
Nice work, but I think your work is getting less and less thorough or informative. I'd prefer to see benches from Guru3D or others.

GTX 970 SLI has massive frame drops, and in some games, it has some video blemishes. It's safe to assume that GTX 980 in SLI has the same issues. Most likely, it's a driver issue, or an issue with this new generation of Maxwell GPUs. I'll probably wait for Maxwell Titan, or the 2015 Maxwell refresh. Anxious to see R9-300s from AMD.
 
Going to wait for the 980ti version. Then buy two and SLI. AMD lost my business when they made the R9 295X2 without HDMI 2.0 support nor offered a mini display port to HDMI 2.0 @60hz adapter. Keep an eye out someone will pick up my old 295X2 fairly cheap :-)
 
I can't decide between waiting for the gtx 980 ti or grabbing another gtx 980.

The current iteration of the GTX980 uses a fully enabled GM204 core, so I don't really see a 980Ti in the cards, however, big Maxwell isn't out yet, and that's gonna be a monster of a card, specially if TSMC is able to crank their 20nm process.

In my personal experience, my 980s boost to 1400MHz+ during regular use, and under water clocks of 1.5GHz or more might be sustainable without any throttling, my advice to you is to get a 980 and a water block to go with it, it doesn't get much better than that :)

Although, for its price, a pair of 970s can't be beat in terms of raw performance for your money, you can get two of those cards for the price of a single 980 and a waterblock, and I bet they would beat any hypothetic 980ti in terms of performance :)
 
Nice review. Would be nice to see a 780 ti SLI setup in there as well and maybe 780 SLI. The prices on the 780 ti have come down substantially now and I think I'll be ordering a second MSI 780 ti for SLI.
 
Interesting performance lead drop off from 1600p to 6K and then 4K when compared with the R9 295. At 1600p it had a 16% lead, then that drops to 9 and then 7% by the time you reach 4k. If there is a dual card, it might just go neck and neck at 4k by the time all is said and done.
I really wouldn't even bother including 1600x900 in any comparative metric, you're obviously running into a CPU limitation on more than a few benches...
crysis_1600_900.gif
ac4_1600_900.gif
wow_1600_900.gif

...there's also some quirky behaviour in SLI scaling - or lack of
Surround has some degree of scaling....
crysis_5760_1080.gif
..................4K has none...
crysis_3840_2160.gif
. Wolfenstein will crater both SLI and CrossfireX from a value prespective, and AMD's CFX suffers with Diablo 3: RoS
 
the underlying reason is that rebenching on new driver takes ages .. like 2 weeks non stop. i'm not sure if it's really a slow driver, we'll see next rebench, which will be sometime in october, once new games are out. and maybe amd manages to release a whql driver till then as well.

i would like to use the latest drivers for every review too, but it's simply not possible with that many games and cards. if we had only 4 cards to compare to, in 5 games, 3 resolutions, maybe. but i doubt anyone would want that.

You have a misunderstanding here. you should focus on quality rather than quantity. There is not a single well known review site that tests more than 8 - 10 games. the game count should come down to 8 - 10 games at max. Frankly beyond that game count the quality of the review is affected. you should clearly display the settings using a screenshot. there is no point in testing at a certain quality setting at the highest resolution (4k) when the fastest card is less than 30 fps. Eg: Crysis 3 at 4k. If you want to keep AA settings consistent across all resolutions atleast go down to 2x AA in this game. Given that you chose to go no AA in Batman Arkham Origins at 4k you should be able to test different AA settings for the same game and focus on smooth playability atleast for the main card being reviewed. Another example is you chose to go no AA instead of 2x AA in Farcy 3 and Tombraider at 4k. Again in WatchDogs too higher AA levels are possible but not applied especially given that the fastest cards are 50+ fps and close to 60 fps. . Labeling of charts is inconsistent. Eg : Crysis 3 has no AA mentioned for 5760 x 1080 and 4k. But the performance indicates that AA is being applied at these resolutions. Games like Wolfenstein New Order need to be removed from the test suite as they are not popular and both Nvidia and AMD multi GPU solutions don't work. Diablo 3 can be removed too as it runs at very high fps even on single GPU and is not a good game to really test these powerhouse GPUs. Ideally pick the games which are both popular and really stress the GPU like WatchDogs , BF4, AC4. For multi GPU focus on just 2560 x 1600, 5760 x 1080 and 3840 x 2160. There seems to be a lot of effort but less thought going into your reviews. definitely use the latest available driver for the review otherwise your review could be misleading. Both Nvidia and AMD introduce performance improvements and/or stability fixes in their latest drivers.
 
Power draw has to be lower because even if we were to shabbily multiply the power draw of a single GTX 980 by two (which is really the worst-case scenario), power draw would still be lower than with a single R9 295X2, by a staggering 100W.
Hold up a sec, 2x980's would only draw 350watts maybe up to 400, And well AMD said tdp of 295x2 was 500watts but mostly every reviewer seen more like 600watts in testing and 290x generally pulls 300watts. So really power difference is 200-250watts.
 
Test the 290x or 295x2 with Mantle, it runs so much better than DX11 I don't understand these benchmarks. Unless you want to results not to be Nvidia on top.

Nobody with 290x or two of them will use DX11 unless they want to record using two GPU's Xplit works with one in borderless mode, just compare the graphs in BF4 using "perfoverlay.drawgraph 1" and you will see which is better it's night and day. The 290x's GCN 2.0 architecture is ideal for Mantle specially BF4, other cards specially GCN 1.0 or 1.1 might not work with mantle so well but the 290x is a beast on it.

Specially multiplayer Mantle is so much better because it can handle random data quicker and the CPU doesn't need to make as many draw calls on the GPU and the GPU just does what it's meant to without distractions.

Singleplayer is more like a synthetic benchmark like Firestrike, there is no surprise on what is going to happen, all cards have been optimised and do not give any real world correlation to actual Multiplayer performance that almost consists of 99.9% of BF4 gameplay.

BF4 has had a issue with the last patch released on July 8, Mantle crossfire has a memory leak and degrades over time. It has been fixed in CTE for a few months and will be released possibly next week on the full BF4 game when Final Stand is released.

I have been playing the vanilla BF4 on this last patch with crossfire off and it runs awesome only crossfire is affected but this will be fixed soon. Mantle on the 290x and crossfire has been playing amazing since 14.3.

There was a Mantle crossfire memory leak in the beginning when Mantle first came out when you Alt Tabbed out of the game at the end of a round while it was loading but Dice and AMD fixed it around 14.2 / 14.3 and it's been flawless since till this last patch that DICE stuffed up, looks like next week it will be fixed :)
 
Without more than 4 monitors (all lacking features) in the market, i.e. more UHD, 4K, G-Sysnc, or OpenSync, enabled monitors, none of us is going to be affected by this review, but it is very interesting (maybe near) future music.
 
Since 3-way 970 SLI is about the same cost ($990-$1050) as 2-way 980 SLI ($1098) would be interested in seeing those benchmarks. So for ~$1100 which yields better performance and by how much?
 
Obviously, STiLL not a single card or SLI or Cross Fire can beat Crysis3 Game with 4k resolution for at least 60 FPS...
I am dissapointed... I am sticking with 1080p!
 
Since 3-way 970 SLI is about the same cost ($990-$1050) as 2-way 980 SLI ($1098) would be interested in seeing those benchmarks. So for ~$1100 which yields better performance and by how much?
I would like to see this aswell for the same reasons :)
 
You have a misunderstanding here. you should focus on quality rather than quantity. There is not a single well known review site that tests more than 8 - 10 games. the game count should come down to 8 - 10 games at max. Frankly beyond that game count the quality of the review is affected. you should clearly display the settings using a screenshot. there is no point in testing at a certain quality setting at the highest resolution (4k) when the fastest card is less than 30 fps. Eg: Crysis 3 at 4k. If you want to keep AA settings consistent across all resolutions atleast go down to 2x AA in this game. Given that you chose to go no AA in Batman Arkham Origins at 4k you should be able to test different AA settings for the same game and focus on smooth playability atleast for the main card being reviewed. Another example is you chose to go no AA instead of 2x AA in Farcy 3 and Tombraider at 4k. Again in WatchDogs too higher AA levels are possible but not applied especially given that the fastest cards are 50+ fps and close to 60 fps. . Labeling of charts is inconsistent. Eg : Crysis 3 has no AA mentioned for 5760 x 1080 and 4k. But the performance indicates that AA is being applied at these resolutions. Games like Wolfenstein New Order need to be removed from the test suite as they are not popular and both Nvidia and AMD multi GPU solutions don't work. Diablo 3 can be removed too as it runs at very high fps even on single GPU and is not a good game to really test these powerhouse GPUs. Ideally pick the games which are both popular and really stress the GPU like WatchDogs , BF4, AC4. For multi GPU focus on just 2560 x 1600, 5760 x 1080 and 3840 x 2160. There seems to be a lot of effort but less thought going into your reviews. definitely use the latest available driver for the review otherwise your review could be misleading. Both Nvidia and AMD introduce performance improvements and/or stability fixes in their latest drivers.

what utter shite.

how can any review using more titles and game engines be a bad thing?

if people wanted opinions and not apples vs apples results then we would all be watching youtube "reviews".

but i do look forward to seeing your reviews which follow those concepts in the future :)
 
definitely use the latest available driver for the review otherwise your review could be misleading. Both Nvidia and AMD introduce performance improvements and/or stability fixes in their latest drivers.

This is the only thing I'll respond to, but he will either (as stated) have to reduce the number of games, reduce the number of cards or have the cards on different drivers. The last one is a huge nono, the others are what sets this site apart. Too many reviews out there that only does three cards and three games and that is just annoying.

I mean look at this:

perfrel_1920.gif


It would be nice to have the same graph on a per game basis, but still. This is massively useful.
 
Back
Top