• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 1080 8 GB

Exactly, I had plans to go GTX 1080 (from the Nano) but given the prices (in AU) and given the not so great improvement at AMD titles, I think Im going to wait a bit. The same seems to apply when AMD hardware runs slower on nvidia backed games. Really wish there was a better way to measure performance.

Basically the whole nvidia vs AMD thing is going to pan out over which vendor is backing the most number of titles, heh...
Wait for the Vega10 GPU, 1st 14nm true enthusiast GPU - coming to you October. ;) Jokes, I think the Nano is still very good, but if you need something better, wait for the Vega.
 
Nice card very impressive stuff.

Only thing that kinda bothers me it's unnecessarily long.
 
I feel sorry for AMD, even their own fanboys compare this to the 980 Ti.
 
I feel sorry for AMD, even their own fanboys compare this to the 980 Ti.

With every generation we compare the new high end with last gens high end single GPU card. It seems like a reasonable card, but at around 27% better performance, even less if you compare to last gens non-reference/overclocked cards, its not ground breaking. To me, this is the bare minimum improvement to expect from a new process node. Personally I would have expected at least 50% improvement over 980Ti/FuryX. They obviously left plenty in reserve for a refresh or 2 and the Ti versions.

The kind of people interested in spending the money on this card are 980Ti and FuryX owners, and theres not a huge incentive to upgrade. With the temp situation, even less, I would be waiting for a non-reference version.

It looks like GPUs are going the same way as CPUs at smaller process nodes, super low voltage at stock, but any overclocking and temps rise really fast.
 
With every generation we compare the new high end with last gens high end single GPU card. It seems like a reasonable card, but at around 27% better performance, even less if you compare to last gens non-reference/overclocked cards, its not ground breaking. To me, this is the bare minimum improvement to expect from a new process node. Personally I would have expected at least 50% improvement over 980Ti/FuryX. They obviously left plenty in reserve for a refresh or 2 and the Ti versions.
The problem with Pascal is, it's a compute GPU architecture, whereas Maxwell was a pure Gaming Architecture, that's why it's not such a big jump (ressources wasted on transistors that aren't helpful for games).
 
The problem is this isn't high-end Pascal, the fact that it beats everything at the moment is just the inconvenient truth and they are charging accordingly.

We all look forward to AMD's response.
 
As a negative, I would have put the back plate.
Being two pieces doesn't give it the needed support, in fact the removable section is dead weight which may add to the cards distortion as time goes on.
The high temperatures under load may be a reason for the power limit, hopefully something remedied by the partners as they release cards.
AMD you almost had it right with the Fury X, but failed to implement HBM2 in time. :shadedshu:
 
As a negative, I would have put the back plate.
Being two pieces doesn't give it the needed support, in fact the removable section is dead weight which may add to the cards distortion as time goes on.
The high temperatures under load may be a reason for the power limit, hopefully something remedied by the partners as they release cards.
AMD you almost had it right with the Fury X, but failed to implement HBM2 in time. :shadedshu:

Actually, that metal shroud is quite strong and fitted firmly to the pcb, just like the non-stealth-angled one on the 780 and Titan before. It is quite enough support, trust me. On that Founder's edition, the back plate serves no purpose but looks.
 
Actually, that metal shroud is quite strong and fitted firmly to the pcb, just like the non-stealth-angled one on the 780 and Titan before. It is quite enough support, trust me.
I thought I had read that the fan shroud was plastic, nice if it is an alloy of some kind.
 
NVidia driver BS so they could claim 1080 is more than it really is? It's brilliant marketing that NVidia has employed many times over now. We all know that after some time drivers pull back performance of previous generation as the driver focuses on newer generation?

I personally have noticed retracted performance on my 980; Dirt Rally used to give me average 45 FPS with my used settings @ 2560X1600, but now, it's 32-34 FPS. That's a considerable drop.

Dirt Rally has new "Daily" races every single day, with certain tracks repeated over and over again, but using different cars. So I play it every day, noticed the difference right away since I have STEAM overlay display FPS in all titles.

If this is factually correct and verified as such, why isn't the entire Internet boiling about it, and why isn't the lawsuits flying?
 
I thought I had read that the fan shroud was plastic, nice if it is an alloy of some kind.

Hmm, I will have to go re-read! :-)
If they cheaped out, instead of alloy shroud like they used to use, that would make the $100 premium even less explainable.
 
If this is factually correct and verified as such, why isn't the entire Internet boiling about it, and why isn't the lawsuits flying?
I think it is because they can justify the changes in drivers and refer to older cards as legacy, no longer supported in new drivers.
Users can revert back to known working drivers, or "upgrade" their GPU, which is a win, win for the manufacturer.
 
If this is factually correct and verified as such, why isn't the entire Internet boiling about it, and why isn't the lawsuits flying?
I dunno? :p Because you can restore past performance by changing back to older drivers? I asked questions, and made one statement about one titles performance. Not the first time new drivers released for upcoming titles affect other things. o_O
 
I think that 295x2 is a lot more impressive to be honest. That thing managing well!

Any actual benchmarks with the R9 Pro Duo in the future? I expect it to be better than the 1080 if the 295x2 is holding up this good.

I think that 295x2 is a lot more impressive to be honest. That thing managing well!

Any actual benchmarks with the R9 Pro Duo in the future? I expect it to be better than the 1080 if the 295x2 is holding up this good.

And not trying to take away from the 1080. Certainly a huge performance upgrade. Lets see what AMD can do in response.
 
Last edited by a moderator:
I dunno? :p Because you can restore past performance by changing back to older drivers? I asked questions, and made one statement about one titles performance. Not the first time new drivers released for upcoming titles affect other things. o_O

You said they've done it many times. Dropping old cards from new drivers is one thing, but if new drivers intentionally cripples cards that aren't even old... That's a different thing entirely.
 
Then compare it to R9 380.
After all, it also ends with 80.
Makes as much sense.

Price wise, 1080 currently sits between 980Ti and Titanium X, 980 is a different tier entirely. (789 Euros FFS)
Nobody is forcing you to buy Founders Edition. Wait a few weeks and get a cheaper custom card. It's not like every 1080 will be priced at $699 and above.
 
Wait for the Vega10 GPU, 1st 14nm true enthusiast GPU - coming to you October. ;) Jokes, I think the Nano is still very good, but if you need something better, wait for the Vega.

Vega is 5 months out at a minimum. Card now is 1080. 5 months after Vega you get daddy Pascal (almost twice the chip) which may well be GP102 and tweaked to provide more gaming centric purpose. The main Pascal chip if clocking similar to 1080 will obliterate anything before it.
Worse thing from your standpoint is that 5 months after Vega 10 you will also have the full Vega chip.
Saying half a year wait for Vega 10 is just ignorant of tech market conditions. If you wait like that, you may as well wait for the 'proper' Vega chip.

For now, the FE 1080 is by reviewer conclusions, an incredible feat but too expensive. If you can get partner OC and power relaxed cards for same or less price, then you'll have a ridiculously fast piece of kit.
In fact rather than wait for Vega 10, people ought to just wait till June for the the partner cards. I'll wait for EK to release some custom partner blocks too.
 
Then comes Pascal refresh at 14nm GTX 1085, 1075 etc
 
I've always supported the underdog for the sake of competition, so that's AMD, but this is like OM(F)G.
Incredibly fast, incredibly efficient. Well done nVidia. Too bad I stil sit on a 1080p monitor ... and this doesn't make much sense for me, but maybe 1060 ... will.
 
I have to say, at 1080p (which is what I use) it's basically 40% faster than GTX 980. That's quite significant. Also the overclock scaling, in give game, OC gave the user 20 additional frames per second. That's a lot. May vary depending on game but still.

Btw, GPU Boost thingie is actually a legit cheat tool. Look at it from a different perspective, in actual games people play, it'll always heat up so much the clock will drop quite a lot because of it, meaning you're not really getting the performance you were advertised in a card review where benchmark shoot up the clocks, was benchmarked on that and card never really had enough stress to start drastically downclock the GPU through the GPU Boost. And between different games, it has a time to cool down enough to boost it up more on next testing cycle.

I think reviewers should pre-heat the graphic card to its normal operating temperature and then do a benchmark to give users the actual performance they'll be getting from the card. This is especially important considering how far Pascal is pushing the clocks and how that affects the performance. Sure, it looks great in a review and those 40% boost over GTX 980 is amazing, but is it really 40% in actual gaming when you play for a hour or two and card really heats up properly? I think it's not. And that is important imo.
 
Great review but, I do find it funny that when AMD releases a GPU with 8GB of VRAM and it gets a Con for "8 GB VRAM provides no tangible benefits" but, when nVidia does it, suddenly it's not a con anymore. In fact there is practically no mention of the 8GB of VRAM beyond the title, intro, and a single bullet in the Pro list in the conclusion. I just find that interesting. Did 8GB suddenly become worth while? Are these games utilizing more than 4GB with the 1080 when it didn't with other nVidia cards? I might just be looking into it too much but, I always see cons for large amounts of VRAM and I'm wondering why it's not a con anymore. It's very interesting timing...

AMD cards with 8GB of VRAM were all using Hawaii, which doesn't have enough horsepower to drive the higher resolutions/VR that would actually use that extra VRAM. Pascal does, hence why the 8GB is actually useful.
 
Just to burst any theory bubbles: SMP can only work from the same viewpoint, so imagine in a 3d shooter being able to use the mouse to look anywhere you want, but you can not walk somewhere else
They are using SMP for simultaneous stereo in VR and each eye has a different position. One eye has 4 projections from one point and other eye 4 projections from another position, total 8 projections.
If they are really imposing restrictions for projection origin it's obvious from VR example that's not because it's required.
However, complete and total freedom with projections could mess up ability to partially load the game world, to use level of detail and algorithms like frustum culling or occlusion culling, but IMO that's not a reason to disable it.
 
You said they've done it many times. Dropping old cards from new drivers is one thing, but if new drivers intentionally cripples cards that aren't even old... That's a different thing entirely.
It's not nefarious. They reach the pinnacle of performance for a design, and then focus on the new gen. So a driver optimized for a newer gen doesn't run as well as a driver in the past gen, especially when there are significant changes in architecture. I may have given the wrong idea in my post, but that's my sarcasm, yet again. :p They intentionally focus on the new cards, and this affects the older cards. It only sucks if you play all games as they come out and are constantly changing drivers, but if you stick to older drivers and play the same games of the same generation as the cards, there aren't any problems. So I keep updating the cards I use, and deal with it, as to me it's no different than having to buy a new console every time they release one.

As to NVidia not causing problems by doing so, and nobody complaining about it, there are countless articles and threads on countless sites complaining of newer drivers affecting performance in specific titles or outright stability issues.
 
NVIDIA has placed the bar high. Here comes AMD...
5142227_orig.jpg

I hope not but I am a realist. :cry:
 
Back
Top