• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Unveils GeForce GTX 1070 with GDDR5X Memory

So, that's what Maxwell cards were, right, as that's where perf/w of Vega roughly is?
While the DOA space heater might have been pure flamebait - Maxwell is from 2014 at 28nm while Vega is from last year and 14nm.
 
Thinking about Space Heater, I would like a card that doesn't heat up my room when I game. Throttle it to 40C I say.
 
There is an alternative. It's called the faster Vega 56 and 64. The former of which has sunk to, and below, MSRP here in the UK. Vega 56 can be had for £350 + free games. It's faster than 1070 in essentially everything with an exception to some horrifically NVIDIA favouring engines.

Too bad here in the US Vega 56 is an insanely bad value. A good 1070, with an aftermarket cooler, is $335. A good Vega 56, with an aftermarket cooler, is $420. Not really worth it for a unnoticeable performance improvement.
 
Oh that you can buy today, it's called gtx1080...

dd0.gif
 
So, that's what Maxwell cards were, right, as that's where perf/w of Vega roughly is?

That delusional underdog hate is so pathetic.

Is this supposed to be a positive? AMD latest and greatest vega gpu barely competes with the 980ti from 2015 in both power and performance. The rumor mills cards are being place in the same playing field as the midrange gpu die equipped 1080 (GP104). Also not a positive to be as hot as your competitors last generation midrange die. If you count Volta they are what two generations behind easily and couldn't touch big pascal let alone big Volta.

AMD needs something here...
 
What? They were like that always. Rebadged Geforce 2 MX. Remilked GTX8800 for a decade in various forms.

Get a grip people. This thing has been always there since Radeon VE or 7000 and Geforce days.

What's the fuss about it? So what?

GT80 was never rebranded. You're thinking of G92, which first appeared as a GTX 8800GT.
 
I'm losing my hope for humanity. We get a decent bump in memory throughput, and people rant about it like it's a bad thing.

Assuming that all the leaks so far are correct and the GDDR5X is being clocked at the same speed as the GDDR5, then no, there isn't any more throughput here... at least, not at stock.

As for ranting, that's just @ArbitraryAffection being a bell-end.
 
GT80 was never rebranded. You're thinking of G92, which first appeared as a GTX 8800GT.

The real story of G92 was the 8800GTS 512 which was ranked below the 8800GTX/ultra and then in the next generation was rebranded above it as the 9800GTX/+ and the other dozen cards it became lol
 
If this inventory stuff was real, they would lower the prices. This is nothing more than free nGreedia PR.
 
If this inventory stuff was real, they would lower the prices. This is nothing more than free nGreedia PR.

Sales are down, manufacturing was spun up. I absolutely would try and sell at a fuller price before dropping it.
 
But it's old hat now.
And what does that matter? At 1070 model and above, Pascal is still highly capable. Nothing wrong with it.

it's a good thing only Micron made them, though obviously not good for Micron.
Why? I’m quite sure that Nvidia and the AIB’s already paid Micron for them. This is Nvidia and AIB’s way of trying not to take a bath by the overbuying they already did.

If this inventory stuff was real, they would lower the prices. This is nothing more than free nGreedia PR.
Oh look, another of our edgy and cool members using words like “Ngreedia”. :rolleyes: It’s not any more endearing than when other members use “M$“.
 
Last edited:
GT80 was never rebranded. You're thinking of G92, which first appeared as a GTX 8800GT.

I say remilked. Not rebadged. The arch remained same except for the PureVideo, G92 was used for about 30 cards... with the last being GTS 150

Also... during early days, bloody nobody whined about cards transitioning in between SDR, SGRAM, DDR and so on up...

Now something is trying to make a scene out of it... so what?
 
Assuming that all the leaks so far are correct and the GDDR5X is being clocked at the same speed as the GDDR5, then no, there isn't any more throughput here... at least, not at stock.
There is one thing that makes the difference: QDR.
GDDR5 only supports double data rate.
So, even if it's clocked the same, even if there headroom for theoretically crazy OC is locked behind the steel door, we still get the advantage of faster throughput.
 
I say remilked. Not rebadged. The arch remained same except for the PureVideo, G92 was used for about 30 cards... with the last being GTS 150

Also... during early days, bloody nobody whined about cards transitioning in between SDR, SGRAM, DDR and so on up...

Now something is trying to make a scene out of it... so what?

GT240/GTS250 came out after the 150 right? Those were G92 as well.

I want to say there was like a gt330 still based on G92 in oem land as well
 
There is one thing that makes the difference: QDR.
GDDR5 only supports double data rate.
So, even if it's clocked the same, even if there headroom for theoretically crazy OC is locked behind the steel door, we still get the advantage of faster throughput.

When he said "clocked the same" he means effectively clocked the same. The throughput of the GDDR5X version is identical to the GDDR5 version.
 
AMD latest and greatest vega gpu barely competes beats the 980ti from 2015 in both power and performance.
Yeah. You know, those space heaters, the Maxwell cards.
 
GT240/GTS250 came out after the 150 right? Those were G92 as well.

I want to say there was like a gt330 still based on G92 in oem land as well

9800gtx+/gts250 were with 55nm G92b die shrinked from 65nm G92. But that is some ancient history, which has nothing to do with gtx1070 gddr5x. /OT
 
Really Nvidia. First over priced 2000 series cards and now they keep making soup of old meat.

Nvidia is not what they used to be or also they are burnt in with alot of cards after mining has ended its terror rounds.

On the contrary. nVidia is being EXACTLY what they've always been:



 
What? They were like that always. Rebadged Geforce 2 MX.
They made GeForce 4 MX with several models, didn't see a problem with that back then. And at least it had some features from the GF 4 Ti lineup.

Remilked GTX8800 for a decade in various forms.
No, they didn't. The only rebadge was 8800 Ultra which was the same card with different cooler and higher clocks.

It was 8800 GTS 512 which had too many rebrands (9800 GTX, 9800 GTX+, GTS 150 OEM, GTS 250 - and you couldn't tell if they had the original 65nm or die-shrink 55nm chip)
 
The real story of G92 was the 8800GTS 512 which was ranked below the 8800GTX/ultra and then in the next generation was rebranded above it as the 9800GTX/+ and the other dozen cards it became lol

I had a G80 8800GTS which I promptly returned to CompUSA and picked up an 8800GT to replace it. The 8800GT, despite having 112 shaders to the GTX's 128shaders, regularly outperformed the GTX due to it's much higher clock speeds.

That was the crux of why the 8800GT was so amazing. For $300 it was performing the same as or better than the $599-$649 8800GTX.
 
Back
Top