Tuesday, December 5th 2017

NVIDIA Unveils GeForce GTX 1070 with GDDR5X Memory

It looks like NVIDIA bought itself a mountain of unsold GDDR5X memory chips, and is now refreshing its own mountain of unsold GP104 inventory, to make products more presentable to consumers in the wake of its RTX 20-series and real-time ray-tracing lure. First, it was the GP104-based GTX 1060 6 GB with GDDR5X memory, and now it's the significantly faster GeForce GTX 1070, which is receiving the newer memory, along with otherwise unchanged specifications. ZOTAC is among the first NVIDIA add-in card partners ready with one such cards, the GTX 1070 AMP Extreme Core GDDR5X (model: ZT-P10700Q-10P).

Much like the GTX 1060 6 GB GDDR5X, this otherwise factory-overclocked ZOTAC card sticks to a memory clock speed of 8.00 GHz, despite using GDDR5X memory chips that are rated for 10 Gbps. It features 8 GB of it across the chip's full 256-bit memory bus width. The GPU is factory-overclocked by ZOTAC to tick at 1607 MHz, with 1797 MHz GPU Boost, which are below the clock-speeds of the GDDR5 AMP Extreme SKU, that has not just higher 1805 MHz GPU Boost frequency, but also overclocked memory at 8.20 GHz. Out of the box, this card's performance shouldn't be distinguishable from the GDDR5 AMP Core, but the memory alone should serve up a significant overclocking headroom.
Add your own comment

78 Comments on NVIDIA Unveils GeForce GTX 1070 with GDDR5X Memory

#51
megamanxtreme
Thinking about Space Heater, I would like a card that doesn't heat up my room when I game. Throttle it to 40C I say.
Posted on Reply
#52
newtekie1
Semi-Retired Folder
ArbitraryAffection said:
There is an alternative. It's called the faster Vega 56 and 64. The former of which has sunk to, and below, MSRP here in the UK. Vega 56 can be had for £350 + free games. It's faster than 1070 in essentially everything with an exception to some horrifically NVIDIA favouring engines.
Too bad here in the US Vega 56 is an insanely bad value. A good 1070, with an aftermarket cooler, is $335. A good Vega 56, with an aftermarket cooler, is $420. Not really worth it for a unnoticeable performance improvement.
Posted on Reply
#53
jabbadap
newtekie1 said:
Now I'm just waiting for the GTX1070 Ti w/ GDDR5X. That I'll definitely buy!
Oh that you can buy today, it's called gtx1080...
Posted on Reply
#54
newtekie1
Semi-Retired Folder
jabbadap said:
Oh that you can buy today, it's called gtx1080...
Posted on Reply
#55
Dave65
megamanxtreme said:
The first thing I would be looking for is price.
150 above 2070 because Nvidia.:shadedshu::shadedshu:
Posted on Reply
#56
cdawall
where the hell are my stars
medi01 said:
So, that's what Maxwell cards were, right, as that's where perf/w of Vega roughly is?

That delusional underdog hate is so pathetic.
Is this supposed to be a positive? AMD latest and greatest vega gpu barely competes with the 980ti from 2015 in both power and performance. The rumor mills cards are being place in the same playing field as the midrange gpu die equipped 1080 (GP104). Also not a positive to be as hot as your competitors last generation midrange die. If you count Volta they are what two generations behind easily and couldn't touch big pascal let alone big Volta.

AMD needs something here...
Posted on Reply
#57
Slizzo
Ferrum Master said:
What? They were like that always. Rebadged Geforce 2 MX. Remilked GTX8800 for a decade in various forms.

Get a grip people. This thing has been always there since Radeon VE or 7000 and Geforce days.

What's the fuss about it? So what?
GT80 was never rebranded. You're thinking of G92, which first appeared as a GTX 8800GT.
Posted on Reply
#58
Assimilator
silentbogo said:
I'm losing my hope for humanity. We get a decent bump in memory throughput, and people rant about it like it's a bad thing.
Assuming that all the leaks so far are correct and the GDDR5X is being clocked at the same speed as the GDDR5, then no, there isn't any more throughput here... at least, not at stock.

As for ranting, that's just @ArbitraryAffection being a bell-end.
Posted on Reply
#59
cdawall
where the hell are my stars
Slizzo said:
GT80 was never rebranded. You're thinking of G92, which first appeared as a GTX 8800GT.
The real story of G92 was the 8800GTS 512 which was ranked below the 8800GTX/ultra and then in the next generation was rebranded above it as the 9800GTX/+ and the other dozen cards it became lol
Posted on Reply
#60
stimpy88
If this inventory stuff was real, they would lower the prices. This is nothing more than free nGreedia PR.
Posted on Reply
#61
cdawall
where the hell are my stars
stimpy88 said:
If this inventory stuff was real, they would lower the prices. This is nothing more than free nGreedia PR.
Sales are down, manufacturing was spun up. I absolutely would try and sell at a fuller price before dropping it.
Posted on Reply
#62
rtwjunkie
PC Gaming Enthusiast
ArbitraryAffection said:
But it's old hat now.
And what does that matter? At 1070 model and above, Pascal is still highly capable. Nothing wrong with it.

R0H1T said:
it's a good thing only Micron made them, though obviously not good for Micron.
Why? I’m quite sure that Nvidia and the AIB’s already paid Micron for them. This is Nvidia and AIB’s way of trying not to take a bath by the overbuying they already did.

stimpy88 said:
If this inventory stuff was real, they would lower the prices. This is nothing more than free nGreedia PR.
Oh look, another of our edgy and cool members using words like “Ngreedia”. :rolleyes: It’s not any more endearing than when other members use “M$“.
Posted on Reply
#63
Ferrum Master
Slizzo said:
GT80 was never rebranded. You're thinking of G92, which first appeared as a GTX 8800GT.
I say remilked. Not rebadged. The arch remained same except for the PureVideo, G92 was used for about 30 cards... with the last being GTS 150

Also... during early days, bloody nobody whined about cards transitioning in between SDR, SGRAM, DDR and so on up...

Now something is trying to make a scene out of it... so what?
Posted on Reply
#64
silentbogo
Assimilator said:
Assuming that all the leaks so far are correct and the GDDR5X is being clocked at the same speed as the GDDR5, then no, there isn't any more throughput here... at least, not at stock.
There is one thing that makes the difference: QDR.
GDDR5 only supports double data rate.
So, even if it's clocked the same, even if there headroom for theoretically crazy OC is locked behind the steel door, we still get the advantage of faster throughput.
Posted on Reply
#65
cdawall
where the hell are my stars
Ferrum Master said:
I say remilked. Not rebadged. The arch remained same except for the PureVideo, G92 was used for about 30 cards... with the last being GTS 150

Also... during early days, bloody nobody whined about cards transitioning in between SDR, SGRAM, DDR and so on up...

Now something is trying to make a scene out of it... so what?
GT240/GTS250 came out after the 150 right? Those were G92 as well.

I want to say there was like a gt330 still based on G92 in oem land as well
Posted on Reply
#66
newtekie1
Semi-Retired Folder
silentbogo said:
There is one thing that makes the difference: QDR.
GDDR5 only supports double data rate.
So, even if it's clocked the same, even if there headroom for theoretically crazy OC is locked behind the steel door, we still get the advantage of faster throughput.
When he said "clocked the same" he means effectively clocked the same. The throughput of the GDDR5X version is identical to the GDDR5 version.
Posted on Reply
#67
medi01
cdawall said:
AMD latest and greatest vega gpu barely competes beats the 980ti from 2015 in both power and performance.
Yeah. You know, those space heaters, the Maxwell cards.
Posted on Reply
#68
jabbadap
cdawall said:
GT240/GTS250 came out after the 150 right? Those were G92 as well.

I want to say there was like a gt330 still based on G92 in oem land as well
9800gtx+/gts250 were with 55nm G92b die shrinked from 65nm G92. But that is some ancient history, which has nothing to do with gtx1070 gddr5x. /OT
Posted on Reply
#69
anubis44
Tomgang said:
Really Nvidia. First over priced 2000 series cards and now they keep making soup of old meat.

Nvidia is not what they used to be or also they are burnt in with alot of cards after mining has ended its terror rounds.
On the contrary. nVidia is being EXACTLY what they've always been:

<div class="youtube-embed" data-id="b_x1JGG4JC8"><img src="https://i.ytimg.com/vi/b_x1JGG4JC8/hqdefault.jpg" /><div class="youtube-play"></div><a href="https://www.youtube.com/watch?v=b_x1JGG4JC8" target="_blank" class="youtube-title"></a></div>

<div class="youtube-embed" data-id="H0L3OTZ13Os"><img src="https://i.ytimg.com/vi/H0L3OTZ13Os/hqdefault.jpg" /><div class="youtube-play"></div><a href="https://www.youtube.com/watch?v=H0L3OTZ13Os" target="_blank" class="youtube-title"></a></div>

<div class="youtube-embed" data-id="dE-YM_3YBm0"><img src="https://i.ytimg.com/vi/dE-YM_3YBm0/hqdefault.jpg" /><div class="youtube-play"></div><a href="https://www.youtube.com/watch?v=dE-YM_3YBm0" target="_blank" class="youtube-title"></a></div>
Posted on Reply
#73
Chloe Price
Ferrum Master said:
What? They were like that always. Rebadged Geforce 2 MX.
They made GeForce 4 MX with several models, didn't see a problem with that back then. And at least it had some features from the GF 4 Ti lineup.
Remilked GTX8800 for a decade in various forms.
No, they didn't. The only rebadge was 8800 Ultra which was the same card with different cooler and higher clocks.

It was 8800 GTS 512 which had too many rebrands (9800 GTX, 9800 GTX+, GTS 150 OEM, GTS 250 - and you couldn't tell if they had the original 65nm or die-shrink 55nm chip)
Posted on Reply
#74
Slizzo
cdawall said:
The real story of G92 was the 8800GTS 512 which was ranked below the 8800GTX/ultra and then in the next generation was rebranded above it as the 9800GTX/+ and the other dozen cards it became lol
I had a G80 8800GTS which I promptly returned to CompUSA and picked up an 8800GT to replace it. The 8800GT, despite having 112 shaders to the GTX's 128shaders, regularly outperformed the GTX due to it's much higher clock speeds.

That was the crux of why the 8800GT was so amazing. For $300 it was performing the same as or better than the $599-$649 8800GTX.
Posted on Reply
#75
newtekie1
Semi-Retired Folder
Slizzo said:
I had a G80 8800GTS which I promptly returned to CompUSA and picked up an 8800GT to replace it. The 8800GT, despite having 112 shaders to the GTX's 128shaders, regularly outperformed the GTX due to it's much higher clock speeds.

That was the crux of why the 8800GT was so amazing. For $300 it was performing the same as or better than the $599-$649 8800GTX.
Yeah, I'm pretty sure that is why they significantly upped the clock speeds on the 9800GTX(not to mention actually using faster RAM too).

Chloe Price said:
It was 8800 GTS 512 which had too many rebrands (9800 GTX, 9800 GTX+, GTS 150 OEM, GTS 250 - and you couldn't tell if they had the original 65nm or die-shrink 55nm chip)
The + in the 9800GTX+ was there specifically to designate the G92b 55nm version. The GTS150 was always G92 and the GTS250 was always G92b. So, yes, you could always tell if they had the 65nm or 55nm chip.
Posted on Reply
Add your own comment