Wednesday, December 5th 2018

NVIDIA Unveils GeForce GTX 1070 with GDDR5X Memory

It looks like NVIDIA bought itself a mountain of unsold GDDR5X memory chips, and is now refreshing its own mountain of unsold GP104 inventory, to make products more presentable to consumers in the wake of its RTX 20-series and real-time ray-tracing lure. First, it was the GP104-based GTX 1060 6 GB with GDDR5X memory, and now it's the significantly faster GeForce GTX 1070, which is receiving the newer memory, along with otherwise unchanged specifications. ZOTAC is among the first NVIDIA add-in card partners ready with one such cards, the GTX 1070 AMP Extreme Core GDDR5X (model: ZT-P10700Q-10P).

Much like the GTX 1060 6 GB GDDR5X, this otherwise factory-overclocked ZOTAC card sticks to a memory clock speed of 8.00 GHz, despite using GDDR5X memory chips that are rated for 10 Gbps. It features 8 GB of it across the chip's full 256-bit memory bus width. The GPU is factory-overclocked by ZOTAC to tick at 1607 MHz, with 1797 MHz GPU Boost, which are below the clock-speeds of the GDDR5 AMP Extreme SKU, that has not just higher 1805 MHz GPU Boost frequency, but also overclocked memory at 8.20 GHz. Out of the box, this card's performance shouldn't be distinguishable from the GDDR5 AMP Core, but the memory alone should serve up a significant overclocking headroom.
Add your own comment

78 Comments on NVIDIA Unveils GeForce GTX 1070 with GDDR5X Memory

#51
newtekie1
Semi-Retired Folder
ArbitraryAffectionThere is an alternative. It's called the faster Vega 56 and 64. The former of which has sunk to, and below, MSRP here in the UK. Vega 56 can be had for £350 + free games. It's faster than 1070 in essentially everything with an exception to some horrifically NVIDIA favouring engines.
Too bad here in the US Vega 56 is an insanely bad value. A good 1070, with an aftermarket cooler, is $335. A good Vega 56, with an aftermarket cooler, is $420. Not really worth it for a unnoticeable performance improvement.
Posted on Reply
#52
jabbadap
newtekie1Now I'm just waiting for the GTX1070 Ti w/ GDDR5X. That I'll definitely buy!
Oh that you can buy today, it's called gtx1080...
Posted on Reply
#53
newtekie1
Semi-Retired Folder
jabbadapOh that you can buy today, it's called gtx1080...
Posted on Reply
#54
Dave65
megamanxtremeThe first thing I would be looking for is price.
150 above 2070 because Nvidia.:shadedshu::shadedshu:
Posted on Reply
#55
cdawall
where the hell are my stars
medi01So, that's what Maxwell cards were, right, as that's where perf/w of Vega roughly is?

That delusional underdog hate is so pathetic.
Is this supposed to be a positive? AMD latest and greatest vega gpu barely competes with the 980ti from 2015 in both power and performance. The rumor mills cards are being place in the same playing field as the midrange gpu die equipped 1080 (GP104). Also not a positive to be as hot as your competitors last generation midrange die. If you count Volta they are what two generations behind easily and couldn't touch big pascal let alone big Volta.

AMD needs something here...
Posted on Reply
#56
Slizzo
Ferrum MasterWhat? They were like that always. Rebadged Geforce 2 MX. Remilked GTX8800 for a decade in various forms.

Get a grip people. This thing has been always there since Radeon VE or 7000 and Geforce days.

What's the fuss about it? So what?
GT80 was never rebranded. You're thinking of G92, which first appeared as a GTX 8800GT.
Posted on Reply
#57
Assimilator
silentbogoI'm losing my hope for humanity. We get a decent bump in memory throughput, and people rant about it like it's a bad thing.
Assuming that all the leaks so far are correct and the GDDR5X is being clocked at the same speed as the GDDR5, then no, there isn't any more throughput here... at least, not at stock.

As for ranting, that's just @ArbitraryAffection being a bell-end.
Posted on Reply
#58
cdawall
where the hell are my stars
SlizzoGT80 was never rebranded. You're thinking of G92, which first appeared as a GTX 8800GT.
The real story of G92 was the 8800GTS 512 which was ranked below the 8800GTX/ultra and then in the next generation was rebranded above it as the 9800GTX/+ and the other dozen cards it became lol
Posted on Reply
#59
stimpy88
If this inventory stuff was real, they would lower the prices. This is nothing more than free nGreedia PR.
Posted on Reply
#60
cdawall
where the hell are my stars
stimpy88If this inventory stuff was real, they would lower the prices. This is nothing more than free nGreedia PR.
Sales are down, manufacturing was spun up. I absolutely would try and sell at a fuller price before dropping it.
Posted on Reply
#61
rtwjunkie
PC Gaming Enthusiast
ArbitraryAffectionBut it's old hat now.
And what does that matter? At 1070 model and above, Pascal is still highly capable. Nothing wrong with it.
R0H1Tit's a good thing only Micron made them, though obviously not good for Micron.
Why? I’m quite sure that Nvidia and the AIB’s already paid Micron for them. This is Nvidia and AIB’s way of trying not to take a bath by the overbuying they already did.
stimpy88If this inventory stuff was real, they would lower the prices. This is nothing more than free nGreedia PR.
Oh look, another of our edgy and cool members using words like “Ngreedia”. :rolleyes: It’s not any more endearing than when other members use “M$“.
Posted on Reply
#62
Ferrum Master
SlizzoGT80 was never rebranded. You're thinking of G92, which first appeared as a GTX 8800GT.
I say remilked. Not rebadged. The arch remained same except for the PureVideo, G92 was used for about 30 cards... with the last being GTS 150

Also... during early days, bloody nobody whined about cards transitioning in between SDR, SGRAM, DDR and so on up...

Now something is trying to make a scene out of it... so what?
Posted on Reply
#63
silentbogo
AssimilatorAssuming that all the leaks so far are correct and the GDDR5X is being clocked at the same speed as the GDDR5, then no, there isn't any more throughput here... at least, not at stock.
There is one thing that makes the difference: QDR.
GDDR5 only supports double data rate.
So, even if it's clocked the same, even if there headroom for theoretically crazy OC is locked behind the steel door, we still get the advantage of faster throughput.
Posted on Reply
#64
cdawall
where the hell are my stars
Ferrum MasterI say remilked. Not rebadged. The arch remained same except for the PureVideo, G92 was used for about 30 cards... with the last being GTS 150

Also... during early days, bloody nobody whined about cards transitioning in between SDR, SGRAM, DDR and so on up...

Now something is trying to make a scene out of it... so what?
GT240/GTS250 came out after the 150 right? Those were G92 as well.

I want to say there was like a gt330 still based on G92 in oem land as well
Posted on Reply
#65
newtekie1
Semi-Retired Folder
silentbogoThere is one thing that makes the difference: QDR.
GDDR5 only supports double data rate.
So, even if it's clocked the same, even if there headroom for theoretically crazy OC is locked behind the steel door, we still get the advantage of faster throughput.
When he said "clocked the same" he means effectively clocked the same. The throughput of the GDDR5X version is identical to the GDDR5 version.
Posted on Reply
#66
medi01
cdawallAMD latest and greatest vega gpu barely competes beats the 980ti from 2015 in both power and performance.
Yeah. You know, those space heaters, the Maxwell cards.
Posted on Reply
#67
jabbadap
cdawallGT240/GTS250 came out after the 150 right? Those were G92 as well.

I want to say there was like a gt330 still based on G92 in oem land as well
9800gtx+/gts250 were with 55nm G92b die shrinked from 65nm G92. But that is some ancient history, which has nothing to do with gtx1070 gddr5x. /OT
Posted on Reply
#68
anubis44
TomgangReally Nvidia. First over priced 2000 series cards and now they keep making soup of old meat.

Nvidia is not what they used to be or also they are burnt in with alot of cards after mining has ended its terror rounds.
On the contrary. nVidia is being EXACTLY what they've always been:



Posted on Reply
#72
Keullo-e
S.T.A.R.S.
Ferrum MasterWhat? They were like that always. Rebadged Geforce 2 MX.
They made GeForce 4 MX with several models, didn't see a problem with that back then. And at least it had some features from the GF 4 Ti lineup.
Remilked GTX8800 for a decade in various forms.
No, they didn't. The only rebadge was 8800 Ultra which was the same card with different cooler and higher clocks.

It was 8800 GTS 512 which had too many rebrands (9800 GTX, 9800 GTX+, GTS 150 OEM, GTS 250 - and you couldn't tell if they had the original 65nm or die-shrink 55nm chip)
Posted on Reply
#73
Slizzo
cdawallThe real story of G92 was the 8800GTS 512 which was ranked below the 8800GTX/ultra and then in the next generation was rebranded above it as the 9800GTX/+ and the other dozen cards it became lol
I had a G80 8800GTS which I promptly returned to CompUSA and picked up an 8800GT to replace it. The 8800GT, despite having 112 shaders to the GTX's 128shaders, regularly outperformed the GTX due to it's much higher clock speeds.

That was the crux of why the 8800GT was so amazing. For $300 it was performing the same as or better than the $599-$649 8800GTX.
Posted on Reply
#74
newtekie1
Semi-Retired Folder
SlizzoI had a G80 8800GTS which I promptly returned to CompUSA and picked up an 8800GT to replace it. The 8800GT, despite having 112 shaders to the GTX's 128shaders, regularly outperformed the GTX due to it's much higher clock speeds.

That was the crux of why the 8800GT was so amazing. For $300 it was performing the same as or better than the $599-$649 8800GTX.
Yeah, I'm pretty sure that is why they significantly upped the clock speeds on the 9800GTX(not to mention actually using faster RAM too).
Chloe PriceIt was 8800 GTS 512 which had too many rebrands (9800 GTX, 9800 GTX+, GTS 150 OEM, GTS 250 - and you couldn't tell if they had the original 65nm or die-shrink 55nm chip)
The + in the 9800GTX+ was there specifically to designate the G92b 55nm version. The GTS150 was always G92 and the GTS250 was always G92b. So, yes, you could always tell if they had the 65nm or 55nm chip.
Posted on Reply
#75
webdigo
So I wonder if new gtx 1070 will outperform old gtx 1070-ti.
Posted on Reply
Add your own comment
Apr 26th, 2024 03:25 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts