Tuesday, December 5th 2017

NVIDIA Unveils GeForce GTX 1070 with GDDR5X Memory

It looks like NVIDIA bought itself a mountain of unsold GDDR5X memory chips, and is now refreshing its own mountain of unsold GP104 inventory, to make products more presentable to consumers in the wake of its RTX 20-series and real-time ray-tracing lure. First, it was the GP104-based GTX 1060 6 GB with GDDR5X memory, and now it's the significantly faster GeForce GTX 1070, which is receiving the newer memory, along with otherwise unchanged specifications. ZOTAC is among the first NVIDIA add-in card partners ready with one such cards, the GTX 1070 AMP Extreme Core GDDR5X (model: ZT-P10700Q-10P).

Much like the GTX 1060 6 GB GDDR5X, this otherwise factory-overclocked ZOTAC card sticks to a memory clock speed of 8.00 GHz, despite using GDDR5X memory chips that are rated for 10 Gbps. It features 8 GB of it across the chip's full 256-bit memory bus width. The GPU is factory-overclocked by ZOTAC to tick at 1607 MHz, with 1797 MHz GPU Boost, which are below the clock-speeds of the GDDR5 AMP Extreme SKU, that has not just higher 1805 MHz GPU Boost frequency, but also overclocked memory at 8.20 GHz. Out of the box, this card's performance shouldn't be distinguishable from the GDDR5 AMP Core, but the memory alone should serve up a significant overclocking headroom.
Add your own comment

78 Comments on NVIDIA Unveils GeForce GTX 1070 with GDDR5X Memory

#1
cucker tarlson
at least they're not naming it gtx 1080 1920 cuda :roll:
Posted on Reply
#4
megamanxtreme
The first thing I would be looking for is price.
Posted on Reply
#5
Tomgang
Really Nvidia. First over priced 2000 series cards and now they keep making soup of old meat.

Nvidia is not what they used to be or also they are burnt in with alot of cards after mining has ended its terror rounds.
Posted on Reply
#7
Ferrum Master
Tomgang said:
Really Nvidia. First over priced 2000 series cards and now they keep making soup of old meat.

Nvidia is not what they used to be or also they are burnt in with alot of cards after mining has ended its terror rounds.
What? They were like that always. Rebadged Geforce 2 MX. Remilked GTX8800 for a decade in various forms.

Get a grip people. This thing has been always there since Radeon VE or 7000 and Geforce days.

What's the fuss about it? So what?
Posted on Reply
#9
ArbitraryAffection
Poopscal with faster memory is still Poopscal. Except the memory isn't even faster out of the box. Huh.
Posted on Reply
#10
Frick
Fishfaced Nincompoop
ArbitraryAffection said:
Poopscal with faster memory is still Poopscal. Except the memory isn't even faster out of the box. Huh.
Nothing wrong with Pascal.
Posted on Reply
#11
the54thvoid
Frick said:
Nothing wrong with Pascal.
Yup. Been happy with mine since day one.

And as for using better memory on an old card, can someone explain why that is a bad thing? Look, here is item 'A' which we now sell with a better component. As long as price doesn't inflate.
Posted on Reply
#12
ArbitraryAffection
Frick said:
Nothing wrong with Pascal.
But it's old hat now.


*inb4 obligatory but but but RX 590 is old hat, too! comeback*

the54thvoid said:
Yup. Been happy with mine since day one.

And as for using better memory on an old card, can someone explain why that is a bad thing? Look, here is item 'A' which we now sell with a better component. As long as price doesn't inflate.
Because they obviously can't offer Turding, uh, I mean Turing for a reasonable price?
Posted on Reply
#13
Honest Abe
ArbitraryAffection said:
But it's old hat now.


*inb4 obligatory but but but RX 590 is old hat, too! comeback*
"Old hat" not quite. Pascal isn't obsolete yet.
Posted on Reply
#14
ensabrenoir
.....I think its pretty smart(from a business stand point). Its all gonna come down to price though. (used 1070 & 1080 going for $280-400 in my area) Keeping the 20 series price high with promises of the future while refreshing a proven( and still viable) product, for those not willing to pony up for the newer and all the while eliminating backs stock. Until AMD can offer an alternative.....its gonna work.
Posted on Reply
#15
ArbitraryAffection
Honest Abe said:
"Old hat" not quite. Pascal isn't obsolete yet.
Heya. Did you make an account just to say that? :)

ensabrenoir said:
.....I think its pretty smart(from a business stand point). Its all gonna come down to price though. (used 1070 & 1080 going for $280-400 in my area) Keeping the 20 series price high with promises of the future while refreshing a proven( and still viable) product, for those not willing to pony up for the newer and all the while eliminating backs stock. Until AMD can offer an alternative.....its gonna work.
There is an alternative. It's called the faster Vega 56 and 64. The former of which has sunk to, and below, MSRP here in the UK. Vega 56 can be had for £350 + free games. It's faster than 1070 in essentially everything with an exception to some horrifically NVIDIA favouring engines.
Posted on Reply
#16
Vayra86
ArbitraryAffection said:
But it's old hat now.


*inb4 obligatory but but but RX 590 is old hat, too! comeback*


Because they obviously can't offer Turding, uh, I mean Turing for a reasonable price?
Actually Turing made sure Pascal will be relevant for a LONG time. The only thing Turing really offers is a 2080ti which is out of reach for most, so apart from some weak RTRT, there is nothing to see here.
Posted on Reply
#17
ArbitraryAffection
Vayra86 said:
Actually Turing made sure Pascal will be relevant for a LONG time. The only thing Turing really offers is a 2080ti which is out of reach for most, so apart from some weak RTRT, there is nothing to see here.
You're right, but I was just poking fun at it :p
Posted on Reply
#18
R0H1T
Like I said previously ~ Nvidia is helping offload the remaining stock of GDDR5x & it's a good thing only Micron made them, though obviously not good for Micron.
Posted on Reply
#19
I No
ArbitraryAffection said:

There is an alternative. It's called the faster Vega 56 and 64.
Hardly, it just comes down to what's available for the money and don't dare bring Freesync into it. While being cheaper, Freesync panels have close to terrible QC so there's no guarantee that a cheap $200 panel per say would perform the same as a $600 panel.

ArbitraryAffection said:

some horrifically NVIDIA favouring engines .
Same can be said about VEGA. Outliers aside both cards are capable of delivering you can't go wrong with either of them.
Posted on Reply
#20
the54thvoid
ArbitraryAffection said:
But it's old hat now.


*inb4 obligatory but but but RX 590 is old hat, too! comeback*
Judging by your small font disclaimer, you are aware of your hypocrisy?

Anyway, enough said.
Posted on Reply
#21
ensabrenoir
ArbitraryAffection said:
Heya. Did you make an account just to say that? :)


There is an alternative. It's called the faster Vega 56 and 64. The former of which has sunk to, and below, MSRP here in the UK. Vega 56 can be had for £350 + free games. It's faster than 1070 in essentially everything with an exception to some horrifically NVIDIA favouring engines.
.....oh yeah forgot about those....like alot of people because availability was just so horrible (in my neck of the woods). I actually saw a Vega Frontier for sale on craigslist and i honestly had to google it to figure out where it fell and just when was it released. Given this is just my opinion and experience but i'm confident i'm not alone. Brand identity do factor into buying.
Posted on Reply
#22
Vayra86
Another issue with Vega is that a lot of its AIB versions are hit or miss. If you don't have one of the great ones, you have a shitty one. And guess what, the nice ones do cost more. Then factor in the power draw gap across a few years of light gaming and *poof* price difference gone.
Posted on Reply
#23
The Quim Reaper
but the memory alone should serve up a significant overclocking headroom.
All I can say to this is, so what? Memory overclocking really only gives tiny, marginal performance gains, in comparison to core overclocking.
Posted on Reply
#24
Assimilator
And finally the GTX 1060 GDDR5X shenanigans make sense.

It's simple: NVIDIA has stopped selling GTX 1080 GPUs to board partners, because big green wants to push Turing as hard as they can and GTX 1080 performs too close to RTX 2070 for their liking. One problem: board partners are sitting on large stocks of GTX 1080 PCBs and GTX 1080 GDDR5X memory because they wanted to make GTX 1080s (due to both ordinary demand and the crypto boom that flopped), and now NVIDIA isn't letting them, and they're understandably unhappy.

Solution: rework existing, lower-performance SKUs to use the GTX 1080 PCBs, GPUs and memory and hence deplete partners' inventories. Thus the GTX 1060 GDDR5X, which uses a cut-down GTX 1080 GPU, and now this "GTX 1070 GDDR5X". This is also why neither of these "new" SKUs take advantage of the extra speed of the GDDR5X they're using: NVIDIA doesn't want them to perform better than the GDDR5 originals (which would potentially pressure Turing, as well as make everyone who owns the GDDR5 versions of these cards unhappy).

It's a pretty ingenious solution to an (admittedly self-created) problem, and it should be a win for consumers in the know: unless NVIDIA pulls some shenanigans around overclocking, the GTX 1060 GDDR5X and "GTX 1070 GDDR5X" will easily hit 10Gbps on their memory, which will offer a tangible, although not huge, performance gain. And the use of GTX 1080 boards, with their beefed-up power circuitry, should also allow more stable and higher core overclocks.

At the end of the day, though, board partners are still going to be losing money selling what are essentially GTX 1080s at GTX 1060/1070 prices, and I assume NVIDIA will be picking up the tab for that, so I expect to see a substantial "inventory writedown" or similar charge in NVIDIA's next quarterly results report.

Which brings up another interesting question: are the prices of Turing so high only because NVIDIA is greedy, or was this writedown cost already factored into them from the start?
Posted on Reply
#25
R0H1T
Source for the 1080 boards being used in GTX 1060 or 1070 :confused:
Posted on Reply
Add your own comment