Wednesday, May 23rd 2018

NVIDIA GeForce "Volta" Graphics Cards to Feature GDDR6 Memory According to SK Hynix Deal

NVIDIA's upcoming GeForce GTX graphics cards based on the "Volta" architecture, could feature GDDR6 memory, according to a supply deal SK Hynix struck with NVIDIA, resulting in the Korean memory manufacturer's stock price surging by 6 percent. It's not known if GDDR6 will be deployed on all SKUs, or if like GDDR5X, it will be exclusive to a handful high-end SKUs. The latest version of SK Hynix memory catalogue points to an 8 Gb (1 GB) GDDR6 memory chip supporting speeds of up to 14 Gbps at 1.35V, and up to 12 Gbps at 1.25V.

Considering NVIDIA already got GDDR5X to run at 11 Gbps, it could choose the faster option. Memory remains a cause for concern. If 8 Gb is the densest chip from SK Hynix, then the fabled "GV104" (GP104-successor), which could likely feature a 256-bit wide memory interface, will only feature up to 8 GB of memory, precluding the unlikely (and costly) option of piggy-backing chips to achieve 16 GB.
Source: Appuals
Add your own comment

55 Comments on NVIDIA GeForce "Volta" Graphics Cards to Feature GDDR6 Memory According to SK Hynix Deal

#26
iO
Let's hope not. Being based on Volta's year old architecture and not using Samsungs superior 16Gb 18Gbps chips could make this a rather disappointing card...
Posted on Reply
#27
Kaapstad
jabbadapWell Titan V has one HBM2 disabled, which cut that memory bandwidth to 3/4 from the maximum available. But is Titan V more ROPs limited or bw limited?

By raw bandwidth next gen. ti/titan with 384bit gddr6 bus will have more(16*384/8 = 768 GB/s vs Titan V:s 653 GB/s). But yeah brink of 7nm I wonder what are nvidia's plans. I.E. do they make only gv104 with 12nm and release it now(summer) and shrink it to 7nm call it Turing/Ampere 6-9 months later with full release from low to high end(GX108, GX107, GX106, GX104, GX102, X= A or T).
Titan V has massive memory bandwidth when over clocked, no other GPU comes close.
Posted on Reply
#28
bug
iOLet's hope not. Being based on Volta's year old architecture and not using Samsungs superior 16Gb 18Gbps chips could make this a rather disappointing card...
Bwahahahaha.
Posted on Reply
#29
Kaapstad
iOLet's hope not. Being based on Volta's year old architecture and not using Samsungs superior 16Gb 18Gbps chips could make this a rather disappointing card...
Volta like for like (1080 v 2080) is good for about 35% increase in performance, not boring at all.
Posted on Reply
#30
jabbadap
KaapstadTitan V has massive memory bandwidth when over clocked, no other GPU comes close.
Well Quadro GV100 or GP100 might wan't to have a word on that. But yeah you are quite right most close to consumer space, Titan V is the one to beat on Raw bandwidth. Some Titan Xp oc with memory to 12.5 Gbps, which translates to 12.5*384/8 = 600 GB/s, which even OC:ed can't match the stock bw of titan V.
Posted on Reply
#31
iO
KaapstadVolta like for like (1080 v 2080) is good for about 35% increase in performance, not boring at all.
So basically an overclocked 1080Ti with less VRAM, lower bandwidth but the same price because of expensive GDDR6.
Not boring but also not really exciting..

But I hope Volta was just a mid step and Turing or whatever has recieved quite some architectural and memory compression related improvements in the last year of development since GV100 was launched.
Posted on Reply
#32
T4C Fantasy
CPU & GPU DB Maintainer
MT66I don't think Pascal was after Maxwell on a older roadmap either. Its subject to change apparently.
They did, pascal released in 2016 they showed pascal and volta in 2015 slides
Posted on Reply
#33
efikkan
ImsochoboI do not expect Nvidia to massively improve but respectable improvements and much awaited.
I expect AMD to massively improve in gaming performance (like it could get worse? :) vega56 is the only great card in their entire lineup with hardware vs cost vs performance (MSRP!!!))
We all wish AMD will catch up, but first they will need a new architecture. At the moment the gap between them and the competition is increasing.
iOLet's hope not. Being based on Volta's year old architecture and not using Samsungs superior 16Gb 18Gbps chips could make this a rather disappointing card...
So why is Volta disappointing? It will be the fastest architecture ever by far. You should care less about fancy specifications and more about actual performance.

It's not unlikely that "Volta" will offer a performance increase of over 30%, which isn't bad at all. But the interesting chip will be the big one, "GV102", which probably wouldn't launch anytime soon.

The exciting part will be the potential additional features "Volta" may bring to the consumer space, such as RTX, fp16, etc.
the54thvoidIf anything, the higher resolution should hurt a memory starved chip. Just like how HBM helped Fury X at higher resolution, when it stumbled at 1080p.
Saying Titan V is throttled by its memory requires some significant analytical proof (given it uses HBM2).
I just want to add, increasing the resolution matters relatively little to memory consumption and bandwidth usage today (if other parameters are unchanged). It did matter when GPUs had ~1 GB of memory, but not today even for a 4K framebuffer with 8xAA which would only consume about ~256 MB.
In most cases computational load grows quicker than memory bandwidth and capacity requirements. And as we've seen for the last few generations, Nvidia have yet to release a GPU actually bottlenecked by memory, despite what some in the public believe.
bugAnd it's only going to get funnier when, because of HDR, we move to 10 bits per channel. 25% more bandwidth required right off the bat.
Well, probably not.

Fragment processing is internally done at fp32 in GPUs, and many games already use HDR internally which they tone map to 8-bit per channel in post-processing.

So it actually comes down to if games will use higher resolution assets (textures) because of displaying HDR or not, some games already use HDR textures.
Posted on Reply
#34
Caring1
UpgrayeddWouldn't that be like saying I forgot GeForce?
Ah yes, Mr GeForce, up there with all those other scientific great names …. :rolleyes:
Posted on Reply
#35
Upgrayedd
Caring1Ah yes, Mr GeForce, up there with all those other scientific great names …. :rolleyes:
I was going off market segmentation. I mentioned strictly architectures then got told I forgot "Tesla" the brand which isn't an arch.
It was Ms. GeForce do your research.
Posted on Reply
#36
T4C Fantasy
CPU & GPU DB Maintainer
UpgrayeddI was going off market segmentation. I mentioned strictly architectures then got told I forgot "Tesla" the brand which isn't an arch.
It was Ms. GeForce do your research.
Tesla was an architecture though and it has 2 versions

Curie is an architecture too before Tesla
Posted on Reply
#38
T4C Fantasy
CPU & GPU DB Maintainer
MT66goo.gl/images/NRfohq
and in 2015 a year before pascal released they included it

if turing was real it would be there already

your slide is 2013, updated with pascal and no volta in 2015 then later it shows both
Posted on Reply
#39
Upgrayedd
T4C FantasyTesla was an architecture though and it has 2 versions

Curie is an architecture too before Tesla
Sooo I guess it would be like changing the name GeForce to Fermi? Now that I know this, hearing Tesla and Volta when talking about the same card just sounds weird.
Posted on Reply
#40
T4C Fantasy
CPU & GPU DB Maintainer
Fahrenheit - NV4
Celsius - NV10+
Kelvin - NV20+
Rankine - NV30+
Curie - NV40/G70+
Tesla - G80+
Tesla 2.0 - GT200+
Fermi - GF100+
Fermi 2.0 - GF110+
Kepler - GK100+
Kepler 2.0 - GK200+
Maxwell - GM100+
Maxwell 2.0 - GM200+
Pascal - GP100+
Volta - GV100+
Ampere - Fake
Turing - Rumor
Posted on Reply
#41
B-Real
dwadeRIP AMD
That's why AMD closed the gap to NV in the latest Quarter shares?
Posted on Reply
#42
zmeul
that title tho .. geez H christ, TPU!

going from "to feature" in the title to "could feature"
Posted on Reply
#43
bug
zmeulthat title tho .. geez H christ, TPU!

going from "to feature" in the title to "could feature"
The title says "...to feature... according to". I don't see a contradiction.
Posted on Reply
#44
jabbadap
T4C FantasyFahrenheit - NV4
Celsius - NV10+
Kelvin - NV20+
Rankine - NV30+
Curie - NV40/G70+
Tesla - G80+
Tesla 2.0 - GT200+
Fermi - GF100+
Fermi 2.0 - GF110+
Kepler - GK100+
Kepler 2.0 - GK200+
Maxwell - GM100+
Maxwell 2.0 - GM200+
Pascal - GP100+
Volta - GV100+
Ampere - Fake
Turing - Rumor
How about Einstein, it was rumored to be after Maxwell even before Volta was in roadmaps(Echelon system).
Posted on Reply
#45
Unregistered
M2BIt's gonna be very disappointing if the new flagship uses only 8GB of memory @12GHz

but 12GB/16GB @14GHz should be fine.
Agreed. I don't care how fast the memory or GPU is, if it's only 8GB of memory, I'll stick with my 1080Ti cards for now before upgrading. I'm already seeing upwards of 9GB of utilization in some of my games, so no way I can downgrade to 8GB from 11GB.
Posted on Edit | Reply
#46
John Naylor
Razrback16Agreed. I don't care how fast the memory or GPU is, if it's only 8GB of memory, I'll stick with my 1080Ti cards for now before upgrading. I'm already seeing upwards of 9GB of utilization in some of my games, so no way I can downgrade to 8GB from 11GB.
Howz that ? If you mean that you are usng a memory utility that measures VRAM allocation, you are not **using** that much VRAM. ;let me start w/ an analogy. If you have a credit card with $500 charged against your $5,000 limit, and now you ask your bank for a car loan, your credit report will show a liability of $5,000 ... but only the $500 is actually being **used**. Same thing. When you install a game, the install routine wiil look and say .... oh he's got 11 GB, let's allocate 7 GB ... that doesn't mean it's actually **using 7 GB**.

Peeps have been misinterpreting these utilities for ages and, if ya try real hard, you can create issues, but they just don't pop up under normal usage except on rare occasions ... and when they do, it doesn't matter. What benefit is it if ya can get a 31 % increase in fps, when it takes you from an unplayable 13 fps to an unplayable 17 fps. In each case when that happens,the GPU can't handle the resolution / settings anyway. Putting more VRAM in is like putting a 2,000 pond chain on a 1,000 pound hoist. :

600 Series - www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/

"So, what can we glean from all of that? For one thing, with any single monitor the 2GB video card is plenty - even on the most demanding games and benchmarks out today. When you scale up to three high-res screens the games we tested were all still fine on 2GB, though maxing out some games’ settings at that resolution really needs more than a single GPU to keep smooth frame rates. "

700 series -

The original web site is gone and you have to put up with the non-english narration but the data is there. They tested the 2GB and 4 GB 770s in some 45 games up to 5760 x 1080. Again, they found no significant difference in perfopmance with the 2Gb oft beating the 4 GB. And again, in any game where the 4GB showed a substantial gain over the 2Gb model, settings and resolution rendered the game unplayable anyway. The icing on the cake was in Max payne 2, where the game would not install at 5760 x 1080 on the 2 GB card ... they did install on the 4Gb card and, then went back and tried the 2 GB card ... again, once the install routine was "tricked", the game ran at no significant decrease in fps and no decrease in playability or image quality.

900 series - www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x

"[Insert and VRAM utilization tool here] doesn’t actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, whether it’s GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.

We began this article with a simple question: “Is 4GB of RAM enough for a high-end GPU?” .... Based our results, I would say that the answer is yes — but the situation is more complex than we first envisioned. ...

First, there’s the fact that out of the fifteen games we tested, only four of could be forced to consume more than the 4GB of RAM. In every case, we had to use high-end settings at 4K to accomplish this. ..... provoking this problem in current-generation titles required us to use settings that rendered the games unplayable any current GPU

For the 1000 series, I had one user give me the link below to prove that 6GB was faster than 3 GB

www.techpowerup.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/26.html

Problem is the 6GB 1060 has 10% more shaders so the it has a faster GPU. If VRAM was in fact a contributor to performance, we could show this by looking at how much a decrease in perfomance the 3GB model incurred when we went from 1080 p to 1440p. Problem is, there is none. The ^GB model is 6% faster at 1080p and it's still 6% faster at 1440p. Now of course, it is certainly going to be an issue at 2160p.... but as before, the GPU itself is not suited for 2160p so what's the point.

Yes ... certain games, poor console ports for example, can have issues with inadequate VRAM ... and yes, if you try hard enough you can create issues. But the fact remains, this will not be the case in the majority of cases. Card offerings with multiple VRAM choices, just have not hisorically shown wide variations ... exceptions exist, but rare. So while I'm all for having as much memory as possible paired with any GPU, having more than what the card can actually benefit from, doesn't serve a purpose. When doing builds we recommend the following:

Minimum for Budget Limited Builds / Recommended Minimum

1080p - 3 GB / 4 GB
1440p - 6 GB / 8 GB
2160p - 12 GB / 16 GB

Yes, no gaming oriented card exists past the Ti's 11GB which is why don't consider 4k ready for prime time. But logic dictates, if 11 GB is OK for 2160p ( which has same pxels as (4) 1080 screens... then 1/4 of 11GB should be fine for 1080p.
Posted on Reply
#47
bug
And to prove the above, just look at TPU's assessment of various games' performance: quite often, games that allocate a lot of VRAM upfront, run just as well on cars having less VRAM.

Now, I'm not saying when paying top $$$ it's ok to get a card that only gets by. Just that the inference "less VRAM => sucky gaming" is wrong.
Posted on Reply
#48
efikkan
Razrback16Agreed. I don't care how fast the memory or GPU is, if it's only 8GB of memory, I'll stick with my 1080Ti cards for now before upgrading. I'm already seeing upwards of 9GB of utilization in some of my games, so no way I can downgrade to 8GB from 11GB.
"GTX 1180" will most likely not be a significant upgrade over GTX 1080 Ti, so I see no reason to upgrade.
And as some have already mentioned, allocated and needed memory are not the same thing. I'm pretty confident that if Nvidia decide to launch "GTX 1180" with just 8 GB, it will still be a very balanced card.
Look at the past:
GTX 580 1.5 GB
GTX 680 2 GB
GTX 780 3 GB
GTX 980 4 GB
GTX 1080 8 GB
I think GTX 1080 is ahead of the memory requirements.
Posted on Reply
#49
Totally
bugThe title says "...to feature... according to". I don't see a contradiction.
"According to SK Hynix deal, Nvidia Volta to feature GDDR6"
"According to SK Hynix deal, Nvidia Volta could feature GDDR6"

Those are not equivalent statements. One is an indefinite the other is not.
Posted on Reply
#50
bug
Totally"According to SK Hynix deal, Nvidia Volta to feature GDDR6"
"According to SK Hynix deal, Nvidia Volta could feature GDDR6"

Those are not equivalent statements. One is an indefinite the other is not.
It's still "according to Hynix". But ok, you've spotted another worldwide conspiracy to make Nvidia look better than it should. PM me your address so I can send you some candy.
Posted on Reply
Add your own comment
Apr 18th, 2024 09:48 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts