Wednesday, May 10th 2017

NVIDIA Announces Its Volta-based Tesla V100

Today at its GTC keynote, NVIDIA CEO Jensen Huang took the wraps on some of the features on their upcoming V100 accelerator, the Volta-based accelerator for the professional market that will likely pave the way to the company's next-generation 2000 series GeForce graphics cards. If NVIDIA goes on with its product carvings and naming scheme for the next-generation Volta architecture, we can expect to see this processor on the company's next-generation GTX 2080 Ti. Running the nitty-gritty details (like the new Tensor processing approach) on this piece would be impossible, but there are some things we know already from this presentation.

This chip is a beast of a processor: it packs 21 billion transistors (up from 15,3 billion found on the P100); it's built on TSMC's 12 nm FF process (evolving from Pascal's 16 nm FF); and measures a staggering 815 mm² (from the P100's 610 mm².) This is such a considerable leap in die-area that we can only speculate on how yields will be for this monstrous chip, especially considering the novelty of the 12 nm process that it's going to leverage. But now, the most interesting details from a gaming perspective are the 5,120 CUDA cores powering the V100 out of a total possible 5,376 in the whole chip design, which NVIDIA will likely leave for their Titan Xv. These are divided in 84 Volta Streaming Multiprocessor Units with each carrying 64 CUDA cores (84 x 64 = 5,376, from which NVIDIA is cutting 4 Volta Streaming Multiprocessor Units for yields, most likely, which accounts for the announced 5,120.) Even in this cut-down configuration, we're looking at a staggering 42% higher pure CUDA core-count than the P100's. The new V100 will offer up to 15 FP 32 TFLOPS, and will still leverage a 16 GB HBM2 implementation delivering up to 900 GB/s bandwidth (up from the P100's 721 GB/s). No details on clock speed or TDP as of yet, but we already have enough details to enable a lengthy discussion... Wouldn't you agree?
Add your own comment

103 Comments on NVIDIA Announces Its Volta-based Tesla V100

#51
RejZoR
m0nt34870 vs GTX280, 4870 was $200 cheaper, a much smaller die was under 15% performance difference. 4870 first card to use GDDR5
5870 (was as fast as dual GPU cards like 4870x2 and GTX 295) vs GTX285. Was more power effcient. Fermi arrived late, was hot and power hungry, although faster. 5800 series first to support eyefinity or multi-montor gaming.
R9 290/x vs 780/Titan

Pretty much the same through the midrange as well. The masses will buy nVidia, regardless of what AMD makes. Nvidia knows this, which is why they are selling midrange GPU's at higher prices. Like the 1080 launching at around $700 was it.



To be fair, there was poor opengl performance way back in the early 2000's. One of the reason the FX series did better in Doom 3 vs the Radeon 97/800 series. This hasn't been the case for awhile, except maybe in linux where the OpenSource driver is really turning things around and is now much faster in most OpenGL scenarios than the proprietary drive.
I've played basically all OpenGL games on Radeons and never even cared that it's OpenGL and that I was suppose to get inferior performance. And you probably know by now that I love playing games at max quality...
Posted on Reply
#52
m0nt3
RejZoRI've played basically all OpenGL games on Radeons and never even cared that it's OpenGL and that I was suppose to get inferior performance. And you probably know by now that I love playing games at max quality...
I didn't mean to give the impression that OpenGL games were unplayable. I beat Doom3 on a 9800 non-pro at max settings, minus AA, its performance just was not relative to the FX 5900 when comapred to DX9.
Posted on Reply
#53
Rockarola
I realize that computers work in binary, but I didn't know that it works the same way 40cm from the monitor!
Any post mentioning AMD/nVidia will look the same...fanboys screaming at each other, trolls crap posting just to get a rise and the few intelligent posts drowned out by the noise.
WHAT THE FUCK HAPPENED TO OBJECTIVITY???
/rant over
Posted on Reply
#54
ppn
the next titan X is probably going to be
GV102 5120 Cuda out of 5376
GDDR6 384 bit 12 GB HYNIX
610 sq.mm.
Posted on Reply
#55
GorbazTheDragon
Just some theory on why AMD doesn't sell too well.

AMD basically dropped all their market share in the mobile market, both on the CPU and GPU side. For many people I think the logical choice after buying an Intel-NV laptop is an Intel-NV desktop (not to mention that noone in their right mind would have bought FX after Haswell if it was for gaming).

AMD CPUs don't have the best rep in the last 5 years or so. Probably a lot of people (again, those who are not familiar with computers but want to game on PC) look at that and think the same applies to Radeon.
Posted on Reply
#56
Caring1
GorbazTheDragonJust some theory on why AMD doesn't sell too well.

AMD basically dropped all their market share in the mobile market, both on the CPU and GPU side. For many people I think the logical choice after buying an Intel-NV laptop is an Intel-NV desktop (not to mention that noone in their right mind would have bought FX after Haswell if it was for gaming).

AMD CPUs don't have the best rep in the last 5 years or so. Probably a lot of people (again, those who are not familiar with computers but want to game on PC) look at that and think the same applies to Radeon.
Hate to burst your bubble, but "those that are not familiar with computers" look at price and appearance when buying, they generally get sold cheaper systems the salespeople want to clear from stock, usually AMD products fill that niche.
Posted on Reply
#57
Fluffmeister
So Vega isn't even the fastest unreleased GPU anymore? Fuck sake, life is cruel.
Posted on Reply
#58
Caring1
FluffmeisterSo Vega isn't even the fastest unreleased GPU anymore? Fuck sake, life is cruel.
Theoretically it is as this Volta based Tesla V100 isn't a GPU as such.
Posted on Reply
#59
GorbazTheDragon
Caring1Hate to burst your bubble, but "those that are not familiar with computers" look at price and appearance when buying, they generally get sold cheaper systems the salespeople want to clear from stock, usually AMD products fill that niche.
I'd say it varies a lot, you have to remember that there is a large influx of PC gamers who want to build their own rig, the PCMR subreddit for example has a lot of traction. And I think a lot of people looking for real gaming rigs are buying prebuilts from sites where you can configure the systems, I think if you ask them they will tell you they sell more Intel than AMD systems, not entirely sure on the GPU side though...
Posted on Reply
#60
Fluffmeister
Caring1Theoretically it is as this Volta based Tesla V100 isn't a GPU as such.
This place has gone mad.
Posted on Reply
#61
TheoneandonlyMrK
HoodIt's common knowledge that AMD/ATI/Radeon software and driver support sucks, for all their hardware, including CPUs and chipsets. I guess you never got the memo...oh that's right, you're the one who ignores everything negative about AMD and makes up stats that "prove" your point. Please don't stop, you are providing comic relief with your delusions. Continue obsessing over trivial BS,
so we can keep laughing...
Your confused amd recently updated drivers and oc software for fx and radeon and fx needs no cpu upgrade via ageesa because it's not got built in killer faults;) but as we all know they do when needed, again bs.

Crossfire 480s with waterblocks do 4k@ 60hz fine on all old stuff , and I have been using them a year for 565
Posted on Reply
#62
Caring1
FluffmeisterThis place has gone mad.
If you are referring to me, it's never been proven :D
Posted on Reply
#63
R4E3960FURYX
oxidizedVega still has to come out, chill your beans, and also, even if they meant to take it on Volta, we all know they just misspelled Pascal ;) now let's be quiet and just wait for them.
I think RX Vega may not compete with 1080Ti GTX anymore. Due SK Hynix HBM2 supplier does not meet target speed 409.6GB/sec only with 1.6Gbps module.
RX Vega core config not different much from my R9 Fury X. Estimate 15% gain clock for clock from R9 Fury X GPU max.

Compare to those Volta GV 100 5000+ Cuda Core 32MB SM Memory (High Bandwidth Cache of Volta Cuda core) with 16GB HBM2 @ 900MHz.

RX Vega 8GB HBM2 is death and overkill by Volta GV 100.


336 Texture Unit OMG! 128 ROP 1455MHz Clock 5120 CUDA Core.



Maybe It was advance T-800 Chip from the terminator.

Posted on Reply
#64
GorbazTheDragon
Other thing you need to remember with this chip is it has many non fp32 cores. Expect a 5k CUDA core Volta for gaming to be significantly smaller.
Posted on Reply
#65
alucasa
FluffmeisterThis place has gone mad.
Look the same to me. Hasn't changed ever since I joined. The fanboyism has gotten worse but not by much.
Posted on Reply
#66
RejZoR
Caring1Hate to burst your bubble, but "those that are not familiar with computers" look at price and appearance when buying, they generally get sold cheaper systems the salespeople want to clear from stock, usually AMD products fill that niche.
I've seen how that worked for normies. They were getting "recommendations" from experts. And experts said "buy Pentium 4" even though Athlon XP was the shit at the time. So, I've kinda seen how that went. Intel was shoehorned everywhere "just because Intel". Literally.

EDIT:
Also idiots comparing an out of this world expensive Volta compute unit with a consumer gaming GPU. I don't think I'm allowed to express the words I'd like to express right now, right here...
Posted on Reply
#67
Kanan
Tech Enthusiast & Gamer
RejZoRI've seen how that worked for normies. They were getting "recommendations" from experts. And experts said "buy Pentium 4" even though Athlon XP was the shit at the time. So, I've kinda seen how that went. Intel was shoehorned everywhere "just because Intel". Literally.
Indeed, but those times are over, Ryzen has a lot of momentum, and Intel's good reputation is long gone, by heavily milking its users and such chaotic things like toothpaste on CPU dies to potentially decrease overclocking in order to decrease longevity of its own CPUs to force users to buy new stuff more early (after seeing Sandy Bridge is "too good"). I think AMD, at least their CPU part of company is way more mature than in Athlon XP times, AMD isn't so small anymore and the internet, and changed behaviour of people (almost anyone is a internet guy nowadays, not just nerds and typical PC users like before) helps them big big time. They are doing a way better job at PR today than 5, 10 or more years ago, I'm quite satisfied with it, and I'm rather critical person, and always critisized them for poor PR.
EDIT:
Also idiots comparing an out of this world expensive Volta compute unit with a consumer gaming GPU. I don't think I'm allowed to express the words I'd like to express right now, right here...
Right, it's annoying as fuck. Who to blame? The news editor, but I think it was intended. I already did multiple posts to explain that it's not a GeForce GTX GPU for consumers and will never be, and that the actual Titan "Xv" will be much smaller, probably under 500mm² again.
Posted on Reply
#68
alucasa
Let's vote to ban the news editors.
Posted on Reply
#69
oxidized
m0nt3The GTX 280 was a bit better than the 4870 in performance, but not $200 better, which is the point, nvidia still outsold, even with the GTX 400 series. All through the GTX 200-500 series AMD was more power effceint, which was a reflection of their small die strategy, yet still maintained performance very close to nvidia. Most temp problems are with reference coolers. As an owner of a 4870 I don't recall going much past 80C with 4870 even in crossfire during BFBC2. nvidia having the better product over the last 10 years, is just not true, as I have been trying to demonstrate. In 2007 with the 8800 series, yes hand down no denying, the 2900 series was not great. But from the GTX 200-500 series as I stated above, was just not so. Although the 6900 series was a bit sub-par in my oppinion, as it certainly lacked in AA performance compared to nvidia as well as tessalation. That all changed with GCN. The 7000 series was faster than anything at the time, until kepler came out (which was the begining of nvidia selling midrange GPU's as higher end GPU's). After some drivers updates the 7000 series was back on par with kepler. IMO The R9 290/290X was a big upset for nvidia, besting their $1000 card in several scenarios, unfortunately it was plagued by the bitcoin mining fad. A lot of Market share was already lost before this point, in AMD's case. They have had a bad stigma that they have not been able to overcome. Maxwell was a homerun for nvida, but was behind AMD in one aspect, asynchronus compute, which has really helped the AMD cards shine with DX12/Vulkan and has been in place since the first gen of GCN and still isnt in pascal, which is nothing more than die shrunk maxwell + more cude core and higher clocks. SO nvidia hands down for the past 10 years? No way. AMD has been very close and the technological superior comments made, are because they were doing very close to nvidia performance, with a small die, with lower power for most of those ten years while still being very close in performance. I am not making the case, that AMD has lost drastic market share because of some consumer conspiracy, I am simply pointing out that it is NOT because they were technologically inferrior. However, i believe at this point, Vega could be 15% faster than Titan Xp and AMD's market share would not grow drastically.

So please share, how did AMD do so bad? Where they failed was marketing and getting game devs on board. Which they are working on correcting now. nvidia won early on, IMO, because of their The Way Its Meant to be Played marketing strategy and later on GimpWorks. You could see this loading at the start of many games back in the day. I remember seeing this at the begining of UT2003/4 as the skar busted through it, there was a mod to change it to an ATi, logo.
Let alone the costs, that's the only big issue with nvidia. Fermi refresh aka GTX5xx was better than its counter part, you yourself said even GTX4xx was faster, but with all those big issues, well 5xx was even faster and with none of those problems, actually i think that 5xx was one of the most successful recent "architectures". Again about power efficiency i don't remember back then but i pretty sure remember "faster" and "cooler". My old GTX580 reference pcb/cooler reaches 85C now, pretty sure it was under 80C back then even at full load. So let's recap, 8800 series was great, 9800 was just a rebrand, so, just faster, with all 8800 series pros, GTX2xx was a better product than its counterpart, despite maybe efficiency (?) GTX4xx was aids, GTX5xx was pretty good as it came with GTX4xx or better performance, but run cooler and less power hungry and everything else, after that GTX6xx 7xx 9xx, nvidia always had the upper hand, at least for the first years the product came out, after that, Rx 2xx/3xx "finewine" kicked in, and those cards managed to age better than nvidia counterpart, and eventually managed to reach or slightly top them, all this at least 1 year later in some case, 2 or more in some other case, not to mention the poor efficiency and temperature on those cards, that were pretty much always worse than nvidia counterpart. Also now that i remember my brother used to have a XFX 5850 in his pc i built back then, and i remember it being a pretty decent card, but it was very power hungry and temps were not that good, just ok.
ATi/AMD has been behind for the last 10 years, i agree that they always were pretty close, incredibly actually, having far far less funds than nvidia, that's admirable, they surely worked harder than nvidia, that's no doubt, but they were behind, except for GTX4xx, in the rest of scenarios, nvidia just managed to collect more points in terms of pro vs cons, so if certain ATi/AMD card was more power efficient than its nvidia counterpart, it was at the same time slower and had higher temps and not so good driver support, that's why i used the term "overall" because we cannot just use, 1 pro as an example, and make it matter more than the others, if a X card was faster than a Y card, but the Y card was more efficient, cheaper, lower temps, and better drivers, the Y card would be the better card overall, and this repeats with all the other pros, as long as one card as more pros than the other, that's the better product.
You wanna talk about dies? Ok let's do it, let's talk about Polaris vs Pascal GP106, let's see, 480 is slower, hotter, FAR less power efficient, maybe has better drivers, cheaper (maybe, because initially, here in europe i can assure you, was far easier to find 1060 at a decent prices, 480 had insane prices and far less availability), less availability at launch and for the 2/3 months upon launch, so which one is the better product? You can't only make price matter, if 1060 cost more, how much more it's to see but, still, let's say it cost more, because we're not only talking about europe here, what's the problem? It's the better product, in almost everything, OH i forgot to mention that Polaris has a slightly better performance on DX12 and Vulkan (maybe, because DOOM's shift of framerates isn't only related to Vulkan), GTX 1060 is still a winner overall (i ordered an XFX rx 480 gtr black edition a week ago, should be shipped either today or tomorrow, so no, i'm not the fanboy you think i am).
Ok so i shared how AMD did worse in the last 10 years (no blame, seen the funds) they also failed miserably in marketing yeah, which they are correcting now? Really? "Poor Volta-ge" anyone? "It has a brain" "it has a soul"? Really, correcting? No. Gimping cards is still another one of those legends i really wanted to see with my eyes, because sounds so much like a BS, but no, hey, BS only come out of nvidia's mouth.
"You could see this loading at the start of many games back in the day. I remember seeing this at the begining of UT2003/4 as the skar busted through it, there was a mod to change it to an ATi, logo"
Basically ATi/AMD in a nutshell.
Posted on Reply
#70
Nokiron
RejZoRPeople keep yodeling this "bad software support" and "bad drivers" and "awful OpenGL support/performance" and in decade of owning ATi/AMD cards, I hardly ever experienced any issues, stability or performance. This too is one of the reasons why AMD just can't get any traction. People spreading misinformation and just plain lies. For reasons unknown to any logic.
What? Im talking business and enterprise (which is the market that matters). AMD did NOT have good software support, no matter if you like it or not.
Posted on Reply
#71
R4E3960FURYX
ppnthe next titan X is probably going to be
GV102 5120 Cuda out of 5376
GDDR6 384 bit 12 GB HYNIX
610 sq.mm.
King of 2018 GPU ever made.
Posted on Reply
#72
medi01
12nm is a marketing term Intel would laugh at.

Clock speed is slower (expected).
800mm2 is monstrous, so will be the price of that thing.

Some say it means that consumer products are around the corner. If so, bad for AMD, although it won't be as big a jump as some expect.
R4E3960FURYXEstimate 15% gain clock for clock from R9 Fury X GPU max.
And they use nVidia marketing slides to somehow get to 12.5 TFlops figure (up from 8.6) with mere 15% clock bump and the same number of shaders.
Genious.

You should apply for a job at wccftech dude, you got talent.
NokironIm talking business and enterprise (which is the market that matters)
Market that matters to... whom?
All but gaming market share is laughable.

So are 20-80k cars Tesla makes annually, compared to it.
Posted on Reply
#73
GorbazTheDragon
medi01And they use nVidia marketing slides to somehow get to 12.5 TFlops figure (up from 8.6) with mere 15% clock bump and the same number of shaders.
Genious.

You should apply for a job at wccftech dude, you got talent.



Market that matters to... whom?
All but gaming market share is laughable.

So are 20-80k cars Tesla makes annually, compared to it.
Strength of volta will probably show in areas outside FP32.

And I'd encourage you to check your facts whrn it comes to what markets nvidias chips are sold... Last I heard they were producing huge volumes of chips for supercomputers and processing clusters for commercial/government funded operations. From what I've heard from friends in astronomy, they are also interested in moving to GPGPU, which will be done with NV chips. The market is most definitely there, and it is also set to grow.
Posted on Reply
#74
Nokiron
medi01Market that matters to... whom?
All but gaming market share is laughable.

So are 20-80k cars Tesla makes annually, compared to it.
For AMD. That's where the money is.
Posted on Reply
#75
medi01
GorbazTheDragonI'd encourage you to check your facts
GorbazTheDragonI've heard from friends
Ironic.
NokironFor AMD. That's where the money is.
It's a quickly growing market that is still dwarfed by gaming GPU market.
Posted on Reply
Add your own comment
Apr 19th, 2024 23:08 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts