• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Announces Its Volta-based Tesla V100

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.16/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
Today at its GTC keynote, NVIDIA CEO Jensen Huang took the wraps on some of the features on their upcoming V100 accelerator, the Volta-based accelerator for the professional market that will likely pave the way to the company's next-generation 2000 series GeForce graphics cards. If NVIDIA goes on with its product carvings and naming scheme for the next-generation Volta architecture, we can expect to see this processor on the company's next-generation GTX 2080 Ti. Running the nitty-gritty details (like the new Tensor processing approach) on this piece would be impossible, but there are some things we know already from this presentation.

This chip is a beast of a processor: it packs 21 billion transistors (up from 15,3 billion found on the P100); it's built on TSMC's 12 nm FF process (evolving from Pascal's 16 nm FF); and measures a staggering 815 mm² (from the P100's 610 mm².) This is such a considerable leap in die-area that we can only speculate on how yields will be for this monstrous chip, especially considering the novelty of the 12 nm process that it's going to leverage. But now, the most interesting details from a gaming perspective are the 5,120 CUDA cores powering the V100 out of a total possible 5,376 in the whole chip design, which NVIDIA will likely leave for their Titan Xv. These are divided in 84 Volta Streaming Multiprocessor Units with each carrying 64 CUDA cores (84 x 64 = 5,376, from which NVIDIA is cutting 4 Volta Streaming Multiprocessor Units for yields, most likely, which accounts for the announced 5,120.) Even in this cut-down configuration, we're looking at a staggering 42% higher pure CUDA core-count than the P100's. The new V100 will offer up to 15 FP 32 TFLOPS, and will still leverage a 16 GB HBM2 implementation delivering up to 900 GB/s bandwidth (up from the P100's 721 GB/s). No details on clock speed or TDP as of yet, but we already have enough details to enable a lengthy discussion... Wouldn't you agree?



View at TechPowerUp Main Site
 
Last edited by a moderator:
Looks very promising for scientific research. On gaming perspective it should be pretty amazing as well.

Somehow i feel like Nvidia has already maxed out the possible efficiency optimization for Pascal-Maxwell CUDA designing. They are also back to the MOAR CORE and Higher MHz direction. With so many CUDA units available i am pretty sure Async Computing will be Volta's advantage. It should perform pretty well in Vulkan and DX12.

Poor VEGA.
 
Doesn't look to be that great a graphics card considering its specs@ 21 billion transistors, but Google might like it.
Is it me or have they done little but rearrange the graphics components then add a shit tone of ai processors and call it done.
I am now more interested in lower tier volta because this ones for egg heads :) and not so much gamers ,what the heck is small volta going to look like given big voltas trying to go all Cyberdyne on us.
I don't believe this is going to be easy to market to joe public though because any record for dearest consumer card is going to get obliterated when this hits the shops Christmas 2018 ;).
 
uh oh, this is should be BIG warning sign for AMD
Nvidia has give a glimpse on Volta, meanwhile AMD still teasing us with Vega :shadedshu:
 
Doesn't look to be that great a graphics card considering its specs@ 21 billion transistors, but Google might like it.
Is it me or have they done little but rearrange the graphics components then add a shit tone of ai processors and call it done.
I am now more interested in lower tier volta because this ones for egg heads :) and not so much gamers ,what the heck is small volta going to look like given big voltas trying to go all Cyberdyne on us.
I don't believe this is going to be easy to market to joe public though because any record for dearest consumer card is going to get obliterated when this hits the shops Christmas 2018 ;).
What? The specs looks fantastic!

You never saw the GP100 for consumers and you will never see this. Expect a completely different card with no FP16, FP64 and Tensor-capabilities.
 
Only 15 TFLOP? That's only 3 TFLOP more than Vega should have. That lower clockspeed hurts.
 
Only 15 TFLOP? That's only 3 TFLOP more than Vega should have. That lower clockspeed hurts.
Well, that's not how you measure performance when selecting processing power for datacenters. Unless Vega has something like the Tensor Cores it will not even be competition.

It also has the capability of executing INT32 and FP32 simultaneously. I know devs at my work are already frothing in their mouths just reading about it.
 
Correct me if I am wrong but ~6% CUDA core performance increase...? :wtf:
Yeah, a measly 6% more performance coupled with a measly 42% more CUDA cores amounts to about nothing :wtf:

And I'm not sure where you got the 6% from.
 
And such a massive HPC chip already pushing 1.4+ Ghz.

Strip away the tech not relevant to gaming and GV104 and GV102 chips are gonna absolutely fly.
 
Last edited:
Yeah, a measly 6% more performance coupled with a measly 42% more CUDA cores amounts to about nothing :wtf:

And I'm not sure where you got the 6% from.


Lets not forget price gouging/fixing...
 
nvidia gives us Volta specs. AMD gives us a Vega logo. Yeah Intel should've been the one to buy ATI.
 
nvidia gives us Volta specs. AMD gives us a Vega logo. Yeah Intel should've been the one to buy ATI.

To be fair, they did announce recently that Vega was going to be called RX.... Vega.
 
nvidia gives us Volta specs. AMD gives us a Vega logo. Yeah Intel should've been the one to buy ATI.

Baseless comment.
 
Lets not forget price gouging/fixing...
Perhaps AMD should wake up and make a decent GPU then? Nobody can really blame nvidia for making some additional cash and capitalizing on AMD's inability to perform.
 
Lol at yields of a die that size. Consumer card will be missing 1,300 at least.
 
Perhaps AMD should wake up and make a decent GPU then? Nobody can really blame nvidia for making some additional cash and capitalizing on AMD's inability to perform.

AMD made several excellent GPU's that were either better priced, technologically superior or just plain superior as whole. And their market share didn't change at all. In fact they kept on losing it. Maybe, instead of blaming AMD, users should blame themselves? Ever though of it that way? I've seen people constantly sticking with Intel or NVIDIA literally "just because". And then tehy go 180°and piss on AMD for "not doing a better job". ¯\_(ツ)_/¯
 
AMD made several excellent GPU's that were either better priced, technologically superior or just plain superior as whole. And their market share didn't change at all. In fact they kept on losing it. Maybe, instead of blaming AMD, users should blame themselves? Ever though of it that way? I've seen people constantly sticking with Intel or NVIDIA literally "just because". And then tehy go 180°and piss on AMD for "not doing a better job". ¯\_(ツ)_/¯

Apart from the grammar error, when exactly did that happen? All i remember are slower and far less efficient cards, in what way exactly were those technologically superior? Maybe i'm not counting another aspect.
 
Yeah, a measly 6% more performance coupled with a measly 42% more CUDA cores amounts to about nothing :wtf:

And I'm not sure where you got the 6% from.

The chip is 33% bigger and carries 33% more cores blow Pascal up by 33% and you will have a very very similar chip that uses more power
 
Last edited:
Lets not forget price gouging/fixing...

Because pricing is totally what we were talking about.

AMD made several excellent GPU's that were either better priced, technologically superior or just plain superior as whole. And their market share didn't change at all. In fact they kept on losing it. Maybe, instead of blaming AMD, users should blame themselves? Ever though of it that way? I've seen people constantly sticking with Intel or NVIDIA literally "just because". And then tehy go 180°and piss on AMD for "not doing a better job". ¯\_(ツ)_/¯

Users can't be expected to sift through a lineup to identify which cards are current generation and which are actually worth buying. Ever thought about it that way?
But the simple truth is users are users and when competing for them, it's the companies that are expected to bend over backwards. Crying like a baby that your good product doesn't sell is not a market strategy.
 
What a freakin' massive chip! 815mm² :kookoo:
Would be nice if the 2070 would come close to 1080TI levels ...

Edit: Cannot wait for Vega, though. Really wanna see what AMD has to offer as well.
 
Back
Top