• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GTC 2019 Kicks Off Later Today, New GPU Architecture Tease Expected

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA will kick off the 2019 GPU Technology Conference later today, at 2 PM Pacific time. The company is expected to either tease or unveil a new graphics architecture succeeding "Volta" and "Turing." Not much is known about this architecture, but it's highly likely to be NVIDIA's first to be designed for the 7 nm silicon fabrication process. This unveiling could be the earliest stage of the architecture's launch cycle, would could see market availability only by late-2019 or mid-2020, if not later, given that the company's RTX 20-series and GTX 16-series have only been unveiled recently. NVIDIA could leverage 7 nm to increase transistor densities, and bring its RTX technology to even more affordable price-points.



View at TechPowerUp Main Site
 
  • Like
Reactions: 64K
They need more rtx games,it's been just two and one is a mp shooter.This technology will be dead if this continues.
 
So RTX 2000 series obsolete in less than a year ?
 
It's a trend, like new mobile phones every 1-2 year and milking customers more frequently.
 
So RTX 2000 series obsolete in less than a year ?

They rolled out Turing on 12nm TSMC which is more like 16nm++ last October, and basically we knew that 7nm for mass production would be available from 2019Q1 more or less. I personally did not see the point of launching the new RTX series on this node. I think they did it cause they knew they could and AMD had nothing to counter with, cause they are all-in on 7nm.

After Turing launch we've seen some performance gains, but no better performance per dollar and meh RTX performance. I've even told everyone who is not building a completely new rig or at least is sitting on GTX 1070 or higher performance GPUs to not bother buying Turing, unless you are like super enthusiast but still RTX 2080 Ti at that price... yikes.

So if they announce RTX 3000 series on 7nm Ampere with a boatload of more CCs and a 2nd generation RT logic today it will be funny to read and see the reactions through out the techpress and techforums.
 
So RTX 2000 series obsolete in less than a year ?
Like I excpected. It was known to launch productions for wide amount of 7nm products for many companies in around mid-year. RTX 2000 series was simple placeholder with improved 12nm to milk as many as possible, with faster 7nm we probably wouldn't even see this rtx 2000 with 12nm. Delays of 7nm and some AMD firecrackers forced them to release new gpus. Now we will see real new gpu generation, or bullshit - same shrinked gpu with new name.
 
Like I excpected. It was known to launch productions for wide amount of 7nm products for many companies in around mid-year.
AMD Presentation from December 2017 noted that 250mm² chip at 7nm cost twice as much as 250mm² chip at 12nm. Hopefully this has improved by now.
I personally did not see the point of launching the new RTX series on this node. I think they did it cause they knew they could and AMD had nothing to counter with, cause they are all-in on 7nm.
As with any new technology, they need to get around the chicken or egg problem. RTRT games/software/API and something to support RTRT in hardware. Whether the new hardware implementation is technically good or bad, the first generation usually does not succeed because games are not there. Games will not be there if there is no hardware.
 
I do wonder, how many of those leather jackets does Huang have? I imagine a wardrobe that contains only that item.
 
I'm expecting good things from Nvidia on 7nm. Hopefully the prices will be more reasonable as well.

I wish we could squeeze some info out of Intel as to where they are heading too. afaik they are still planning a gaming GPU launch next year sometime.

I do wonder, how many of those leather jackets does Huang have? I imagine a wardrobe that contains only that item.
There can be only one. :)
 
i like the word "affordable" by NVIDIA LUL
 
AMD Presentation from December 2017 noted that 250mm² chip at 7nm cost twice as much as 250mm² chip at 12nm. Hopefully this has improved by now.
As with any new technology, they need to get around the chicken or egg problem. RTRT games/software/API and something to support RTRT in hardware. Whether the new hardware implementation is technically good or bad, the first generation usually does not succeed because games are not there. Games will not be there if there is no hardware.

I don't think that's totally correct. shrinks bring more yields. You have more chips on one wafer with lower count of defects affecting chips. This means it is cheaper or supposed to be cheaper.
 
I don't think that's totally correct. shrinks bring more yields. You have more chips on one wafer with lower count of defects affecting chips. This means it is cheaper or supposed to be cheaper.
Shrinks are done via improvements on the optics, substances, etc. These cost a fortune and usually are very reliable in the beginning. Hence the much higher cost of the same chip size in a newer process, same chip size that sells for the same amount of money as the previous generation.
 
They rolled out Turing on 12nm TSMC which is more like 16nm++ last October, and basically we knew that 7nm for mass production would be available from 2019Q1 more or less. I personally did not see the point of launching the new RTX series on this node. I think they did it cause they knew they could and AMD had nothing to counter with, cause they are all-in on 7nm.

After Turing launch we've seen some performance gains, but no better performance per dollar and meh RTX performance. I've even told everyone who is not building a completely new rig or at least is sitting on GTX 1070 or higher performance GPUs to not bother buying Turing, unless you are like super enthusiast but still RTX 2080 Ti at that price... yikes.

So if they announce RTX 3000 series on 7nm Ampere with a boatload of more CCs and a 2nd generation RT logic today it will be funny to read and see the reactions through out the techpress and techforums.

I can guarantee you with absolute confidence there will be no RTX3K series talk in this event.
 
I don't think that's totally correct. shrinks bring more yields. You have more chips on one wafer with lower count of defects affecting chips. This means it is cheaper or supposed to be cheaper.
Shrink, in the early part of smaller process' life brings lower yield. Factual data on yields are hard to come by but usually in the first year of mass production, yields are noticeably worse than old, well matured process.

In addition to that, the cost of using a smaller process has been increasing over the last few generations. A few process steps ago producing a chip on a new, smaller process cost close to the same as the old one automatically bringing better cost efficiency (mostly lower prices to consumers along with it). This was not exactly the case with shrink from 22nm to 16nm and the cost difference between 16/14/12nm and 7nm is even worse. Smaller process is still worth it for its performance and especially power efficiency but not necessarily cost.

Also, yields and manufacturing costs do not rise linearly with die size. AMD's slide was for 250 mm². The current 7nm flagship GPU - Vega 20 on Radeon VII and MI cards - is a little over 30% larger than that example at 331 mm². There is a reason this competes in price with TU104 with the size of 545 mm² at 12nm.
Edit: Just to be clear, I the intention was not to compare Radeon VII and RTX2080 or start a discussion on that. Both GPUs in them - Vega 10 and TU104 - have 13.x billion transistors and have about the same compute performance. They are as good a comparison for 12nm vs 7nm as we are going to get right now.
 
Last edited:
I don't think that's totally correct. shrinks bring more yields. You have more chips on one wafer with lower count of defects affecting chips. This means it is cheaper or supposed to be cheaper.
In the past that might be true. The reason why nvidia did not wait for 7nm and using "refined" 16nm process is because of the cost issue. In fact they already talk about this more extensively during 28nm generation. Going forward shriking does not guarantee lower cost due to the process itself becoming very expensive due to how hard it was to get it right.
 
I'll stick with my 1070 TIs for now, thanks - not overly impressed by the RTX series when factoring in the cost. Not enough games to sway either...
 
Shrink, in the early part of smaller process' life brings lower yield. Factual data on yields are hard to come by but usually in the first year of mass production, yields are noticeably worse than old, well matured process.
In addition to that, the cost of using a smaller process has been increasing over the last few generations. A few process steps ago producing a chip on a new, smaller process cost close to the same as the old one automatically bringing better cost efficiency (mostly lower prices to consumers along with it). This was not exactly the case with skrink from 22nm to 16nm and the cost difference between 16/14/12nm and 7nm is even worse. Smaller process is still worth it for its performance and especially power efficiency but not necessarily cost.

That's what I've heard as well. The GPUs are getting more and more expensive to manufacture as the process node gets smaller.
 
Bringing RTX to an even more affordable price point.... lol
 
I do wonder, how many of those leather jackets does Huang have? I imagine a wardrobe that contains only that item.
I like to think he has them labelled with Monday, Tuesday and so on. So atleast 7?
 
I don't think that's totally correct. shrinks bring more yields. You have more chips on one wafer with lower count of defects affecting chips. This means it is cheaper or supposed to be cheaper.

No no. Smaller dies bring more yields. Shrinks do not necessarily, every time you go smaller, the margin for error decreases and the chance for errors increases, because there are more masking/lighting steps, and accuracy needs to be higher.

Now, take a long look at Turing die sizes ;)
 
Not much is known about this architecture, but it's highly likely
Haha.

PS
Teasing the upcoming teasing.

That's what I've heard as well. The GPUs are getting more and more expensive to manufacture as the process node gets smaller.
And in parallel income skyrokets, curiously.
 
In the past that might be true. The reason why nvidia did not wait for 7nm and using "refined" 16nm process is because of the cost issue. In fact they already talk about this more extensively during 28nm generation. Going forward shriking does not guarantee lower cost due to the process itself becoming very expensive due to how hard it was to get it right.
No no. Smaller dies bring more yields. Shrinks do not necessarily, every time you go smaller, the margin for error decreases and the chance for errors increases, because there are more masking/lighting steps, and accuracy needs to be higher.

Now, take a long look at Turing die sizes ;)

I hear what you guys are saying. Anyway, if you take turing die and it's size and move from 12 to 7nm, considering it's got same amount of cores, shaders etc. the die will be smaller. You have more of them on one wafer. I kinda consider this that way. What it means for me is that you get same performance (because it is the same chip) but it uses less power and it's smaller due to the shrink.
Isn't this going that way?

Just to add. Turing die size. Yes I get it but isn't it faster at the same time from 1080 TI for example? Not sure about difference in the die size of the two.
 
Last edited:
They will launch a high end HPC/ DL AI part first regardless, Radeon VII competes with a 1080 Ti so they have bigger fish to fry.
 
Back
Top