• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA's Next-Generation Ampere GPUs to be 50% Faster than Turing at Half the Power

Nvidia did it before, they only got lazy lately

Geforce 6800 Ultra - four as fast as Geforce FX 5950 Ultra
Geforce 7800GTX - Twice as fast as 6800 Ultra
Geforce 8800GTX - Twice as fast as 7900GTX
GTX 280 - Twice as fast as 8800 Ultra
GTX 480 - Twice as fast as GTX 280
GTX 680 - Twice as fast as GTX 580
And then nvidia got lazy
They were obviously milking but it's not hard to see that Nvidia will push their GPUs even harder than before since Intel is jumping in and the AI revolution is right around the corner. Everyone knows the GPU is simply the future for next-generation computing. Tomorrow's landscape will revolve heavily on the GPU, and the CPU will only become less and less relevant especially when we're talking about AI-enhanced software. Intel will most likely put tons of resources into their GPU development and Nvidia will be foolish to hold back even a little. Jensen Huang is too good of a leader to let that happen though.
 
As we all know, 7 nm process introduced something new in the form of hot spots or heat concentration: this can be seen in big dies, such as Vega VII GPU, as well as in smaller dies, such as Zen 2 chiplet's.

How will nVidia tackle this problem?

- lower speeds?
- stronger coolers?
- other?
 
As we all know, 7 nm process introduced something new in the form of hot spots or heat concentration: this can be seen in big dies, such as Vega VII GPU, as well as in smaller dies, such as Zen 2 chiplet's.

How will nVidia tackle this problem?

- lower speeds?
- stronger coolers?
- other?
how about they just say "110 degrees is fine" and igore it completely ?
 
Maybe ... it's a possibility ...
it's a possibility that results from completely different dies on samsung 7nm euv will vary from amd's tsmc 7nm too.
 
it's a possibility that results from completely different dies on samsung 7nm euv will vary from amd's tsmc 7nm too.

nVidia is jumping directly to 7 nm EUV? I thought they were going for "normal" 7 nm.

EDIT

Also: OP states Ampere will use TSMC's 7 nm and not Samsung's.
 
nVidia is jumping directly to 7 nm EUV? I thought they were going for "normal" 7 nm.

EDIT

Also: OP states Ampere will use TSMC's 7 nm and not Samsung's.

In the words of the Nvidia Korea chief, shared by The Korea Herald, "It is meaningful that Samsung Electronics' 7-nanometer process would be used in manufacturing our next-generation GPU,"
 
  • Like
Reactions: HTC
If true then I'm readying my poor wallet already.

For gaming it means nothing - all that RTX fad-, but for rendering engines utilizing RTX, Turing leaves old generations in the dust. I'm struggling to contain my urges and this rumor has set me straight. Must not buy. I'll not look at 2080Tis anymore. ;)

Current Turing architecture on Pro side has some merits (but at inflated-monopol-price of course) , but overall if somebody buys Turing solely for gaming, it's utter waste of money.
 
Tomorrow's landscape will revolve heavily on the GPU, and the CPU will only become less and less relevant especially when we're talking about AI-enhanced software.
Except it's really the opposite :rolleyes:
The future is PoP, MCM, EMIB & 3D stacking ~
Screenshot (244).pngScreenshot (243).pngScreenshot (242).pngScreenshot (241).pngScreenshot (240).png

The future, as Intel have realized, is not with CPU or GPU only solutions. They're just following what you could say AMD showed with Zen. And if Nvidia don't get their act together they'll be caught napping, Nvidia's lead is certainly not infallible with the kind of hardware AMD, Intel have up on the horizon. Ironically their biggest strength in this filed is software i.e. CUDA.
 

Then it's not going up one node but two instead and this changes things dramatically on a couple of fronts:

- the density will be higher than "normal" 7 nm meaning even more heat concentration and / or hot spots, unless Samsung's 7 nm EUV is less denser than TSMC's "normal" 7 nm
- possibly a much higher efficiency jump VS current Turing cards, which could actually enable 50% more performance while @ 50% less power
 
Then it's not going up one node but two instead and this changes things dramatically on a couple of fronts:

- the density will be higher than "normal" 7 nm meaning even more heat concentration and / or hot spots, unless Samsung's 7 nm EUV is less denser than TSMC's 7 nm EUV
- possibly a much higher efficiency jump VS current Turing cards, which could actually enable 50% more performance while @ 50% less power
from what I can infer from having a quick look at various news pieces,including samsung's forums,7nm euv will be used "substantially",while 7nm tsmc will probably be used too.Samsung has two 7nm euv itinerations: low power which has been in mass production already and high performance which is not in mass production yet.

it is confusing,but it seems like a portion of ampere will be made on high perf 7nm euv from samsung.we'll have to wait and see.

Except it's really the opposite :rolleyes:
The future is PoP, MCM, EMIB & 3D stacking
why can't gpus do the same ?
 
  • Like
Reactions: HTC
why can't gpus do the same ?
Nvidia are trying that, whether they get there is another matter. We haven't seen anything on the scale of Zen 2 or Feveros from Nvidia ever.
Yes there's always a first time but as of now AMD & Intel are way ahead in this field. Also Nvidia still lags massively in the CPU dept, that isn't changing anytime soon.
 
rdoa2 is doa
it's gonna be feliz navi-dead for amd when ampere launches :laugh:

nah,but seriously,I'm confused by this tsmc 7nm mention.
would not be surprised if smaller a107/108 dies went for tsmc,they segmented their production for big and small pascals between tsmc and glofo too IIRC.I bet rdna2 desktop cards would not like to see 3050 and 3060 production get in their way too.
 
probably himself.
5500xt 8gb costs exactly the same as 1660 super in reality while it's a tier down in performance.

relative-performance_1920-1080.png


btw I bet that amd shirt is a loan :laugh:

Didnt this sad case of an eternal student comment on me a few months back? He missed the ball then as he does now.

Stop giving it airtime... its really a new level of sad

it's gonna be feliz navi-dead for amd when ampere launches :laugh:

nah,but seriously,I'm confused by this tsmc 7nm mention.
would not be surprised if smaller a107/108 dies went for tsmc,they segmented their production for big and small pascals between tsmc and glofo too IIRC.I bet rdna2 desktop cards would not like to see 3050 and 3060 production get in their way too.

Smaller dies may even just remain DUV as is Navi 1st gen.

3070 should be a decent entry level 4K card, think GTA V @ 60fps @ 4k at fairly high settings minus high end AA, which is less necessary at 4K anyway.

3070 even?! That will be a repeat if how the 1070 is now not really sufficient for 1440p, then. There are no cards for a specific res. Already we recommend 2070S (1080ti) for smooth high/ultra gaming at that res...

4K will be a struggle for the next decade, make no mistake.
 
Stop giving it airtime... its really a new level of sad
this is shameless promotion from "gurustud" ( :rolleyes: )
a guy youtubing his commentary for TPU posts,how brave of him to avoid getting into a discussion.I bet that would go well for him.
 
3070 even?! That will be a repeat if how the 1070 is now not really sufficient for 1440p, then. There are no cards for a specific res. Already we recommend 2070S (1080ti) for smooth high/ultra gaming at that res...

4K will be a struggle for the next decade, make no mistake.
Yes, it's important to remember that it's a moving target; new games also get more demanding over time.
Personally, I'll favor 1440p 144 Hz over 4K 60 Hz any day, and 4K 144 Hz is out of reach for a card in the upper mid-range for now.
 
The problem with 4K is the expense. When a new high end GPU comes out it does pretty well at 4K but then a couple years later it's not adequate for some of the new games coming out (sometimes simply due to poor optimization) and you have to upgrade to the new high end GPU. So the 4K proposition is very expensive. It's there for those that want it badly enough but certainly not for mainstream gamers. I don't think Ampere will change that even with a 50% increase in performance. A couple of years after the 3080 Ti comes out you will be looking at a 4080 Ti to keep up.

I'm quite happy with 1440p 60 FPS average but even my 980 Ti is inadequate for this resolution with newer games.
 
Last edited:
Then it's not going up one node but two instead and this changes things dramatically on a couple of fronts:

- the density will be higher than "normal" 7 nm meaning even more heat concentration and / or hot spots, unless Samsung's 7 nm EUV is less denser than TSMC's "normal" 7 nm
- possibly a much higher efficiency jump VS current Turing cards, which could actually enable 50% more performance while @ 50% less power
While I don't disagree with what you're saying entirely, claiming that 7nm EUV is a full node up from 7nm DUV is... dubious. It is definitely an improved node, but not by that much.
 
While I don't disagree with what you're saying entirely, claiming that 7nm EUV is a full node up from 7nm DUV is... dubious. It is definitely an improved node, but not by that much.

You're right.

That said, it still brings substantial efficiency gains (or performance gains) to the full node.
 
Anyone want to buy two 2080 ti's? Fully water cooled, fully sexual.

Just asking... :roll: You can never stay ahead of that techno curve baby
 
Anyone want to buy two 2080 ti's? Fully water cooled, fully sexual.

Just asking... :roll: You can never stay ahead of that techno curve baby
So you're planning to go 6-9 months without a GPU? ;)
 
YouTube will put out (just about) anything people... just remember that.

Anything. Even avrona has a yt channel...I'm not surprised to see some other neophyte at it... is anyone?

Double standard my ass... get your head out of the sand.
 
The problem with 4K is the expense. When a new high end GPU comes out it does pretty well at 4K but then a couple years later it's not adequate for some of the new games coming out (sometimes simply due to poor optimization) and you have to upgrade to the new high end GPU. So the 4K proposition is very expensive. It's there for those that want it badly enough but certainly not for mainstream gamers. I don't think Ampere will change that even with a 50% increase in performance. A couple of years after the 3080 Ti comes out you will be looking at a 4080 Ti to keep up.

I'm quite happy with 1440p 60 FPS average but even my 980 Ti is inadequate for this resolution with newer games.

This is why I stick to 1080p I did have a 1070 I sold it when I was laid off, but it meant it was overkill at the time but would have aged more gracefully. Same thing a friend of mine did, he bought a GTX 980ti for 1080p and he's still happy.
 
oh no, where my strix 2080ti gonna head to? sub $400 after new gen?
 
Back
Top