• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA AD103 and AD104 Chips Powering RTX 4080 Series Detailed

No it won't be. You can't throw out MCD area, just because AMD moved them off the main chip...
nVidia also has Tensor Cores and a double number of nvenc/dec.

In other words, ~80% more than the Ampere cores (CUDA, RT, Tensor) and the added performance of a new generation. They can increase now because the disadvantage of the manufacturing node is no longer there.
The RTX 4090 promises to be a beast.
 
It's not irrelevant.
Realising an item's real worth vs. what the price tag says is part of what makes you an informed consumer. Just because something costs $900 it doesn't mean it's actually worth $900, even if you can afford it.
So you will settle for a lesser product even if you can afford the one that is best for you (and cost more) just because it is less of a value compering past product?

If so, you dont really need to but the product in the first place.
 
So you shop by preformance to Watt of $, not by die size.
The same way you don't shop by memory-bus and, in most situations, not by memory size.
No, its not really like that. If you have some experience in generations of GPU and buying the 'best one', also in terms of performance/$, its far too simple to just look at performance/$ in one isolated product at one moment in time.

GPUs last. And if you buy the right one, they can last very long. And when they last very long, the cost metric works out differently. The longer you can use a GPU, the more value you can extract from it. So damn right things like VRAM capacity matter. If your MO is buying small generational upgrades of, say, 30~40% each time or skip just one gen to get perhaps 50% on the same tier, you've not selected a price point, you just want every upgrade you can get your hands on and its not going to be cost effective. This is the very reason I stuck with my GTX 1080 for so long. It still games fine, I don't miss much honestly. But the more important reasoning here is that any upgrade won't gain me anything meaningful and the tier I usually buy into, to make sure I HAVE a GPU that can last 5 years plus, is far too expensive. Let's face it: a 10GB 3080 at a higher price than what I came from, is just simply not a great deal, no matter how much core perf it has over the past one. It's just missing resources. And a cheaper 12GB with much lower core power? Same issue: what's the point?! Gain 12 FPS in games you could already play?

There is a sweet spot within the high end of every GPU stack where you get something that 'exceeds the norm' for quite a while. Some call it future proofing, but rather, I prefer to call it the PC Sweet spot and it relates closely to where consoles are at that time. It applies to CPU, GPU, RAM, storage. Going to the very top end of available performance is extremely expensive and not cost effective, and because consoles aren't there yet, it won't pay off either because that level isn't being optimized for. But staying right under, and not buying at launch but afterwards, and taking a careful look at market developments, is extremely cost effective. You'll be having something that no software can reasonably get to its knees in the first few years, and after that you'll still have 'enough' for quite a long time as the mainstream/median of performance level in the market starts catching up.

So you can be damn sure I shop by memory size/bus/featureset and numerous other aspects of a GPU. When all those stars align, relative to what I'm upgrading from, and there is a gain of royally over 50% at a good price point (75~100% is better, and that time is close now for me), that's when I know I can get a new deal that will run anything for a long time and then some, while not getting ripped off.

The math is beautiful. In 2017 I paid 420 eur for a GTX 1080. I might soon sell it for 150 ish. I might buy a 550-600 eur replacement that's twice as fast. That's about 50 eur/year for high end gaming. Before mining I've made a similar move going from a 780ti to 1080.
 
Last edited:
So you will settle for a lesser product even if you can afford the one that is best for you (and cost more) just because it is less of a value compering past product?
I'm not comparing it to a past product. I'm comparing it to its actual worth. If it doesn't match its price tag, I won't buy it. Even if I had a million £, I wouldn't buy a Lamborghini because it offers terrible value.

If so, you dont really need to but the product in the first place.
That's right. Gaming is a hobby, not a necessity. There's still hundreds of old classics that I can play just fine with my 6500 XT. I'll want an upgrade eventually, but it won't have to be a ridiculously overpriced halo product from Nvidia.

No, its not really like that. If you have some experience in generations of GPU and buying the 'best one', also in terms of performance/$, its far too simple to just look at performance/$ in one isolated product at one moment in time.

GPUs last. And if you buy the right one, they can last very long. And when they last very long, the cost metric works out differently. The longer you can use a GPU, the more value you can extract from it. So damn right things like VRAM capacity matter. If your MO is buying small generational upgrades of, say, 30~40% each time or skip just one gen to get perhaps 50% on the same tier, you've not selected a price point, you just want every upgrade you can get your hands on and its not going to be cost effective. This is the very reason I stuck with my GTX 1080 for so long. It still games fine, I don't miss much honestly. But the more important reasoning here is that any upgrade won't gain me anything meaningful and the tier I usually buy into, to make sure I HAVE a GPU that can last 5 years plus, is far too expensive. Let's face it: a 10GB 3080 at a higher price than what I came from, is just simply not a great deal, no matter how much core perf it has over the past one. It's just missing resources. And a cheaper 12GB with much lower core power? Same issue: what's the point?! Gain 12 FPS in games you could already play?

There is a sweet spot within the high end of every GPU stack where you get something that 'exceeds the norm' for quite a while. Some call it future proofing, but rather, I prefer to call it the PC Sweet spot and it relates closely to where consoles are at that time. It applies to CPU, GPU, RAM, storage. Going to the very top end of available performance is extremely expensive and not cost effective, and because consoles aren't there yet, it won't pay off either because that level isn't being optimized for. But staying right under, and not buying at launch but afterwards, and taking a careful look at market developments, is extremely cost effective. You'll be having something that no software can reasonably get to its knees in the first few years, and after that you'll still have 'enough' for quite a long time as the mainstream/median of performance level in the market starts catching up.

So you can be damn sure I shop by memory size/bus/featureset and numerous other aspects of a GPU. When all those stars align, relative to what I'm upgrading from, and there is a gain of royally over 50% at a good price point (75~100% is better, and that time is close now for me), that's when I know I can get a new deal that will run anything for a long time and then some, while not getting ripped off.

The math is beautiful. In 2017 I paid 420 eur for a GTX 1080. I might soon sell it for 150 ish. I might buy a 550-600 eur replacement that's twice as fast. That's about 50 eur/year for high end gaming. Before mining I've made a similar move going from a 780ti to 1080.
That's a good way of thinking about it. :)

My thinking is that there's the
1. Halo products - They offer terrible value for money and depreciate quickly. In GPUs, avoid at all costs. In CPUs, only buy if your budget allows it. CPUs keep their usage value for a bit longer as generational upgrades aren't so significant. Unfortunately, Nvidia is positioning a bigger and bigger chunk of their product stack into this category. Spending nearly £1,000 on a graphics card that you'll swap for something else only a couple years later when DLSS 4 comes out and only runs on the next high-end thing is a terrible value, whichever way you look at it.
2. Mid-range - It gives you nearly the same experience as the top-end, but at a significantly lower price, and lower depreciation over time. The best value for money is here.
3. Low-end - Low price, low performance, and almost no resale value. This is where I shop when I'm curious about something new, but don't want to touch my savings for something unknown.

Edit: It's not just about price per performance, either. I couldn't care less if my games run at 60 or 100 fps, so from my point of view, the only difference between a midrange and high-end graphics card (besides price and power consumption / cooling requirements) is how well future games will run on it - which Nvidia kills by making new DLSS iterations run only on the newest generation. So effectively, the high-end offers worse value than ever before.
 
Last edited:
The idea of a mid range is broken with the 40 series, since 4090 provides best value. 4080 12 is less than a half and costs more.
Take into account that Nvidia also skipped 7nm, we are getting the 2024 cards now ahead of time. so the die size is more tightly packed than the usual. 4080 16 is 600mm² unpacked.
AD104 is not you regular 4070. to compete with 3080 Ti, 4070 would have needed only 25,6 billion and 256 bit bus and less L2.
Because of this the whole lineup would be messed up now, except 4090 that provides incredible value. but for how long, it depends how soon Nvidia moves to N3 or 3N.
 
Last edited:
The idea of a mid range is broken with the 40 series, since 4090 provides best value. 4080 12 is less than a half and costs more.
Take into account that Nvidia also skipped 7nm, we are getting the 2024 cards now ahead of time. so the die size is more tightly packed than the usual. 4080 16 is 600mm² unpacked.
AD104 is not you regular 4070. to compete with 3080 Ti, 4070 would have needed only 25,6 billion and 256 bit bus and less L2.
Because of this the whole lineup would be messed up now, except 4090 that provides incredible value. but for how long, it depends how soon Nvidia moves to N3 or 3N.

Yeah, what has changed is that today the low end and midrange are charged with premium taxes, while the halo enthusiasts parts are left without these premium taxes.

In a normal world, the 4080-12 should cost 599, while the 4090 should cost 1999.
What we see is the opposite, the halo is cheaper, while the midrange is more expensive.
 
Yeah, what has changed is that today the low end and midrange are charged with premium taxes, while the halo enthusiasts parts are left without these premium taxes.

In a normal world, the 4080-12 should cost 599, while the 4090 should cost 1999.
What we see is the opposite, the halo is cheaper, while the midrange is more expensive.
What I see is the mid-range moving up into high-end and halo categories, while the low-end gets basically no attention.

What I mean is, x90 used to be halo tier, x80 high-end, x70 and x60 mid-range, x50 entry-level gamer and x30 and x10 low-end, but now x90 and x80 are halo products, x70 is high-end, x60 and x50 are mid-range, and the low-end basically ceased to exist. AMD is only a tiny bit better in this regard. They actually have entry-level gaming cards with the 6400 and 6500 XT.

This is very strange in a time when chip manufacturing costs more than ever and people have less and less money for hobbies.
 
I'm not comparing it to a past product. I'm comparing it to its actual worth. If it doesn't match its price tag, I won't buy it. Even if I had a million £, I wouldn't buy a Lamborghini because it offers terrible value.


That's right. Gaming is a hobby, not a necessity. There's still hundreds of old classics that I can play just fine with my 6500 XT. I'll want an upgrade eventually, but it won't have to be a ridiculously overpriced halo product from Nvidia.


That's a good way of thinking about it. :)

My thinking is that there's the
1. Halo products - They offer terrible value for money and depreciate quickly. In GPUs, avoid at all costs. In CPUs, only buy if your budget allows it. CPUs keep their usage value for a bit longer as generational upgrades aren't so significant. Unfortunately, Nvidia is positioning a bigger and bigger chunk of their product stack into this category. Spending nearly £1,000 on a graphics card that you'll swap for something else only a couple years later when DLSS 4 comes out and only runs on the next high-end thing is a terrible value, whichever way you look at it.
2. Mid-range - It gives you nearly the same experience as the top-end, but at a significantly lower price, and lower depreciation over time. The best value for money is here.
3. Low-end - Low price, low performance, and almost no resale value. This is where I shop when I'm curious about something new, but don't want to touch my savings for something unknown.

Edit: It's not just about price per performance, either. I couldn't care less if my games run at 60 or 100 fps, so from my point of view, the only difference between a midrange and high-end graphics card (besides price and power consumption / cooling requirements) is how well future games will run on it - which Nvidia kills by making new DLSS iterations run only on the newest generation. So effectively, the high-end offers worse value than ever before.
Your point 1 conclusion is the exact nail on the head there: nvidia is pushing a larger part of the stack to halo product, and that is caused ironically not by monster specs, but only monstrous MSRP without much to show for it. It wont last.
 
Your point 1 conclusion is the exact nail on the head there: nvidia is pushing a larger part of the stack to halo product, and that is caused ironically not by monster specs, but only monstrous MSRP without much to show for it. It wont last.
The specs are monstrous, too, considering that you only need that category for 4K. Even my 6500 XT can play everything at 1080p. If I end up buying a 4070, 4060 or 7700 XT, I'll be sorted for a good few years.
 
A 12-GB part can not be "halo" because today's games and the future games need more VRAM allocation.
But thats just allocation its not usage and you will never notice!!! /S quote:-Nvidia Ampere early adopters
The specs are monstrous, too, considering that you only need that category for 4K. Even my 6500 XT can play everything at 1080p. If I end up buying a 4070, 4060 or 7700 XT, I'll be sorted for a good few years.
4K on a 12 GB part? Lmao, that will last all of 12 months at best. All I see is monstrous shader counts that say nothing, alongside way too little bandwidth.
 
A 12-GB part can not be "halo" because today's games and the future games need more VRAM allocation.
Allocation and usage are different things. Most modern games allocate as much VRAM as they can without using all of it.

4K on a 12 GB part? Lmao, that will last all of 12 months at best.
Depends on the game, I guess. The GPU resources are there, nonetheless. 12 months sounds about as long as Nvidia wants it to last. By that time, the 4080 Super 24 GB will be out.
 
Allocation and usage are different things. Most modern games allocate as much VRAM as they can without using all of it.


Depends on the game, I guess. The GPU resources are there, nonetheless. 12 months sounds about as long as Nvidia wants it to last. By that time, the 4080 Super 24 GB will be out.
Full allocation plus low bandwidth relative to core perf = stutter heaven. We've been here a few times in Nvidia history...

AMD also tried a top end product with low VRAM but very high relative bandwidth btw... Fury X with a measly 4GB. We have never seen a GPU fall off in perf over time faster than Fury X. It lost against 6GB at lower bandwidth every step of the way... losing its 1440p & 4K lead by the time Pascal released ; 980ti still relevant, Fury relegated to midrange.

VRAM matters, its the most effective tool for planned obscolescence.
 
Last edited:
Full allocation plus low bandwidth relative to core perf = stutter heaven. We've been here a few times in Nvidia history...
That will be magically solved by DLSS 4.0. ;) Oh wait... you'll need a 50-series card for that. :slap:

Edit: This is probably why they never released the 3070 Ti 16 GB. It would have made the entire 40-series pointless.
 
With 13.5GB right :D
That's the Nvidia recipe recently...
1. Pee in your pants seeing how fast the new generation is.
2. Fork up some money, or take out a loan to buy a shiny new x90 card.
3. Wait for a year until the Ti / Super version is out with better efficiency and more VRAM and your halo card isn't worth crap anymore.
4. Start again.
 
The die size differences (12-12.5% for AD102/Navi31 and 9-8% for AD103/Navi32) are based on the figures that leakers claimed for AMD.
The performance/W is just my estimation (4090 will be at max -10% less efficient if compared at the same TBP)
AMD fans saying otherwise just isn't doing AMD a favour because anything more it will to disappointment.
Even what I'm saying probably is too much, because if you take a highly OC Navi31 flagship partner card like Powercolor Red devil, Asus strix, Asrock Formula and the TBP is close to 450W, what i just said is that Navi31 flagship will be regarding performance 100% and 4090 90% which probably isn't going to happen...
Your time to shine is just around the corner. We'll see.
 
We'll see how much performance is raised; more and more I suspect that large L2 cache might possibly be a Tile architecture element more so than infinity_cache-copy so there might be some impressive jumps.
If it's so then Gigapixel's stuff is finally getting full use (anyone remember that name?)...
popcorn.gif
 
To those of you who don't understand why having two different cards being called the same (4080) is bad:


TLDR: It's 1. intentionally misleading costumers, 2. the way Nvidia plans to get away with selling a card for 900 bucks that realistically should launch around 600.
 
..are the gaming tech review sites going to persist in playing along with Nvidia's sham naming of this card or will they have the integrity to call it out for what it actually is?
I mean, it's what they name it man. It'd be a sham to name it anything else. We leave the opinions to the end user.

No they don't
Why would everyone naming it whatever they feel like be better and create less confusion? Do tell.
 
I mean, it's what they name it man. It'd be a sham to name it anything else. We leave the opinions to the end user.


Why would everyone naming it whatever they feel like be better and create less confusion? Do tell.
In all seriousness ii wouldn't expect to call it a 4070 , and at no point did I suggest otherwise.

It should be pointed out as THE Slightly gimped 4080 or the shit one.
 
In all seriousness ii wouldn't expect to call it a 4070 , and at no point did I suggest otherwise.

It should be pointed out as THE Slightly gimped 4080 or the shit one.
But it's not. It's called a 4080, same as the 16 GB version. But it's not the same.

The average Joe will walk into a store and think "hey, this 4080 is cheaper than that one. Cool, I don't need 16 GB VRAM anyway" only to find out later that his card is a lot slower than what he expected it to be. Or maybe he doesn't even realise it straight away. I'm not sure which scenario is more sad.

It's not like the RX 480 that you could buy with either 4 or 8 GB VRAM with the same GPU, but the small difference in price suggests so.

I agree with JayzTwoCents: it should be legally mandated to include specs on the box.
 
But it's not. It's called a 4080, same as the 16 GB version. But it's not the same.

The average Joe will walk into a store and think "hey, this 4080 is cheaper than that one. Cool, I don't need 16 GB VRAM anyway" only to find out later that his card is a lot slower than what he expected it to be. Or maybe he doesn't even realise it straight away. I'm not sure which scenario is more sad.

It's not like the RX 480 that you could buy with either 4 or 8 GB VRAM with the same GPU, but the small difference in price suggests so.

I agree with JayzTwoCents: it should be legally mandated to include specs on the box.
I meant in the press, we can't rename it as others have said but can make it known that we don't like it and recognise the BS it represents.

The 12Gb has no founders ed so Evga were on the right track IMHO.

MSRP means nothing, so clearly the 12GB cards are going to end up at or above the cost of Nvidia's 16GB card.

I wouldn't touch Nvidia with yours this time out, but many couldn't care less about ethics, a shame as this shits not new.

I agree with your statements though so no argument here.
 
I meant in the press, we can't rename it as others have said but can make it known that we don't like it and recognise the BS it represents.

The 12Gb has no founders ed so Evga were on the right track IMHO.

MSRP means nothing, so clearly the 12GB cards are going to end up at or above the cost of Nvidia's 16GB card.

I wouldn't touch Nvidia with yours this time out, but many couldn't care less about ethics, a shame as this shits not new.

I agree with your statements though so no argument here.
I see what you mean. If I was press, I'd start my review by stating that the 4080 12 GB is only a 4080 in name, and I do not recommend anyone to buy one at MSRP. It's essentially a scam.

I agree with you too - I'll give nvidia a hard pass on the 40-series as well. In fact, I'm tempted to build an all-AMD machine again.

Edit: The really sad thing is what JayzTwoCents said in his video. The card may be awesome, but the whole nvidia experience gets soured by their shady practices around naming and pricing.
 
Last edited:
Skipping this gen. Both CPU and GPU from this year are minuscule upgrades from last gen. Waiting on 15th gen and RTX 5000 series.
 
Back
Top