Tuesday, May 9th 2023

NVIDIA GeForce RTX 4070 Variant Could be Refreshed With AD103 GPU

Hardware tipster kopite7kimi has learned from insider sources that a variant of NVIDIA's GeForce RTX 4070 graphic card could be lined up with a different GPU - the AD103 instead of the currently utilized AD104-derived AD104-250-A1. The Ada Lovelace-based architecture is a staple across the RTX 40-series of graphics cards, but a fully unlocked AD103 is not yet attached to any product on the market - it will be a strange move for NVIDIA to refresh or expand the mid-range RTX 4070 lineup with a much larger GPU, albeit in a reduced form. A cut down variant of the AD103 is currently housed within NVIDIA's GeForce RTX 4080 graphics card - its AD103-300-A1 GPU has 9728 CUDA Cores and Team Green's engineers have chosen to disable 5% of the full article's capabilities.

The hardware boffins will need to do a lot of pruning if the larger GPU ends up on the rumored RTX 4070 sort-of upgrade - the SKU's 5,888 CUDA core count spec would require a 42% reduction in GPU potency. It is somewhat curious that the RTX 4070 Ti has not been mentioned by the tipster - you would think that the more powerful card (than the standard 4070) would be the logical and immediate candidate for this type of treatment. In theory NVIDIA could be re-purposing dies that do not meet RTX 4080-level standards, thus salvaging rejected material and repurposing it for step down card models.

According to TPU's GPU database the NVIDIA AD103: "uses the Ada Lovelace architecture and is made using a 5 nm production process at TSMC. With a die size of 379 mm² and a transistor count of 45,900 million it is a large chip. AD103 supports DirectX 12 Ultimate (Feature Level 12_2). For GPU compute applications, OpenCL version 3.0 and CUDA 8.9 can be used. Additionally, the DirectX 12 Ultimate capability guarantees support for hardware-ray tracing, variable-rate shading and more, in upcoming video games. It features 10240 shading units, 320 texture mapping units and 112 ROPs. Also included are 320 tensor cores which help improve the speed of machine learning applications. The GPU also contains 80 ray tracing acceleration cores."
Further reading: Ada Architecture Whitepaper
Sources: kopite7kimi Tweet, VideoCardz, Tweak Town
Add your own comment

25 Comments on NVIDIA GeForce RTX 4070 Variant Could be Refreshed With AD103 GPU

#1
Quitessa
Gotta simply be a way to scavenge bad dies. Especially ones with duff memory controllers that can't hit the full 256bit of the 4080 so cutting it back to the 192bit of the 4070(60 really) or that have defects in other areas that would prevent selling it as any sort of varient of the 4080(should be 70)
Posted on Reply
#2
Paranoir andando
15 years of Nvidia graphics cards:



The name of the cards means nothing

Nvidia want to continue with the Crypto prices,
Real4050--> 450€ , Real4060--> 600€ , Real4060 Ti--> 800€ , Real4070--> 1100€ ,

if you don't want to pay it, they turn around, change the names, and they try it again:
Now Real4060Ti is called 4070 Ti, buy it, it only costs 800€
Now Real4050 is called 4060 Ti, buy it, it only costs 450€
Posted on Reply
#3
rainzor
Would be interesting if it had the same core count and full 256bit bus with 16GB of 16Gbps G6 memory.
Also there's a mistake in that green table in the article, ADA is using 5nm process, not 4.
Paranoir andandoThe name of the cards means nothing

Nvidia want to continue with the Crypto prices,
Real4050--> 450€ , Real4060--> 600€ , Real4060 Ti--> 800€ , Real4070--> 1100€ ,

if you don't want to pay it, they turn around, change the names, and they try it again:
Now Real4060Ti is called 4070 Ti, buy it, it only costs 800€
Now Real4050 is called 4060 Ti, buy it, it only costs 450€
So what you're saying is nvidia is competing against 7900xtx with 70 series and against 7900xt with 60 series? Cool.
Posted on Reply
#4
oxrufiioxo
rainzorSo what you're saying is nvidia is competing against 7900xtx with 70 series and against 7900xt with 60 series? Cool.
If someone wants to call the 4080 a 70 tier product fine but that makes the AMD ones look even worse and shows how uncompetitive they really are.
Posted on Reply
#5
kondamin
Paranoir andando15 years of Nvidia graphics cards:



The name of the cards means nothing

Nvidia want to continue with the Crypto prices,
Real4050--> 450€ , Real4060--> 600€ , Real4060 Ti--> 800€ , Real4070--> 1100€ ,

if you don't want to pay it, they turn around, change the names, and they try it again:
Now Real4060Ti is called 4070 Ti, buy it, it only costs 800€
Now Real4050 is called 4060 Ti, buy it, it only costs 450€
Gonna need some more inflation before I’m going to spend anywhere near €1100 for a graphicscard
Posted on Reply
#6
JimmyDoogs
This is so they can make a 16GB variant right?
Posted on Reply
#7
oxrufiioxo
JimmyDoogsThis is so they can make a 16GB variant right?
Probably 8GB for all them Green Kool-aid lovers.
Posted on Reply
#8
Mr. Perfect
rainzorWould be interesting if it had the same core count and full 256bit bus with 16GB of 16Gbps G6 memory.
That would make sense to me, just call it a 4070 Super.
Posted on Reply
#9
tvshacker
Mr. PerfectThat would make sense to me, just call it a 4070 Super.
Super or Ultra would be the logical "monikers", but let's not forget we're talking about Nvidia who tried to sell the 4070TI as "4080 12G" so we're gonna probably going to get a "4070TI 16G"
Posted on Reply
#10
sethmatrix7
oxrufiioxoIf someone wants to call the 4080 a 70 tier product fine but that makes the AMD ones look even worse and shows how uncompetitive they really are.
Idk why anyone would buy a 4080. Why pay $200+ dollars more for the same performance? Sure looks pretty competitive...

Posted on Reply
#11
oxrufiioxo
sethmatrix7Idk why anyone would buy a 4080. Why pay $200+ dollars more for the same performance?

I mean if you want to play CP2077 or Witcher 3 NG and care about upscaling the 4080 is the better option if all you care about is raster performance sure the 7900XTX is fine. I would personally lean towards the 4080 just because the models of the 7900XTX I like aren't that much cheaper than the FE 4080 that is generally pretty great,

I also hate FSR though.

I think they are both meh products but I can see why people would choose one over the other.

They are not far enough in price for me to feel the 7900XTX is worth buying over it now if AMDs slides at the RDNA3 launch where accurate then sure but they were off by like 20%
Posted on Reply
#12
sethmatrix7
oxrufiioxoI mean if you want to play CP2077 or Witcher 3 NG and care about upscaling the 4080 is the better option if all you care about is raster performance sure the 7900XTX is fine. I would personally lean towards the 4080 just because the models of the 7900XTX I like aren't that much cheaper than the FE 4080 that is generally pretty great,

I also hate FSR though.

I think they are both meh products but I can see why people would choose one over the other.

They are not far enough in price for me to feel the 7900XTX is worth buying over it now if AMDs slides at the RDNA3 launch where accurate then sure but they were off by like 20%
So someone with niche usage scenario that doesn't care about money

As an aside- marketing and stupid consumers have done a number on this enthusiast hobby.
Posted on Reply
#13
oxrufiioxo
sethmatrix7So someone with niche usage scenario that doesn't care about money

As an aside- marketing and stupid consumers have done a number on this enthusiast hobby.
I mean they both looked so bad to me I grabbed a 4090 instead so I guess the marketing worked on me... Release crap 1000 usd cards to upsell the 1600 usd one......

Are you trying to say if someone likes the 4080 better and can afford it they shouldn't buy it number 1 its their money number 2 people place different values on their own hobbies. I only game 4 hours a week but I still don't mind spending quite a bit on a gpu.

And while I would give you that on a 3-600 usd range where people don't typically have as much disposable income on every dollar counting but someone who can afford a 1000 usd gpu likely can afford a 1200 one and if the 1200 dollar one offers them features they prefer that is just the way the cookie crumbles.

AMD needs to do a lot better in a lot of areas before I consider them again they improved quite a bit with the 6000 series but I feel the 7000 series in general is still a step back.... FSR still isn't very good and RT performance in RT heavy games is terrible to the point that you can't even use it with their gpus.

Again if someone only cares about Raster then sure more power to them.
Posted on Reply
#14
THU31
If they had any sense in them, they'd release a 4070 SUPER near the end of the year, with 16 GB and a 256-bit bus.

But this is probably just about salvage.
Posted on Reply
#15
oxrufiioxo
THU31If they had any sense in them, they'd release a 4070 SUPER near the end of the year, with 16 GB and a 256-bit bus.

But this is probably just about salvage.
I think that only happens if A: sales are as bad as they seem with pretty much every gpu below msrp at this point or B: 50 series is delayed due to issues with 3nm making a 20 series super like refresh viable.

Even if that does happen I just expect slightly more compelling tier for tier cards at the original MSRPs at best.
Posted on Reply
#16
Paranoir andando
tvshackerSuper or Ultra would be the logical "monikers", but let's not forget we're talking about Nvidia who tried to sell the 4070TI as "4080 12G" so we're gonna probably going to get a "4070TI 16G"
No no no no. They tried to sell the 4060-4060Ti as "4080 12G" and finally they decided to name it as 4070 Ti, don't be fooled. See the chart I post above.
Posted on Reply
#17
ValenOne
sethmatrix7Idk why anyone would buy a 4080. Why pay $200+ dollars more for the same performance? Sure looks pretty competitive...

My main reason for Gigabyte RTX 4080 Gaming OC (55 TFLOPS @ 2835 Mhz) is 16 GB VRAM, productivity raytracing, and cooling. RTX 4080 runs cooler than my old RTX 3080 Ti.
For similar reasons, my other gaming workstation PC has ASUS TUF 4090 OC 24 GB (89.9 TFLOPS @ 2745 Mhz).


From www.cgdirector.com/rtx-4080-review-content-creation/

For games RT...

From www.techpowerup.com/review/amd-radeon-rx-7900-xtx/34.html

For high-end NVIDIA customers with RT focus, RX 7900 XTX is side grade from RTX 3080 Ti / RTX 3090 / RTX 3090 Ti, but with an improved raster and higher VRAM.

RX 7900 XT 20 GB is a good upgrade from RTX 3070 Ti 8 GB level GPUs.
Posted on Reply
#18
Minus Infinity
4070 was always meant to use AD103 up to within 3-4 months of launch if you can believe certain leaks. But it only makes sense if the die's that don't make the cut for 4080 are cheaper than AD104 good dies. Honeslty, for $799 4070 Ti should have always been on AD103 with 16GB and 15% less cores than 4080, which frankly should have been 20GB and $999.
Posted on Reply
#19
N/A
If it's the exact same configuration it wouldn't matter. The current line up needs GDDR7, 4090 is barely 2x faster than a 4070 with 3x the core count, yeah core count means nothing if underutilised. In the next generation Nvidia needs to do just that and call it a day.
Posted on Reply
#20
oxrufiioxo
N/AIf it's the exact same configuration it wouldn't matter. The current line up needs GDDR7, 4090 is barely 2x faster than a 4070 with 3x the core count, yeah core count means nothing if underutilised. In the next generation Nvidia needs to do just that and call it a day.
If you look at how other flagship scaled vs the 70 tier product it's honestly about right.

Going from a 1070 to 1080ti was an increase of 87% in cuda cores for a 50% increase in performance.

Going from a 2070 to a 2080ti was an increas of 88% in cuda cores for a 56% increase in performance.

Going from a 3070 to 3090 is 80% more cuda cores for about 40% more performance that goes up to around 50% in RT assuming the 3070 doesn't run out of vram.

4070 to 4090 is a 178% increase in cuda cores for 84% more performance but around 100% in RT

This was always a diminished returns sorta thing due to cache/ROP/TMU and now RT/Tensor cores as well.

Also lets be real if a developer targeted a 4090 and only a 4090 while making a game that is probably the only scenario where you would get a comparable increase a game that runs on a 4090 also has to run on a 1650 lol.
Posted on Reply
#21
ValenOne
N/AIf it's the exact same configuration it wouldn't matter. The current line up needs GDDR7, 4090 is barely 2x faster than a 4070 with 3x the core count, yeah core count means nothing if underutilised. In the next generation Nvidia needs to do just that and call it a day.
4090's 72 MB L2 cache size and 1008 GB/s memory bandwdith are 2X when compared to 4070's 36 MB L2 cache and 504.2 GB/s memory bandwdith.

The problem is GDDRx memory manufacturers, not NVIDIA.
Posted on Reply
#22
Beermotor
oxrufiioxoI mean they both looked so bad to me I grabbed a 4090 instead so I guess the marketing worked on me... Release crap 1000 usd cards to upsell the 1600 usd one......

Are you trying to say if someone likes the 4080 better and can afford it they shouldn't buy it number 1 its their money number 2 people place different values on their own hobbies. I only game 4 hours a week but I still don't mind spending quite a bit on a gpu.

And while I would give you that on a 3-600 usd range where people don't typically have as much disposable income on every dollar counting but someone who can afford a 1000 usd gpu likely can afford a 1200 one and if the 1200 dollar one offers them features they prefer that is just the way the cookie crumbles.

AMD needs to do a lot better in a lot of areas before I consider them again they improved quite a bit with the 6000 series but I feel the 7000 series in general is still a step back.... FSR still isn't very good and RT performance in RT heavy games is terrible to the point that you can't even use it with their gpus.

Again if someone only cares about Raster then sure more power to them.
I got a 4080 because I couldn't find a 4090 or a XTX in stock when I had a $500 hardware purchase credit burning a hole in my pocket. $800 out of pocket is what I think the MSRP of the 4080 should have been in the first place so it didn't hurt so bad.

That and it kind of balances out to the positive because when you're not in a game the thing pulls maybe 14 watts and sits at maybe 40c without the fans running. As I'm writing this it is bouncing between 14 and 20 watts and is dead silent.

With that said, I'm going to have to remind you of three things:
  • that there are posts on this board with people praising the 3090ti's RT performance not 9 months ago
  • The 7900XTX has RT performance on par with a 3090ti.
  • The 4070ti has RT performance on par with the 3090ti.
That's objectively not terrible unless you're claiming the 4070ti RT is "terrible to the point you can't even use it."

You've very clearly never owned an RT-capable AMD GPU or you'd how disingenuous and dishonest you sound.
Posted on Reply
#23
oxrufiioxo
BeermotorI got a 4080 because I couldn't find a 4090 or a XTX in stock when I had a $500 hardware purchase credit burning a hole in my pocket. $800 out of pocket is what I think the MSRP of the 4080 should have been in the first place so it didn't hurt so bad.

That and it kind of balances out to the positive because when you're not in a game the thing pulls maybe 14 watts and sits at maybe 40c without the fans running. As I'm writing this it is bouncing between 14 and 20 watts and is dead silent.

With that said, I'm going to have to remind you of three things:
  • that there are posts on this board with people praising the 3090ti's RT performance not 9 months ago
  • The 7900XTX has RT performance on par with a 3090ti.
  • The 4070ti has RT performance on par with the 3090ti.
That's objectively not terrible unless you're claiming the 4070ti RT is "terrible to the point you can't even use it."

You've very clearly never owned an RT-capable AMD GPU or you'd how disingenuous and dishonest you sound.
I was never super high on the 3090ti and honestly I almost did the same as you and grabbed a 4080 FE waiting for a 4090 to come in stock and while I agree 7900XTX doesn't have terrible RT performance its still more than a generation behind what I ended up purchasing and what I was targeting from a performance standpoint I don't really think catching up to a last gen flagship on a 2 year old plus arch is all that impressive and in general it's at best 20% better in RT vs my 2 year old 3080ti I have in my secondary PC.

Even though I'm not super high on the 4080 due to it's asking price I would pick it over the 7900XTX

and for multiple reasons I think the 4070ti is a pretty terrible card it really has no redeeming quality other than efficiency. That's just my opinion of it if others think it's the best thing since sliced bread good for them.



At the end of the day if someone looks at any of these cards and decides they are best for them more power to them and honestly I hope they serve them well for years to come.
Posted on Reply
#24
N/A
oxrufiioxo4070 to 4090 is a 178% increase in cuda cores for 84% more performance but around 100% in RT
Comparing 4070 to 4090 is like comparing 1660 Ti to a 2080 Ti ~~ +180% more cores. +100% bandwidth and is 120% faster as average of those two components.
but there is a third one, 1660 had 48 rops. and this would be the equivalent of 4070 having 96, and it only has 64.

Now with 4090 being exactly 2x4070 faster in 4K, considering 4070 should be loosing 10% of efficiency at 4K this is a terrible result for the 4090.

4090 with 1008 GBs is the equivalent of 4070 with 336 MBs in the sense that it would be a disaster. Therefore 4090 should have 1512 GB/s and this is where GDDR7 comes into play.
And only then we can expect it to be 2.5x faster. loosing some efficiency but not as bad as a 1080 Ti and 2080 Ti that only had 38% more bandwidth and Rops than 70-class, so +55% perf. made sense.
Posted on Reply
#25
oxrufiioxo
N/AComparing 4070 to 4090 is like comparing 1660 Ti to a 2080 Ti ~~ +180% more cores. +100% bandwidth and is 120% faster as average of those two components.
but there is a third one, 1660 had 48 rops. and this would be the equivalent of 4070 having 96, and it only has 64.

Now with 4090 being exactly 2x4070 faster in 4K, considering 4070 should be loosing 10% of efficiency at 4K this is a terrible result for the 4090.

4090 with 1008 GBs is the equivalent of 4070 with 336 MBs in the sense that it would be a disaster. Therefore 4090 should have 1512 GB/s and this is where GDDR7 comes into play.
And only then we can expect it to be 2.5x faster. loosing some efficiency but not as bad as a 1080 Ti and 2080 Ti that only had 38% more bandwidth and Rops than 70-class, so +55% perf. made sense.
It definitely would be interesting. My 4090 overclocks to 1152GB which is about 14% more bandwidth vs stock for a whopping 1-2% increase in performance. Hopefully GDDR7 brings better latency etc as well.
I've never had a flagship scale linearly with bandwidth there was always diminished returns the massive L2 cache is likely offsetting this a bit.

I for one am ok with 100% more performance in RT heavy games it's more than what we got last generation for about the same price increase. My guess is if the 4090 was 150% faster than the 4070 it would be a lot more expensive.

The 2080ti was 328% more expensive than the 1660ti the 4090 is only 166% more expensive than the 4070 so not a very good comparison really from a cost increase perspective. Also TPU database only shows the 2080ti as 92% faster than the 1660ti.
Posted on Reply
Add your own comment
Apr 29th, 2024 07:05 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts