• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA "Ada Lovelace" Architecture Designed for N5, GeForce Returns to TSMC

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
42,866 (8.03/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA's upcoming "Ada Lovelace" architecture, both for compute and graphics, is reportedly being designed for the 5 nanometer silicon fabrication node by TSMC. This marks NVIDIA's return to the Taiwanese foundry after its brief excursion to Samsung, with the 8 nm "Ampere" graphics architecture. "Ampere" compute dies continue to be built on TSMC 7 nm nodes. NVIDIA is looking to double the compute performance on its next-generation GPUs, with throughput approaching 70 TFLOP/s, from a numeric near-doubling in CUDA cores, generation-over-generation. These will also be run at clock speeds above 2 GHz. One can expect "Ada Lovelace" only by 2022, as TSMC N5 matures.



View at TechPowerUp Main Site
 
Joined
Jul 1, 2011
Messages
274 (0.07/day)
System Name Matar Extreme PC.
Processor Intel Core i9-9900KF @5GHZ
Motherboard Asrock Pro4 Z370 With Wi-Fi & Bluetooth
Cooling CoolerMaster ML240L AIO
Memory 4x8 32GB DDR4 3200MHZ G.SKILL
Video Card(s) Nvidia RTX 2070 Super OC Edition
Storage 256GB Nvme for OS + 1TB 2.5" SSD & 2.5" 256GB SSD
Display(s) Lenovo 34" Ultra Wide 3440x1440 144hz 1ms G-Snyc
Case NZXT H510
Audio Device(s) Internal
Power Supply Cooler Master 850 Watts , Gold
Mouse Logitech M500 Lazer Mouse
Keyboard ASUS keyboard, & Xbox ONE Wireless Controller
Software Windows 10 Home
Benchmark Scores https://www.youtube.com/user/matttttar/videos
I bought this year RTX 2070 super and its Rocking @3440x1440 and not upgrading for a while, i even tried rtx 3060 but wasn't happy.
 
Joined
Mar 28, 2020
Messages
1,242 (1.59/day)
If this is true, then I think Nvidia is truly worried about AMD's progress in the GPU space. The reason why I said that is because Nvidia' "gaming" GPUs have not been manufactured on near cutting edge nodes when Nvidia was dominating the high end GPU space over the last few years. When AMD introduced their first TSMC N7 GPU, Turing was introduced on TSMC 12nm (basically a 16nm), then they slowly move to Samsung 8nm (essentially a 10nm) even though AMD was already using N7 for a year or 2. So now with competition heating up, if they continue going for cheaper nodes, it is not going to do them any favor.

I bought this year RTX 2070 super and its Rocking @3440x1440 and not upgrading for a while, i even tried rtx 3060 but wasn't happy.
RTX 2070 Super is faster than a RTX 3060 for sure. The only benefit of going with the RTX 3060 is the 50% increase in VRAM, which may be more beneficial in the longer run.
 
Joined
Jul 1, 2011
Messages
274 (0.07/day)
System Name Matar Extreme PC.
Processor Intel Core i9-9900KF @5GHZ
Motherboard Asrock Pro4 Z370 With Wi-Fi & Bluetooth
Cooling CoolerMaster ML240L AIO
Memory 4x8 32GB DDR4 3200MHZ G.SKILL
Video Card(s) Nvidia RTX 2070 Super OC Edition
Storage 256GB Nvme for OS + 1TB 2.5" SSD & 2.5" 256GB SSD
Display(s) Lenovo 34" Ultra Wide 3440x1440 144hz 1ms G-Snyc
Case NZXT H510
Audio Device(s) Internal
Power Supply Cooler Master 850 Watts , Gold
Mouse Logitech M500 Lazer Mouse
Keyboard ASUS keyboard, & Xbox ONE Wireless Controller
Software Windows 10 Home
Benchmark Scores https://www.youtube.com/user/matttttar/videos
If this is true, then I think Nvidia is truly worried about AMD's progress in the GPU space. The reason why I said that is because Nvidia' "gaming" GPUs have not been manufactured on near cutting edge nodes when Nvidia was dominating the high end GPU space over the last few years. When AMD introduced their first TSMC N7 GPU, Turing was introduced on TSMC 12nm (basically a 16nm), then they slowly move to Samsung 8nm (essentially a 10nm) even though AMD was already using N7 for a year or 2. So now with competition heating up, if they continue going for cheaper nodes, it is not going to do them any favor.


RTX 2070 Super is faster than a RTX 3060 for sure. The only benefit of going with the RTX 3060 is the 50% increase in VRAM, which may be more beneficial in the longer run.
OH yes I know the 2070 super is faster but gave the 3060 a try for testing not that i was going to replace my 2070 super with it.
 
Joined
May 31, 2016
Messages
3,376 (1.55/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Corsair AXi 760W / Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Is that supposed to be the multi chip design?
 
Joined
Nov 11, 2016
Messages
2,191 (1.09/day)
System Name The de-ploughminator
Processor I7 9900K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 3090 + Bitspower WB
Storage Plextor 512GB nvme SSD
Display(s) LG OLED CX48"
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software Win10
OH yes I know the 2070 super is faster but gave the 3060 a try for testing not that i was going to replace my 2070 super with it.

I would prefer the 3060 over 2070 Super because of the HDMI 2.1, that make the 3060 very suitable for HTPC, goes along well with OLED TV too :D
 
Joined
Apr 16, 2013
Messages
373 (0.11/day)
Location
Bulgaria
System Name Black Knight | White Queen
Processor Intel Core i9-10940X | Intel Core i7-5775C
Motherboard ASUS ROG Rampage VI Extreme Encore X299G | ASUS Sabertooth Z97 Mark S (White)
Cooling Noctua NH-D15 chromax.black | Xigmatek Dark Knight SD-1283 Night Hawk (White)
Memory G.SKILL Trident Z RGB 4x8GB DDR4 3600MHz CL16 | Corsair Vengeance LP 4x4GB DDR3L 1600MHz CL9 (White)
Video Card(s) KFA2/Galax GeForce GTX 1080 Ti Hall of Fame Edition | Intel Iris Pro 6200
Storage Samsung 980 Pro 1TB, 850 Pro 256GB, 840 Pro 256GB, WD Red 2/3/6TB, WD VelociRaptor 300GB/600GB/1TB
Display(s) Dell Alienware AW2721D 240Hz, ASUS ROG Strix XG279Q 170Hz, ASUS PA246Q 60Hz| Samsung JU7500 48'' TV
Audio Device(s) ASUS Xonar Essence STX | Realtek ALC1150
Power Supply Enermax Revolution 1250W 85+ | Super Flower Leadex Gold 650W (White)
Mouse Razer Basilisk Ultimate, Razer Naga Trinity | Razer Mamba 16000
Keyboard Razer Blackwidow Chroma V2 (Orange switch) | Razer Ornata Chroma
Software Windows 10 Pro 64bit
Ampere is such a flop with Samsung's 8nm.
 
Joined
Aug 21, 2013
Messages
1,122 (0.35/day)
Ampere is such a flop with Samsung's 8nm.
Agreed. Tho there are people who argue that its not that much worse than TSMC's 7nm. Tho that argument only looks at the density and not the power characteristics, output quantity or yields. It does not help matters that Micron's G6X is also very power hungry for a small bump in effective speed over standard 16Gbps (18Gbps G6 has existed since Turing).

I hope that if Lovelace or whatever it ends up beingh called uses TSMC once again and Micron fixes their G6X power draw or Samsung comes out with 20Gbps G6 to replace G6X. Turing was an insult with nonexistant (RT) and bad (DLSS 1.0) features and high price. Ampere is just expensive to produce, hot, low yielding and power hungry. Samsung's 8nm process was never meant to produce such large chips. Even in smartphones Samsung's 8nm was always losing to TSMC.
The only reason Ampere is half decent is Nvidia's architecture and monstrous cooling solutions by Nvidia and AIB's to keep it in check.

If we were not in the middle of a global pandemic, supply shortage and mining boom the low (atleast lower than Turing) MSRP's would have made Ampere tolerable. But not as great as Maxwell or Pascal were. Especially 1080Ti when it came out. 700 was a steal for it and even years later Nvidia could only produce 2080Ti that was slightly faster. Only with Ampere was 1080Ti defeated by midrange cards. Cards that cost more than 700....
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
6,379 (1.16/day)
System Name MightyX
Processor Ryzen 9 5900X
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage Samsung 970 Evo m.2 NVME
Display(s) 21:9 3440x1440 144hz 1500R VA
Case Coolermaster NR200P
Power Supply Corsair SF750 Platinum
Mouse Zowie EC1-A
Keyboard Razer Blackwidow X Chroma
Software case populated with Artic P12's
Ampere is such a flop with Samsung's 8nm.
Desktop Ampere is not what it *could* have been on, for example, TSMC 7nm, but a flop?

*checks notes*

Sure doesn't seem that way.
 
Joined
May 3, 2018
Messages
688 (0.47/day)
Nope Lovelace is supposed to be monolithic. They also have Hopper that is MCM but that is for data center and HPC customers.
That is not a given. There are leaks that if RDNA3 is so good, Nividia will skip Lovelace and go straight to Hopper for desktop. RDNA3 will be MCM on Big NAvi at least, but Lovelace is just evolution of Ampere. It reportedly 60-80% faster than Ampere, but RDNA3 is at least 100% faster and on biggest Navi31 it could be 200% faster but at an obscene $2K price.
 
Joined
Nov 11, 2016
Messages
2,191 (1.09/day)
System Name The de-ploughminator
Processor I7 9900K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 3090 + Bitspower WB
Storage Plextor 512GB nvme SSD
Display(s) LG OLED CX48"
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software Win10
That is not a given. There are leaks that if RDNA3 is so good, Nividia will skip Lovelace and go straight to Hopper for desktop. RDNA3 will be MCM on Big NAvi at least, but Lovelace is just evolution of Ampere. It reportedly 60-80% faster than Ampere, but RDNA3 is at least 100% faster and on biggest Navi31 it could be 200% faster but at an obscene $2K price.

I wouldn't bet on MCM design for gaming in this early stage, SLI and Xfire died for a reason LOL.
 
Joined
May 3, 2018
Messages
688 (0.47/day)
I wouldn't bet on MCM design for gaming in this early stage, SLI and Xfire died for a reason LOL.
I'm only talking MCM in the flagship, not the mainstream. They might have a 7950XT, 7900XT and 7800XT. 7950XT would be $2K and just for bragging rights. I doubt 4090 would get near it if specs are to believed.
 
Joined
Aug 21, 2013
Messages
1,122 (0.35/day)
That is not a given. There are leaks that if RDNA3 is so good, Nividia will skip Lovelace and go straight to Hopper for desktop. RDNA3 will be MCM on Big NAvi at least, but Lovelace is just evolution of Ampere. It reportedly 60-80% faster than Ampere, but RDNA3 is at least 100% faster and on biggest Navi31 it could be 200% faster but at an obscene $2K price.
We dont know. Nvidia is a black (green?) box when it comes to keeping these things close to it's chest. The leaks about AMD and Intel products tend to be far more reliable.
I wouldn't bet on MCM design for gaming in this early stage, SLI and Xfire died for a reason LOL.
MCM is invisible to the OS and games. It's a hardware solution that does not depend on OS or game developers optimizing for it. As far as they are concerned they see one monolithic chip. Load balancing is done in hardware. Atleast that is what AMD patents thus far have shown. SLI and Crossfire being dead is good. Nothing good ever came out of those.
 
Joined
Nov 11, 2016
Messages
2,191 (1.09/day)
System Name The de-ploughminator
Processor I7 9900K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 3090 + Bitspower WB
Storage Plextor 512GB nvme SSD
Display(s) LG OLED CX48"
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software Win10
MCM is invisible to the OS and games. It's a hardware solution that does not depend on OS or game developers optimizing for it. As far as they are concerned they see one monolithic chip. Load balancing is done in hardware. Atleast that is what AMD patents thus far have shown. SLI and Crossfire being dead is good. Nothing good ever came out of those.

If the MCM design lead to unwanted stutterings I would rather stick to huge monolithic chip.
between 120FPS with mad stutterings and smooth 80FPS I would pick the later LOL, I play games, not benchmarks, same reason I haven't gone back to SLI ever since I bought the first ever SLI GPU (7950GX2).
 
Joined
Aug 21, 2013
Messages
1,122 (0.35/day)
If the MCM design lead to unwanted stutterings I would rather stick to huge monolithic chip.
between 120FPS with mad stutterings and smooth 80FPS I would pick the later LOL, I play games, not benchmarks, same reason I haven't gone back to SLI ever since I bought the first ever SLI GPU (7950GX2).
Why would MCM lead to stuttering? MCM CPU's have been fine for example. Monolithic chips are getting more and more expensive and have essentially a 800mm² limit. MCM's can scale higher. For example four 400mm² chips. Tho first iterations use two. Atleast in gaming.
 
Joined
Nov 11, 2016
Messages
2,191 (1.09/day)
System Name The de-ploughminator
Processor I7 9900K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 3090 + Bitspower WB
Storage Plextor 512GB nvme SSD
Display(s) LG OLED CX48"
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software Win10
Why would MCM lead to stuttering? MCM CPU's have been fine for example. Monolithic chips are getting more and more expensive and have essentially a 800mm² limit. MCM's can scale higher. For example four 400mm² chips. Tho first iterations use two. Atleast in gaming.

Well MCM will have higher latency than monolithic, that's for sure.
The overhead associated with MCM for gaming is not yet known at this point, Nvidia and AMD probably have thought about MCM a long time ago and just waited for the right kind of interconnect technology to make it possible.
While AMD is going to use a big pool of Infinity Cache, Nvidia will probably use networking tech from Mellanox like the PAM4 on GDDR6x, no one knows which interconnect will allow better MCM design at this point, or whether MCM is suitable for gaming at all or just meant for workstation tasks.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
6,379 (1.16/day)
System Name MightyX
Processor Ryzen 9 5900X
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage Samsung 970 Evo m.2 NVME
Display(s) 21:9 3440x1440 144hz 1500R VA
Case Coolermaster NR200P
Power Supply Corsair SF750 Platinum
Mouse Zowie EC1-A
Keyboard Razer Blackwidow X Chroma
Software case populated with Artic P12's
If the MCM design lead to unwanted stutterings I would rather stick to huge monolithic chip.
I guesss I'd have to hope, and to an extent bank on that if they are going to do it, they've figured that out, because nobody wants that stuttery mess.
 
Joined
Aug 21, 2013
Messages
1,122 (0.35/day)
Well MCM will have higher latency than monolithic, that's for sure.
The overhead associated with MCM for gaming is not yet known at this point, Nvidia and AMD probably have thought about MCM a long time ago and just waited for the right kind of interconnect technology to make it possible.
While AMD is going to use a big pool of Infinity Cache, Nvidia will probably use networking tech from Mellanox like the PAM4 on GDDR6x, no one knows which interconnect will allow better MCM design at this point, or whether MCM is suitable for gaming at all or just meant for workstation tasks.
40ns vs 60ns monolithic vs MCM. At least on CPU's- On GPU's latency is far less of an issue. GDDR6 itself has much higher latency than DDR4 for example. But despite that it is still used as system RAM on consoles. GPU's are more about bandwidth and troughput. If they are bringing out MCM GPU's then im assuming it's ok.
 

ppn

Joined
Aug 18, 2015
Messages
1,089 (0.44/day)
looking at GA100 65.6M / mm² density on N7, this new N5 should land around 118. Ampere 8nm sits at only 44. this means the maximum EUV die of 421 mm² can contain 50 B transistors, and this is just mindblowing.
 
Joined
Nov 23, 2010
Messages
243 (0.06/day)
Would it make sense to not work with Samsung after just 1 product launch? I would think given the supply contains Nvidia would continue to use both TSMC and Samsung. Samsung themselves are investing many billions to fix their manufacturing issues, how much validity does this news item carry?
 
Joined
Dec 12, 2012
Messages
225 (0.07/day)
System Name THU
Processor Intel Core i7-9700KF (8C/8T, 4.5 GHz @ 1.18 V)
Motherboard ASUS PRIME Z370-A
Cooling SilentiumPC Fortis 3 HE1425 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 (dual rank, 16-18-18-38 2T @ 1.35 V)
Video Card(s) MSI GeForce RTX 3080 Gaming X 10 GB GDDR6X (1800/19000 @ 0.8 V)
Storage PNY XLR8 CS3030 500 GB + Corsair MP510 960 GB + 2 x Toshiba E300 3 TB
Display(s) LG OLED C8 55"
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair STRAFE RED
Software Windows 10 Home
Benchmark Scores Benchmarks in 2022?
Good. The Samsung 8N process is trash for big chips. Ampere efficiency is garbage without severe undervolting.

My 3080 can push over 270 W with ray-tracing, at just 1800 MHz and 0.8 V. That is crazy.

Regular games do 200-230 W. Vsynced, rarely getting past 70-80% GPU usage.

At stock settings the clock can actually drop below 1800 MHz with ray tracing while drawing over 350 W. That is madness.
 
Last edited:
Joined
Aug 21, 2013
Messages
1,122 (0.35/day)
Good. The Samsung 8N process is trash for big chips. Ampere efficiency is garbage without severe undervolting.

My 3080 can push over 270 W with ray-tracing, at just 1800 MHz and 0.8 V. That is crazy.

Regular games do 200-230 W. Vsynced, rarely getting past 70-80% GPU usage.

At stock settings the clock can actually drop below 1800 MHz with ray tracing while drawing over 350 W. That is madness.
That's crazy. TSMC's 12nm 2080Ti with a 380W limit BIOS can do 2050Mhz+ with 380W. 1800 stock 350W is just bad for a "8nm" process.
 
Top