Tuesday, December 22nd 2020

NVIDIA to Introduce an Architecture Named After Ada Lovelace, Hopper Delayed?

NVIDIA has launched its GeForce RTX 3000 series of graphics cards based on the Ampere architecture three months ago. However, we are already getting information about the next-generation that the company plans to introduce. In the past, the rumors made us believe that the architecture coming after Ampere is allegedly being called Hopper. Hopper architecture is supposed to bring multi-chip packaging technology and be introduced after Ampere. However, thanks to @kopite7kimi on Twitter, a reliable source of information, we have data that NVIDIA is reportedly working on a monolithic GPU architecture that the company internally refers to as "ADxxx" for its codenames.

The new monolithically-designed Lovelace architecture is going make a debut on the 5 nm semiconductor manufacturing process, a whole year earlier than Hopper. It is unknown which foundry will manufacture the GPUs, however, both of NVIDIA's partners, TSMC and Samsung, are capable of manufacturing it. The Hopper is expected to arrive sometime in 2023-2024 and utilize the MCM technology, while the Lovelace architecture will appear in 2021-2022. We are not sure if the Hopper architecture will be exclusive to data centers or extend to the gaming segment as well. The Ada Lovelace architecture is supposedly going to be a gaming GPU family. Ada Lovelace, a British mathematician, has appeared on NVIDIA's 2018 GTC t-shirt known as "Company of Heroes", so NVIDIA may have already been using the ADxxx codenames internally for a long time now.
Sources: kopite7kimi, via VideoCardz
Add your own comment

34 Comments on NVIDIA to Introduce an Architecture Named After Ada Lovelace, Hopper Delayed?

#1
kiddagoat
"Some time now". Those cards are literally three months old, if that. Why does this article come across as though those cards are already obsolete?
Posted on Reply
#2
fancucker
Another product to smash AMD. Even with a node disadvantage Ampere beats RDNA2 in raster. Can't wait for this.
kiddagoat
"Some time now". Those cards are literally three months old, if that. Why does this article come across as though those of cards are already obsolete?
GPU development cycles are as long as 3 years
Posted on Reply
#3
piloponth
4080 Ti would be perfect card for 4k gaming with Raytracing. If they don't mess it with stupid memory size/bus.
Posted on Reply
#4
Vayra86
piloponth
4080 Ti would be perfect card for 4k gaming with Raytracing. If they don't mess it with stupid memory size/bus.
Dream on, if you game on 4K you'll be upgrading every gen or another because games will still get more demanding and sub 60 FPS is never far away.
Posted on Reply
#5
Night
Each time a new series is released I'm hoping it's going to be a breakthrough in performance, something worth getting right now, but all I'm seeing is performance uplifts from the previous generation. Miss the early times of HD and GTX releases. I remember when I had to get a ATI 9600 PRO to play Medal of Honor Pacific Assault.
Posted on Reply
#6
Vayra86
Night
Each time a new series is released I'm hoping it's going to be a breakthrough in performance, something worth getting right now, but all I'm seeing is performance uplifts from the previous generation. Miss the early times of HD and GTX releases. I remember when I had to get a ATI 9600 PRO to play Medal of Honor Pacific Assault.
Really? And when Pascal was out you bought an RX480 of all cards, which was a complete standstill from whatever came before it. That gen did offer 50% or more perf uplift and a major efficiency boost. It was like the dawn of a new GPU era.

Not to mention that even a midrange GPU you'd buy now would easily triple your performance. I think nostalgia has clouded your view on reality a little bit. Not to mention the fact that current GPUs actually DO offer more than a perf uplift, they enable completely new lighting technology too.
Posted on Reply
#7
Night
Vayra86
Really? And when Pascal was out you bought an RX480 of all cards, which was a complete standstill from whatever came before it. That gen did offer 50% or more perf uplift and a major efficiency boost. It was like the dawn of a new GPU era.

Not to mention that even a midrange GPU you'd buy now would easily triple your performance. I think nostalgia has clouded your view on reality a little bit. Not to mention the fact that current GPUs actually DO offer more than a perf uplift, they enable completely new lighting technology too.
GPUs before weren't released monthly, and you simply had to "get that card" to play the game at all. I bought RX 480 because I needed a performance boost at that time from R9 270X, while I was still gaming, now that's over. I would be irritated if I bought an expensive card and 2 months later there would be a refresh of the same card with ~10% perf. increase for the same price, that's why I wait a few gens to get to some real performance increase, I doubt I'll even change to another GPU any time soon.
Posted on Reply
#8
Vayra86
Night
GPUs before weren't released monthly, and you simply had to "get that card" to play the game at all. I bought RX 480 because I needed a performance boost at that time from R9 270X, while I was still gaming, now that's over. I would be irritated if I bought an expensive card and 2 months later there would be a refresh of the same card with ~10% perf. increase for the same price, that's why I wait a few gens to get to some real performance increase, I doubt I'll even change to another GPU any time soon.
Ah, gotcha. Well... the generational releases have actually been pretty sparse, and they've always been releasing spread out over several months. There's always been a GPU stack...

But I understand that at some point you just 'switch off' as you realize its all more of the same. I sorta feel the same way, even with the RT that's in cards now. Yes, shiny eye candy... I suppose some sort of oversaturation of graphics occurs and it all starts looking samey.
Posted on Reply
#10
RedelZaVedno
Guys, what's wrong with VR performance of 3080 compared to 6800XT? I finally got much overpaid 3080 + HP Reverb G2 for the use in DCS World and to my horror noticed more lows (=stutters) compared to 2080TI. I went to DCS forum and found out that 6800XT obliterates 3080 in this sim :( What's the problem, not enough Vram? Should I sell 3080 and wait for 3080TI or just buy 6800XT when available?
Posted on Reply
#11
Toothless
Tech, Games, and TPU!
RedelZaVedno
Guys, what's wrong with VR performance of 3080 compared to 6800XT? I finally got much overpaid 3080 + HP Reverb G2 for the use in DCS World and to my horror noticed more lows (=stutters) compared to 2080TI. I went to DCS forum and found out that 6800XT obliterates 3080 in this sim :( What's the problem, not enough Vram? Should I sell 3080 and wait for 3080TI or just buy 6800XT when available?
Should probably make your own thread instead of hijacking a news thread.
Posted on Reply
#12
Steevo
fancucker
Another product to smash AMD. Even with a node disadvantage Ampere beats RDNA2 in raster. Can't wait for this.


GPU development cycles are as long as 3 years

Get 6-8% more performance from a 3090 for 50% more money?

Nah.
Posted on Reply
#13
theoneandonlymrk
I'm sure it'll be super.

My three years till hopper theory, just got some wind behind its sales too.
Posted on Reply
#14
zlobby
With nvidia and Lovelace I can think of only one thing - Linda Lovelace! :D
Posted on Reply
#15
Hardware Geek
Night
GPUs before weren't released monthly, and you simply had to "get that card" to play the game at all. I bought RX 480 because I needed a performance boost at that time from R9 270X, while I was still gaming, now that's over. I would be irritated if I bought an expensive card and 2 months later there would be a refresh of the same card with ~10% perf. increase for the same price, that's why I wait a few gens to get to some real performance increase, I doubt I'll even change to another GPU any time soon.
The main problem is that consumers have been conditioned to expect there will be a Ti, or super, or xt, or some other such thing and simply accept it. These companies know they can get top dollar from the early adopters and then release a new variant of the same product with a longer name and minor differences and people will pay top dollar for that because they have to have the best and fastest of everything. I've gotten very tired of trying to keep up with the annual upgrade cycles and no longer care if my rna3 gtx8600 super ti turbo edition is the fastest and simply enjoy what I have and upgrade when there is a compelling reason. I'm close to pulling the trigger on building a new machine as my older i7 is getting a bit long in the tooth, but I'll be looking for a good value instead of looking for the fastest possible option for every component.
Posted on Reply
#16
Vayra86
zlobby
With nvidia and Lovelace I can think of only one thing - Linda Lovelace! :D
Finally a GOOD sticker for a GPU shroud. Bring it on, for nostalgia sake.

They can even give it DDR3, it'll still sell.
Posted on Reply
#17
ZoneDymo
imagine supporting Nvidia after that whole Hardware Unboxed affair.
Posted on Reply
#18
Toothless
Tech, Games, and TPU!
ZoneDymo
imagine supporting Nvidia after that whole Hardware Unboxed affair.
Every company has done a bad or screwed up somehow. You've got AMD with Bulldozer/Pile-driver. Intel with the security flaws. NVIDIA with bad PR. You might as well live on Mars and make your own companies.

Oh wait, Tesla is gonna get there, nevermind.
Posted on Reply
#19
theoneandonlymrk
Toothless
Every company has done a bad or screwed up somehow. You've got AMD with Bulldozer/Pile-driver. Intel with the security flaws. NVIDIA with bad PR. You might as well live on Mars and make your own companies.

Oh wait, Tesla is gonna get there, nevermind.
There's always going to be debate on that, I totally disagree with you but by the same token partially agree.
In that your examples aren't comparable, tech fails verses trying to write your own review aren't the same.
And yes you damn well can vote with your wallet.

But I agree companies are companies and any can mess up , I wouldn't reward those personally but then that's changeable and I wouldn't hold grudges for years , I would retain the memory though and I remember a hell of a lot more business shenanigans from intel, Nvidia and Apple then I have about AMD since the eighties anyway.
Posted on Reply
#20
bug
Hardware Geek
The main problem is that consumers have been conditioned to expect there will be a Ti, or super, or xt, or some other such thing and simply accept it. These companies know they can get top dollar from the early adopters and then release a new variant of the same product with a longer name and minor differences and people will pay top dollar for that because they have to have the best and fastest of everything. I've gotten very tired of trying to keep up with the annual upgrade cycles and no longer care if my rna3 gtx8600 super ti turbo edition is the fastest and simply enjoy what I have and upgrade when there is a compelling reason. I'm close to pulling the trigger on building a new machine as my older i7 is getting a bit long in the tooth, but I'll be looking for a good value instead of looking for the fastest possible option for every component.
Well, before these refreshes were a thing, Nvidia would release the x60 cards first, tune the manufacturing process and release the high-end card later. The public didn't like that either.
And you just can't get big chips with good yields out the door from the beginning. AMD sidestepped this problem for a while with their smaller chips. It remains to be seen what they'll do now, with RDNA2 returning them to the big chips arena.
Posted on Reply
#21
Toothless
Tech, Games, and TPU!
theoneandonlymrk
There's always going to be debate on that, I totally disagree with you but by the same token partially agree.
In that your examples aren't comparable, tech fails verses trying to write your own review aren't the same.
And yes you damn well can vote with your wallet.

But I agree companies are companies and any can mess up , I wouldn't reward those personally but then that's changeable and I wouldn't hold grudges for years , I would retain the memory though and I remember a hell of a lot more business shenanigans from intel, Nvidia and Apple then I have about AMD since the eighties anyway.
I wouldn't call Intel's response to the flaws as a fail when they were trying to cover it up. If we went by that then MSI should be close to out of business no? AMD didn't fail as much as "falsely advertised" (depending who you are) their chips as eight cores. Everything has two sides to the story of course and while I want to vote with my wallet, it's been empty. ;)
Posted on Reply
#22
Minus Infinity
zlobby
With nvidia and Lovelace I can think of only one thing - Linda Lovelace! :D
This information comes from our inside guy only known as agent deepthroat!
Posted on Reply
#23
Hardware Geek
bug
Well, before these refreshes were a thing, Nvidia would release the x60 cards first, tune the manufacturing process and release the high-end card later. The public didn't like that either.
And you just can't get big chips with good yields out the door from the beginning. AMD sidestepped this problem for a while with their smaller chips. It remains to be seen what they'll do now, with RDNA2 returning them to the big chips arena.
That's a legitimate point, but it seems like they do it much more often with far smaller performance increases between each variant. I'm sure better binning of the chips has an impact on it, but it's getting kind of ridiculous. Look at the xt series on ryzen 2. 100mhz increases when you're already running at 4.6ghz is around a 2% difference. That's not an upgrade, that's a margin of error. And I fully expect my next upgrade to be a zen 3 apu, but that kind of marketing stunt is off-putting.
Posted on Reply
#24
300BaudBob
Now if I could just write software for these cards using Ada.
Posted on Reply
#25
R-T-B
zlobby
With nvidia and Lovelace I can think of only one thing - Linda Lovelace! :D
Go directly to horny jail.
Posted on Reply
Add your own comment