• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA to Introduce an Architecture Named After Ada Lovelace, Hopper Delayed?

She was forced, Linda. Ada? Was she forced to anything?
 
nvidia lovelace 5nm
amd rdna3 5nm


i really hope this, then only engineer works both side shows clearly, which one has fastest and efficencies gpus.. i guess Q1-2/2022 out.

few things is sure.

1st,amd cant make same thing like its soing rx 6000 series, and i mean using 'old' rx 5000 series gpus and sqeezze them one gpu,aka 2x5700xt,i mean 3x5700xt are not possible
so, amd must create brand new one. let see them skills.

nvidia..well lovelace is something new what no1 knows except nvidia engineers and leaders,but, sure is that it is 5nm die tech...is it samsung or TSMC we will see, but not so long ago,
we heard that samsung and nvida make new deal...so is it mean samsung engineer has sure nvidia peoples that them die tech is better ..or is it cheaper.. or both?

we will see.

hope we see amd rdna3 and nvidia lovelace early 2022..Q1-2.... i cant belive earlier.

bfore that we seen nvidia super models and of coz rtx 3080 ti and others variants from nvidia also..... how about rtx 3070 sli...i say, NO.

and. also, where is intel gpu? it was nice to see 3rd gpu for battle line.


i just wondering, sure thouse both are remendes fast and return back under 300W can forget...questions is, where we need so much power,bcoz only few percent gamers use 4K monitor.

hmm,soon, only one gpu take same power than hole PC rig bfore.....no good.
 
That's a legitimate point, but it seems like they do it much more often with far smaller performance increases between each variant. I'm sure better binning of the chips has an impact on it, but it's getting kind of ridiculous. Look at the xt series on ryzen 2. 100mhz increases when you're already running at 4.6ghz is around a 2% difference. That's not an upgrade, that's a margin of error. And I fully expect my next upgrade to be a zen 3 apu, but that kind of marketing stunt is off-putting.
Yeah well, what can you do, marketing is like that.
I don't look at each and every generation as a must-have upgrade (because, like you noted, improvements can be pretty minute). I just look at it like "$200 bought you 100% performance last year, this year it buys you 105%".
 
In a little more than 1 year, Lisa Su's AMD has managed to wreck NV's lineup into ridiculous last minute measures, with laughable memory configurations and power consumption with new power sockets and all that at unexpectedly low price.

All the lame excuses now, about "but node advantage", is just an attempt to address that acute green pain in the lower back of the respective green zealots.

While node advantage is indeed there, AMD also happens to use SMALLER chips that consume LESS power. That is all what one would expect from a node shrink to do. Compare transistor counts, AMD has matched green chips at perf/transistor, beaten at perf/watt AND is using slower VRAM.

What remains is the lame "but RT" argument, 2 years into ungodly RT goodness, we have "whopping" two dozen games, most of which exhibit barely any visual difference, with the most notable "feature" that RT brings being sharply dropping your framerate.

Oh, but to "compensate" there is "but if you run it at a lower resolution... from far above you wouldn't notice... especially if we apply that glorified TAA derivative we call 'DLSS 2'".


AT this pace, RDNA3 will at the very least exchange punches with the most obnoxiously priced crap team green will roll out, but we can also get into this:

1609086515313.png
 
I thought the architecture's name was Linda Lovelace.... That would have been deep.
Only for connoisseurs. ;)
 
Yeah well, what can you do, marketing is like that.
I don't look at each and every generation as a must-have upgrade (because, like you noted, improvements can be pretty minute). I just look at it like "$200 bought you 100% performance last year, this year it buys you 105%".
Agreed. I used to be on the new processor every gen bandwagon but when I was a kid, you could still push processors and increase performance by 50% or more. I didn't need a benchmark to "prove" it was faster and subsequent generations were still making 20-30% gen on gen gains. It's only now that processors are getting high core counts that I'm actually considering an upgrade, although I'll probably wait for zen 3+ with ddr5.
 
  • Like
Reactions: bug
Agreed. I used to be on the new processor every gen bandwagon but when I was a kid, you could still push processors and increase performance by 50% or more. I didn't need a benchmark to "prove" it was faster and subsequent generations were still making 20-30% gen on gen gains. It's only now that processors are getting high core counts that I'm actually considering an upgrade, although I'll probably wait for zen 3+ with ddr5.
Yes, when I realized I needed benchmarks to see the improvements is when I also stopped fretting over each and every release. Great minds think alike, right? ;)
 
A Japanese firm has figured out how to replace silicon with Antimony... Thinner, better charge holding noticbly. So we.. not me..to old.. will see 1 NM in 10-20 years. Finer than brain material, coding will be available then.. Rise Of the Machines will happen. I always wanted a AI of my own.
 
Back
Top