• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 3080 Ti and GA102 "Ampere" Specs, Other Juicy Bits Revealed

Cost and scalability; Nvidia's Turing dies had to be effin' huge, so an older node was the cheap route. Clocks might play some role, but I think Turing is what it looks to be, a rough version of Pascal X Volta with RT, a stepping stone to the real deal in terms of architectural updates. It has likely provided a wealth of info about what degree of RT performance devs would need going forward, as well.

Note that right now some leaks also point at two nodes coming into use, 7nm and Samsung 8nm, with the better node being for the higher tier products. I'm not sure how much sense that really makes, but we do know TSMC's 7nm yield is pretty good. That would lend credibility to making bigger dies on it.

About clocks, in the end clocking is mostly about power/voltage curves and what you can push through on a certain node... Nvidia already achieved much higher clocks with Pascal on a smaller node and a big one in that was power delivery changes. They will probably only improve on that further, and Turing pushed clocks back a little bit, however minor; probably related to introduction of RT/Tensor (power!). Also, TSMC's early 7nm was DUV which wasn't specifically suited for clocking high.

But nvidia usually takes it easy and chooses the path of efficiency, this helps to keep some margin to overclock the opposite of AMD that puts the chips almost on edge. Nvidia would just show that ampere isn't such an impressive architecture after all.
 
Thinking of 3070 for my main system and if they can get 1660 super or ti level performance with 1650 successor with no power cable would buy that as well for 1080p mini system but they will probably launch those much later dates.
 
I don`t understand what`s all this fuss about new videocards is unless you are filthy rich. There are plenty of used/mined cards out there after the mining burst that keep us going for the next 3 years. Just get a used GTX 1070 for 170-190$(1 year ago i got an asus rog strix rgb for 185$-in europe) and you can play whatever for the next 3 years , or rx 580 8 gb for 90$ is good enouf for most games. For me this next generation means nothing. Wake me up when we get 2080ti class performance at 250$ in 3-5 years time and maby i will be tempted to upgrade.
 
Last edited:
I don`t understand what`s all this fuss about new videocards is unless you are filthy rich. There are plenty of used/mined cards out there after the mining burst that keep us going for the next 3 years. Just get a used GTX 1070 for 170-190$(i got an asus rog strix rgb for 185$-in europe) and you can play whatever for the next 3 years , or rx 580 8 gb for 90$ is good enouf for most games. For me this next generation means nothing. Wake me up when we get 2080ti class performance at 250$ in 3-5 years time and maby i will be tempted to upgrade.

Sleep tight, sweet prince.
 
NV making upscaling mandatory...

LoL

Though I always thought the 2000 series was a beta test for actual RT hardware.

So I wonder if NV is feeling the heat of a challenge from AMD. This should be an exciting time.
 
NV making upscaling mandatory...

LoL

Though I always thought the 2000 series was a beta test for actual RT hardware.

So I wonder if NV is feeling the heat of a challenge from AMD. This should be an exciting time.

Let's hope the 2080 Ti doesn't reach two years old before the challenge turns up.
 
Last edited:
Moore's Law is Dead is AdoredTV's brother in lie. Don't believe their lies.
You can't expect leaks about pre-production silicon to be 100% representative of the end product. AdoredTV got a ton of stuff right in his Zen 2 leaks, like the fact that there would be a separate io die, and 16 cores on am4 using 2 x 8 core chiplets.
 
You can't expect leaks about pre-production silicon to be 100% representative of the end product. AdoredTV got a ton of stuff right in his Zen 2 leaks, like the fact that there would be a separate io die, and 16 cores on am4 using 2 x 8 core chiplets.


And there is always a possibility for "roadmap subject to change". AMD decided that there wasn't a need to be even more aggressive.
 
I don`t understand what`s all this fuss about new videocards is unless you are filthy rich. There are plenty of used/mined cards out there after the mining burst that keep us going for the next 3 years. Just get a used GTX 1070 for 170-190$(1 year ago i got an asus rog strix rgb for 185$-in europe) and you can play whatever for the next 3 years , or rx 580 8 gb for 90$ is good enouf for most games. For me this next generation means nothing. Wake me up when we get 2080ti class performance at 250$ in 3-5 years time and maby i will be tempted to upgrade.

You play on 1080p don't you?
 
Back
Top