Monday, May 11th 2020

NVIDIA RTX 3080 Ti and GA102 "Ampere" Specs, Other Juicy Bits Revealed

PC hardware focused YouTube channel Moore's Law is Dead published a juicy tech-spec reveal of NVIDIA's next-generation "Ampere" based flagship consumer graphics card, the GeForce RTX 3080 Ti, citing correspondence with sources within NVIDIA. The report talks of big changes to NVIDIA's Founders Edition (reference) board design, as well as what's on the silicon. To begin with, the RTX 3080 Ti reference-design card features a triple-fan cooling solution unlike the RTX 20-series. This cooler is reportedly quieter than the RTX 2080 Ti FE cooling solution. The card pulls power from a pair of 8-pin PCIe power connectors. Display outputs include three DP, and one each of HDMI and VirtualLink USB-C. The source confirms that "Ampere" will implement PCI-Express gen 4.0 x16 host interface.

With "Ampere," NVIDIA is developing three tiers of high-end GPUs, with the "GA102" leading the pack and succeeding the "TU102," the "GA104" holding the upper-performance segment and succeeding today's "TU104," but a new silicon between the two, codenamed "GA103," with no predecessor from the current-generation. The "GA102" reportedly features 5,376 "Ampere" CUDA cores (up to 10% higher IPC than "Turing"). The silicon also taps into the rumored 7 nm-class silicon fabrication node to dial up GPU clock speeds well above 2.20 GHz even for the "GA102." Smaller chips in the series can boost beyond 2.50 GHz, according to the report. Even with the "GA102" being slightly cut-down for the RTX 3080 Ti, the silicon could end up with FP32 compute performance in excess of 21 TFLOPs. The card uses faster 18 Gbps GDDR6 memory, ending up with 863 GB/s of memory bandwidth that's 40% higher than that of the RTX 2080 Ti (if the memory bus width ends up 384-bit). Below are screengrabs from the Moore's Law is Dead video presentation, and not NVIDIA slides.
As for performance, the "GA102" based prototype is allegedly clocking 40 percent higher performance than the RTX 2080 Ti at 4K UHD resolution in poorly optimized games, 50% higher performance on optimized games, and up to 70 percent performance in the "best case scenario" (a game that's been optimized for the "Ampere" architecture). We know from older leaks that by increasing the number of streaming multiprocessors, NVIDIA is doubling the CUDA core : RT core ratio compared to Turing, resulting in more RT cores per tier; and increased ray-tracing performance.

Each "Ampere" RT core is able to process 4x more intersections per unit clock-speed than "Turing." The tensor core count is also reportedly going to see an increase. The focus on ray-tracing and AI performance increase could give game developers the freedom to cram in more RTX effects per title, letting users disable what they want on older "Turing" cards. Performance limitations on "Turing" made developers choose from the RTX feature-set on what to implement. With "Ampere," NVIDIA could introduce DLSS 3.0, an updated image quality and performance enhancement. NVIDIA could resurrect a hybrid memory technology similar to AMD's HBCC, called NVCache, which spreads video memory across the video memory, the system memory, and flash-based storage.
Lastly, there's more clarity as to what silicon fabrication process NVIDIA could use. Apparently, NVIDIA will spread its product stack between two kinds of 7 nm-class nodes. The higher-end ASICs, such as the "GA102" and "GA103," could be built on 7 nm EUV nodes such as the TSMC N7+; while the smaller ASICs could be built on conventional DUV-based 7 nm-class nodes such as the N7P or even N7.

Don't pull your wallets out just yet. The launch schedule points to May 2020 (GTC) being focused on HPC parts based on "Ampere," such as the Tesla A100 and DGX A100 system.

In September 2020, NVIDIA will hold a separate event specifically to launch the next-generation GeForce, very close to "Cyberpunk 2077" release.
Source: Moore's Law is Dead (YouTube)
Add your own comment

83 Comments on NVIDIA RTX 3080 Ti and GA102 "Ampere" Specs, Other Juicy Bits Revealed

#51
Unregistered
Gonna come down to pricing. I avoided Turing because of the insane pricing on it. Hopefully Ampere is more reasonable and if not...thankfully there is a healthy 2nd hand market.
Posted on Edit | Reply
#52
ERazer
Bet another price increase, truly who grabbed 1080ti when it first came out had best bang for there buck.
Posted on Reply
#53
TheGreatPascalian
Nothing is revealed. This comes from rumor channel with a history of being almost always wrong. This is the same guy who said we will get SMT-4 with ZEN3 until AMD proved him wrong with their own ZEN3 slide lol ! This guy just guesses spec and "future" for clicks, claims he have sources and thats it. He did that plenty in the past with Navi and he got 99% of the spec wrong to a point of deleting some of the videos out of humiliation. The fact that this site treats it as some "reveal" is just bad hype coming to fall flat on its face.
Posted on Reply
#54
Tomorrow
BArmsFinally HDMI 2.1 support. My current 4K TV is a 120hz panel but there's no way to feed it a 4K signal @ 120hz, and that was from 2018.
HDMI 2.1 is almost guaranteed at this point. Im very sceptical of DP 2.0 tho. Ampere would be the first product that includes this. I fear it will still be DP 1.4
M2BMy biggest concern is the amount of VRAM we'll be getting.
The 10 & 12 GB cards (3080/3080Ti) are going to be fine but if midrange cards such as the 3070 only get 8GB it's going to be disappointing.
I don't think 8GB is going to cut it long-term.
I hope nvidia gives more options with more VRAM, I would pay an extra 100$ for the 16GB Version of the 70 class Card.
That's what the compression and NVCache is for i believe.
ARFMost probably fake news.

RTX 2080 Ti doesn't need 3 power connectors.

2 x 8-pin is good for 75 + 2 x 150 watts.
If you actually watched the video you would have picked up that this was about the ES variants. Those always use more (of everything) than the Retail cards.
Otonel88When I saw: triple fan design, dual 8 pin connector, I am just thinking about a high TDP card, which is contradicting the whole 7nm design process, unless this is just rumors.
7nm should be more efficient bringing the TDP down, so why the double 8 pin and tripple fan design.
Nvidia wants to ensure it beats out RDNA2. Triple fan may be more about better acoustics more than it is to cool a hot card.
techguymaxcPeople who are knowledgeable about this industry over at Beyond3D forums have pointed out that this compiled list of specs and features make no sense, and that the Youtuber has no background in engineering or the industry.
It’s no different than if someone from this forum made a video based on rumors.
Care to link a topic and what about these specs don't make sense?
TurmaniaAMD is at least a generation and a die shrink away from Nvdia. So Rdna2 can be good but it will be nothing in comparison to ampere.
RDNA2 is meant to go against Ampere, not Turing. 2080Ti will be a midrange card by the end of the year. Consoles will almost equal it and RDNA2 and Ampere will surpass it by ~50%
Th3pwn3rLol. And don't expect price drops, they have no reason for that. The gap between Nvidia and AMD on high end just became that much bigger.
Also don't expect price increaes. Unless AMD decides their top RDNA2 card costs $1200 now too.
TheGreatPascalianNothing is revealed. This comes from rumor channel with a history of being almost always wrong. This is the same guy who said we will get SMT-4 with ZEN3 until AMD proved him wrong with their own ZEN3 slide lol ! This guy just guesses spec and "future" for clicks, claims he have sources and thats it. He did that plenty in the past with Navi and he got 99% of the spec wrong to a point of deleting some of the videos out of humiliation. The fact that this site treats it as some "reveal" is just bad hype coming to fall flat on its face.
Care to share what he got wrong?

Usually he speaks about things way in advance and then months later Wccftech or some other site makes a story about "breaking news". Hes was pretty spot on about the new console specs before others. When it comes to SMT4 - that may still come. But plans change and it's likely pushed back. If the SMT4 thing is the only one you can think of the id say that's quite good for a leak/speculation channel to get this one thing wrong.
Posted on Reply
#55
ARF
TomorrowRDNA2 is meant to go against Ampere, not Turing. 2080Ti will be a midrange card by the end of the year. Consoles will almost equal it and RDNA2 and Ampere will surpass it by ~50%

Also don't expect price increaes. Unless AMD decides their top RDNA2 card costs $1200 now too.

Care to share what he got wrong?
Yeah, let's just hope that AMD's Navi 23, Navi 22 and Navi 21 work as intended, and actually get launched sometime in the (near) future.
Because the more time goes by, the more I begin to doubt that we will ever see a new generation :lol:
Posted on Reply
#56
phill
I wonder if it will be the cost of a RTX Titan for the next series of card?? I would surely hope NOT....

Anyone got any guesses on the cost of one of these 3080 Ti's?? Or have they mentioned a price??
Posted on Reply
#57
Caring1
phillI wonder if it will be the cost of a RTX Titan for the next series of card?? I would surely hope NOT....

Anyone got any guesses on the cost of one of these 3080 Ti's?? Or have they mentioned a price??
Based on current prices in Australia, I'd guess around $1800 - $2,000 (or both kidneys).
Posted on Reply
#58
Tomorrow
phillI wonder if it will be the cost of a RTX Titan for the next series of card?? I would surely hope NOT....

Anyone got any guesses on the cost of one of these 3080 Ti's?? Or have they mentioned a price??
TITAN: 2500
3080Ti: 999
3080: 700
3070: 500
3060: 350

Just my guess. Remember that unless AMD prices their cards very close to Nvidia then Nvidia can't just ask whatever they want. They can't even tout RT as some exlusive feature. Nvidia overpriced Turing because they had no competition at the high end and they could say "hey look we are the only one who has RT". With RDNA2 they don't have that advantage. I think the bigger problem could be AMD's pricing.
Posted on Reply
#59
Hyderz
T3RM1N4L D0GM4RTX3060 seems like a decent upgrade from my actually GTX760 and my 3440x1440 screen.... :respect:
Oh man hats off to you i have the same resolution monitor and my GTX1070ti struggles and keep up with games like star wars at medium settings :(
Posted on Reply
#60
rtwjunkie
PC Gaming Enthusiast
QUANTUMPHYSICSFor the 1080Ti to outperform the 2060 and 2070 is ridiculous. The 2060 should have been ahead of the 1080Ti in every way.
Your expectation of something outside the norm is what is a bit ridiculous. 60 series have NEVER outperformed the 80Ti’s from the previous generation. And once the 70’s did, sort of and were about even another. What IS normal with each new generation is 1 step up. For the new 80 designer to beat a previous gen 80Ti is normal and expected. But that’s it.

When the new 70’s can squeak by we all celebrate that a low-high end/high midrange card can do it and what a great value that makes it. But it is not the normal way of it.
Posted on Reply
#61
djisas
rtwjunkieYour expectation of something outside the norm is what is a bit ridiculous. 60 series have NEVER outperformed the 80Ti’s from the previous generation. And once the 70’s did, sort of and were about even another. What IS normal with each new generation is 1 step up. For the new 80 designer to beat a previous gen 80Ti is normal and expected. But that’s it.

When the new 70’s can squeak by we all celebrate that a low-high end/high midrange card can do it and what a great value that makes it. But it is not the normal way of it.
I miss the day when my choice of graphic cards was between HD4850 or HD4870 and w/e nvidea had to offer, another 2 or 3 models...
How many does nvidea have right now?
Introducing models 5% apart so they can charge more for less...
Posted on Reply
#62
rtwjunkie
PC Gaming Enthusiast
djisasI miss the day when my choice of graphic cards was between HD4850 or HD4870 and w/e nvidea had to offer, another 2 or 3 models...
How many does nvidea have right now?
Introducing models 5% apart so they can charge more for less...
True enough! But...as long as people will buy no matter how many models are offered, they will make them.
Posted on Reply
#63
Chrispy_
dyonoctisLooks good on paper...RDNA2 might have to be insanely good if we want competition to be healthy again.
"new control panel merged with geforce experience" Ho boy. Some people are going to be so mad about that.
The slides say that the'll be removing the always-online login reuirement for GFE, so this should actually be fine.
Nobody can actually like the Nvidia control panel as it is right now? It's poorly laid out, skinned for Windows XP and lacks features that AMD give that are either nice to have or downright mandatory IMO.
Posted on Reply
#64
sergionography
This generation is going to be hella confusing for reviewers especially with ray tracing becoming a standard. I wonder if they will continue showing rt on and rt off benchmarks. And if these Nvidia specs are to be true then RDNA2 is going to have a tough time. 50% efficiency over RDNA1 means we might get a high end card with double the performance of 5700xt around the 200-300watt tdp, which puts them about 30 to 40% ahead of 2080ti which keeps them in the same previous trend where they can only compare with Nvidias second best tier cards and probably with higher TDP. (1080, 2080, 3080). But then this is considering shader performance, Ray tracing then adds another layer of complexity with how much each architecture can be penalized by RT
on.
Posted on Reply
#66
dyonoctis
Chrispy_The slides say that the'll be removing the always-online login reuirement for GFE, so this should actually be fine.
Nobody can actually like the Nvidia control panel as it is right now? It's poorly laid out, skinned for Windows XP and lacks features that AMD give that are either nice to have or downright mandatory IMO.
I remember the first time that AMD showed it's big redesign. Some started to point out how the "no frills" Nvidia control panel was the right way to do it. (Even though the Nvidia control panel is not exactly fast and nimble despite being stuck in win xp era)
Posted on Reply
#67
bug
Chrispy_The slides say that the'll be removing the always-online login reuirement for GFE, so this should actually be fine.
It would be pretty crazy if what is essentially a driver would require an internet connection, wouldn't it? I mean, stranger things have happened, but it would still be pretty crazy.
Chrispy_Nobody can actually like the Nvidia control panel as it is right now? It's poorly laid out, skinned for Windows XP and lacks features that AMD give that are either nice to have or downright mandatory IMO.
I'm fine with the current Control Panel for two reasons:
- I've been using it since forever
- I open it so rarely. Basically never, the only times I used it was when I needed to change a game's profile. But I've stopped gaming.

I'm not opposed to change or added features, I'm just saying the current CP is not a sore spot for me.
Posted on Reply
#70
medi01
efikkanNot very impressive considering Turing have spent a lot of those transistors on additional features
Define "a lot".
I was told it is at around 8%. Which makes it pretty much on par.
efikkanDon't forget that Nvidia will have access to any new nodes that AMD does have access to.
No doubt about that, except this gen we might see Huang eat crow for messing with TSMC.
efikkanit's not going to be a cakewalk no matter how good RDNA 2 is.
The cakewalk NV had back in Raja times was based on total absence of AMD chips in major parts of the market, followed by very bad Vega metrics (big chip + expensive RAM and meh performance) which left high end to full throttle milking by Huang.

Now that gamers are trained to pay obnoxious sums for GPUs, what we'll see with next gen would hardly be a deathmatch. NV will milk a bit less, AMD will push it's ever raising margins closer to Intel/NV margins.

There will be no cutthroat competition.

Major technological gap between the two exists only in someone's imagination.
Posted on Reply
#71
Th3pwn3r
medi01AMD will push it's ever raising margins closer to Intel/NV margins.

There will be no cutthroat competition.

Major technological gap between the two exists only in someone's imagination.
Lol, in our dreams.
Posted on Reply
#72
Dyatlov A
Are the current gen RTX prices will go down, after tomorrow announcement?
Posted on Reply
#73
medi01
Dyatlov AAre the current gen RTX prices will go down, after tomorrow announcement?
Yeah, very funny.

PS
It's not a gaming card.
Posted on Reply
#74
bug
Dyatlov AAre the current gen RTX prices will go down, after tomorrow announcement?
Tomorrow only data center parts will be announced. So no.
Posted on Reply
#75
Dante Uchiha
Vayra86Cost and scalability; Nvidia's Turing dies had to be effin' huge, so an older node was the cheap route. Clocks might play some role, but I think Turing is what it looks to be, a rough version of Pascal X Volta with RT, a stepping stone to the real deal in terms of architectural updates. It has likely provided a wealth of info about what degree of RT performance devs would need going forward, as well.

Note that right now some leaks also point at two nodes coming into use, 7nm and Samsung 8nm, with the better node being for the higher tier products. I'm not sure how much sense that really makes, but we do know TSMC's 7nm yield is pretty good. That would lend credibility to making bigger dies on it.

About clocks, in the end clocking is mostly about power/voltage curves and what you can push through on a certain node... Nvidia already achieved much higher clocks with Pascal on a smaller node and a big one in that was power delivery changes. They will probably only improve on that further, and Turing pushed clocks back a little bit, however minor; probably related to introduction of RT/Tensor (power!). Also, TSMC's early 7nm was DUV which wasn't specifically suited for clocking high.
But nvidia usually takes it easy and chooses the path of efficiency, this helps to keep some margin to overclock the opposite of AMD that puts the chips almost on edge. Nvidia would just show that ampere isn't such an impressive architecture after all.
Posted on Reply
Add your own comment
Apr 23rd, 2024 06:37 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts