Monday, January 20th 2020

Rumor: NVIDIA's Next Generation GeForce RTX 3080 and RTX 3070 "Ampere" Graphics Cards Detailed

NVIDIA's next-generation of graphics cards codenamed Ampere is set to arrive sometime this year, presumably around GTC 2020 which takes place on March 22nd. Before the CEO of NVIDIA, Jensen Huang officially reveals the specifications of these new GPUs, we have the latest round of rumors coming our way. According to VideoCardz, which cites multiple sources, the die configurations of the upcoming GeForce RTX 3070 and RTX 3080 have been detailed. Using the latest 7 nm manufacturing process from Samsung, this generation of NVIDIA GPU offers a big improvement from the previous generation.

For starters the two dies which have appeared have codenames like GA103 and GA104, standing for RTX 3080 and RTX 3070 respectively. Perhaps the biggest surprise is the Streaming Multiprocessor (SM) count. The smaller GA104 die has as much as 48 SMs, resulting in 3072 CUDA cores, while the bigger, oddly named, GA103 die has as much as 60 SMs that result in 3840 CUDA cores in total. These improvements in SM count should result in a notable performance increase across the board. Alongside the increase in SM count, there is also a new memory bus width. The smaller GA104 die that should end up in RTX 3070 uses a 256-bit memory bus allowing for 8/16 GB of GDDR6 memory, while its bigger brother, the GA103, has a 320-bit wide bus that allows the card to be configured with either 10 or 20 GB of GDDR6 memory. In the images below you can check out the alleged diagrams for yourself and see if this looks fake or not, however, it is recommended to take this rumor with a grain of salt.
Source: VideoCardz
Add your own comment

173 Comments on Rumor: NVIDIA's Next Generation GeForce RTX 3080 and RTX 3070 "Ampere" Graphics Cards Detailed

#101
moproblems99
kapone32
Well I have to be honest, the only games I played in those days was Total War Medieval 2 and Total War Shogun 2, throw in some Torchlight and Titan Quest as well all of which have full multi GPU support. Then I got into Just Cause 2, Batman Arkham and Dues Ex HR which all support crossfire. Then I discovered Sleeping Dogs and Shadow of Mordor. Then Rome 2 was launched and after that, Attila which again fully support crossfire. Just Cause 3 and I will end it with Total War Warhammer 1 & 2 which both support multi GPU. Even though 3 Kingdoms does not support multi GPU, I fully expect Warhammer 3 to continue crossfire support (as long as you can edit the script) as it will be an expansion of the existing game.
Here we are the - two extremes. I had a terrible time while you had a great time. We are but a small sample size so lets assume the truth lies in the middle. More than 25% and less than 100% of games will work with mGPU (and the percentage of newer games is dropping). Of those games, most of them will not have linear scaling, some of them might have more than 50%, and the many more, less than 50%. All of these amazing benefits for twice the money, twice the heat (up to), and twice the power (up to)!! So now, you have to have a beefier PSU, and better case cooling. And your top gpu is going to get cooked.

If you find that all of your games work with mGPU then it may make sense. But for the majority (by a margin), it doesn't make sense. You can spend 50% more on a gpu and jump up at least 2 tiers. You'll get a much more consistent experience with less of draw on the system. And the best part, for the majority of the games that don't have great scaling, you'll get the same performance or better.

Edit: I suppose this generation you would not have been able to jump two tiers. Perhaps the Touring generation added some fractional tenths of a percent to the "Should I bother with mGPU?" question.
Posted on Reply
#102
SRB151
Let's face it. 7nm means smaller dies, much lower cost, better yields, lower power, and higher clocks.
Nvidia has ALWAYS gotten more out of any silicon than anyone, so 100% performance increases, lower
prices than RTX cards, more memory as the article says, and more efficiency. The only thing standing
in their way of total domination will be Raja and his upcoming, infinite scaling XE cards. Great year for
super graphics.
Posted on Reply
#103
moproblems99
SRB151
Let's face it. 7nm means smaller dies, much lower cost, better yields, lower power, and higher clocks.
Nvidia has ALWAYS gotten more out of any silicon than anyone, so 100% performance increases, lower
prices than RTX cards, more memory as the article says, and more efficiency. The only thing standing
in their way of total domination will be Raja and his upcoming, infinite scaling XE cards. Great year for
super graphics.
What did they put in my tea....

Our Savior.....Raja! Who everyone booed out of AMD?
Posted on Reply
#104
64K
moproblems99
So, adding 1 + 1, it really doesn't matter that it was a mid-range card, does it? In most cases, people are buying for performance, not chip size, transistor count, board label, or anything else. It is pretty much performance. Would anything have really changed if the board was labeled a 100 or 102?
It matters a lot and it's why people are paying $500 to $700 for a midrange Nvidia GPU today.
Posted on Reply
#105
moproblems99
64K
It matters a lot and it's why people are paying $500 to $700 for a midrange Nvidia GPU today.
Not because the competition hasn't exactly put out an alternative? Now granted, when AMD had as good (or better) cards, it isn't exactly like many people bought them.

Intel charged an arm and a leg because they had no one to forced them to lower the price. Look what happened when Ryzen came out. Until AMD puts out some juice, prices will stay where they are. The board designation has nothing to do with where we are.
Posted on Reply
#106
64K
moproblems99
Not because the competition hasn't exactly put out an alternative? Now granted, when AMD had as good (or better) cards, it isn't exactly like many people bought them.

Intel charged an arm and a leg because they had no one to forced them to lower the price. Look what happened when Ryzen came out. Until AMD puts out some juice, prices will stay where they are. The board designation has nothing to do with where we are.
Your assertion that it doesn't matter that the GTX 680 was a midrange card which Nvidia placed an MSRP of $500 which was the former price for a high end GPU and used the x80 name which was once again the name for the high end GPU just because the 680 was on par with the high end 7970 does matter. There are still to this very day people that think that a GTX 680 was a high end GPU.
Posted on Reply
#107
moproblems99
64K
Your assertion that it doesn't matter that the GTX 680 was a midrange card which Nvidia placed an MSRP of $500 which was the former price for a high end GPU and used the x80 name which was once again the name for the high end GPU just because the 680 was on par with the high end 7970 does matter. There are still to this very day people that think that a GTX 680 was a high end GPU.
It beat the competition high-end why does it matter?
Posted on Reply
#108
64K
moproblems99
It beat the competition high-end why does it matter?
Well, the RTX 2080 Ti beat the competition. Why does it matter that it was $1,200 ?

The RTX Titan beat the competition. Why does it matter that it was $2,500 ?

The Nvidia shenanigans began with the GTX 680 and people just pay any damn thing since then.
Posted on Reply
#109
rtwjunkie
PC Gaming Enthusiast
moproblems99
It beat the competition high-end why does it matter?
Because it fundamentally shifted the prices artificially upward for what appears forever. All because they passed off a high-midrange card as high end and charged accordingly.
Posted on Reply
#110
Khonjel
At this point it's more entertaining to watch how much Nvidia slam-dunks AMD rather than hoping AMD will ever put up a fight.
Posted on Reply
#111
ixi
EarthDog
All I am saying is that we will get A LOT more than 10-15% gains out of the box from the new generation when comparing apples to apples (tier to tier). The only differences we've seen the past couple of generations are clock swaps and in the case of the super cards, core/spec differences, in order to extract more performance out of these cards. But 10-15% is putting a 2080 Ti on water and overclocking it... next gen will show double that, at minimum.
Are you sure about clock speeds?

Compare 980 ti vs gtx 1080 ti vs rtx 2080 :peace: . Difference between them is around 25-30%.
Posted on Reply
#112
cucker tarlson
If your mid range beats competitions high end,the problem is that competitions high end is really mid range.
Posted on Reply
#113
Krzych
Really hyped for this gen. Turing was "suspicious" right from the beginning and still turned out better than excepted to be honest, but this time I think we will get more Pascal like improvements in both performance and power. Everything is pointing to that the same way everything was pointing to Turing being a smaller jump. Plus expected huge improvements in RT performance. New consoles, want it or not, will be affecting the market somewhat too and prices should be much better. Maybe not going back to old ways (like Maxwell) but some Pascal level prices with tricks like RTX 3080 masquerading for top-end card for a few months before 3080 Ti arrives are possible I think.

kapone32
Multi GPU is for anyone that wants it. I have not replaced my Vega 64s with 5700Xts for exactly that reason. If Multi GPU was really dead MBs would not be touting it and giving you 2, 3 and 4 SLI bridges. As much as people complain about it being troublesome it is not as bad as people make it out to be. There are plenty of Games that support Multi GPU anyway. As an example I get 107 FPS average at 4K playing Jedi Fallen Order @ Ultra settings.
Exactly that. The only people who say things about "dead, not worth it, troublesome" are those who never wanted such configuration in the first place and they complain on something that doesn't even concern them just to reassure themselves. SLI is up there for anyone who wants it and is capable of managing it. It was never perfect or reliable as far as support in newly released games goes and never been targeted for making sense perf/money wise across the board, it is strictly for pushing the limit kind of situation, in selected games. But I guess it is too hard to understand since there always has to be some guy entering the thread and talking about dead multi GPU, no matter what thread is it. It is almost like they are actively looking for an opportunity :D

The rate of support and tweakability is practically stable for years now, only notable loss was Ubisoft dropping SLI from Assassin's Creed series starting with Origins, a series that had SLI support since forever. But at the same time they dropped all the extra features to use SLI for and Odyssey is basically a non-scalable console port that cannot even take proper advantage of new GPUs, so it is just a whole series going astray not just SLI exclusive problem. Also we are yet to see how universal this new SLI CFR mode is going to be once fully developed (and how tweakable), it will probably go official along with Ampere launch. Even now it shows some notable gains in games with zero multi GPU support and no custom profiles available. So things are rather looking up compared to right now, if anything.
Posted on Reply
#114
Metroid
finally very high settings + 4k 60 fps constant, old games 144hz for sure.
Posted on Reply
#115
Xajel
These seems good, as long as NV doesn't repeat the RTX 2000 launch prices again. RTX 3070 should be $499 (like current super), and RTX 3080 should be $649~$699...

And please no exclusive Founders Edition at launch which costs $100.
Posted on Reply
#116
ratirt
I see a lot of people have their wishes in terms of price and/or performance from the new NV graphics. Lets hope at least one, out of these wishes, comes true.
Hope the shrink will bring some more performance. Of the new arch I'm not so sure what to expect. Not much has been revealed.
My concerns on the other hand is mostly the price. We will have to wait a bit longer to get all the answers.
Posted on Reply
#117
64K
The R&D costs for RTX GPUs has probably already been recouped from the early adopters. AMD will be stepping into the RTRT ring with their releases this year. Consoles will have a hardware solution to accelerate ray tracing this year. Probably Intel will be stepping into the RTRT ring this year.

Prices for the 3070 and 3080 should be more down to earth or Mr Huang will have to make a similar announcement to the press like he made a while back concerning the RTX Turings: "Sales are failing to meet expectations."
Posted on Reply
#118
Super XP
SRB151
Let's face it. 7nm means smaller dies, much lower cost, better yields, lower power, and higher clocks.
Nvidia has ALWAYS gotten more out of any silicon than anyone, so 100% performance increases, lower
prices than RTX cards, more memory as the article says, and more efficiency. The only thing standing
in their way of total domination will be Raja and his upcoming, infinite scaling XE cards. Great year for
super graphics.
Interesting how people already count AMD out of the GPU business, without Understanding the reasoning behind hanging onto GCN so long, reasoning behind the release of Fury, Vega & Radeon VII. I could get into why AMD stuck onto the low to medium range of GPUs, but I'm sure most on TPU already know why.

A quick word of advice, RDNA 2 will make Nvidia re-think it's Ampere GPU Launch.
Posted on Reply
#119
EarthDog
Super XP
A quick word of advice, RDNA 2 will make Nvidia re-think it's Ampere GPU Launch.
So confident, yet nothing behind it. o_O

Bookmarked... My guess (instead of sharing it as fact and guised as 'advice') I don't think it will phase Nvidia much at all. If they are lucky, they will have a card that competes with 2080ti... which ampre will clearly beat by a significant amount.
Posted on Reply
#120
Super XP
EarthDog
So confident, yet nothing behind it. o_O

Bookmarked... My guess (instead of sharing it as fact and guised as 'advice') I don't think it will phase Nvidia much at all. If they are lucky, they will have a card that competes with 2080ti... which ampre will clearly beat by a significant amount.
I was only referring to SRB151's confidence that AMD has nothing upcoming in the GPU side of things. His own comments are posted as facts without any facts lol

I never mentioned anything about RDNA2 performance, what I did say is RDNA2 WILL make Nvidia re-think it's Ampere Launch, depending on who launches 1st.

Disect that as you will but it could be cost to performance ya know. This ain't a Vega like repeat was my point.
Posted on Reply
#121
EarthDog
Super XP
I was only referring to SRB151's confidence that AMD has nothing upcoming in the GPU side of things. His own comments are posted as facts without any facts lol

I never mentioned anything about RDNA2 performance, what I did say is RDNA2 WILL make Nvidia re-think it's Ampere Launch, depending on who launches 1st.

Disect that as you will but it could be cost to performance ya know. This ain't a Vega like repeat was my point.
Yeah, lol, his post is ridiculous. :p

I suppose every launch will cause the other to 'think' about things, be it price or whatever, but I took your post to mean it will be some sort of game changer. I mean, you said it was 'advice' like it was a fact that RDNA2 will be good enough... only time will tell. The confidence exuded from both parties is premature. :)
Posted on Reply
#122
Super XP
EarthDog
Yeah, lol, his post is ridiculous. :p

I suppose every launch will cause the other to 'think' about things, be it price or whatever, but I took your post to mean it will be some sort of game changer. I mean, you said it was 'advice' like it was a fact that RDNA2 will be good enough... only time will tell. The confidence exuded from both parties is premature. :)
I think my stance and my optimism for RDNA2 is because AMD has struggled with both the GPU and CPU for Soooooo Longggg lol
And that AMD put almost all its resources into building ZEN. So I'm hopeful AMD pulls a rabbit out of RDNA2 and actually puts out competitive GPUs so we all benefit.
It's been for too long and we need something from AMD that wil hopefully turn eyebrows towards them ;)

But I agree. Both are premature. Though Nvidia has had a better track record for obvious reasons lol
Posted on Reply
#123
EarthDog
Super XP
It's been for too long and we need something from AMD that wil hopefully turn eyebrows towards them ;)
No doubt! But I have little faith (in the high-end) and hoping for a surprise.
Posted on Reply
#124
QUANTUMPHYSICS
If I do buy a 3080Ti, I will sell my 2080Ti on Ebay and probably get around $700 - $800 for it.
From now on I will only buy cards with built in AIO.

I am extremely impressed with the temperature of my FTW3 over my BLACK.

My Black will rise to 55 Degrees after 2 - 3 hours of gaming with Ray Tracing on.

My FTW3 doesn't go past 40 Degrees in the same time period.

I don't overclock (as a practice).
Posted on Reply
#125
Super XP
EarthDog
No doubt! But I have little faith (in the high-end) and hoping for a surprise.
Amen to that. Lol

QUANTUMPHYSICS
If I do buy a 3080Ti, I will sell my 2080Ti on Ebay and probably get around $700 - $800 for it.
From now on I will only buy cards with built in AIO.

I am extremely impressed with the temperature of my FTW3 over my BLACK.

My Black will rise to 55 Degrees after 2 - 3 hours of gaming with Ray Tracing on.

My FTW3 doesn't go past 40 Degrees in the same time period.

I don't overclock (as a practice).
A card like that doesn't need overclocking.
And when you sell it be sure to mention that it was never overclocked. That might squeeze out more $$$ for you.
Posted on Reply
Add your own comment