• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Rumor: NVIDIA's Next Generation GeForce RTX 3080 and RTX 3070 "Ampere" Graphics Cards Detailed

Just clicked a few links, I see "half".
Are you joking with me? lol, all sources claim its brand spanking new design that will feature some exceptional things including VRS and RT.
At the end of the day it's rumours and speculation, but note that a couple of those sources have been right before.
 
When is expected for Nvidia to announce the new Ampere cards?
And AMD with Big Navi?
Maybe if they are close together it will create some kind of competition between them, wich is always good for the consumer. (Look at the 5600 release)
 
When is expected for Nvidia to announce the new Ampere cards?
And AMD with Big Navi?
Maybe if they are close together it will create some kind of competition between them, wich is always good for the consumer. (Look at the 5600 release)
Couldn't Agree More, lol
 
Why not check the internet. There's a wealth of speculation and rumours for both Big Navi / RDNA2 and Ampere. lol

*snip*
From a link (or two) of yours...
AMD claims up to a 50 percent improvement in performance for the same power consumption.

And with Nvidia Ampre saying the same thing (50% performance for the same power)...if we assume both are true... that leaves the landscape in a similar position, no?

I wonder how AMD is going to accomplish this on the same node with just an arch tweak. Nvidia on the other hand has all of the natural improvements of a die shrink, PLUS their arch changes.
 
I wonder how AMD is going to accomplish this on the same node with just an arch tweak.

Aren't they moving to EUV? That should give them something...or not. Or RDNA2 would have to be radically different like Zen3 is purport to be. But that doesn't seem to make sense at this point. Or HBM anyone lol? Get those power savings somehow.

Nvidia on the other hand has all of the natural improvements of a die shrink, PLUS their arch changes.

They could be hedging on RTX gains this year seeing as they likely know the steep ass hill AMD has to climb.

Lots of ifs, lots of coulda, lots of we don't really know. Dash of salt.....voila!
 
From a link (or two) of yours...


And with Nvidia Ampre saying the same thing (50% performance for the same power)...if we assume both are true... that leaves the landscape in a similar position, no?

I wonder how AMD is going to accomplish this on the same node with just an arch tweak. Nvidia on the other hand has all of the natural improvements of a die shrink, PLUS their arch changes.
But you assume AMD is going to do nothing more than a arch tweak, where the majority of sources state RDNA2 is a new GPU architecture. But it's also something that AMD is keeping quiet, just like how they kept ZEN quiet.
 
But you assume AMD is going to do nothing more than a arch tweak, where the majority of sources state RDNA2 is a new GPU architecture. But it's also something that AMD is keeping quiet, just like how they kept ZEN quiet.
Lol, I like your positive outlook, even if it is inherently misplaced here. :)

Your links show 50% for amd so is ampre (assuming both are true). So I ask again, regardless of 7nm+ and new arch, we're in the same place, if we follow the rumors, eh?
 
Last edited:
Lol, I like your positive outlook, even if it is inherently misplaced here. :)

Your links show 50% for amd so is ampre (assuming both are true). So I ask again, regardless of 7nm+ and new arch, we're in the same place, if we follow the rumors, eh?
Agreed. :clap::peace:
 
Lol, I like your positive outlook, even if it is inherently misplaced here. :)

Your links show 50% for amd so is ampre (assuming both are true). So I ask again, regardless of 7nm+ and new arch, we're in the same place, if we follow the rumors, eh?
If we follow the rumors the Ampere wont be Ampere. Zen3 will suck and Cascade lake will mop the floor with it. The new NV graphics might not come out.
Well, God knows what is true or not and if rumors are in play here, we, users, customers know nothing about the actual product and will know nothing till it shows up. It is in the best interest for the companies to not reveal anything serious, meaningful that would jeopardize the product in any way. The rumors are manufactured not leaked nowadays.
This new NV graphics will be good. Why wouldn't it be? RT is being pushed forward and it's been our "leather jacket dude" dream and now let us wait what the price will be and how this will play in the future against AMD and what the latter will show with the RDNA2. BUT, if anyone will justify that the price is being jacked up again because more RT core are in the GPU or because it is high end of the highest end, then please leave the planet and never come back.
 
Ya I remember years back before I tried my first SLI setup (2x 780 Ti) I was scared to death about the stuttering people kept talking about. I've run 780 Ti SLI, Titan X (maxwell) SLI, and now 1080 Ti SLI...I haven't had a lick of stuttering on any games I play. I mean zippo. I generally run vsync @ 60fps in 4k and my games have been butter smooth. If I ever feel like a game is running right around that 60fps limit for my cards and may fluctuate (which can cause input lag in those rare situations), then I switch out of true vsync and enable adaptive vsync at the driver level and that will take care of any issues.

My experiences with multi gpu have been great. It's obviously not for everyone given that the scaling is never 1:1, and in some cases not even close, but if you have tons of cash and / or are just a hardware enthusiast that wants maximum image quality and / or framerates, it's something I'd recommend people try.



Ya I'd like to think they'll get the prices back down into the realm of reason, but I am skeptical with NVidia. :) I may need to just plan on buying 2nd hand Turing once the prices get down into reasonable territory.
The 1070 Ti was actually the best performance/$ card out of the 10 series, with the 1080 Ti right behind. Take a look in this: https://docs.google.com/spreadsheets/d/1-JXBPMRZtUx0q0BMMa8Nzm8YZiyPGkjeEZfZ2DOr8y8/edit?usp=sharing This is my database for the NVIDIA graphics cards, and I also have launch MSRPs and price/TFLOPS (for this context, it is directly relevant with GPUs with the same architecture). I only use launch MSRPs, and not adjusted MSRPs like the GTX 1080 which was launched at 699$ but they dropped it I think 100 or 200$, I don't remember the price cut.

Would also be nice to hear some rumours about the improvements to RTX hardware and tensor cores too, shader numbers are not going to inform us much about the performance in isolation.

@Berfs1 I Can't see any of that panning out, since when did Nvidia improve their best then undercut it? so I think the prices we have now plus 50 dollars minimum, at least at launch with supers or something coming out later ,once stocks of the 2xxx series run out, makes sense tbf.
Yea, the pricing was a rumor, but I can believe that the performance will be substantially better. Also, since you mentioned the Super cards, here is a few facts that not a lot of people know about them: They have better price/performance, but they have higher performance/watt than the non-Supers. Not a lot of people touched on that, so I just wanted to let it out there that the performance/watt is actually worse on Supers than the non-Supers. Everyone wants the most performance with the lowest cost and lowest power, you can only get a good balance from two of those, you will never get max performance at lowest power.
 
That 320-bit bus looks weird: allowing your xx80 grade card to have such high memory bandwidth, you start to cripple your xx80Ti's performance at higher resolutions (unless it uses 512-bit bus or HBM2).
Though I'd be happy to be wrong, as better grade products for the same or cheaper price is always a welcome.

last card with a 320bit bus was an 8800GTS, its to odd a number to be the complete memory controller, I'd wager there is some disabled silicone and the controller is really 384bit for TI
 
As much as I want to buy the latest and greatest GPU each year... I think I am going to stop doing so until they make gaming great again!

All the game developers, but a maybe a few, suck! Tired of 1/4 baked games filled with bugs, or heavily monetized with money transactions...
 
last card with a 320bit bus was an 8800GTS, its to odd a number to be the complete memory controller, I'd wager there is some disabled silicone and the controller is really 384bit for TI
Not the last card to have 320 bit bus width; the GTX 470 had 320 bit, as well as the GTX 560 Ti (448 core) and GTX 570, but yes it is weird nonetheless. Not to mention the 3080 Ti may be 352 or 384 bit, I mean the 1080 Ti and 2080 Ti had 352 bit, the 780 Ti and 980 Ti had 384 bit, who knows at this point lol, but I have a hunch the Titan Ampere may have 48GB VRAM and 3080 Ti may be 20GB/24GB VRAM, since the 80 Ti cards have always have had half or near half the VRAM of the Titan cards, with the one exception being the Titan Z (which was a dual GPU card).
 
Not the last card to have 320 bit bus width; the GTX 470 had 320 bit, as well as the GTX 560 Ti (448 core) and GTX 570, but yes it is weird nonetheless. Not to mention the 3080 Ti may be 352 or 384 bit, I mean the 1080 Ti and 2080 Ti had 352 bit, the 780 Ti and 980 Ti had 384 bit, who knows at this point lol, but I have a hunch the Titan Ampere may have 48GB VRAM and 3080 Ti may be 20GB/24GB VRAM, since the 80 Ti cards have always have had half or near half the VRAM of the Titan cards, with the one exception being the Titan Z (which was a dual GPU card).
Most of these were just partially disabled memory controllers. GPUs don't have a single memory controller, but multiple 64-bit controllers, but even these can be partially disabled to get 32-bit increments.

This rumor does however contain two specific and odd details;
1) Different SM count per GPC (Nvidia usually have these the same within an architecture)
2) A separate die for a chip with a 320-bit memory controller
These are two pieces of information that are either completely true or completely wrong, and those under NDA would immediately know if this entire rumor is true or just BS.

As much as I want to buy the latest and greatest GPU each year... I think I am going to stop doing so until they make gaming great again!

All the game developers, but a maybe a few, suck! Tired of 1/4 baked games filled with bugs, or heavily monetized with money transactions...
Why upgrade so often? Upgrade when you need to.

But you're right about game developers; high quality work is more the exception than the norm. It's not Nvidia that's "killing" gaming, it's poor software.
 
Most of these were just partially disabled memory controllers. GPUs don't have a single memory controller, but multiple 64-bit controllers, but even these can be partially disabled to get 32-bit increments.

This rumor does however contain two specific and odd details;
1) Different SM count per GPC (Nvidia usually have these the same within an architecture)
2) A separate die for a chip with a 320-bit memory controller
These are two pieces of information that are either completely true or completely wrong, and those under NDA would immediately know if this entire rumor is true or just BS.


Why upgrade so often? Upgrade when you need to.

But you're right about game developers; high quality work is more the exception than the norm. It's not Nvidia that's "killing" gaming, it's poor software.
It's not the developers fault it's the gaming companies Board of Directors that conjure up techniques to squeeze as much money as possible into there pockets and greatly limiting development funding.

They want less money invested into proper game development while still trying to make huge profits.

Nowadays games are popped out too quickly with lots of issues and bugs. It's really too bad, and hopefully things change.
 
Why upgrade so often? Upgrade when you need to.

For no logical reason. I like like playing with new gadgets. It's a disease... LOL. But now that they cost over $1200 each year, I'm going to stop wasting my money... especially since I can't find any games that suck me in anymore.
 
For no logical reason. I like like playing with new gadgets. It's a disease... LOL. But now that they cost over $1200 each year, I'm going to stop wasting my money... especially since I can't find any games that suck me in anymore.

I would the average gamer is on an rx580 or gtx1060 and this is the middle road they're going to Target.
 
I would the average gamer is on an rx580 or gtx1060 and this is the middle road they're going to Target.
I have the following: As you can see in the drop down menu for System Specs. And I have absolutely NO Issues playing on Ultra High Picture Quality Settings at 1440p. My FPS in games like DOOM, RAGE 2, the entire METRO series, Wolfenstein New Order, Old Blood, II New Colossus, Young Blood, Resident Evil 2 Revamp, Resident Evil 7 BioHazard, Mad Max, PREY, Dying Light ETC., all gaming at a range of 60 FPS up to 100 FPS. Never have I experienced any slow downs in a game except in Dying Light a couple times, when I am picking up stuff too quickly, FPS drop down to about 50 FPS, then go back up to my average 75 FPS, for that game. DOOM is about 70-100 FPS, depending on the area. Same goes for Wolfenstein games for the most part.

Specs:
* Sapphire Radeon RX 580 8GB Nitro+ SE
* AMD Ryzen 7 1700X @ stock
* G.Skill TridentZ 32GB (2 x 16GB) DDR4 3200
* Asus 27" (MG278Q) 144Hz WQHD 1440p

My next upgrade:
Radeon RX 6700XT 8GB
AMD Ryzen 7 4800X
Same Ram & Monitor.
X670 chipset mobo Socket AM4 etc., I can dream right? lol
End of my Ramblings....
 
It's not the developers fault it's the gaming companies Board of Directors that conjure up techniques to squeeze as much money as possible into there pockets and greatly limiting development funding.

They want less money invested into proper game development while still trying to make huge profits.

Nowadays games are popped out too quickly with lots of issues and bugs. It's really too bad, and hopefully things change.
Yes, agree. I should have emphasized that I meant the game development companies, not the poor individuals given the "impossible" task.

But as I've mentioned in other threads before, it's often not only about money spent, but rather "quick turnover". These companies' boards often rather want rapid product cycles rather than spending the time necessary to build a proper game engine first, then do full-scale content creation, and then finally proper testing before shipping.
 
Estimated on confirmed release date?
 
I wanne tell you something that nobody mentions about multicard: input delay
people have to understand that SLI "frames" are not realy frames :x
think about it: what is the point to have x2 fps but an input delay of x3 ?!

yhe it the true, all the games realy bad last 10 yars :x
 
Last edited:
Well 3070 is 2080S on steroids, this is 60% faster than 2070. 60% the chip size at same power higher clocks, and lower price.... than 2080S.

And how will the 3070 get over the same bandwidth limitations as the 2080S if it uses the same GDDR6 256 but memory? The 16 gb/s stuff is already quite expensive. It's been shown that even a standard 2080 performs the same as the 2080S when memory is set at the same speed despite the SM deficiency.

This is why the 3080 is getting a 320 bit bus otherwise it would perform the same as the 3070 while saving room for a 3080ti which will most likely get a full 384 bit bus.
 
Btw, when was Intel supposedly to launch their new GPUs ??
 
Yes, agree. I should have emphasized that I meant the game development companies, not the poor individuals given the "impossible" task.

But as I've mentioned in other threads before, it's often not only about money spent, but rather "quick turnover". These companies' boards often rather want rapid product cycles rather than spending the time necessary to build a proper game engine first, then do full-scale content creation, and then finally proper testing before shipping.
That is exactly what I meant to say lol,

Btw, when was Intel supposedly to launch their new GPUs ??
If Intel is serious about Discrete Graphics, I can see them matching both AMD & Nvidia in about 3 years time. As for the release, I heard sometime in 2020, but its going to barely do 1080p last I read.
 
Back
Top