Tuesday, July 7th 2020

NVIDIA GeForce RTX 3070 and RTX 3070 Ti Rumored Specifications Appear

NVIDIA is slowly preparing to launch its next-generation Ampere graphics cards for consumers after we got the A100 GPU for data-centric applications. The Ampere lineup is getting more and more leaks and speculations every day, so we can assume that the launch is near. In the most recent round of rumors, we have some new information about the GPU SKU and memory of the upcoming GeForce RTX 3070 and RTX 3070 Ti. Thanks to Twitter user kopite7kimi, who had multiple confirmed speculations in the past, we have information that GeForce RTX 3070 and RTX 3070 Ti use a GA104 GPU SKU, paired with GDDR6 memory. The cath is that the Ti version of GPU will feature a new GDDR6X memory, which has a higher speed and can reportedly go up to 21 Gbps.

The regular RTX 3070 is supposed to have 2944 CUDA cores on GA104-400 GPU die, while its bigger brother RTX 3070 Ti is designed with 3072 CUDA cores on GA104-300 die. Paired with new technologies that Ampere architecture brings, with a new GDDR6X memory, the GPUs are set to be very good performers. It is estimated that both of the cards would reach a memory bandwidth of 512 GB/s. So far that is all we have. NVIDIA is reportedly in Design Validation Test (DVT) phase with these cards and is preparing for mass production in August. Following those events is the official launch which should happen before the end of this year, with some speculations indicating that it is in September.
Sources: VideoCardz, TweakTown, kopite7kimi (Twitter)
Add your own comment

106 Comments on NVIDIA GeForce RTX 3070 and RTX 3070 Ti Rumored Specifications Appear

#26
droopyRO
RedelZaVednoRTX 2070 was 10% slower than 1080TI ...... 14->12 nm
You mean the 2070 Super.
Posted on Reply
#27
moproblems99
EarthDog24GB of VRAM sounds like a Titan variant...


please... PLEASE GPU makers, don't stuff us with more than 12GB of VRAM.................
I agree that it is probably a Titan. I think 16gb should keep us satisfied for the 'life' of a current gpu as I can see the new consoles creating a new generation of laziness with regards to memory and resource management.
Posted on Reply
#28
ratirt
I'm worried seeing this. It would be definitely another level of pricing but not necessarily another level of gaming.
Posted on Reply
#29
Krzych
These leaks are just so useless... They just assume the same die naming scheme as Turing and proportional spec upgrade to what Turing had over Pascal. You could make this "leak" the day Turing launched...
Posted on Reply
#30
ppn
24GB sounds like a Dual GPU card, 3090, much like 490,590. And the buffer is not additive. 12GB+12GB in SLI.
Posted on Reply
#31
EarthDog
ppn24GB sounds like a Dual GPU card, 3090, much like 490,590. And the buffer is not additive. 12GB+12GB in SLI.
I highly doubt we'll see a dual GPU card these days....

the last dual GPU card I recall is AMD's 295x2 that was 500W, lolol.
Posted on Reply
#32
ARF
EarthDogI highly doubt we'll see a dual GPU card these days....

the last dual GPU card I recall is AMD's 295x2 that was 500W, lolol.
Unless they start producing GCs with chiplets, for example 2, 4, 6 small chiplets.
Posted on Reply
#33
BoboOOZ
Well, many rumors point to some nice fat TDP's for this generation, too. But maybe not 500w, though :)
Posted on Reply
#34
EarthDog
ARFUnless they start producing GCs with chiplets, for example 2, 4, 6 small chiplets.
Correct. But that isn't happening now and will be different implementation than the SLI/CF that we know. We aren't getting any dual GPUs on a single PCB with these large chips.

EDIT: See your post below, lol.
Posted on Reply
#35
ARF
BoboOOZWell, many rumors point to some nice fat TDP's for this generation, too. But maybe not 500w, though :)
The usual wccftech click-bait now states the top die should be 627 mm^2 on Samsung N8 which would equal 895 mm^2 TSMC old 16/12nm process.
Which means around 15-20% larger than the old RTX 2080 Ti.
wccftech.com/nvidia-ampere-gpu-gaming-627mm-2-die/
Posted on Reply
#36
ObiFrost
RedelZaVednoOr Nvidia just don't care about PC gamers migrating to consoles?
Migrating where? To 30 FPS with lowest presets box? There are barely any games worth purchasing a dedicated $400-500 hardware for it. Consoles will be viable between 2021 and 2022 when we actually start receving next-gen games instead of that cartoonish/cell-shaded crap for children. As far as I'm aware, more enthusiasts are pumped for GPU hardware, not console trash.
Posted on Reply
#37
Valantar
EarthDogCorrect. But that isn't happening now and will be different implementation than the SLI/CF that we know. We aren't getting any dual GPUs on a single PCB with these large chips.

EDIT: See your post below, lol.
Maybe Nvidia is moving to wafer-scale GPUs, ditching the PCB and just carving the whole card out of silicon? :D
ObiFrostMigrating where? To 30 FPS with lowest presets box? There are barely any games worth purchasing a dedicated $400-500 hardware for it. Consoles will be viable between 2021 and 2022 when we actually start receving next-gen games instead of that cartoonish/cell-shaded crap for children. As far as I'm aware, more enthusiasts are pumped for GPU hardware, not console trash.
I think you'd do well to take a look at Digital Foundry's analysis of the PS5 game showcase video. While there is some 30fps there, definitely no low presets, cel-shaded crap or current-gen looks there. And while of course none of those games are final, most of them are run on actual hardware, including examples of RTRT and detail levels not even remotely possible on current-gen consoles. There's little reason to suspect that high-budget launch titles won't be very impressive graphically on both consoles.
Posted on Reply
#38
blazed
xkm1948Newest rumor from Chiphell claims the upcoming 3080Ti (or the non Titan flagship equivalent) will boost 24GB of VRAM.

translate.google.com/translate?sl=auto&tl=en&u=https%3A%2F%2Fwww.chiphell.com%2Fthread-2240189-1-1.html

$1499 for lowest end 3080Ti would I dare say?
I actually expect the upcoming Nvidia cards to come in at a reasonable price. My premise is that not only will Nvidia have to compete with AMD as well as Intel in the graphics card space, but they will also have to compete with the new consoles as well. Rumor is that due to the pandemic, Xbox Series X and PS5 will be sold at a loss in order to move as many units as possible, with prices being suggested as low as $199 and $299 respectively, although obviously take that with a very large grain of salt.
Posted on Reply
#39
BoboOOZ
blazedI actually expect the upcoming Nvidia cards to come in at a reasonable price. My premise is that not only will Nvidia have to compete with AMD as well as Intel in the graphics card space, but they will also have to compete with the new consoles as well. Rumor is that due to the pandemic, Xbox Series X and PS5 will be sold at a loss in order to move as many units as possible, with prices being suggested as low as $199 and $299 respectively, although obviously take that with a very large grain of salt.
All the rumors that I heard point to 400-500 for the consoles, with only the Microsoft 1080P version coming lower than that. 300USD for a console sound like wishful thinking, but, if it were true, Nvidia cannot compete with that. They won't give you a 12TFlop card for 300USD.
Posted on Reply
#40
Valantar
blazedI actually expect the upcoming Nvidia cards to come in at a reasonable price. My premise is that not only will Nvidia have to compete with AMD as well as Intel in the graphics card space, but they will also have to compete with the new consoles as well. Rumor is that due to the pandemic, Xbox Series X and PS5 will be sold at a loss in order to move as many units as possible, with prices being suggested as low as $199 and $299 respectively, although obviously take that with a very large grain of salt.
There is absolutely no chance whatsoever that those consoles will be that cheap. Zero. Selling at a loss is one thing, selling at that kind of loss is quite something else. The SoCs inside of each likely costs close to that much in production costs for AMD, let alone what console makers are paying for them. So, if the PS5 (should be the cheaper of the two to make) costs $400 to make (it's likely more than that) but is sold at $299 including distributor and retailer margins, that means Sony would be losing more than 150 million dollars per million consoles sold assuming two levels of sales with 10% margins. Losses on that scale won't be made back by game licencing or subscription services. And remember, the PS4 has sold something like 50 million units. If they sold ten million ps5s the first year they would then lose more than a billion and a half dollars. In a single year. Sony would buckle under those losses.

So no, those prices aren't happening.

On the other hand, if the rumored 20-CU "Xbox Series S/Lockhart" shows up, that might indeed hit $299.
Posted on Reply
#41
Tomorrow
Yep 199 and 299 are very unrealistic. The fact that neither has shared prices and Sony specificly argued for "value" tells me that they will try to soften the blow and promote other aspects of the new consoles. If the prices were that low they would heavily promote that. 449 is the absolute cheapest i can see them going. That's for the PS5 Digital Edition. Regular PS5 and XSX will be likely 499.

Now the less powerful XSS could be 299 for 1080p gaming.
Posted on Reply
#42
RandallFlagg
ratirtI'm worried seeing this. It would be definitely another level of pricing but not necessarily another level of gaming.
Not understanding the angst here.
It's basically the same arch going to a smaller, more efficient, and probably faster process node.
So you get 2080 arch -> 3070, + faster memory, +more power efficient, + a few percent better IPC from the new node

I bet the 3070 is at least 110% the performance of a 2080.
Posted on Reply
#43
wolf
Performance Enthusiast
RedelZaVednoNGreedia
moproblems99Oh shit, here we go again. Want lower prices? Stop buying things. And convince everyone else to do the same.
When I see that word written is when I know the person isn't worth listening to.

On the note of the 3070 and Ti, I'm hopeful that if this spec is roughly where we land, I can get a meaningful upgrade from my GTX1080 with one. A 2070S with the same 2560 CUDA count handily outperforms my 1080 already, bump that up to ~3000 with the next gen features and I can't see any way it at least doesn't outperform a 2080S. What I do really want in addtion is more VRAM, I won't spend big money on another 8GB card.
Posted on Reply
#44
moproblems99
wolfWhen I see that word written is when I know the person isn't worth listening to.
They are still worth listening too provided they walk the walk and work for free as well.
Posted on Reply
#45
Minus Infinity
Leaks that clocks will be over 2GHz, combined with IPC gains of at least 10% - it's a new architecture not a tweak of Turing - make it plausible a 3070 Ti is as fast as 2080 Ti and will smash it in RT. 3070 should be faster than 2080 for sure and of course all Ampere cards will crush Turing for RT. Again leaks indicate no more than 10% hit to frame rate with RT, 3060 will be viable for RT for sure.
Posted on Reply
#46
robb
RandallFlaggNot understanding the angst here.
It's basically the same arch going to a smaller, more efficient, and probably faster process node.
So you get 2080 arch -> 3070, + faster memory, +more power efficient, + a few percent better IPC from the new node

I bet the 3070 is at least 110% the performance of a 2080.
what does IPC gain have to do with a node?
Posted on Reply
#47
RandallFlagg
robbwhat does IPC gain have to do with a node?
From my limited understanding, changing to a smaller process node can (doesn't have to, can get worse) result in transistor switching occurring faster. That can and has resulted in a microcode instruction that used to take say 2 cycles to complete, to take 1 cycle.

www.maketecheasier.com/processors-process-size/

"Smaller processes also have a lower capacitance, allowing transistors to turn on and off more quickly while using less energy. And if you’re trying to make a better chip, that’s perfect. The faster a transistor can toggle on and off, the faster it can do work. "
Posted on Reply
#48
medi01
Say, if stuff is to be released for CP2077, shouldn't RDNA2 cards be in full swing AIB production already? Yet, there have been no leaks so far.

Is it me or the gap between 3070Ti and 3080 is rather large?
RandallFlaggThat can and has resulted in a microcode instruction that used to take say 2 cycles to complete, to take 1 cycle.
This makes no sense. You get faster cycles (higher clocks) you don't miraculously get circuits that are capable of doing something in 1 cycle if it took 2.
Posted on Reply
#49
robb
RandallFlaggFrom my limited understanding, changing to a smaller process node can (doesn't have to, can get worse) result in transistor switching occurring faster. That can and has resulted in a microcode instruction that used to take say 2 cycles to complete, to take 1 cycle.

www.maketecheasier.com/processors-process-size/

"Smaller processes also have a lower capacitance, allowing transistors to turn on and off more quickly while using less energy. And if you’re trying to make a better chip, that’s perfect. The faster a transistor can toggle on and off, the faster it can do work. "
Let me just go ahead and clear this up for you. Process node has NOTHING to do with IPC.
Posted on Reply
#50
ratirt
RandallFlaggNot understanding the angst here.
It's basically the same arch going to a smaller, more efficient, and probably faster process node.
So you get 2080 arch -> 3070, + faster memory, +more power efficient, + a few percent better IPC from the new node

I bet the 3070 is at least 110% the performance of a 2080.
You don't? I'm worried about the price due to the high bandwidth mem and capacity and GDDR6X. With this comparisons 3070 equals/or better than 2080 performance I wouldn't be so sure. Since you mentioned it is the same arch that has just a shrink and basically we know nothing about these new NV graphics so not sure where you get this information from. Node shrink gives efficiency or frequency not IPC btw. If it is the same arch as Turing, the IPC will be exactly the same.
Like I said I'm worried about the price because that one will be higher for sure but the performance may not exactly justify the price bump. Do you get it now?
Look at it this way price/performance ratio.
Posted on Reply
Add your own comment
Apr 25th, 2024 18:24 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts