• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Readies New GeForce RTX 30-series SKU Positioned Between RTX 3070 and RTX 3080

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,684 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Possibly unsure of the GeForce RTX 3070 tackling AMD's Radeon RX 6000 series parts, NVIDIA is designing a new RTX 30-series SKU positioned between the RTX 3070 and RTX 3080. This is not a 16 GB variant of the RTX 3070, but rather a new SKU based on the 8 nm "GA102" silicon, according to a reliable source with NVIDIA leaks, kopite7kimi. The SKU is based on the GA102 with the ASIC code "GA102-150-KD-A1." The silicon is configured with 7,424 CUDA cores across 58 streaming multiprocessors (29 TPCs), 232 tensor cores, 232 TMUs, 58 RT cores, and an unknown number of ROPs. According to kopite7kimi, the card is configured with a 320-bit wide memory interface, although it's not known if this is conventional GDDR6, like the RTX 3070 has, or faster GDDR6X, like that on the RTX 3080.

NVIDIA recently "cancelled" a future 16 GB variant of the RTX 3070, and 20 GB variant of the RTX 3080, which is possibly the company calibrating its response to the Radeon RX 6000 series. We theorize that doubling in memory amounts may not have hit the desired cost-performance targets; and the company probably believes the competitive outlook of the RTX 3080 10 GB is secure. This explains the need for a SKU with performance halfway between that of the RTX 3070 and RTX 3080. As for pricing, with the RTX 3070 positioned at $500 and the RTX 3080 at $700, the new SKU could be priced somewhere in between. AMD's RDNA2-based Radeon RX 6000 series GPUs are expected to feature DirectX 12 Ultimate logo compliance, meaning that there is a level playing ground between AMD and NVIDIA in the performance segment.



View at TechPowerUp Main Site
 
Nvidia must be bloody terrified of what AMD has if its moving GA102 down to 70 Ti levels
 
So 3 Navi21 tier vs 3 GA102 tier, these 2 chips have terrible yields dont they.
 
From the start of this generation, Nvidia seemed rushed, and when the prices were announced, I seriously questioned why they would reduce them so much generation over generation. It seems apparent (and frankly unsurprising) that Nvidia had a source close to or inside AMD that gave them enough details on Navi 2x to realize thry had to launch first at competitive prices or risk launching after AMD and then be in the awkward position of having to catch up to AMD with ampere.
 
From the start of this generation, Nvidia seemed rushed, and when the prices were announced, I seriously questioned why they would reduce them so much generation over generation. It seems apparent (and frankly unsurprising) that Nvidia had a source close to or inside AMD that gave them enough details on Navi 2x to realize thry had to launch first at competitive prices or risk launching after AMD and then be in the awkward position of having to catch up to AMD with ampere.

I am not saying you are wrong, but what was the point in launching 2 months earlier than AMD with extremely limited quantities and not leaving enough time for the board partners to test their cards properly? Nvidia just pissed off a lot of people and I can’t see why.

Edit: it must have been the end of the financial quarter. Now it makes more sense.
 
The Ti or Super editions that come out as a response to AMD's launch will only serve to prove how bad impatience is for those buying the launch editions.

If you were price-scalped for an inferior card that's had crash to desktop issues at launch and a shortage of VRAM compared to the competition then I hope the last two months of 3000-series were worth it for you. Early adopter tax was high this time around.
 
This is what the 3070 should have been from the beginning. The specification is way to wide from 3070 to 3080. Shame Nvidia really dropped the ball this time around...
 
Here is a picture of my shocked face...:D

Nobody is (should be) surprised here. There is a 25% performance gap and $200 between them. Seems normal to me...

The Ti or Super editions that come out as a response to AMD's launch will only serve to prove how bad impatience is for those buying the launch editions.

If you were price-scalped for an inferior card that's had crash to desktop issues at launch and a shortage of VRAM compared to the competition then I hope the last two months of 3000-series were worth it for you. Early adopter tax was high this time around.
lol... wow. Shortage of vram, lol... a ctd issue resolved in a week with a driver.
 
I already said this, the 3070 is simply too overpriced considering the gap between it and 3080. Well, they are all overpriced but that one in particular.
 
lol... wow. Shortage of vram... a ctd issue resolved in a week with a driver.
When the competition is launching cards with 16GB VRAM and your "flagship" has 10GB, yes.

Shortage of VRAM is going to be down to game devs in the future. I can't predict that with any accuracy and neither can you but two games in this last year forced me to dial back settings because I was using a 6GB card that had insufficient VRAM, and I'm not even running at 4K. DirectStorage also means that it should be easier than ever for Devs to load and make use of even higher resolution assets than we're currently used to seeing.

The best-case scenario for Nvidia is that game devs treat the consoles as the upper end of what they develop for, and 10GB will be just about enough. Empirical data and historic trends have proven repeatedly that game devs will not do that and they'll push whatever they can based on the hardware available at the time, and that'll be a bunch of 16GB cards. I reckon by 2022 when these cards are getting longer in the tooth, the limiting factor for pushing graphics detail will be VRAM limitations and not their framerate.
 
Last edited:
"We theorize that doubling in memory amounts may not have hit the desired cost-performance targets; "

Called it.
 
How about the 3070, 3080 and 3090s? Any news of their availability yet??
 
I want a SKU for ever 5 dollar difference pricepoint, thank you for paving the way Nvidia.
 
God please give Lisa Su wisdom to price Jensen's Ampere out of the market. Here's the tip:):

36CU (2.1GHz) 8Gb Navi22L = $249 ->RX5700(XT) replacement
40CU (2.3GHz) 10Gb Navi22 = $299 -> RTX 3060TI/2080/1080TI competitor
52 CU (2.1GHz) 10Gb Navi 21L = $399 -> 2080TI/370 competitor
64 CU (2.1GHz) 12Gb Navi 21 = $449 -> 2080TI/3070 beater
72 CU (2.2GHz) 16Gb Navi 21XL = $599 -> 3080 competitor
80CU (as fast as it gets, the hell with TPD) 16Gb Navi 21XT = $999 -> 3080 beater and maybe 3090 competitor
 
Another day, another rtx rumor. How about they actually start producing cards instead of producing rumors.
 
Bored to death with these bs rumors all over the damned place on both Red and Green. AMD should release their GPU asap and let the people decide what's worth and who is king.
 
When the competition is launching cards with 16GB VRAM and your "flagship" has 10GB, yes.

Shortage of VRAM is going to be down to game devs in the future. I can't predict that with any accuracy and neither can you but two games in this last year forced me to dial back settings because I was using a 6GB card that had insufficient VRAM, and I'm not even running at 4K. DirectStorage also means that it should be easier than ever for Devs to load and make use of even higher resolution assets than we're currently used to seeing.

The best-case scenario for Nvidia is that game devs treat the consoles as the upper end of what they develop for, and 10GB will be just about enough. Empirical data and historic trends have proven repeatedly that game devs will not do that and they'll push whatever they can based on the hardware available at the time, and that'll be a bunch of 16GB cards. I reckon by 2022 when these cards are getting longer in the tooth, the limiting factor for pushing graphics detail will be VRAM limitations and not their framerate.
The memory 'issue' has been covered ad nauseum... 10GB is plenty for 1440p over the life of the card... and most titles at 4K. By the time it becomes a true concern, you'll want a different card anyway.

Ya'll can cry over spilled milk. Me? Rooted in reality so, I'll pass on bitching about 10GB. I won't be one of the... people... thinking 'zOMG, it haz s1xt33n GBs it haz 2 b betterz!'
 
From the start of this generation, Nvidia seemed rushed, and when the prices were announced, I seriously questioned why they would reduce them so much generation over generation.
Consoles. 2070S/2080 class GPU horsepower in 500$€£ package...
 
Hopefully nVidia will pay for milking people with their Turing rubbish.
 
So 3 Navi21 tier vs 3 GA102 tier, these 2 chips have terrible yields dont they.
This is the beginning of this GPU cycle what else would you expect? As more chips are produced the entire quality stack covers them by releasing a cut down sku (probably next year some time) as the product matures.

Consoles. 2070S/2080 class GPU horsepower in 500$€£ package...
That is a key in this contracted World economy.
 
It was about time, the gap is huge.

I already said this, the 3070 is simply too overpriced considering the gap between it and 3080. Well, they are all overpriced but that one in particular.

I would say under performing because it is a 3070, price is right for a 3070 but there is not performance that justifies it to be a 3070, like i said before that is more like a 3060ti.
 
Last edited:
Here is a picture of my shocked face...:D

Nobody is (should be) surprised here. There is a 25% performance gap and $200 between them. Seems normal to me...

lol... wow. Shortage of vram, lol... a ctd issue resolved in a week with a driver.

Nvidia continues with the same tactic... They always create a new cutdown when they realize they've lost. But... It is not normal for Nvidia to sacrifice profit margins using the best chip in a mid-end GPU.

What will the leather jacket do when realize that the 6080XT has close performance to 3080 costing the same as the 3070 ? I wonder how much they can still cut in profit margins, I bet 10GB DDR6X costs over $200 and with this horrible yield/wafer the chip should cost almost the same(just kidding). lol lol
 
Back
Top