• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Readies New GeForce RTX 30-series SKU Positioned Between RTX 3070 and RTX 3080

What people need to understand is that if $600 3070 is really good, then a cheaper weaker AMD might be a better choice.
3070 should be $500.
 
Nvidia is one generation ahead of Nvidia. This article is full of fake news made up to fill up a web post.

What people need to understand is that if $600 3070 is really good, then a cheaper weaker AMD might be a better choice.
Since they're having trouble producing GPUs, they might as well produce an unprecedented number of news articles just to keep the hype train going. What I've seen of the 30 series so far isn't that good anyway (meh amounts of VRAM, huge power consumption, weak 1080p performance...).
 
Nvidia must be bloody terrified of what AMD has if its moving GA102 down to 70 Ti levels
When are the new AMD cards coming
 
nVidia was planning to flood the market by Oct end, Nov start with a few hundred thousand Ampere cards (~300K of 3080 10GB, ~30K of 3090 24GB) including the RTX3070 8GB also by a hefty number.

There is some talking going on for a while that now they decided to supply half the 3080s 10GB and 3070 8GB from original plan but same number for 3090 24GB.
Also they canceled the 20GB options for 3070/3080s. By the way, the 3070 20GB was going to be with GDDR6, so the claimed shortage of GDDR6X is not true.

This, if true, indicating that after leaks/rumors of RDNA2 performance nVidia re-evaluated their segmentation and profit margins plan. They saw that 3070/3080s with 20GB are pointless, and they bringing down the GA102 to a new 3070 (Super/Ti or ?) and this (short of) confirms the cut in half supply numbers for 3070 8GB and 3080 10GB.

First they planed towards more segmentation in the upper high tiers for more profit margins (I’m surprised!) but the RDNA2 leaks and rumors turned them to move into more segmentation towards the middle tier parts and have more options there, to “play” with perf and price.

In a nutshell their plan has backfired right in their face...
They could have gone to profit 10, but they wanted 15~20 and now they’re going to get <10.
 
By the way, the 3070 20GB was going to be with GDDR6

Rumors were 20GB GDDR6x for the 3080(Ti) and 16GB GGDR6 for the 3070(Ti).
 
Yes I’ve made a mistake about the quantity of the 3070 VRAM. The rest is the same.
 
Seems like Nvidia is just having yield issues and it's probably their only option / easier to make a slightly gimped 3080 with more ram to fight 6800xt while fixing said issues.
 
You'd have to have a screw loose to buy a $500 graphics card with just 8GB of memory (3070) going in to 2021.
Same-wise a 700~800$ with 10GB either...
 
Is anyone else tired of this forum and its users propagating misinformation? I mean, you link/support/show, and yet here we are...users droning on about vram...

Crying shame that the reviews are so good, but when you hit the forum it's polarizing, toxic fanboy driven drivel. :(
 
Is anyone else tired of this forum and its users propagating misinformation? I mean, you link/support/show, and yet here we are...users droning on about vram...

Crying shame that the reviews are so good, but when you hit the forum it's polarizing, toxic fanboy driven drivel. :(
Probably is for me since you havent quote anyone.
Why did you retracted your "Haha" emoji from @Shatun_Bear latest post? ...after I did the same and left the #59 post.
Should I take this as an opportunity to tell on me? I dont want to, to be honest.

First I wasnt agree with him, and was trying to make a point that if this is true then it must also apply to 3080. You know... try to start conversation...
 
Is anyone else tired of this forum and its users propagating misinformation? I mean, you link/support/show, and yet here we are...users droning on about vram...

Crying shame that the reviews are so good, but when you hit the forum it's polarizing, toxic fanboy driven drivel. :(

Yep, cant do much about it.
just let them continue to spew their made up rumor "facts" and other such drivel. Fortunately it's relatively contained to these news sections. Mostly.
 
Is anyone else tired of this forum and its users propagating misinformation? I mean, you link/support/show, and yet here we are...users droning on about vram...

Crying shame that the reviews are so good, but when you hit the forum it's polarizing, toxic fanboy driven drivel. :(
As far as I see it, the reviews point out that the 3080/90 are better than the 20 series. It is true, yet you can't ignore certain factors: no availability in stores, high power consumption, mediocre amounts of VRAM, tiny performance gains in lower resolutions. Sure, if you want to play current titles in 4K, go ahead, buy a 3080, but if you...
  • want a future-proof card: wait and see how well AMD's offerings perform with 16 GB VRAM,
  • want to play current games at 1080p: keep what you have and be happy, or buy something second-hand.
8/10 GB VRAM is enough now, but who knows what the future brings. The 3080/90 might run out of VRAM sooner than raw GPU power in the future, and this wouldn't be the first instance. My brother has a 2 GB GTX 960 which could easily play Kingdom Come: Deliverance just fine, would it not drop into single digit FPS whenever it's updating the framebuffer. Personally, I'd rather avoid a similar situation.

Disclaimer: all of my comment contains personal opinion. If it differs from anyone else's, so be it. No offense. :)
 
Last edited:
Is anyone else tired of this forum and its users propagating misinformation? I mean, you link/support/show, and yet here we are...users droning on about vram...

Crying shame that the reviews are so good, but when you hit the forum it's polarizing, toxic fanboy driven drivel. :(
But if I point them out, the staff delete the comment but don’t do shit when there’s people who go to people dms personally attacking folks because of their “views”and still they don’t do shit about it . It’s hilarious.
 
Hypocripsy at the highest level when people who bought Navi are worrying about future proofing when they bought something that is already obsolete the moment it released.
 
I'm actually pretty happy with my 2080Ti perf - overclocked I get 7600 in timespy extreme and 16K in timespy, which is about ~15% behind 3080 at 1440p. All my games run in the 150-100 fps range and 100-80 if i crank the details at so I think ill just wait for GPU wars to start before picking anything up. Gonna hold on to it and hope DLSS 2.0 gives me good perf in cyberpunk.

At the end of the day if ur rig plays games the way you want, who cares how big your theoretical future ePeen is.
 
Hypocripsy at the highest level when people who bought Navi are worrying about future proofing when they bought something that is already obsolete the moment it released.
How was the RX 5700 XT (or any equivalent GPU) obsolete at the moment of release? Also, how is this connected to the article about a new nvidia SKU?
 
How was the RX 5700 XT (or any equivalent GPU) obsolete at the moment of release?

No DX12 Ultimate ?
Inferior hardware encoder ?
Sure you can sugar-coat it with price to performance thingy but there really is no feature that could distinguish 5700XT from the budget GPU. People who bought 5700XT are about to upgrade to Big Navi ASAP, so talking about future proofing when it's involved Nvidia is just pure hypocrisy
 
No DX12 Ultimate ?
Inferior hardware encoder ?
Sure you can sugar-coat it with price to performance thingy but there really is no feature that could distinguish 5700XT from the budget GPU.
DX12 Ultimate was nowhere near a thing when the 5700XT was released. Hardware-accelerated Ray Tracing and DLSS were (and still are) an nvidia niche represented only in a handful of games. Also, how is the hardware encoder inferior?

Saying that Navi is inferior because it doesn't support hardware RT and DLSS is just as much BS as saying that RTX 20 series cards are inferior because they only communicate through PCI-E 3.0.

My point is: If AMD released the 5700XT right now, it would be a product of questionable value, but at the time of release, it was just as valid as nvidia's GTX 16 series (one of which I bought and used happily until a month ago).

Edit: I still don't see how it all connects to the main article.
 
DX12 Ultimate was nowhere near a thing when the 5700XT was released. Hardware-accelerated Ray Tracing and DLSS were (and still are) an nvidia niche represented only in a handful of games. Also, how is the hardware encoder inferior?

See ? when you talk about RT and DLSS it's "only in a handful of games", yet they are obviously the future proofing features of that time. Do you know how many games require more than 10GB VRAM right now ? 0 ?
People who switch between price to performance then future proofing as they see fit really shouldn't comment on both matters.

Btw DX12 Ultimate is more than RT.
 
See ? when you talk about RT and DLSS it's "only in a handful of games", yet they are obviously the future proofing features of that time. Do you know how many games require more than 10GB VRAM right now ? 0 ?
People who switch between price to performance then future proofing as they see fit really shouldn't comment on both matters.
I still can't say that RT will be dominantly present (not to mention required) in the near future, just like I can't see electric cars taking over the domestic vehicle market in the next few years. Sure, if you want the best experience, you'll buy a RT enabled card, but I'm not sure if you really need it to be future-proof. VRAM is a different story: you need it whether you use RT or not.
 
Hypocripsy at the highest level when people who bought Navi are worrying about future proofing when they bought something that is already obsolete the moment it released.

No thats more the case for the RTX2000 series, because people bought those cards "because of ray tracing which is the future" which is exactly the reason why you now claim the 5700xt was "obsolete the moment it released".
Yet everyone bought a 5700 because of its rasterization performance (the performance that matters today) saying that buying a card now for RT is stupid because its not relevant yet except for some gimmicky effects in a handful of games with terrible performance and if you wanted proper RT support from both games and hardware you had to wait atleast a gen, probably two.

And guess what, that is exactly where we are, only the 3000 series now can do at sorta playable fps what was released during the 2000 series and yet even now its still quite gimmicky, if you really want RT even this gen is not enough, you will have to wait until the next for actually full on RT based games (if even then...maybe even another gen is needed).
 
I still can't say that RT will be dominantly present (not to mention required) in the near future, just like I can't see electric cars taking over the domestic vehicle market in the next few years. Sure, if you want the best experience, you'll buy a RT enabled card, but I'm not sure if you really need it to be future-proof. VRAM is a different story: you need it whether you use RT or not.

Even worse argument there, I can play with 4GB VRAM all the same, just lower the details setting to Medium and voila. Not sure if you have tried but RT reflections and Transparent Reflection is much more noticeable in game than Ultra details vs Medium. Furthermore all RTX games will come with DLSS, so yeah, not sure what manual you read regarding VRAM for future proofing.


No thats more the case for the RTX2000 series, because people bought those cards "because of ray tracing which is the future" which is exactly the reason why you now claim the 5700xt was "obsolete the moment it released".
Yet everyone bought a 5700 because of its rasterization performance (the performance that matters today) saying that buying a card now for RT is stupid because its not relevant yet except for some gimmicky effects in a handful of games with terrible performance and if you wanted proper RT support from both games and hardware you had to wait atleast a gen, probably two.

And guess what, that is exactly where we are, only the 3000 series now can do at sorta playable fps what was released during the 2000 series and yet even now its still quite gimmicky, if you really want RT even this gen is not enough, you will have to wait until the next for actually full on RT based games (if even then...maybe even another gen is needed).

Well if you state it that way then there is no need for future proofing, every GPU is obsolete when its successor come out anyway. So why bother with 8GB or 10GB VRAM, can't play game with anything less than Ultra High settings ?
I would pretty much prefer RT to Ultra High settings, at least it doesn't need 400% magnification to distinguish.
 
Even worse argument there, I can play with 4GB VRAM all the same, just lower the details setting to Medium and voila. Not sure if you have tried but RT reflections and Transparent Reflection is much more noticeable in game than Ultra details vs Medium. Furthermore all RTX games will come with DLSS, so yeah, not sure what manual you read regarding VRAM for future proofing.

Well if you state it that way then there is no need for future proofing, every GPU is obsolete when its successor come out anyway. So why bother with 8GB or 10GB VRAM, can't play game with anything less than Ultra High settings ?
I would pretty much prefer RT to Ultra High settings, at least it doesn't need 400% magnification to distinguish.
It's not just the settings, but the general assets of the game, texture quality, etc. You can lower your settings, but you can't lower them indefinitely. The fact that you can still play games fine with 4 GB VRAM has nothing to do with it. I currently play The Witcher 3 on my GT 1030 2 GB (since I sold my GTX 1660 Ti) and only use about 1.5 GB with 1080p medium settings. Microsoft Flight Simulator would be a totally different story, I guess. You can't say that 4 GB is enough just because it's enough for you.

Besides, if you're happy to play at lower settings, then why is it an issue for you if a graphics card doesn't have hardware-accelerated RT? ;)

Not to mention, if you're lowering your settings right now, then what kind of future proofing are we talking about?
 
Well if you state it that way then there is no need for future proofing, every GPU is obsolete when its successor come out anyway. So why bother with 8GB or 10GB VRAM, can't play game with anything less than Ultra High settings ?
I would pretty much prefer RT to Ultra High settings, at least it doesn't need 400% magnification to distinguish.

Its not so much that, its that the first go at a new tech is never going to be worth it because what you buy it for its so new its not established yet and by the time it is, then we are 2 or so generations further which will be needed to handle it properly anyway.
RTX2000 series was the first go at RT, and even now with the 3000 series out RT is not even really a thing yet, so if you bought the RTX2000 series for RT then you were just being silly yet that is what you base your opinion on as to why the 5700XT was obsolete on release.
 
Back
Top