• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA: GeForce RTX 3080 Reviews Delayed, RTX 3070 Availability Confirmed

Actually you're wrong. Pascal was a tweaked (and extended) version of Maxwell, which were introduced due to delays of Volta(just look at older roadmaps showing Kepler->Maxwell->Volta), but it's still a good design though.
Interesting theory, but that's not what happened.

I would say the failure is on your side, you are the one trying to part information here, but do go ahead and try again.
You can say whatever you wish...

Hey, @staff!

This basically means that you have the card already, and will be able to publish a review in a few days? YES/NO is enough...
They're bound by NDA, they can't confirm or even deny that info.

Just lex being lex... somebody said a bad thing about his 2080 and that's how he gets. Its okay, he gets too excited about hardware. :nutkick:
And this is you being yourself. If by "lex being lex" you mean being objective and realistic, then yeah, I'm being me. Being excited by life is part of how one stays happy.

The user above stated that having just bought a 2080ti they feel "buyers-remorse". They fail to realize that the card they bought is leaps and bounds better than what they had previously. Couple that with the fact that they have that card in hand and can actually use it means they will not have to wait for stocks to be replenished for the 30x0 series card about to be released and will instantly be sold out for months to come. Choosing to be a cynic is just that: a choice. Choosing to be objective and positive is a far better option. Maybe you and a few others should try it some time instead of being, well, the way you are.
 
Interesting theory, but that's not what happened.
Not a theory, just the facts sir.
Volta was delayed, Pascal was put in between as a refinement, shrunk and extended Maxwell with some improvements from Volta. Volta evolved into Turing, the Pascal successor in the consumer space.
GPURoadmap_678x452.jpg

GPU-Roadmap-GTC-2015-SGEMM-900x418.jpg
 
Not a theory, just the facts sir.
Volta was delayed, Pascal was put in between as a refinement, shrunk and extended Maxwell with some improvements from Volta. Volta evolved into Turing, the Pascal successor in the consumer space.
GPURoadmap_678x452.jpg

GPU-Roadmap-GTC-2015-SGEMM-900x418.jpg
That's it? Pictures? LOL!
 
Dunno where you've been but Kepler smoked what AMD had running at the time. The fact that NVIDIA felt it OK to release their "midrange" GK104 to compete with AMD's top products should show you why.

"The Radeon R9 290 and R9 290X hit NVIDIA's high-end GPU lineup back in fall-2013. The $399 R9 290 was faster than the $999 GTX TITAN and the $650 GTX 780; while the R9 290X held out on its own until NVIDIA made a product intervention with the GTX 780 Ti and GTX TITAN Black to reclaim those two price points."

-W1zzard

 
I been wondering if it would be worth it to get the RTX 3080 FE but without knowing the noise, heat and so on of that cooler design it makes me want to wait.

Plus would love to the price and performance of AMD this time around.

Avoid 3080 no matter what, it uses old 8nm 44Mtr density, instead of 6nm 66Mtr/mm2 that should be the norm. Even AMD doesnt use real 7nm. it is 41Mtr/mm I have a suspicion 7+ is not any better, except yeah some clock and power improvements... Wait for Hopper.
 
I get a little excited everytime I see nvidia and/or their logo in news titles. Does it make me a fanboy?
It makes you a Tech head like most people here.
 

OMFG I CANT WAIT TO READ THIS!!!!1

That first comment of the review always makes me laugh. :laugh:
 
Avoid 3080 no matter what, it uses old 8nm 44Mtr density, instead of 6nm 66Mtr/mm2 that should be the norm.
That is an uninformed statement. Samsung is the manufacturer for the 30x0 series of dies. Even though it's 8nm, Samsung's process has differing characteristics from TSMC's process and must therefore be judged on the merits of performances not the simple aspect of node size. And as performance numbers have yet to be released(completely ignoring leaks that are summarily without merit) that information is not specifically known.
 
RTX 3070 vs 2080Ti
SM: 46 vs 68(+48%)
Shaders: 5888(+35%) vs 4352
TMUs: 184 vs 272(+48%)
ROPs: 64 vs 88(+38%)
Bandwidth: 448GB/s vs 616GB/s (+38%)

Edit: If It really is as fast or faster than RTX2080Ti in standard rasterization, then good job Nvidia. The only downside is 8GB vs 11GB Vram and relatively high TDP for a smaller process.

Keep in mind the shaders are a bit different architecturally this time around. Should still be an improvement but not as directly comparable to 2080ti. Basically 1 2080ti shader/Cuda core is not the same as 1 3070 core, even at same clock speeds
 
"The Radeon R9 290 and R9 290X hit NVIDIA's high-end GPU lineup back in fall-2013. The $399 R9 290 was faster than the $999 GTX TITAN and the $650 GTX 780; while the R9 290X held out on its own until NVIDIA made a product intervention with the GTX 780 Ti and GTX TITAN Black to reclaim those two price points."

-W1zzard


That's more than a full year after the release of the GTX 680 on the GK104 Kepler die. GTX 680 released March of 2012. And as you state, a bigger GK110 launched 6 months after the GTX 780 did and performed better than the 290X.
 
Avoid 3080 no matter what, it uses old 8nm 44Mtr density, instead of 6nm 66Mtr/mm2 that should be the norm. Even AMD doesnt use real 7nm. it is 41Mtr/mm I have a suspicion 7+ is not any better, except yeah some clock and power improvements... Wait for Hopper.

The problems is the RTX 3070 ain't powerful enough for 4K@144hz and the RTX 3090 is a rip off so Nvidia counted it right.
 
The problems is the RTX 3070 ain't powerful enough for 4K@144hz and the RTX 3090 is a rip off so Nvidia counted it right.
I think it might be hard even for 3090 to get 144FPS at 4k all the way in certain games. Dips will happen for sure.
Wonder what caused the delay for 3080.
 
Avoid 3080 no matter what, it uses old 8nm 44Mtr density, instead of 6nm 66Mtr/mm2 that should be the norm. Even AMD doesnt use real 7nm. it is 41Mtr/mm I have a suspicion 7+ is not any better, except yeah some clock and power improvements... Wait for Hopper.

If you go by that logic to not upgrade, then you will likely never upgrade. For all it matters, Hopper may not be using a real 5 or 7nm, so skipping it again? 8nm is a refined Samsung 10nm, so I am not expecting it to be as dense as the 6nm you mentioned. Moreover, people are looking out for performance improvement, less on how dense the chip is.

The problems is the RTX 3070 ain't powerful enough for 4K@144hz and the RTX 3090 is a rip off so Nvidia counted it right.
The mid range xx70 models are generally not meant for 4K gaming in the first place. The same goes for the AMD's side of things. High FPS 4K is reserved for the top end cards, and very frankly, I don't think the likes for the RTX 3090 can play all games at 4K@144hz consistently. It may be possible with DLSS, but I recalled that Nvidia introduced DLSS not meant to drive very high FPS. Most DLSS implementations target 60FPS up to 4K.
 
That's more than a full year after the release of the GTX 680 on the GK104 Kepler die. GTX 680 released March of 2012. And as you state, a bigger GK110 launched 6 months after the GTX 780 did and performed better than the 290X.

Quite underwhelming considering the price.

AMD was able to keep refreshing the same old GCN GPUs for years because Kepler portfolio was so bad lol.
 
I think it might be hard even for 3090 to get 144FPS at 4k all the way in certain games. Dips will happen for sure.
Wonder what caused the delay for 3080.

Same here because a delay is interesting no matter if it's good or bad it's more the reason why.

The mid range xx70 models are generally not meant for 4K gaming in the first place. The same goes for the AMD's side of things. High FPS 4K is reserved for the top end cards, and very frankly, I don't think the likes for the RTX 3090 can play all games at 4K@144hz consistently. It may be possible with DLSS, but I recalled that Nvidia introduced DLSS not meant to drive very high FPS. Most DLSS implementations target 60FPS up to 4K.

You don't say, that's why the RTX 3080 is a good competitor for this because the RTX 3090 costs about 1900USD from Nvidia's own site and the RTX 3080 at a more reasonable 900USD give or take depending on AIB models and I know I am not blowing off 1900USB for the FE and even more for a AIB card not gonna happen this is also why I think it's sad that AMD ain't out with something yet.
 
Avoid 3080 no matter what, it uses old 8nm 44Mtr density, instead of 6nm 66Mtr/mm2 that should be the norm. Even AMD doesnt use real 7nm. it is 41Mtr/mm I have a suspicion 7+ is not any better, except yeah some clock and power improvements... Wait for Hopper.
God... I waited 2 years for a replacement for my 1070, then another 2 years for this to come out. Now you expect me to wait another 2 years for Hopper?
Hopper will probably be good, sure, but that's at least 2 years away from now.
Samsung 8nm is still better than the TSMC 12nm that Turing and Pascal was on, and that shows in how good Ampere is shaping up to be. I'll happily buy the 3080 on Thursday, and then I might upgrade when Hopper arrives if it's so damn good.
 
Quite underwhelming considering the price.

AMD was able to keep refreshing the same old GCN GPUs for years because Kepler portfolio was so bad lol.

Kepler first gen was indeed neck-neck between AMD and Nvidia.

Kepler refresh started showing the cracks in GCN and forced AMD to deploy a 512 bit bus on those 'great' R9 290 / 290Xes. They also managed to get probably 2 AIB products out with a card that didn't sound like an airplane while boiling you in your room. Realistically, the R9 290X was AMD's/ GCNs last swan song and after that, they simply had nothing. Fury X was a dead end, plagued with issues not the least of them being margin and supply, a problem they handily repeated with Vega, while the latter had no competitive edge whatsoever. HBM was very visibly the only escape AMD saw at the time to get 'moar cores' on their slow as molasses memory subsystem. Meanwhile, Nvidia explored and deployed its first Delta Compression tech in Maxwell and could make do with half or less of a bus width instead with el cheapo GDDR5. Alongside GPU Boost 3.0 this obliterated AMD's entire line up which by then consisted of a failed Tonga attempt, 7970 rebrands and soon after they declared to 'Focus on the midrange'. For all that focus, they also 'focused on consoles'... in practice, they just ran a skeleton crew on the whole division.

We know how that went. Their focus on the midrange was a repeatedly rebranded Polaris chip, a failed Vega chip, and the most buggy GPU ever released called Navi 10.

Time for some redactions in your history notebook I'd say. I'm sure Navi 20 will do fantastic, after all, track record seems good. :wtf:

"lex being lex" you mean being objective and realistic

Mhm. In the eyes of the beholder ;) You've stacked several 'facts' here that are proven untrue and clear for all to see. But keep at it, its entertaining.
 
Last edited:
I was hoping for some interesting reading today......guess it can wait.

What I don't understand is this:
"...the delay was in response to certain reviewers requesting more time from NVIDIA as COVID-19 impacted their sampling logistics. "

It's horrifically vague and doesn't make sense.
Did Nvidia fail to get samples out in a timely fashion? I've shipped things about via USPS and UPS over the last six months and things have been arriving on time to their destinations across the US.....barring any kind of natural weather issues.
 
I've shipped things about via USPS and UPS over the last six months and things have been arriving on time to their destinations across the US

There are more countries on the world than just the US...
 
There are more countries on the world than just the US...
True, but we have equipment/parts for machinery we use shipped in to us from Canada, Italy, Germany and Mexico. Small parts to things that need to be shipped on container ships and we haven't had any unforeseen downtime from parts due to late/delayed shipping....maybe we're just lucky.
 
8nm does the job of delivering new product, but it looks like we have a GTX 780Ti, 980Ti scenario, both of them 600mm2 die replaced in less than 10-12 months window by a smaller faster card for half the price. so anywhere between 12-18 months 8nm is obsoleted. Why on earth would you pay $750 for that. it will be worthless than 250 resale so soon you woldn't even play 3 games on it.
 
You've stacked several 'facts' here that are proven untrue and clear for all to see. But keep at it, its entertaining.
Yup, that's what happened... The word "concrete" is coming to mind... But yes, VERY entertaining indeed!
8nm does the job of delivering new product, but it looks like we have a GTX 780Ti, 980Ti scenario, both of them 600mm2 die replaced in less than 10-12 months window by a smaller faster card for half the price. so anywhere between 12-18 months 8nm is obsoleted.
That's not going to happen.
 
8nm does the job of delivering new product, but it looks like we have a GTX 780Ti, 980Ti scenario, both of them 600mm2 die replaced in less than 10-12 months window by a smaller faster card for half the price. so anywhere between 12-18 months 8nm is obsoleted. Why on earth would you pay $750 for that. it will be worthless than 250 resale so soon you woldn't even play 3 games on it.
You pay for the performance it delivers. And thusfar the rumours are it's faster than a 2080Ti, at a much lower price.
So sure, next year by this time they have even better offerings, even more bang for your buck.
As for now, this product seems to deliver what it says it will, upcoming reviews ofcourse will proof this right or wrong, but I suspect it will come true.
I douldn't care less how big or small the Die is, as long as it performs.
 
Back
Top