Wednesday, September 2nd 2020

NVIDIA Announces GeForce Ampere RTX 3000 Series Graphics Cards: Over 10000 CUDA Cores

NVIDIA just announced its new generation GeForce "Ampere" graphics card series. The company is taking a top-to-down approach with this generation, much like "Turing," by launching its two top-end products, the GeForce RTX 3090 24 GB, and the GeForce RTX 3080 10 GB graphics cards. Both cards are based on the 8 nm "GA102" silicon. Join us as we live blog the pre-recorded stream by NVIDIA, hosted by CEO Jen-Hsun Huang.

Update 16:04 UTC: Fortnite gets RTX support. NVIDIA demoed an upcoming update to Fortnite that adds DLSS 2.0, ambient occlusion, and ray-traced shadows and reflections. Coming soon.
Update 16:06 UTC: NVIDIA Reflex technology works to reduce e-sports game latency. Without elaborating, NVIDIA spoke of a feature that works to reduce input and display latencies "by up to 50%". The first supported games will be Valorant, Apex Legends, Call of Duty Warzone, Destiny 2 and Fortnite—in September.
Update 16:07 UTC: Announcing NVIDIA G-SYNC eSports Displays—a 360 Hz IPS dual-driver panel that launches through various monitor partners in this fall. The display has a built-in NVIDIA Reflex precision latency analyzer.
Update 16:07 UTC: NVIDIA Broadcast is a brand new app available in September that is a turnkey solution to enhance video and audio streaming taking advantage of the AI capabilities of GeForce RTX. It makes it easy to filter and improve your video, add AI-based backgrounds (static or animated), and builds on RTX Voice to filter out background noise from audio.
Update 16:10 UTC: Ansel evolves into Omniverse Machinima, an asset exchange that helps independent content creators to use game assets to create movies. Think fan-fiction Star Trek episodes using Star Trek Online assets. Beta in October.
Update 16:15 UTC: Updates to the AI tensor cores and RT cores. In addition to more numbers of RT- and tensor cores, the 2nd generation RT cores and 3rd generation tensor cores offer higher IPC. Making ray-tracing have as little performance impact as possible appears to be an engineering goal with Ampere.
Update 16:18 UTC: Ampere 2nd Gen RTX technology. Traditional shaders are up by 270%, raytracing units are 1.7x faster and the tensor cores bring a 2.7x speedup.
Update 16:19 UTC: Here it is! Samsung 8 nm and Micron GDDR6X memory. The announcement of Samsung and 8 nm came out of nowhere, as we were widely expecting TSMC 7 nm. Apparently NVIDIA will use Samsung for its Ampere client-graphics silicon, and TSMC for lower volume A100 professional-level scalar processors.
Update 16:20 UTC: Ampere has almost twice the performance per Watt compared to Turing!
Update 16:21 UTC: Marbles 2nd Gen demo is jaw-dropping! NVIDIA demonstrated it at 1440p 30 Hz, or 4x the workload of first-gen Marbles (720p 30 Hz).
Update 16:23 UTC: Cyberpunk 2077 is playing big on the next generation. NVIDIA is banking extensively on the game to highlight the advantages of Ampere. The 200 GB game could absorb gamers for weeks or months on end.
Update 16:24 UTC: New RTX IO technology accelerates the storage sub-system for gaming. This works in tandem with the new Microsoft DirectStorage technology, which is the Windows API version of the Xbox Velocity Architecture, that's able to directly pull resources from disk into the GPU. It requires for game engines to support the technology. The tech promises a 100x throughput increase, and significant reductions in CPU utilization. It's timely as PCIe gen 4 SSDs are on the anvil.

Update 16:26 UTC: Here it is, the GeForce RTX 3080, 10 GB GDDR6X, running at 19 Gbps, 238 tensor TFLOPs, 58 RT TFLOPs, 18 power phases.
Update 16:29 UTC: Airflow design. 90 W more cooling performance than Turing FE cooler.
Update 16:30 UTC: Performance leap, $700. 2x as fast as RTX 2080, available September 17. Up to 2x faster than the original RTX 2070.
Update 17:05 UTC: GDDR6X was purpose-developed by NVIDIA and Micron Technology, which could be an exclusive vendor of these chips to NVIDIA. These chips use the new PAM4 encoding scheme to significantly increase data-rates over GDDR6. On the RTX 3090, the chips tick at 19.5 Gbps (data rates), with memory bandwidths approaching 940 GB/s.
Update 16:31 UTC: RTX 3070, $500, faster than RTX 2080 Ti, 60% faster than RTX 2070, available in October. 20 shader TFLOPs, 40 RT TFLOPs, 163 tensor cores, 8 GB GDDR6
Update 16:33 UTC: Call of Duty: Black Ops Cold War is RTX-on.

Update 16:35 UTC: RTX 3090 is the new TITAN. Twice as fast as RTX 2080 Ti, 24 GB GDDR6X. The Giant Ampere. A BFGPU, $1500 available from September 24. It is designed to power 60 fps at 8K resolution, up to 50% faster than Titan RTX.

Update 16:43 UTC: Wow, I want one. On paper, the RTX 3090 is the kind of card I want to upgrade my monitor for. Not sure if a GPU ever had that impact.
Update 16:59 UTC: Insane CUDA core counts, 2-3x increase generation-over-generation. You won't believe these.
Update 17:01 UTC: GeForce RTX 3090 in the details. Over Ten Thousand CUDA cores!
Update 17:02 UTC: GeForce RTX 3080 details. More insane specs.

Update 17:03 UTC: The GeForce RTX 3070 has more CUDA cores than a TITAN RTX. And it's $500. Really wish these cards came out in March. 2020 would've been a lot better.
Here's a list of the top 10 Ampere features.

Update 19:22 UTC: For a limited time, gamers who purchase a new GeForce RTX 30 Series GPU or system will receive a PC digital download of Watch Dogs: Legion and a one-year subscription to the NVIDIA GeForce NOW cloud gaming service.

Update 19:47 UTC: All Turing cards support HDMI 2.1. The increased bandwidth provided by HDMI 2.1 allows, for the first time, a single cable connection to 8K HDR TVs for ultra-high-resolution gaming. Also supported is AV1 video decode.

Update 20:06 UTC: Added the complete NVIDIA presentation slide deck at the end of this post.

Update Sep 2nd: We received following info from NVIDIA regarding international pricing:
  • UK: RTX 3070: GBP 469, RTX 3080: GBP 649, RTX 3090: GBP 1399
  • Europe: RTX 3070: EUR 499, RTX 3080: EUR 699, RTX 3090: EUR 1499 (this might vary a bit depending on local VAT)
  • Australia: RTX 3070: AUD 809, RTX 3080: AUD 1139, RTX 3090: AUD 2429
Add your own comment

502 Comments on NVIDIA Announces GeForce Ampere RTX 3000 Series Graphics Cards: Over 10000 CUDA Cores

#476
BoboOOZ
biffzinker
I didn't get hung up on mixing the two, and I still don't care. At one time I had K6-III 450 MHz with Nivida's Riva TNT.
Oooh, an old guy :p.

At that point, AMD didn't have a graphics division for more than another 5 years...
AlwaysHope
Ordered factory OC RX 5700XT last week...
Without waiting for RDNA2 launch?
How much did you pay, out of curiosity?
Posted on Reply
#477
Rakhmaninov3
Xpert 2000 Pro doesn't even need a fan. Why would I upgrade.
Posted on Reply
#478
BoboOOZ
Rakhmaninov3
Xpert 2000 Pro doesn't even need a fan. Why would I upgrade.
I didn't get your reference at first ;) It's 2000 like the year... You might need to upgrade for multi-monitor support, though.
Posted on Reply
#479
medi01
dir_d
He said the 3080 is a 4k 60fps card and way better than 2080Ti. That made me question a little because i thought 2080Ti was a 4k 60fps card am i wrong?
Consoles are at 2080/2080sup and will target 4k, but devs are more likely to target 30fps than 60fps. So you'll need something two times faster than 2080. 3080 falls short, maybe 3090.
Posted on Reply
#480
webdigo
Power consumption under gaming 3070 - 3080?????
Posted on Reply
#482
Valantar
Splinterdog
BFGPU
Nuff said.
BFPower draw?
Posted on Reply
#484
KevSmeg
Nvidia know how to piss off their customers that bought RTX 20XX this year LOL
Posted on Reply
#485
Chomiq
Valantar
BFPower draw?
According to them, 30 W over 3080.

But that depends on what and how they actually measure.
Posted on Reply
#487
Valantar
Chomiq
According to them, 30 W over 3080.

But that depends on what and how they actually measure.
Yes, but the 3080 is again 70+ W over the 2080. So definitely BFPD.
Posted on Reply
#488
mouacyk
Splinterdog
Erm, Quake/Doom reference?
NVidia would likely have you think differently. They will likely claim BFGPU is a part of their BFGD ecosytem, that did not take off but is now more than ever ready with Ampere to power it.
Posted on Reply
#491
AlwaysHope
BoboOOZ
Oooh, an old guy :p.

At that point, AMD didn't have a graphics division for more than another 5 years...


Without waiting for RDNA2 launch?
How much did you pay, out of curiosity?
Even when RDNA2 enters retail channel, it will be high end cards first that will be overkill for my gaming needs.

Besides, just like every vga launch their will be "teething" issues with drivers etc...
Posted on Reply
#492
Valantar
mouacyk
NVidia would likely have you think differently. They will likely claim BFGPU is a part of their BFGD ecosytem, that did not take off but is now more than ever ready with Ampere to power it.
I dont think you can call that an ecosystem - there were a few displays, then none. And now they're pretty much obsolete as any decent TV matches their specs and features for a quarter of the price.
Posted on Reply
#493
Icon Charlie
Honest RTX 3000 Series Announcement Parody.... Enjoy :)
Posted on Reply
#494
mechtech
Icon Charlie
So does my 1070. And you can even get 4K monitors @$300+ range.

But there is a big difference between something that is running at 60hz and something running at the 144hz+ with a IP panel @4K. I've seen the difference and its nice... but not that nice when the monitor I'm looking at starts at the $600, then add a card that can take full advantage of the monitor in question and that is a lot of money.

That is why I purchased the Pixio 32 inch 1440p, 165hz monitor for under $300 though it is a bit of overkill for me as I'm so used to the 27 inch monitor.

Hardware Monitor did a review in late 2018

But it was Level 1 Tech that sold me on the Monitor.

And finally the 27 inch Brand Name monitor that I wanted cost more than this monitor.

Now back to the Nvidia 3000 series of video cards. You know that there are going to be limited supply of the 3000 series being sold at launch date... Right? Yea I am hearing the same rumors about limited supply issues and of course Price increases. So if those rumors do come true it will be just like the 2000 series limited supply launch.

I'll just wait until late October/Holiday season to pick up any additional components as needed as well as what AMD has to offer.

But I am interested to see the actual gaming performance over previous generations as well as how hot these cards will be generating the heat it creates.
My 4k screen was about $350 cnd, I'm happy with it. When there is a 120HZ model for the same price that's when I will buy one. I refuse to pay $300 premium for another 60 Hz.

As for the NV 3k series, I have bigger fish to fry, I have to replace shingles on my house. GPU upgrade for me is at least 2 years away.
Posted on Reply
#495
BoboOOZ
Here's a video from Tom, bringing a bit of realism to the hype train.

To those who don't know him, he's not AMD fanboi, he just tends to wear Intel, AMD, Nvidia, T-shirts on occasion, depending on the content of the video.
Posted on Reply
#496
efikkan
BoboOOZ
Here's a video from Tom, bringing a bit of realism to the hype train.
Coming from the guy who cited "sources" claiming 4-5x performance gains in raytracing for Ampere.
Now he is claiming RDNA2 can compete with all tiers of Nvidia's lineup, but only if they want to (7:30). Last year he claimed AMD had "big Navi" ready, but didn't want to release it (despite no evidence pointing to a RDNA1 "big Navi"). He is just rambling and speculating, claiming it's leaks.

Ampere has in general not been overhyped, if anything the specs and claimed performance seems to have caught most by surprise. (As with any claims, these needs to be confirmed with real reviews of course.) The hype for RDNA2 is much higher, but if AMD is nearly as good as Nivida at concealing the real specifics from "leakers", then we might not know until it's unveiled whether "Big Navi" is a 2080 Ti class card for cheap, or if it's a true high-end contender. It may seem like this new trend of all these "leakers" spreading speculation may actually help hide the tiny bits of actual leaks out there.
Posted on Reply
#497
theoneandonlymrk
efikkan
Coming from the guy who cited "sources" claiming 4-5x performance gains in raytracing for Ampere.
Now he is claiming RDNA2 can compete with all tiers of Nvidia's lineup, but only if they want to (7:30). Last year he claimed AMD had "big Navi" ready, but didn't want to release it (despite no evidence pointing to a RDNA1 "big Navi"). He is just rambling and speculating, claiming it's leaks.

Ampere has in general not been overhyped, if anything the specs and claimed performance seems to have caught most by surprise. (As with any claims, these needs to be confirmed with real reviews of course.) The hype for RDNA2 is much higher, but if AMD is nearly as good as Nivida at concealing the real specifics from "leakers", then we might not know until it's unveiled whether "Big Navi" is a 2080 Ti class card for cheap, or if it's a true high-end contender. It may seem like this new trend of all these "leakers" spreading speculation may actually help hide the tiny bits of actual leaks out there.
Nvidia did indeed hype their card, by their own figure's they struggle to prove their own claim's, to be fair ,a company shouldn't down play their performance but it's clear Nvidia hyped this launch, let's wait for review's before getting too excited verbally.
Posted on Reply
#498
BoboOOZ
efikkan
Ampere has in general not been overhyped, if anything the specs and claimed performance seems to have caught most by surprise. (As with any claims, these needs to be confirmed with real reviews of course.)
If you actually listen to what he's saying, he's saying exactly that, he says Ampere was not overhyped, the performance is exactly what he leaked half a year ago. The hype for Nvidia has only started last week, because many people misinterpret the very high FP32 theoretical performance and the performance comparison based on a handful of cherry-picked games which leabs some to believe theres a 80-100% improvement over the last generation.
efikkan
The hype for RDNA2 is much higher, but if AMD is nearly as good as Nivida at concealing the real specifics from "leakers", then we might not know until it's unveiled whether "Big Navi" is a 2080 Ti class card for cheap, or if it's a true high-end contender. It may seem like this new trend of all these "leakers" spreading speculation may actually help hide the tiny bits of actual leaks out there.
I have no idea where is the hype for big Navi, you must be spending your time on different fora then me.
For now Tom (and others) haven't had a single solid leak to know at least if AMD are planning to compete in the high-end or not, it's all just conjecture/deduction and wishful thinking at this point.
Posted on Reply
#499
randompeep
RedelZaVedno
Just look at Microsoft FS 2020. 2080TI manages only 31FPS (with deeps to 21) over NY City at 4K/ULTRA and wants to use 12.7GB of VRAM. 3080 will hopefully get us to 55 fps (with deeps to 40ies). Having 16GB would probably smoothen these fps deeps further. Needless to say 3080 is still a godsent for flight simmers :)
Hey, it seems like the reviews are there. ~45 fps in M$ FS20 @4k Ultra. Hoping everyone got this clear - Microsoft made an experimental unoptimized game just for the GPU market well-being. I'm not saying RTX 3000 is trash, but in some markets the 3080 AIBs came at +50-60% of the advertised price for FE. It's quite a hike and makes it not worth the money for the next 12 months in select regions. TBH I'm sitting out there waiting for further discounts on the second market for the RX570/GTX 1060 3GB. The 3gb cards are getting obsolete, so take my 60$ if you sell one of those mosquitos
Posted on Reply
#500
BoboOOZ
DuxCro
Well, here's a video comparing RTX 2080Ti and RTX 3080 in DOOM Eternal. From those screens where they showed identical scenes side by side i calculated average 40% difference. So my previous math was wrong. Calculated 30-35%. Well, maybe it wasn't wrong. Faster memory gives more advantage at higher resolutions. GDDR6 on 2080Ti vs GDDR6X on 3080. Could be less difference on lower resolutions.
So this is based on this one game. TDP difference between RTX 2080Ti and 3080 is 28%. 250W vs 320W. Performance difference jumping around 40%. So how can Nvidia claim they achieved 1,9X (+90%) performance per Watt? Were they refering to RTX performance?
So now we know, their claim is based on the performance of fully path traced Quake and Minecraft exclusively.

And the difference you are seeing between the 2080 and the 3080 is so large because the settings are cherry-picked just so the VRAM requirement is higher than 8 GB but smaller than 10GB. Nice job Nvidia and DF!
Posted on Reply
Add your own comment