• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce Ampere Architecture, Board Design, Gaming Tech & Software

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,758 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
NVIDIA's new Ampere architecture brings many interesting improvements, especially for raytracing and DLSS. In this special article we're going into all the technical details: how the shader counts are doubled and what makes GeForce 3000 RTX so much faster. We also take a closer look at the designs of RTX 3090, 3080 and 3070.

Show full review
 
Intel.....something Big is coming on sept 2..
Nvidia....hold my beer......
 
@W1zzard Without saying yes or no (thus breaking a NDA or contract), are you currently in possession of any samples of the RTX 3080 or 3090 from NVIDIA?
 
The company also wants to democratize 3D animation film-making without making people learn 3D from scratch—the Omniverse Machinima is basically Ansel for moving images, and its possibilities for storytellers is endless. ............

I think the expression democratize.. this can not be used in this phrase, If NVIDIA thinks that any regular Joe, he might get triggered to mimic an engineer at 3D animation film-making, just because of the card, this is simply not going to happen.

The industry of 3D animation, this is based or relay on a very few talented people with a serious pack of knowledge, this is a sport requiring 100% dedication to the job.
Any way, I am also interested to see of what NVIDIA has cooked this time in the regard of gaming performance, 3D animation film-making this is not a solid motive for someone to spent such amount of cash.

From the other hand it is the norm the games developers to own or use the high end VGA because this is a tool for their work.
Hobbyist gamers, they do not need the tools of 3D developers.
 
The company also wants to democratize 3D animation film-making without making people learn 3D from scratch—the Omniverse Machinima is basically Ansel for moving images, and its possibilities for storytellers is endless. ............

I think the expression democratize.. this can not be used in this phrase, If NVIDIA thinks that any regular Joe, he might get triggered to mimic an engineer at 3D animation film-making, just because of the card, this is simply not going to happen.

The industry of 3D animation, this is based or relay on a very few talented people with a serious pack of knowledge, this is a sport requiring 100% dedication to the job.
Any way, I am also interested to see of what NVIDIA has cooked this time in the regard of gaming performance, 3D animation film-making this is not a solid motive for someone to spent such amount of cash.

From the other hand it is the norm the games developers to own or use the high end VGA because this is a tool for their work.
Hobbyist gamers, they do not need the tools of 3D developers.

The industry of 3D animation... you mean like the industry of entertainment, music and video production... modding... coding for the lulz ;)

Professional work is being done by amateurs across the globe. The only difference with professionals is they make a good paycheck on it.

We saw with the emergence of streaming and Shadowplay that suddenly the "Let's Play" was the new guaranteed source of Youtube income. This has only gotten bigger, and movies built in game engines aren't new either and are really the modder' and 'tubers' territory. And let's face it... there are some pretty big crowds going for that content. What else is entertainment other than something that manages to draw attention of a lot of people? 3D animation as it is done now, perhaps is far more static than having the crowd get creative with it. Whole new formats can emerge.

Even just screenshots - we saw a new form of photography emerge with Nvidia Ansel, as well. People will elevate it to an art form.
 
Well this only shows that Ampere is a better Turing, doesn't seem to add anything new to the table besides performance and power improvements. Don't get me wrong, that's amazing, specially if you consider the pricing, but for GPU devs, this is quite boring.
 
:cry: i don't think i seen encoder improvement mentioned, I was hoping to see at least a 20% increase in performance relation to Quality at same bit rates or even lower.
 
Yeah really interested to see what people can create with Omniverse Machinima, certainly a fun tool to play around with either way.

Anyway, nice write up, all those apparent leaks and the core counts were still wrong.
 
so nvidia again misleads the masses with DLSS 8k

so it renders the app game at 1440 and then upsamples?
DLSS looks blurry on 4k,

i wonder what the performance would be if it was set to high quality, 8k, no optimizations, no DLSS, no screen re-scaling, and was just in 'pure' mode for the driver?

I tried this on 4k and it proved the hypothesis. Basically DLSS is ruining/destroying the beautiful 4k images for games that you could run with textures at 4k (you have to make changes and edits in ,inf, .cfg, .xml and know what you're doing) and as such I see DLSS as marketing fud to pull up abysmal RTX performance, and repleate with image quality compromises.



Where are the benchmarks of 4k, max eyecandy, no DLSS, rtx on, no motion blur, no depth of field, and all postprocess turned on? Max visuals?

THAT is the real tell all of performance
 
The company also wants to democratize 3D animation film-making without making people learn 3D from scratch—the Omniverse Machinima is basically Ansel for moving images, and its possibilities for storytellers is endless. ............

I think the expression democratize.. this can not be used in this phrase, If NVIDIA thinks that any regular Joe, he might get triggered to mimic an engineer at 3D animation film-making, just because of the card, this is simply not going to happen.

The industry of 3D animation, this is based or relay on a very few talented people with a serious pack of knowledge, this is a sport requiring 100% dedication to the job.
Any way, I am also interested to see of what NVIDIA has cooked this time in the regard of gaming performance, 3D animation film-making this is not a solid motive for someone to spent such amount of cash.

From the other hand it is the norm the games developers to own or use the high end VGA because this is a tool for their work.
Hobbyist gamers, they do not need the tools of 3D developers.
Listen to this guy. He's related to those that invented democracy ;)

And I love how Nvidia drew that airflow diagram as if the card is to be used in a CPU-less computer :D

All things considered, Ampere landed about where I predicted it would. Which makes me wish I predicted it was cheaper. It's still good value for the money (fuggidabout 3090), but 3080 at $500 would have been dreamy.
 
Definitely interested in the reviews, they have clearly spent some time trying to flesh out the "what we can use graphics cards for" in addition to the usual "faster, faster" route.
 
exciting time indeed.
Thx for this synthesis.

Im thinking about water cooling the thing. I've never been fan to do it for the CPU...after all it's "only" 100W. Instead I'd like to see those 320W going pushed strait out from the case. could be a new adventure.
 
an uneventful, incremental upgrade to Turing, much like Pascal was to Maxwell

This is a really poor comparison that suggests a very short memory. Pascal was a similar performance jump over Maxwell as Ampere looks to be over Turing. The only difference between the two is that the big chip for Ampere is launching immediately, and not months later.

Here's a quick reminder from your own 1080 review:
Incredible performance, large performance jump

Ampere is a return to form for Nvidia. One bad launch (Turing) doesn't make this some sort of unicorn launch.
 
There will surely be a 3080ti down the road which almost matches the 3090... This is my prediction!!!! lol :toast:
 
so nvidia again misleads the masses with DLSS 8k

so it renders the app game at 1440 and then upsamples?
DLSS looks blurry on 4k,

i wonder what the performance would be if it was set to high quality, 8k, no optimizations, no DLSS, no screen re-scaling, and was just in 'pure' mode for the driver?

I tried this on 4k and it proved the hypothesis. Basically DLSS is ruining/destroying the beautiful 4k images for games that you could run with textures at 4k (you have to make changes and edits in ,inf, .cfg, .xml and know what you're doing) and as such I see DLSS as marketing fud to pull up abysmal RTX performance, and repleate with image quality compromises.



Where are the benchmarks of 4k, max eyecandy, no DLSS, rtx on, no motion blur, no depth of field, and all postprocess turned on? Max visuals?

THAT is the real tell all of performance


What games are you trying for DLSS? If they're older games -not updated- using v1.0 its indeed a blurry mess, but if they're using DLSS 2.0 its amazing, even better than native 4k in some cases and no blurry at all (refer to many digital foundry videos).
 
I just realised the benchmark have been made with Intel i9 which isn't PCIE 4.0.

I wonder how the performance will scale with 4.0 and Ryzen. If it would make a difference.
 
I just realised the benchmark have been made with Intel i9 which isn't PCIE 4.0.

I wonder how the performance will scale with 4.0 and Ryzen. If it would make a difference.
I'm sure Nvidia had tested between the two but still chose to use Intel despite lacking PCIE4.0. You can't beat their overwhelming gaming headroom now that Ampere is delivering >120 FPS at 1440p.

Yes that's right. If you own Zen 2 and slower, get ready to upgrade your CPU AGAIN.
 
There will surely be a 3080ti down the road which almost matches the 3090... This is my prediction!!!! lol :toast:
Well ofc there will be 3080Ti. Massive price gap between 3080 and 3090. Probably going to be revealed by the end of the year. For $899-$999.
 
I'd like to know how Nvidia got 40fps+ in Control at 4K RTX Settings with a 2080, because I can tell you now with a setup the same as their test rig (9900K & 32GB RAM) Control will run at less than 20fps most of the time at 4K RTX Settings, & often crashing down to less than 10fps in some areas...I didn't see 'DLSS On' anywhere in the slide they used, showing the 3080 getting 80fps+ under the same settings.
 
when are the actual reviews coming out and these fluff articles end, want to see it in action.
 
Well ofc there will be 3080Ti. Massive price gap between 3080 and 3090. Probably going to be revealed by the end of the year. For $899-$999.
They could very well fit in two cards over there, the 3080 Super & possibly 3080Ti unless NVidia's planning to drop it this gen.
 
I'm sure Nvidia had tested between the two but still chose to use Intel despite lacking PCIE4.0. You can't beat their overwhelming gaming headroom now that Ampere is delivering >120 FPS at 1440p.

Yes that's right. If you own Zen 2 and slower, get ready to upgrade your CPU AGAIN.

By overwhelming gaming headroom you mean the 2,5 fps that Core-i-10000 is faster at 4K?

Or the 5-15 fps that its faster at 720p?

Because I tell you, these GPUs were not made to play at 720p. These tests at low res a nice to find out which CPUs would be fastest if CPUs were the bottlenecks in gaming. But honestly, who plays at 720p? By now, even the haters should have gotten that it does not matter if you play on Intel or AMD. Basically Intel is for people that want the absolute maximum in gaming and AMD is for people that want to spend a little less and have decent gaming CPU and an even better CPU for most working tasks.

I had Intel for more than a decade (C2D-E6300, C2Q-6600, i5-4690k) and was very happy with Core-2 and Core-i and now I am very happy with my 3900x. And I am pretty sure it will not be the limiting factor when getting a 3080 and playing at 1440+...
 
Last edited:
Back
Top