• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce Ampere Architecture, Board Design, Gaming Tech & Software

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
20,909 (3.50/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
NVIDIA's new Ampere architecture brings many interesting improvements, especially for raytracing and DLSS. In this special article we're going into all the technical details: how the shader counts are doubled and what makes GeForce 3000 RTX so much faster. We also take a closer look at the designs of RTX 3090, 3080 and 3070.

Show full review
 
Joined
Dec 29, 2012
Messages
590 (0.21/day)
System Name White Shark
Processor 7700K
Motherboard MSI Z270 gaming pro carbon
Cooling CM 212 EVO
Memory CORSAIR Vengeance LPX 16GB (2 x 8GB) 3000
Video Card(s) MSI RTX 2070 SUPER SAIYAN X TRI over 9000!!!
Storage Samsung 860 EVO 500 GB/ Seagate 1TBx2
Display(s) Asus MG278Q freesync 144hz 1440p
Case Rosewill Orbit Z1
Power Supply Thermaltake toughpower 750 gold
Mouse Logitech Proteus core
Keyboard Corsair K55
Software W10
Intel.....something Big is coming on sept 2..
Nvidia....hold my beer......
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,337 (0.31/day)
System Name Eluktronics MAX-15 (2020) / AMD Custom Rig
Processor Intel® Core™ i7-9750H Processor (Coffee Lake) / AMD Ryzen 7 3800X
Motherboard Intel HM370 / ASUS TUF GAMING X570-PLUS (WI-FI)
Cooling Eluktronics twin-fan and triple-copper heatpipes cooling / EVGA CLC 280mm AIO Liquid Cooler
Memory Kingston HyperX 16GB (16GBx2) DDR4 2666MHz CL15 / G.SKILL TridentZ 32GB (8GBx4) F4-3200C16-8GTZR
Video Card(s) GeForce RTX 2070 Mobile 115W / Sapphire RX 5700 XT (Reference)
Storage Crucial P1 1TB + Kingston A2000 1TB NVMe / Samsung 970 EVO 250GB + Inland Premium 2TB NVMe
Display(s) LG LP156WFG-SPB2 15.6" 1080p 144 Hz panel / LG 27GL650F-B UltraGear 27" 1080p 144 Hz 1ms
Case TongFang GK5CP7Y chassis / NZXT H510i Matte White
Audio Device(s) Cooler Master MH670 / Kingston HyperX Cloud Flight S
Power Supply Chicony Power Technology A17-230P1A 230W / Corsair RMx Series RM750x 750W
Mouse Eluktronics Luminosa PMW3389 / SteelSeries Sensei Ten
Keyboard Premium Membrane RGB Backlit Keyboard plus 10-key numeric keypad / Cooler Master CK530
Software Windows 10 Home 64-bit 2004 / Windows 10 Pro 64-bit 2004
@W1zzard Without saying yes or no (thus breaking a NDA or contract), are you currently in possession of any samples of the RTX 3080 or 3090 from NVIDIA?
 
Joined
Dec 24, 2008
Messages
1,457 (0.34/day)
Location
Volos, Greece
System Name ATLAS
Processor Q6600 QUAD
Motherboard ASUS P5QC
Cooling ProlimaTech Armageddon
Memory HYPER-X KHX1600C8D3T1K2 /4GX PC3-12800 1600MHz
Video Card(s) Sapphire HD 5770 VAPOR-X
Storage WD Raptors 73Gb - Raid1 10.000rpm
Display(s) DELL U2311H
Case HEC Compucase CI-6919 Full tower (2003) moded .. hec-group.com.tw
Audio Device(s) X-Fi Music + mods Audigy front Panel (full working)
Power Supply HIPER 4M780 PE 980W Peak
Mouse MX510
Keyboard Microsoft Digital media 3000
Software Win 7 Pro x64 ( Retail )
The company also wants to democratize 3D animation film-making without making people learn 3D from scratch—the Omniverse Machinima is basically Ansel for moving images, and its possibilities for storytellers is endless. ............

I think the expression democratize.. this can not be used in this phrase, If NVIDIA thinks that any regular Joe, he might get triggered to mimic an engineer at 3D animation film-making, just because of the card, this is simply not going to happen.

The industry of 3D animation, this is based or relay on a very few talented people with a serious pack of knowledge, this is a sport requiring 100% dedication to the job.
Any way, I am also interested to see of what NVIDIA has cooked this time in the regard of gaming performance, 3D animation film-making this is not a solid motive for someone to spent such amount of cash.

From the other hand it is the norm the games developers to own or use the high end VGA because this is a tool for their work.
Hobbyist gamers, they do not need the tools of 3D developers.
 
Joined
Sep 17, 2014
Messages
13,279 (6.04/day)
Location
Mars
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
The company also wants to democratize 3D animation film-making without making people learn 3D from scratch—the Omniverse Machinima is basically Ansel for moving images, and its possibilities for storytellers is endless. ............

I think the expression democratize.. this can not be used in this phrase, If NVIDIA thinks that any regular Joe, he might get triggered to mimic an engineer at 3D animation film-making, just because of the card, this is simply not going to happen.

The industry of 3D animation, this is based or relay on a very few talented people with a serious pack of knowledge, this is a sport requiring 100% dedication to the job.
Any way, I am also interested to see of what NVIDIA has cooked this time in the regard of gaming performance, 3D animation film-making this is not a solid motive for someone to spent such amount of cash.

From the other hand it is the norm the games developers to own or use the high end VGA because this is a tool for their work.
Hobbyist gamers, they do not need the tools of 3D developers.
The industry of 3D animation... you mean like the industry of entertainment, music and video production... modding... coding for the lulz ;)

Professional work is being done by amateurs across the globe. The only difference with professionals is they make a good paycheck on it.

We saw with the emergence of streaming and Shadowplay that suddenly the "Let's Play" was the new guaranteed source of Youtube income. This has only gotten bigger, and movies built in game engines aren't new either and are really the modder' and 'tubers' territory. And let's face it... there are some pretty big crowds going for that content. What else is entertainment other than something that manages to draw attention of a lot of people? 3D animation as it is done now, perhaps is far more static than having the crowd get creative with it. Whole new formats can emerge.

Even just screenshots - we saw a new form of photography emerge with Nvidia Ansel, as well. People will elevate it to an art form.
 
Joined
Oct 2, 2015
Messages
2,647 (1.46/day)
Location
Argentina
System Name Ciel / Yukino
Processor AMD Ryzen R5 3400G / Intel Core i3 5005U
Motherboard Asus Prime B450M-A / HP 240 G5
Cooling AM3 Wraith + Spire v2 fan / Stock
Memory 2x 8GB Corsair Vengeance LPX DDR4 3200MHz / 2x 4GB Hynix + Kingston DDR3L 1600MHz
Video Card(s) AMD Radeon RX Vega 11 / Intel HD 5500
Storage SSD WD Green 240GB M.2 + HDD Toshiba 2TB / SSD Kingston A400 120GB SATA
Display(s) Samsung S22F350 @ 75Hz/ Integrated 1366x768 @ 94Hz
Case Generic / Stock
Audio Device(s) Realtek ALC892 / Realtek ALC282
Power Supply Sentey XPP 525W / Power Brick
Mouse Logitech G203 / Elan Touchpad
Keyboard Generic / Stock
Software Windows 10 x64
Well this only shows that Ampere is a better Turing, doesn't seem to add anything new to the table besides performance and power improvements. Don't get me wrong, that's amazing, specially if you consider the pricing, but for GPU devs, this is quite boring.
 
Joined
Feb 19, 2006
Messages
6,270 (1.18/day)
Location
New York
Processor INTEL CORE I9-9900K @ 5Ghz all core 4.7Ghz Cache @1.305 volts
Motherboard ASUS PRIME Z390-P ATX
Cooling CORSAIR HYDRO H150I PRO RGB 360MM 6x120mm fans push pull
Memory CRUCIAL BALLISTIX 3000Mhz 4x8 32gb @ 4000Mhz
Video Card(s) EVGA GEFORECE RTX 2080 SUPER XC HYBRID GAMING
Storage ADATA XPG SX8200 Pro 1TB 3D NAND NVMe,Intel 660p 1TB m.2 ,1TB WD Blue 3D NAND,500GB WD Blue 3D NAND,
Display(s) 50" Sharp Roku TV 8ms responce time and Philips 75Hz 328E9QJAB 32" curved
Case BLACK LIAN LI O11 DYNAMIC XL FULL-TOWER GAMING CASE,
Power Supply 1600 Watt
Software Windows 10
:cry: i don't think i seen encoder improvement mentioned, I was hoping to see at least a 20% increase in performance relation to Quality at same bit rates or even lower.
 
Joined
Dec 22, 2011
Messages
3,271 (1.02/day)
Yeah really interested to see what people can create with Omniverse Machinima, certainly a fun tool to play around with either way.

Anyway, nice write up, all those apparent leaks and the core counts were still wrong.
 
Joined
Apr 28, 2011
Messages
69 (0.02/day)
System Name Venturi
Processor Dual Xeon 8180M 56 cores /112HT
Motherboard Asus c621e Sage
Cooling Air, noctua, heatsinks, noiseblocker fans
Memory ecc red 384GB 2666mhz
Video Card(s) 4x Titan V including Titan V CEO ed 32GB
Storage 960pro 2TB and 1x 850pro SSD RAID
Display(s) Asus ProArt 329 4k
Case TT miniITX P1
Audio Device(s) harmon Kardon speakers
Power Supply 1600 AXi digital (silent)
Mouse Z mouse
Keyboard Corsai K65
Software MS 2019 Data Center Server, Ubuntu
Benchmark Scores Cinebench 18708
so nvidia again misleads the masses with DLSS 8k

so it renders the app game at 1440 and then upsamples?
DLSS looks blurry on 4k,

i wonder what the performance would be if it was set to high quality, 8k, no optimizations, no DLSS, no screen re-scaling, and was just in 'pure' mode for the driver?

I tried this on 4k and it proved the hypothesis. Basically DLSS is ruining/destroying the beautiful 4k images for games that you could run with textures at 4k (you have to make changes and edits in ,inf, .cfg, .xml and know what you're doing) and as such I see DLSS as marketing fud to pull up abysmal RTX performance, and repleate with image quality compromises.



Where are the benchmarks of 4k, max eyecandy, no DLSS, rtx on, no motion blur, no depth of field, and all postprocess turned on? Max visuals?

THAT is the real tell all of performance
 

bug

Joined
May 22, 2015
Messages
8,077 (4.14/day)
Processor Intel i5-6600k (AMD Ryzen5 3600 in a box, waiting for a mobo)
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V (@3200)
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
The company also wants to democratize 3D animation film-making without making people learn 3D from scratch—the Omniverse Machinima is basically Ansel for moving images, and its possibilities for storytellers is endless. ............

I think the expression democratize.. this can not be used in this phrase, If NVIDIA thinks that any regular Joe, he might get triggered to mimic an engineer at 3D animation film-making, just because of the card, this is simply not going to happen.

The industry of 3D animation, this is based or relay on a very few talented people with a serious pack of knowledge, this is a sport requiring 100% dedication to the job.
Any way, I am also interested to see of what NVIDIA has cooked this time in the regard of gaming performance, 3D animation film-making this is not a solid motive for someone to spent such amount of cash.

From the other hand it is the norm the games developers to own or use the high end VGA because this is a tool for their work.
Hobbyist gamers, they do not need the tools of 3D developers.
Listen to this guy. He's related to those that invented democracy ;)

And I love how Nvidia drew that airflow diagram as if the card is to be used in a CPU-less computer :D

All things considered, Ampere landed about where I predicted it would. Which makes me wish I predicted it was cheaper. It's still good value for the money (fuggidabout 3090), but 3080 at $500 would have been dreamy.
 

Fourstaff

Moderator
Staff member
Joined
Nov 29, 2009
Messages
9,629 (2.44/day)
Location
Home
System Name Orange!
Processor 3570K
Motherboard ASRock z77 Extreme4
Cooling Stock
Memory 2x4Gb 1600Mhz CL9 Corsair XMS3
Video Card(s) Sapphire Nitro+ RX 570
Storage Samsung 840 250Gb + Toshiba DT01ACA300 3Tb
Display(s) LG 22EA53VQ
Case NZXT Phantom 410 Black/Orange
Power Supply Corsair CXM500w
Definitely interested in the reviews, they have clearly spent some time trying to flesh out the "what we can use graphics cards for" in addition to the usual "faster, faster" route.
 

nikoya

New Member
Joined
Jul 8, 2019
Messages
21 (0.05/day)
exciting time indeed.
Thx for this synthesis.

Im thinking about water cooling the thing. I've never been fan to do it for the CPU...after all it's "only" 100W. Instead I'd like to see those 320W going pushed strait out from the case. could be a new adventure.
 
Joined
Mar 23, 2012
Messages
528 (0.17/day)
Processor Intel Core i7-5930k @ 4.6 GHz
Motherboard ASUS X99-A
Cooling Corsair H110
Memory 32GB Corsair Vengeance LP DDR4-2666
Video Card(s) 2x Titan X @ 1450/8000
Storage WD Black 4TB + Agility 3 240GB + Seagate 600 240GB
Display(s) Sharp Aquos 4k LC-60UD27U
Case Corsair Obsidian 450D
Audio Device(s) Sound Blaster Z
Power Supply Corsair AX1200
an uneventful, incremental upgrade to Turing, much like Pascal was to Maxwell
This is a really poor comparison that suggests a very short memory. Pascal was a similar performance jump over Maxwell as Ampere looks to be over Turing. The only difference between the two is that the big chip for Ampere is launching immediately, and not months later.

Here's a quick reminder from your own 1080 review:
Incredible performance, large performance jump
Ampere is a return to form for Nvidia. One bad launch (Turing) doesn't make this some sort of unicorn launch.
 
Joined
May 12, 2006
Messages
1,201 (0.23/day)
Location
The Gulag Casino
System Name ROG By purecain
Processor AMD Ryzen 9 3900x @4.2-4.46
Motherboard ASUS Crosshair VIII Hero
Cooling Noctua NH U12A
Memory 32Gb G.Skill Trident RGB 4150@3600mhz
Video Card(s) Titan V HBM2 12GB
Storage 1TbSamsung EvoM.2/1TbSamsung SSD/120gbKingston ssd
Display(s) 55"Sony Bravia X855 10bit 4K+HDR@60fps 2K+HDR@120fps IPS panel
Case Thermaltake CoreX71 Limited Edition Etched Tempered Glass Door
Audio Device(s) On board/NIcomplete audio 6
Power Supply Seasonic FOCUS 1000w 80+
Mouse Logitech MX MASTER
Keyboard Corsair Strafe Keyboard
Software Windows10pro
Benchmark Scores [url=https://valid.x86.fr/gtle1y][img]https://valid.x86.fr/cache/banner/gtle1y-6.png[/img][/url]
There will surely be a 3080ti down the road which almost matches the 3090... This is my prediction!!!! lol :toast:
 

jamexman

New Member
Joined
Nov 19, 2019
Messages
7 (0.02/day)
so nvidia again misleads the masses with DLSS 8k

so it renders the app game at 1440 and then upsamples?
DLSS looks blurry on 4k,

i wonder what the performance would be if it was set to high quality, 8k, no optimizations, no DLSS, no screen re-scaling, and was just in 'pure' mode for the driver?

I tried this on 4k and it proved the hypothesis. Basically DLSS is ruining/destroying the beautiful 4k images for games that you could run with textures at 4k (you have to make changes and edits in ,inf, .cfg, .xml and know what you're doing) and as such I see DLSS as marketing fud to pull up abysmal RTX performance, and repleate with image quality compromises.



Where are the benchmarks of 4k, max eyecandy, no DLSS, rtx on, no motion blur, no depth of field, and all postprocess turned on? Max visuals?

THAT is the real tell all of performance

What games are you trying for DLSS? If they're older games -not updated- using v1.0 its indeed a blurry mess, but if they're using DLSS 2.0 its amazing, even better than native 4k in some cases and no blurry at all (refer to many digital foundry videos).
 
Joined
Nov 1, 2017
Messages
433 (0.41/day)
System Name Macbook Air Late-2013
Processor Intel Core i5 1.3Ghz
Memory 8GB 1600Mhz DDR3
Video Card(s) Intel HD Graphics 5000 1536MB
Storage 256GB SSD
Display(s) 13.3" 1440x900
I just realised the benchmark have been made with Intel i9 which isn't PCIE 4.0.

I wonder how the performance will scale with 4.0 and Ryzen. If it would make a difference.
 
Joined
Jul 18, 2017
Messages
423 (0.36/day)
I just realised the benchmark have been made with Intel i9 which isn't PCIE 4.0.

I wonder how the performance will scale with 4.0 and Ryzen. If it would make a difference.
I'm sure Nvidia had tested between the two but still chose to use Intel despite lacking PCIE4.0. You can't beat their overwhelming gaming headroom now that Ampere is delivering >120 FPS at 1440p.

Yes that's right. If you own Zen 2 and slower, get ready to upgrade your CPU AGAIN.
 
Joined
May 17, 2016
Messages
196 (0.12/day)
System Name DUX
Processor Ryzen 5 3600
Motherboard MSI X570 Gaming Plus
Cooling Fortron Windale 6 blue LED
Memory Crucial Balistix Sport 3800MHz CL16 OC
Video Card(s) HD 4850 512MB GDDR3
Storage ADATA 512GB M.2
Case Zalman Z1 NEO
Audio Device(s) Kingston HyperX Cloud II
Power Supply Corsair TX850 Gold
There will surely be a 3080ti down the road which almost matches the 3090... This is my prediction!!!! lol :toast:
Well ofc there will be 3080Ti. Massive price gap between 3080 and 3090. Probably going to be revealed by the end of the year. For $899-$999.
 
Joined
Dec 3, 2012
Messages
431 (0.15/day)
Processor Intel i9 9900K @5Ghz 1.32vlts
Motherboard Gigabyte Z390 Aorus Pro Wi-Fi
Cooling BeQuiet Dark Rock 4
Memory 32GB Corsair Vengeance Pro DDR4 3200Mhz (16-18-18-36)
Video Card(s) EVGA RTX 2080
Storage 512GB Gigabyte Aorus NVMe (Boot) 1Tb Crucial NVMe (Games)
Display(s) LG UK850 27in 4K Freesync/G-Sync/HDR 600
Case Fractal Design Meshify C Windowed (Dark Tint)
Audio Device(s) Corsair HS70 Special Edition Wireless Headphones & 7.1 Sound
Power Supply Corsair RMx 850w Gold
Mouse Corsair Glaive RGB
Keyboard HyperX Alloy Elite Mechanical RGB (Cherry Red)
Software Windows 10 Home
I'd like to know how Nvidia got 40fps+ in Control at 4K RTX Settings with a 2080, because I can tell you now with a setup the same as their test rig (9900K & 32GB RAM) Control will run at less than 20fps most of the time at 4K RTX Settings, & often crashing down to less than 10fps in some areas...I didn't see 'DLSS On' anywhere in the slide they used, showing the 3080 getting 80fps+ under the same settings.
 

wolf32v

New Member
Joined
Aug 10, 2020
Messages
4 (0.09/day)
when are the actual reviews coming out and these fluff articles end, want to see it in action.
 
Joined
Apr 12, 2013
Messages
3,563 (1.31/day)
Well ofc there will be 3080Ti. Massive price gap between 3080 and 3090. Probably going to be revealed by the end of the year. For $899-$999.
They could very well fit in two cards over there, the 3080 Super & possibly 3080Ti unless NVidia's planning to drop it this gen.
 
Joined
Dec 26, 2016
Messages
98 (0.07/day)
Processor Ryzen 3900x
Motherboard Aorus x570 ITX
Cooling NF-A15 on Macho Rev. B
Memory 32GB 3200MHz CL16
Video Card(s) EVGA GTX1060 6GB
Storage 2x 970 EVO Plus
Case Fractal Core 500
Power Supply Straight Power 11
Mouse G603
Keyboard PureWriter blue
Software yes, lots!
I'm sure Nvidia had tested between the two but still chose to use Intel despite lacking PCIE4.0. You can't beat their overwhelming gaming headroom now that Ampere is delivering >120 FPS at 1440p.

Yes that's right. If you own Zen 2 and slower, get ready to upgrade your CPU AGAIN.
By overwhelming gaming headroom you mean the 2,5 fps that Core-i-10000 is faster at 4K?

Or the 5-15 fps that its faster at 720p?

Because I tell you, these GPUs were not made to play at 720p. These tests at low res a nice to find out which CPUs would be fastest if CPUs were the bottlenecks in gaming. But honestly, who plays at 720p? By now, even the haters should have gotten that it does not matter if you play on Intel or AMD. Basically Intel is for people that want the absolute maximum in gaming and AMD is for people that want to spend a little less and have decent gaming CPU and an even better CPU for most working tasks.

I had Intel for more than a decade (C2D-E6300, C2Q-6600, i5-4690k) and was very happy with Core-2 and Core-i and now I am very happy with my 3900x. And I am pretty sure it will not be the limiting factor when getting a 3080 and playing at 1440+...
 
Last edited:
Top