• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces GeForce Ampere RTX 3000 Series Graphics Cards: Over 10000 CUDA Cores

Joined
Feb 20, 2019
Messages
7,526 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
VRAM is one of the most expensive parts of the GPU right now and so far, the least important when it comes to performance. You can increase the VRAM, but then you'll have to increase the price considerably, just look at 3090... I mean, these cards are for today. I doubt anybody can do anything about VRAM prices. Game devs really need (and probably will) adapt to this reality, unless these prices change. I think you can find flaws with anything, if you want.
PS5 devs have specifically talked about targeting 12GB as the dynamic VRAM allocation of next-gen titles, something made possible without silly loading times by the new hybrid storage system PS5 has.

10GB cards will be inadequate, soon, I think - and it has already been mentioned in this thread that HZD on PC requires over 8GB. The 3070 is incapable of max settings on games that existed before it's even released!

The next gen consoles will have a huge impact on game developers, because the hardware is so close to PC hardware this time around. Expect every dev to build their engines for the consoles first and the PCMR will get ports.

I was expecting the 3080 to be 16GB and 3070 to be 12GB, to be honest.....
 
Joined
Mar 10, 2010
Messages
11,878 (2.28/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
PS5 devs have specifically talked about targeting 12GB as the dynamic VRAM allocation of next-gen titles, something made possible without silly loading times by the new hybrid storage system PS5 has.

10GB cards will be inadequate, soon, I think - and it has already been mentioned in this thread that HZD on PC requires over 8GB. The 3070 is incapable of max settings on games that existed before it's even released!

The next gen consoles will have a huge impact on game developers, because the hardware is so close to PC hardware this time around. Expect every dev to build their engines for the consoles first and the PCMR will get ports.
I see the memory allocation as Nvidia's move to keep a GPU release in a few months to a year relevant, they're pushing the power envelope already so silicon optimization and process optimization won't net much of a gain so more higher speed Vram will sell cards in a year.
 
Joined
Dec 22, 2011
Messages
3,890 (0.85/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Well the 3070 is only 220W, that's less than the 5700XT, and that has a feature set that is well.... Lacking.
 
Joined
May 2, 2017
Messages
7,762 (2.99/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
PS5 devs have specifically talked about targeting 12GB as the dynamic VRAM allocation of next-gen titles, something made possible without silly loading times by the new hybrid storage system PS5 has.

10GB cards will be inadequate, soon, I think - and it has already been mentioned in this thread that HZD on PC requires over 8GB. The 3070 is incapable of max settings on games that existed before it's even released!

The next gen consoles will have a huge impact on game developers, because the hardware is so close to PC hardware this time around. Expect every dev to build their engines for the consoles first and the PCMR will get ports.

I was expecting the 3080 to be 16GB and 3070 to be 12GB, to be honest.....
The PS5 likely has the same split between the OS and software as the XSX, reserving 2.5GB for the system and leaving 13.5GB for software. This if course has to serve as both RAM and VRAM for the software, so games exceeding 10GB in VRAM alone is quite unlikely. Of course the PC typically supports higher detail levels, leading to higher VRAM usage, but new texture streaming techniques (and especially DirectStorage) are likely to dramatically reduce the amount of "let's keep it in VRAM in case we need it" data, which is the majority of current VRAM usage on both PCs and consoles. If developers start designing with NVMe as a baseline, VRAM utilization can drop very noticeably from this alone. Current games pre-load data based on HDD transfer rates and seek times, meaning data is loaded very aggressively and early, with the majority of it being flushed without ever seeing use.
 
Joined
Jan 5, 2006
Messages
18,142 (2.69/day)
System Name AlderLake / Laptop
Processor Intel i7 12700K P-Cores @ 5Ghz / Intel i3 7100U
Motherboard Gigabyte Z690 Aorus Master / HP 83A3 (U3E1)
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans / Fan
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36 / 8GB DDR4 HyperX CL13
Video Card(s) MSI RTX 2070 Super Gaming X Trio / Intel HD620
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2 / Samsung 256GB M.2 SSD
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p / 14" 1080p IPS Glossy
Case Be quiet! Silent Base 600 - Window / HP Pavilion
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W / Powerbrick
Mouse Logitech MX Anywhere 2 Laser wireless / Logitech M330 wireless
Keyboard RAPOO E9270P Black 5GHz wireless / HP backlit
Software Windows 11 / Windows 10
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
Here it is, the GeForce RTX 3080, 10 GB GDDR6X, running at 19 Gbps, 238 tensor TFLOPs, 58 RT TFLOPs, 18 power phases.

Hmm, according to toms 68 RT and 272 tensor

Capture.PNG


 
Joined
Jun 25, 2014
Messages
158 (0.04/day)
System Name Ryzen shine, Mr Freeman
Processor 5900X
Motherboard ASUS X570 Dark Hero
Cooling Arctic Liquid Freezer II 360 ARGB
Memory 32GB TridentZ Neo 3600 CL14
Video Card(s) 3080TI FE with Alphacool Eiswolf AIO
Storage 2TB 970 EVO PLUS, 1TB 980
Display(s) LG OLED 55CX
Case O11D XL Black
Audio Device(s) Xonar Essence STU, Mackie MR5+MR10S, HD598
Power Supply Seasonic Prime Titanium 850W
Mouse GPW
Keyboard G815
Anyone knows how and when can we preorder FE cards?
 
Joined
Feb 20, 2019
Messages
7,526 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
The PS5 likely has the same split between the OS and software as the XSX, reserving 2.5GB for the system and leaving 13.5GB for software. This if course has to serve as both RAM and VRAM for the software, so games exceeding 10GB in VRAM alone is quite unlikely. Of course the PC typically supports higher detail levels, leading to higher VRAM usage, but new texture streaming techniques (and especially DirectStorage) are likely to dramatically reduce the amount of "let's keep it in VRAM in case we need it" data, which is the majority of current VRAM usage on both PCs and consoles. If developers start designing with NVMe as a baseline, VRAM utilization can drop very noticeably from this alone. Current games pre-load data based on HDD transfer rates and seek times, meaning data is loaded very aggressively and early, with the majority of it being flushed without ever seeing use.
If they go down that route then we will all need NVMe storage for our games libraries. More likely is that the devs can't assume people have 3GB/s library drives and will opt to continue using GPU VRAM as storage.

This is one instance where I'd like to be wrong but the last 25 years of PC gaming has proven that devs always cater to the lowest common denominator to get the largest customer base possible.
 
Joined
May 28, 2020
Messages
752 (0.51/day)
System Name Main PC
Processor AMD Ryzen 9 5950X
Motherboard ASUS X570 Crosshair VIII Hero (Wi-Fi)
Cooling EKWB X570 VIII Hero Monoblock, 2x XD5, Heatkiller IV SB block for chipset,Alphacool 3090 Strix block
Memory 4x16GB 3200-14-14-14-34 G.Skill Trident RGB (OC: 3600-14-14-14-28)
Video Card(s) ASUS RTX 3090 Strix OC
Storage 500GB+500GB SSD RAID0, Fusion IoDrive2 1.2TB, Huawei HSSD 2TB, 11TB on server used for steam
Display(s) Dell LG CX48 (custom res: 3840x1620@120Hz) + Acer XB271HU 2560x1440@144Hz
Case Corsair 1000D
Audio Device(s) Sennheiser HD599, Blue Yeti
Power Supply Corsair RM1000i
Mouse Logitech G502 Lightspeed
Keyboard Corsair Strafe RGB MK2
Software Windows 10 Pro 20H2
Anyone knows how and when can we preorder FE cards?
Employee from NVIDIA stated in a reddit q&a yesterday that there will be no preorders. They just open for purchase on the 17th/24th/ in october.
 
Joined
Feb 20, 2019
Messages
7,526 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Anyone knows how and when can we preorder FE cards?
You can't, but you can sign up for when orders go live. In my experience with Pascal and Turing launches they are out of stock before the email arrives, so it's not much use.

https://www.nvidia.com/en-gb/geforce/buy/ <regional, you'll need to change en-gb to your country code.

Also, does anyone want to buy a 2080Ti for more than the cost of a 3080 and 3070 combined? Nvidia has you covered!

1599041637519.png
 
Joined
May 2, 2017
Messages
7,762 (2.99/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Hmm, according to toms 68 RT and 272 tensor

View attachment 167503

Tom's is wrong. That is closer to the 3090's specs, though not quite.

If they go down that route then we will all need NVMe storage for our games libraries. More likely is that the devs can't assume people have 3GB/s library drives and will opt to continue using GPU VRAM as storage.

This is one instance where I'd like to be wrong but the last 25 years of PC gaming has proven that devs always cater to the lowest common denominator to get the largest customer base possible.
Given that DirectStorage is on the XSX, the PS5 uses a similar system, and most high budget games are developed for consoles (too), I would be very surprised if this didn't happen. I guess they might make some sort of legacy mode, though it would be far less effort for developers to aim for console specs as a minimum. Though to be frank even aiming for SATA SSDs as a baseline would largely fix this, as seek times matter more for this than raw transfer rates.
 
Joined
Feb 20, 2019
Messages
7,526 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Though to be frank even aiming for SATA SSDs as a baseline would largely fix this, as seek times matter more for this than raw transfer rates.
Yeah, like I said, it'd be nice if I'm wrong this time.
I don't fancy replacing my 2.5TB of library drives with NVMe.
 
Joined
Jan 5, 2006
Messages
18,142 (2.69/day)
System Name AlderLake / Laptop
Processor Intel i7 12700K P-Cores @ 5Ghz / Intel i3 7100U
Motherboard Gigabyte Z690 Aorus Master / HP 83A3 (U3E1)
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans / Fan
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36 / 8GB DDR4 HyperX CL13
Video Card(s) MSI RTX 2070 Super Gaming X Trio / Intel HD620
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2 / Samsung 256GB M.2 SSD
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p / 14" 1080p IPS Glossy
Case Be quiet! Silent Base 600 - Window / HP Pavilion
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W / Powerbrick
Mouse Logitech MX Anywhere 2 Laser wireless / Logitech M330 wireless
Keyboard RAPOO E9270P Black 5GHz wireless / HP backlit
Software Windows 11 / Windows 10
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
You can't, but you can sign up for when orders go live. In my experience with Pascal and Turing launches they are out of stock before the email arrives, so it's not much use.

https://www.nvidia.com/en-gb/geforce/buy/ <regional, you'll need to change en-gb to your country code.

Also, does anyone want to buy a 2080Ti for more than the cost of a 3080 and 3070 combined? Nvidia has you covered!

View attachment 167505

Did you check this?

o_O:roll:
 
Joined
May 15, 2020
Messages
697 (0.47/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
If they go down that route then we will all need NVMe storage for our games libraries. More likely is that the devs can't assume people have 3GB/s library drives and will opt to continue using GPU VRAM as storage.

This is one instance where I'd like to be wrong but the last 25 years of PC gaming has proven that devs always cater to the lowest common denominator to get the largest customer base possible.
Shadowlands already includes an SSD as a minimum requirement for the game. But from that to 3GB/s that's another step up, I would imagine they would rather ask for more RAM/VRAM.
 
Joined
May 2, 2017
Messages
7,762 (2.99/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Shadowlands already includes an SSD as a minimum requirement for the game. But from that to 3GB/s that's another step up, I would imagine they would rather ask for more RAM/VRAM.
But if the only option for more VRAM is a $1499 GPU... then buying a $100 or $200 SSD is far easier, no?
Yeah, like I said, it'd be nice if I'm wrong this time.
I don't fancy replacing my 2.5TB of library drives with NVMe.
It really wouldn't be hard to make a flexible solution for this - just make game platforms (GOG, Steam, Epic, etc.) identify what types of storage you have in your system (Windows already does this for SSDs and HDDs on a system level, but it should be trivial to differentiate between SATA and NVMe too), add a tag to games requiring fast storage so the launcher knows, and allow the platform to shuffle games between drives as needed (obviously with user configuration options like always keeping certain games on storage type X or Y, etc.
 
Joined
Nov 5, 2014
Messages
714 (0.20/day)
:O

Not sure who I should give thanks to... NVIDIA for the effort or AMD for the compo.
Intel.

This price/performance combo is a straight up attempt to knock AMD out of the highend GPU market. Something Nvidia havent beem able to try previously as the monopolies commission would have come calling.

If Nvidia can KO AMD hard enough they will end up in a two horse race with Intel who will be in AMDs old spot of having the second best CPUs and gpus.
 
Joined
May 15, 2020
Messages
697 (0.47/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
But if the only option for more VRAM is a $1499 GPU... then buying a $100 or $200 SSD is far easier, no?
There'l be much more affordable (RDNA2 and then Nvidia) cards with more than 10GB in just a few months, I'm sure.

If Nvidia can KO AMD hard enough they will end up in a two horse race with Intel who will be in AMDs old spot of having the second best CPUs and gpus.
I confess that the 2 shader operation trick reminds me exactly of what they pulled 20 years ago with the Geforce 2 GTS (T stands for texel, which meant applying 2 textures per pixel per clock cycle) for eliminating 3DFX, and it worked just fine back then, so much brute force, 3DFX was lost and soon disappeared.
 
Joined
Jul 19, 2016
Messages
477 (0.17/day)
I've always been a little bit puzzled by the supposed 'price gouging' Nvidia is doing. Yes, they're leading and command a bit of premium. But there's almost always something on offer for that premium. And then there's always a bunch of GPUs below it that do get some sort of advancement in perf/dollar and absolute performance.

I mean... the 970 was super competitive also on price. The 660ti was the same back during Kepler and the 670 was seen as the 'poor man's 680', but performed virtually the same. The 1070 was dropping the 980ti price point down by a few hundred... and its happening again with x70 today. The price of an x70 has risen... but so has the featureset and the performance gap to the bottom end.

Even with the mining craze the midrange was populated and the price, while inflated, was not quite as volatile as others.



'the' leaks? The 12 pin was the only truly accurate one man (alright, and the pictures then). Nvidia played this well, you can rest assured all we got was carefully orchestrated. And that includes the teasing of a 12 pin. Marketing gets a lead start with these leaks, we also heard 1400- 2000 dollars worth of GPU, obviously this makes the announcement of the actual pricing even stronger.

Come on, Red Gaming Tech said/MLID were correct with:

- them using Samsung's inferior 8nm process node
- The cards will draw huge power as a result. 320 and 380W is not normal. One even claimed the exact power draw which was on the money
- The 3080 will be what they are pushing hard as it's performance is much, much closer to the 3090 than the price suggests. This is to combat Navi
- Performance numbers and their relative gaps were all spot on

So we knew a load about this release and Nvidia were definitely mpre leaky here than with Turing, Pascal.
 
Joined
May 15, 2020
Messages
697 (0.47/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
So we knew a load about this release and Nvidia were definitely mpre leaky here than with Turing, Pascal.
Tom from MLID said most of his RDNA2 info was coming from Nvidia sources too :kookoo:
 
Joined
Jul 9, 2015
Messages
3,413 (1.05/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Well the 3070 is only 220W, that's ...
Contradicts Huang's statements about 1.9 better perf/w (taking 2080Ti as 270W card, 3070 should have been 145W)
 
Joined
Sep 17, 2014
Messages
21,325 (5.99/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Come on, Red Gaming Tech said/MLID were correct with:

- them using Samsung's inferior 8nm process node
- The cards will draw huge power as a result. 320 and 380W is not normal. One even claimed the exact power draw which was on the money
- The 3080 will be what they are pushing hard as it's performance is much, much closer to the 3090 than the price suggests. This is to combat Navi
- Performance numbers and their relative gaps were all spot on

So we knew a load about this release and Nvidia were definitely mpre leaky here than with Turing, Pascal.

Quite true in fact, yeah. Still though, I'm pretty sure this was orchestrated leaking. The timing, the content... how do you sell 320W? By letting us ease into it... and then bringing a favorable price point.

These 'tubers are just free or nearly free marketing tools.
 
Joined
Jul 9, 2015
Messages
3,413 (1.05/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Soo, talking about transistor density, TSMC 7nm DUV vs Samsung 8nm:

5700XT, 250mm2, 10.3 billion => 41 million trans. per mm2
3080, 627mm2, 28 billion => 44.6 million tr. per mm2

Remind me, who was saying that Samsung 8nm is faux and just a marketing name for 10nm?
 
  • Like
Reactions: ppn
Joined
Nov 11, 2016
Messages
3,147 (1.14/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Comparing RTX 3070 to 2080 Super which are more similar in specs (8GBs on 256bit bus, 16GBps VRAM) suggests that Ampere is around 35-40% more efficient than Turing.
I guess we lose a bit of efficiency going Samsung 8N but the lower prices justified all that.
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.38/day)
Comparing RTX 3070 to 2080 Super which are more similar in specs (8GBs on 256bit bus, 16GBps VRAM) suggests that Ampere is around 35-40% more efficient than Turing.
I guess we lose a bit of efficiency going Samsung 8N but the lower prices justified all that.

More like 3070 is similar to 2070,. 450mm2, 256 bit. so nvidia managed to squeeze 6144 Cuda or 2.66x more compared to 2304. while bumping the memory speed only to 16. And the average of 1,14+2,66 is 1.9x, so it is 90% more efficient on average, of course where the pure computation power comes into play it is 2.66, Full die to full die TU106 Vs GA104 3070Ti.
 
Joined
Jun 10, 2014
Messages
2,907 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
You trust (and mistread) steam hardware survey too much.
For actual sales check reports from actual shops, e.g. mindfactory.
Steam has a major marketshare among PC gamers, and is way more representative than a single shop. The only thing missing from the Steam hardware survey is people who buy graphics cards and don't game.

PS5 devs have specifically talked about targeting 12GB as the dynamic VRAM allocation of next-gen titles, something made possible without silly loading times by the new hybrid storage system PS5 has.

10GB cards will be inadequate, soon, I think - and it has already been mentioned in this thread that HZD on PC requires over 8GB. The 3070 is incapable of max settings on games that existed before it's even released!
Dynamic, as in the game is able to decide how much is system RAM and how much is VRAM.

By the time 8 GB is inadequate for gaming, the performance of RTX 3070 will be too, and you will be buying a "RTX 6070"…

The next gen consoles will have a huge impact on game developers, because the hardware is so close to PC hardware this time around. Expect every dev to build their engines for the consoles first and the PCMR will get ports.
I think you are putting too much faith in game developers. Most of them just take an off-the-shelf game engine, load in some assets, do some scripting and call it a game. Most game studios don't do a single line of low-level engine code, and the extent of their "optimizations" are limited to adjusting assets to reach a desired frame rate.

If they go down that route then we will all need NVMe storage for our games libraries. More likely is that the devs can't assume people have 3GB/s library drives and will opt to continue using GPU VRAM as storage.
Not really. The difference between a "standard" 500 MB/s SSD and a 3 GB/s SSD will be loading times. For resource streaming, 500 MB/s is plenty.
Also, don't forget that these "cheap" NVMe QLC SSDs can't deliver 3 GB/s sustained, so if a game truely depended on this, you would need a SLC SSD or Optane.

This is one instance where I'd like to be wrong but the last 25 years of PC gaming has proven that devs always cater to the lowest common denominator to get the largest customer base possible.
Games in general isn't particularly good at utilizing the hardware we have currently, and the trend in game development has clearly been less performance optimization, so what makes you think this will change all of a sudden?
 
Top