• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Rumor: NVIDIA's Next Generation GeForce RTX 3080 and RTX 3070 "Ampere" Graphics Cards Detailed

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.43/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
HBM is by far the most powerful and efficient graphic memory out there right now. AMD is just incapable of producing any good gpu, for sometime now.
For practical, consumer and gaming use, I see GDDR6 being used on GA102, 103, 104. GDDR6 has proven itself. HBM not so much. I’m with @EarthDog on this.
 
D

Deleted member 177333

Guest
Why? Dual card systems are basically dead these days. Yes, there is still support for SLI but it's at the point where its unneeded and troublesome.

I've had pretty good luck with multi gpu. I only have maybe 2-3 games that don't utilize it. I play exclusively at 4k so for me, multi gpu is pretty much a necessity if I want games to have nice steady framerates and be able to crank up the image quality settings.
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
For practical, consumer and gaming use, I see GDDR6 being used on GA102, 103, 104. GDDR6 has proven itself. HBM not so much. I’m with @EarthDog on this.
This... still waiting to see its benefits at this (consumer) level. I get the super high bandwidth, but, clearly it isn't needed and it seems (someone correct me on this if needed) that GDDR6 is cheaper to produce anyway? So...... while for high bandwidth applications it can beneficial, we aren't seeing it needed for gaming or general consumer use.
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
The people that are claiming that Ampere will only be a trivial increase in performance aren't paying attention to Nvidia's track record for performance increases with each successive generation.

Average performance increase over previous generation:

RTX 2080 Ti over GTX 1080 Ti 33% (low due to the introduction of RT Cores and Tensor cores on the 2080 Ti instead of just a lot more CUDA cores)
GTX 1080 Ti over GTX 980 Ti 75%
GTX 980 Ti over GTX 780 Ti 41%
GTX 780 Ti over GTX 580 204%
GTX 580 over GTX 285 70%

1579549515067.png




1579549537829.png




1579549562953.png





1579549847248.png





1579549901506.png
 
Joined
Oct 31, 2013
Messages
186 (0.05/day)
320bit memory bus. I really hope we get more details how that bus is connected to what memory controller. I just want to rule out a mess like the GTX 970 was.

@64K
You skipped a few generations. The GTX 480, and GTX 680 are missing. ;)
 
Joined
Aug 6, 2017
Messages
7,412 (3.05/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
RTX 2080 Ti over GTX 1080 Ti 33% (low due to the introduction of RT Cores and Tensor cores on the 2080 Ti instead of just a lot more CUDA cores)
I think it's a scaling problem
 
Joined
Jul 18, 2017
Messages
575 (0.24/day)
The people that are claiming that Ampere will only be a trivial increase in performance aren't paying attention to Nvidia's track record for performance increases with each successive generation.

Average performance increase over previous generation:

RTX 2080 Ti over GTX 1080 Ti 33% (low due to the introduction of RT Cores and Tensor cores on the 2080 Ti instead of just a lot more CUDA cores)
GTX 1080 Ti over GTX 980 Ti 75%
GTX 980 Ti over GTX 780 Ti 41%
GTX 780 Ti over GTX 580 204%
GTX 580 over GTX 285 70%
980ti to 1080ti is pretty much a good example of what to expect from Ampere. New node (7nm EUV this time and not just 7nm!) and new arch + not wanting Intel to ever catch up to them = yuuge increase!
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.61/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Sure there's plenty of research, AMD can only compete with Nvidia at the low-mid tier end currently. They are of course planning on a high end GPU, but how that performs will remain to be seen. We can talk all day about the current die sizes and prices, which will still be the same most likely for Ampere regardless of what AMD does.

5800/5900 series
 
D

Deleted member 177333

Guest
Are you sure about that? There is absolutely no stuttering and the money I spent on my 2 Vega 64s was less than buying a brand new 2080TI here in Canada (Including the water blocks).

Ya I remember years back before I tried my first SLI setup (2x 780 Ti) I was scared to death about the stuttering people kept talking about. I've run 780 Ti SLI, Titan X (maxwell) SLI, and now 1080 Ti SLI...I haven't had a lick of stuttering on any games I play. I mean zippo. I generally run vsync @ 60fps in 4k and my games have been butter smooth. If I ever feel like a game is running right around that 60fps limit for my cards and may fluctuate (which can cause input lag in those rare situations), then I switch out of true vsync and enable adaptive vsync at the driver level and that will take care of any issues.

My experiences with multi gpu have been great. It's obviously not for everyone given that the scaling is never 1:1, and in some cases not even close, but if you have tons of cash and / or are just a hardware enthusiast that wants maximum image quality and / or framerates, it's something I'd recommend people try.

Cus 20 series was extremely overpriced from the beginning and everyone knew that.

Ya I'd like to think they'll get the prices back down into the realm of reason, but I am skeptical with NVidia. :) I may need to just plan on buying 2nd hand Turing once the prices get down into reasonable territory.
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
@64K
You skipped a few generations. The GTX 480, and GTX 680 are missing. ;)

I left out the GTX 480 (Fermi) because there weren't benches to compare it directly to a GTX 780 Ti (Kepler) here and anyway the GTX 580 (Fermi) was the same generation as the GTX 480 (Fermi) but with a vapor chamber for lower temps and a few more shaders.

The GTX 680 has nothing to do with the comparisons that I was making. It was a midrange Kepler. It wasn't the high end. That was some shenanigans that Nvidia pulled on the uninformed. The GTX 780 Ti which came later was the high end Kepler.
 
Joined
Jun 10, 2014
Messages
2,889 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I would advice against trying to estimate the performance when we don't know a single thing about its performance characteristics. Back when Turing's details were pretty much confirmed, most predicted like a ~10% performance increase, and there were an outcry from many claiming it would be a huge failure. Then it surprised "everyone" by offering significant performance gains anyway.

We still don't know what Nvidia's next gen even is at this point. In the worst case, we're looking at a shrink of Turing with some tweaks and higher clocks, but it could also be a major architectural improvement. While I'm not sure I believe the details of this "leak", I do think future improvements will come from architectural changes rather than just "doubling" of SMs every node shrink.
 
Joined
Mar 10, 2015
Messages
3,984 (1.21/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
Multi GPU is for anyone that wants it. I have not replaced my Vega 64s with 5700Xts for exactly that reason. If Multi GPU was really dead MBs would not be touting it and giving you 2, 3 and 4 SLI bridges. As much as people complain about it being troublesome it is not as bad as people make it out to be. There are plenty of Games that support Multi GPU anyway. As an example I get 107 FPS average at 4K playing Jedi Fallen Order @ Ultra settings.

Multi gpu is great for people that like to waste 50% of their money more than 50% of the time. I'll never do it again. Nothing like sitting there waiting for a patch or driver update to get the other card working. Meanwhile, I finished the game already.
 
Last edited:
Joined
Aug 6, 2017
Messages
7,412 (3.05/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
I think they'll wanna

1.use the performance of ps5 as reference,so expect a 350-400 rtx3060 card trading blows with 2070s/2080og
2.have a lot of sku options from the very beginning to be able to respond to changing pricepoints fluidly,hence the new a103 die
3.jack up the prices in the ~2080 super/2080Ti (rtx3070) territory and upwards.expect $600 3070 cards.
 
Joined
Feb 18, 2005
Messages
5,239 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
LOL @ HBM. It's dead on consumer cards and it's not coming back, the only reason AMD ever used it was because Fiji was a disgusting power hog and if they'd added a GDDR controller and memory on top of that it would've been even more of a TDP turd. I assume they continued with it on Vega because by that time they were invested (and Vega was not without TDP and memory bandwidth issues of its own), but it's pretty significant that with Navi - their first mid- to high-end GPU in a while that doesn't suck power like a sponge - they've ditched HBM entirely.

HBM's higher bandwidth, lower power consumption and increased cost only makes sense in compute environments where the memory subsystem is completely maxed out transferring data between the host system and other cards in that system.
 
Joined
Mar 10, 2015
Messages
3,984 (1.21/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
The GTX 680 has nothing to do with the comparisons that I was making. It was a midrange Kepler. It wasn't the high end. That was some shenanigans that Nvidia pulled on the uninformed. The GTX 780 Ti which came later was the high end Kepler.

While generally I agree, I believe that NV mid-range 680 was pretty competitive against AMD's high-end...
 
Joined
May 3, 2018
Messages
2,232 (1.03/day)
10% - 15% over top of the range Nvidia cards like 2080 super or 2080 TI.
If they are running tests and at the moment and the new Ampere generation has an improvement of 50% over the current high end cards then the GPU to come in the next years could have improvements over the current generations as it follows:
3080 (+15% improvement over 2080)
4080 (+30% improvement over 2080) .. and so on.

They will release the performance in batches over the next years cards.

(I am just speculating on percentages, but I guess you got my point)

No I disagree, the next cards after Ampere are a clean sheet new architecture, called Hopper or Harper, named after a female computer scientist IIRC. IMO 3xxx cards will see around 30% at a minimum compared to 2xxx cards, 3080 as fast at least as 2080TI, 3070 faster than 2080 Super, but RT will be much faster for all cards, I honestly expect 100% improvement, they have no choice to if they want to make it a feature you want to enable, it's so lame right now. I'd expect 4xxx cards to be 70%+ faster than 2xxx cards and 200% faster in RT.

Hard to recall Nvidia ever doing a lame 10-15% imprvement on new(er) architecture.
 
Joined
Jun 2, 2017
Messages
7,785 (3.13/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Multi gpu is great for people that like to waste 50% of their money more than 50% of the time. I'll never do it again. Nothing like sitting there waiting for a patch or driver update to get the other card working. Meanwhile, I finished the game already.

Indeed I have only been running Multi GPU since the GTS 450 days. It is your opinion that it does that not seem good and maybe you had a bad experience. I find that with my Vega 64 crossfire that I do not see a waste of money nor time. I can't speak for Nvidia but every time I update my GPU drivers both cards work perfectly. Maybe I am just lucky.
 
Joined
Sep 26, 2017
Messages
553 (0.23/day)
Location
Here
Processor Intel i9 11900K
Motherboard Z590 MSI ACE
Cooling Corsair H80i v2
Memory Ballistix Elite 4000 32GB 18-19-19-39
Video Card(s) EVGA 3090 XC3 ULTRA HYBRID
Storage 2x Seagate Barracuda 120 SSD 1 TB, XPG SX8200 PRO 1 TB
Display(s) Acer Predator Z321QU
Case Fractal Design Meshify C
Power Supply Asus ROG Strix 1000W
I'll stick with my original prediction and say the new video card will be tied to Cyberpunk 2077 :D....

2077.jpg


 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
While generally I agree, I believe that NV mid-range 680 was pretty competitive against AMD's high-end...

It was. The GTX 680 was a little faster than the HD 7970 and was about $50 cheaper than the HD 7970 until the HD 7970 GHz Edition High End GPU that AMD released which caught up with Nvidia's upper midrange GPU the GTX 680.
 
Joined
Mar 10, 2015
Messages
3,984 (1.21/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
It was. The GTX 680 was a little faster than the HD 7970 and was about $50 cheaper than the HD 7970 until the HD 7970 GHz Edition High End GPU that AMD released which caught up with Nvidia's upper midrange GPU the GTX 680.

So, adding 1 + 1, it really doesn't matter that it was a mid-range card, does it? In most cases, people are buying for performance, not chip size, transistor count, board label, or anything else. It is pretty much performance. Would anything have really changed if the board was labeled a 100 or 102?

the only reason AMD ever used it was because Fiji was a disgusting power hog

This is exactly the only reason it ever made it to consumer gpus.

Indeed I have only been running Multi GPU since the GTS 450 days. It is your opinion that it does that not seem good and maybe you had a bad experience. I find that with my Vega 64 crossfire that I do not see a waste of money nor time. I can't speak for Nvidia but every time I update my GPU drivers both cards work perfectly. Maybe I am just lucky.

Indeed. The one and only time I used XFire was it's 'hey-day'. The HD 6800 series. It was total trash. Never again.

Edit: That said. I am half tempted to pick up a second V56 and see what happens. For science.
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Indeed I have only been running Multi GPU since the GTS 450 days. It is your opinion that it does that not seem good and maybe you had a bad experience. I find that with my Vega 64 crossfire that I do not see a waste of money nor time. I can't speak for Nvidia but every time I update my GPU drivers both cards work perfectly. Maybe I am just lucky.
I wouldn't call you lucky, but I would say you are in the minority. :)

They just need to either REALLY make it work so scaling is better and consistent across a lot more titles, or abort alltogether.
 
Joined
Jun 2, 2017
Messages
7,785 (3.13/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
So, adding 1 + 1, it really doesn't matter that it was a mid-range card, does it? In most cases, people are buying for performance, not chip size, transistor count, board label, or anything else. It is pretty much performance. Would anything have really changed if the board was labeled a 100 or 102?



This is exactly the only reason it ever made it to consumer gpus.



Indeed. The one and only time I used XFire was it's 'hey-day'. The HD 6800 series. It was total trash. Never again.

Edit: That said. I am half tempted to pick up a second V56 and see what happens. For science.

Well I have to be honest, the only games I played in those days was Total War Medieval 2 and Total War Shogun 2, throw in some Torchlight and Titan Quest as well all of which have full multi GPU support. Then I got into Just Cause 2, Batman Arkham and Dues Ex HR which all support crossfire. Then I discovered Sleeping Dogs and Shadow of Mordor. Then Rome 2 was launched and after that, Attila which again fully support crossfire. Just Cause 3 and I will end it with Total War Warhammer 1 & 2 which both support multi GPU. Even though 3 Kingdoms does not support multi GPU, I fully expect Warhammer 3 to continue crossfire support (as long as you can edit the script) as it will be an expansion of the existing game.
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
(as long as you can edit the script)
Exactly the stuff most users simply do not want to deal with and one of the hassles. Most people can't figure out how to install a second card, none the less change .exe files to different names to get things to work. People just want it to work... :)

Unless you are running 4K and don't want to drop $1K on a 60 fps capable card, I suppose its viable.. but the writing has been on the wall for years now, it is a, and rightfully so, a dieing breed.
 
Last edited:
Joined
Sep 15, 2011
Messages
6,457 (1.41/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Like every generation, most likely, but not guaranteed, the performance for the 30xx series will be something like:
3070 = RTX 2080
3080 = RTX 2080 Ti
3080 Ti = RTX Titanium

So shouldn't be any surprises here tbh...
 
Top