• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Which NVIDIA Ampere card would you buy?

Which NVIDIA Ampere card would you buy?

  • RTX 3090

    Votes: 5,036 9.3%
  • RTX 3080

    Votes: 11,737 21.7%
  • RTX 3070

    Votes: 6,502 12.0%
  • Will skip this generation

    Votes: 3,868 7.2%
  • Will wait for RDNA2

    Votes: 22,325 41.3%
  • Using AMD

    Votes: 2,758 5.1%
  • Buying a next-gen console

    Votes: 1,810 3.3%

  • Total voters
    54,036
  • Poll closed .
Joined
Nov 3, 2011
Messages
690 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H115i Elite Capellix XT
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB and Toshiba N300 NAS 10TB HDD
Display(s) 2X LG 27UL600 27in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
NVIDIA just announced the GeForce RTX 3090 ($1500), RTX 3080 ($700) and RTX 3070 ($500).

Do you feel like Ampere is for you? If yes, which card are you most interested in? Or sticking with Turing? Or AMD?
I rather buy RTX 3080 with 20 GB VRAM i.e. RTX 3080 Ti type product between RTX 3090 ($1500, 24 GB) and RTX 3080 ($700, 10GB). Willing to pay $999 USD.

AMD will catch up with 2080 Super +20% maximum. They certainly not expected so huge speed bump...
Xbox Series X GPU (52 CU, 1825 Mhz base clock only) is already RTX 2080 level with Gears 5 benchmark.

Xbox Series X GPU has 56 CU in full "XT" configuration.

Xbox Series X GPU = dual shader engines with 56 CU or 26 DCU
RX 5700 XT = dual shader engines with 40 CU or 20 DCU

History
R9-270X = dual shader engines with 20 CU, 32 ROPS
R9-290X = quad shader engines with 44 CU, 64 ROPS

R9-290X slightly exceeded 2X scale from R9-270X

Based on history, expect quad shader engines equipped NAVI version in 80 CU to 112 CU range.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
The problem is that people who bought a 2000 cards especially 2080 Ti's have paid too much to get them, that's why they price them a little bit too high. So nobody wants to loose 600-700 $ of money selling them used. Because if the 3070 will bring same or close performance of a 2080 Ti that means the 2080 Ti value is the same too.

So im not selling my 2080 Ti and i advice to all of you who bought a 20XX card to not rush to sell it !!!

You can keep it and wait for the 4000 or who knows graphics cards from nvidia.
Why would anybody think they could recoup their money on a GPU? After a new gen comes out the best you can get for a brief period of time is 50%, falling off quickly thereafter. Sure, lots of people will ask near full price, until finally some poor sucker comes along and pays that price.
 
Last edited:
Joined
Feb 8, 2008
Messages
2,665 (0.45/day)
Location
Switzerland
Processor i9 9900KS ( 5 Ghz all the time )
Motherboard Asus Maximus XI Hero Z390
Cooling EK Velocity + EK D5 pump + Alphacool full copper silver 360mm radiator
Memory 16GB Corsair Dominator GT ROG Edition 3333 Mhz
Video Card(s) ASUS TUF RTX 3080 Ti 12GB OC
Storage M.2 Samsung NVMe 970 Evo Plus 250 GB + 1TB 970 Evo Plus
Display(s) Asus PG279 IPS 1440p 165Hz G-sync
Case Cooler Master H500
Power Supply Asus ROG Thor 850W
Mouse Razer Deathadder Chroma
Keyboard Rapoo
Software Win 10 64 Bit
Why would anybody think they could recoup their money on a GPU? After a new gen comes out the best you can get for a brief period of time is 50%, falling off quickly thereafter. Sure, lots of people will ask near full price, until finally some poor sucker comes along and pays that price.

Of course people want to recoup the highest amount of money from their GPU.

Its second hand marketing !!! Same principle are for cars, bikes, ecc.... !


What people are missing is the fact that the card they own at least a 20XX Super's and especially who got 2080 Ti is this : your graphics card is still one of the fastest and valid for at least 2 years !!!
 
Joined
Sep 17, 2014
Messages
20,944 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Why would anybody think they could recoup their money on a GPU? After a new gen comes out the best you can get for a brief period of time is 50%, falling off quickly thereafter. Sure, lots of people will ask near full price, until finally some poor sucker comes along and pays that price.

I see GTX 1080 on sale now for about 300 EUR. I bought mine for 500 (well, 420 but that's a tax break you don't always have) new. I will buy a replacement for 500. Let's say it gets worse to 250, indeed 50%. But that isn't a bottom or anything... second hand needs to be combined with buying sub-top at the right moment. At that point you're looking at negligible TCO for any GPU.

Recoup entirely, no. Recoup most? Hell yeah. If you consider that this GPU was worth 3 years of gaming, that's 1095 days, at 250 total cost... just a little over 0,22 EUR per day :) And this just keeps on giving, too, because we can STILL enjoy all the little upgrades over time. Higher res, new tech, etc. etc.

One might even defend that compared to any console which will have just about zero resale value when it is past-gen hardware and no new releases on it (and emulation over time on a PC!), alongside the backwards compatibility it offers, a GPU is virtually free on the bottom line, or even a money maker if you consider your entertainment investments over time. Not to mention its potential for content creation. It really, actually is a tool that can make you money or has 'output value' in other ways (crunching, folding). I even forgot about mining.

Well it more about accepting that people just have different needs.
Some people want the best performance for their dollar.
Some just want the best performance per watt (which would always belong to the most expensive GPU).
The majority will look for the right balance between the 2 metrics.

Let just say you have a Laptop with limited cooling potential, then perf/watt will become the most limiting factor, the more efficient GPU will give more FPS. You can look into 5600M vs 2060 laptops and see why efficiency matter.



HUH ?

Oh please.

Customers buy whatever product looks best at that specific point in time. Trying to attribute that to a few bar charts with minimal percentile gaps is a sign of madness in the eyes of the beholder. That's you in this case.

Nvidia sells because
- It leads in technological advances, mindshare, thought leadership. This is a big thing in any soft- and hardware environment as development is iterative. If you can think faster than the rest, you're smarter and you'll keep leading. Examples: Gsync and other advances in gaming/content delivery, new AA methods, GameWorks emergent tech in co-conspiracy with devs, a major investment in engineers and industry push, CUDA development, etc etc.

- It has a great marketing department

- It has maintained a steady product stack across many generations, this instills consumer faith. The company delivers. Every time. If it delays, there is no competitor that has a better offer.

Got it? Now please, cut this nonsense. Perf/watt is an important metric on an architectural level because it influences what can happen with a specified power budget. Its interesting for us geeks. Not for the other 98% of consumers. In the end all that really matters is what a GPU can do in practice. Perf/watt is great but if you're stuck at 4GB VRAM, well.. yay. If the max TDP is 100W... yay. And if you're just buying the top-end GPU, don't even get me started man, you never cared about perf/watt at all, you only care about absolute perf.

Perf/watt arguments coming from a 2080ti owner is like the new born Tesla driver who's suddenly all about climate, except the reason he bought it was just because he's fond of fast cars. And then speeds off in Ludicrous mode.
 
Last edited:
Joined
Jan 8, 2017
Messages
8,942 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I still see people from time to time trying to sell Maxwell Titan Xs for 500$. I also recently saw a used Titan V for 3000 euros.

You wouldn't believe how stubborn people are to let go of items that have clearly plummeted in value.
 
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Oh please.

Customers buy whatever product looks best at that specific point in time. Trying to attribute that to a few bar charts with minimal percentile gaps is a sign of madness in the eyes of the beholder. That's you in this case.

Nvidia sells because
- It leads in technological advances, mindshare, thought leadership. This is a big thing in any soft- and hardware environment as development is iterative. If you can think faster than the rest, you're smarter and you'll keep leading. Examples: Gsync and other advances in gaming/content delivery, new AA methods, GameWorks emergent tech in co-conspiracy with devs, a major investment in engineers and industry push, CUDA development, etc etc.

- It has a great marketing department

- It has maintained a steady product stack across many generations, this instills consumer faith. The company delivers. Every time. If it delays, there is no competitor that has a better offer.

Got it? Now please, cut this nonsense. Perf/watt is an important metric on an architectural level because it influences what can happen with a specified power budget. Its interesting for us geeks. Not for the other 98% of consumers. In the end all that really matters is what a GPU can do in practice. Perf/watt is great but if you're stuck at 4GB VRAM, well.. yay. If the max TDP is 100W... yay. And if you're just buying the top-end GPU, don't even get me started man, you never cared about perf/watt at all, you only care about absolute perf.

Perf/watt arguments coming from a 2080ti owner is like the new born Tesla driver who's suddenly all about climate, except the reason he bought it was just because he's fond of fast cars. And then speeds off in Ludicrous mode.

Oh please,
Stop with the PCMR shaming, we all are PCMR fanatic :).

Well this might be a surprise to you that the absolute performance is only when you are trying to bench for high score, most of the time 2080 Ti owners will try for a balance between thermal, noise output and performance when gaming. FYI the AIB 2080 Ti can draw 350W too but it make little sense to use that much power for little performance gain.



Would it surprise you that there are people who would run the 3080 at 250-280W ? From this chart the gain from 250 to 320W is around 15fps, a 16% perf gain for 28% power increase. Yes the more FPS the better but you would more likely to notice your room being warmed up after a while rather a few fps drop :).

Worrying about max power a card can use is just short sighted or just salty, especially when Ampere offer 25-30% higher efficiency vs Turing.

Do you know why Vega is meme ?


7% perf/watt improvement vs Fury X :)
 
Joined
Sep 17, 2014
Messages
20,944 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Oh please,
Stop with the PCMR shaming, we all are PCMR fanatic :).

Well this might be a surprise to you that the absolute performance is only when you are trying to bench for high score, most of the time 2080 Ti owners will try for a balance between thermal, noise output and performance when gaming. FYI the AIB 2080 Ti can draw 350W too but it make little sense to use that much power for little performance gain.



Would it surprise you that there are people who would run the 3080 at 250-280W ? From this chart the gain from 250 to 320W is around 15fps, a 16% perf gain for 28% power increase. Yes the more FPS the better but you would more likely to notice your room being warmed up after a while rather a few fps drop :).

Worrying about max power a card can use is just short sighted or just salty, especially when Ampere offer 25-30% higher efficiency vs Turing.

Do you know why Vega is meme ?


7% perf/watt improvement vs Fury X :)

Okay, you repeated yourself once more, but your point is?

Everyone can run their GPU as they see fit, but you're still limited by its max TDP, and you will use that too when you need the performance. Undervolting is always an option, but to have it influence a purchase decision is another world entirely. Your world. People didn't undervolt Vega's because they loved to. They did it because the settings from the factory were usually pretty loose and horrible. They bought a Vega not because it would undervolt so well... they bought it because it had, at some point in time, a favorable perf/dollar. Despite its power usage.

Similarly, not a single soul in the world, well maybe not counting you, did buy a 2080ti for its great efficiency per watt, they bought it for absolute performance and having the longest epeen for the shortest possible time in GPU history, as it now seems. Can you undervolt it yes, and if you won't need the last 5% you probably will. But its an irrelevant metric here and also wrt a 3080 or 3090 with a much increased power budget.

Turing is also proving you wrong with the exact same perf/watt figures across the whole stack, by the way. And Ampere won't be much different. The scaling is the same or almost the same regardless of tier/SKU. They all clock about as high.

Well people are selling 2080 Ti for 500usd now, might as well grab one. At least the 11GB framebuffer is more enticing than 8GB and the 2080 Ti only use 40W more.
RTX and DLSS performance would likely to be the same between the two.

Unlikely given the updates to the SMs themselves. The core has changed quite a bit in favor of more RT perf. This is how they get to their 2080ti perf equivalent claims on the x70, too, most likely - at least in part.
 
Last edited:

phill

Moderator
Staff member
Joined
Jun 8, 2011
Messages
15,973 (3.39/day)
Location
Somerset, UK
System Name Not so complete or overkill - There are others!! Just no room to put! :D
Processor Ryzen Threadripper 3970X
Motherboard Asus Zenith 2 Extreme Alpha
Cooling Lots!! Dual GTX 560 rads with D5 pumps for each rad. One rad for each component
Memory Viper Steel 4 x 16GB DDR4 3600MHz not sure on the timings... Probably still at 2667!! :(
Video Card(s) Asus Strix 3090 with front and rear active full cover water blocks
Storage I'm bound to forget something here - 250GB OS, 2 x 1TB NVME, 2 x 1TB SSD, 4TB SSD, 2 x 8TB HD etc...
Display(s) 3 x Dell 27" S2721DGFA @ 7680 x 1440P @ 144Hz or 165Hz - working on it!!
Case The big Thermaltake that looks like a Case Mods
Audio Device(s) Onboard
Power Supply EVGA 1600W T2
Mouse Corsair thingy
Keyboard Razer something or other....
VR HMD No headset yet
Software Windows 11 OS... Not a fan!!
Benchmark Scores I've actually never benched it!! Too busy with WCG and FAH and not gaming! :( :( Not OC'd it!! :(
Calm it down guys, no need for any heated discussions... Just chat and chill :D

If I'm honest with the rigs at home, I could quite happily buy one of each of them and still have more than enough power for the crunching and folding work that I do. I would however like to upgrade my SLI 1080 Ti's as the hardware looks lost inside my case but I'm ok with that :) :laugh:

I will miss SLI, I've pretty much used it since the GTX 580's when I had two/three of those cards running 8064 x 1600... Ah memories :)

Still what I'm not forgetting either AMD's new offering. Should that be better/faster/more efficient and the like, I'd likely go down to that direction as well :) I love hardware, I'll buy whatever is best for me and not worry too much about anything else :)

As for buying any of the 2 series cards, only when everything is released then I might treat myself to a Kingpin 2080 Ti or something for the collection.. Either way, I'm definitely in no rush at all for any of the newer cards and I'm firmly waiting on reviews and aftermarket cards but sticking with the old faithful models, such as MSI's Gaming X, Asus ROG Strix and then my personal favourite, EVGA's SC or FTW cards as I've had no issues with them at all :)
 
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Okay, you repeated yourself once more, but your point is?

Everyone can run their GPU as they see fit, but you're still limited by its max TDP, and you will use that too when you need the performance. Undervolting is always an option, but to have it influence a purchase decision is another world entirely. Your world. People didn't undervolt Vega's because they loved to. They did it because the settings from the factory were usually pretty loose and horrible. They bought a Vega not because it would undervolt so well... they bought it because it had, at some point in time, a favorable perf/dollar. Despite its power usage.

Similarly, not a single soul in the world, well maybe not counting you, did buy a 2080ti for its great efficiency per watt, they bought it for absolute performance and having the longest epeen for the shortest possible time in GPU history, as it now seems. Can you undervolt it yes, and if you won't need the last 5% you probably will. But its an irrelevant metric here and also wrt a 3080 or 3090 with a much increased power budget.

Turing is also proving you wrong with the exact same perf/watt figures across the whole stack, by the way. And Ampere won't be much different. The scaling is the same or almost the same regardless of tier/SKU. They all clock about as high.

Well you are wrong in many arguments.
First Nvidia TDP is set in the bios, I could flash a 380W TDP bios to my 2080 Ti if I wanted to, but there is no reason to because efficiency just fell off a cliff after about 260W TDP, as mirrored in Nvidia Perf/Power chart.

Second 24 months is actually the longest time a GPU has held onto the performance crown (2080 Ti was released Sept 2018). The previous king would be the 8800GTX at 19 months (1080 Ti at 18 months). I previously own the Titan X Maxwell and 1080 Ti too so I'm just happy that 3090 will beat the 2080 Ti by a good amount.

Third it seems like almost 12% of voters in this forum are buying the RTX 3090, wherether for the best performance or highest efficiency, does it matter ? Can a Tesla offer both good performance and cost of fuel as well ? (which it does btw)
 
Joined
Sep 17, 2014
Messages
20,944 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Well you are wrong in many arguments.
First Nvidia TDP is set in the bios, I could flash a 380W TDP bios to my 2080 Ti if I wanted to, but there is no reason to because efficiency just fell off a cliff after about 260W TDP, as mirrored in Nvidia Perf/Power chart.

Second 24 months is actually the longest time a GPU has held onto the performance crown (2080 Ti was released Sept 2018). The previous king would be the 8800GTX at 19 months (1080 Ti at 18 months). I previously own the Titan X Maxwell and 1080 Ti too so I'm just happy that 3090 will beat the 2080 Ti by a good amount.

Third it seems like almost 12% of voters in this forum are buying the RTX 3090, wherether for the best performance or highest efficiency, does it matter ?

I know you need to justify a 2080ti > 3090 but really, don't. Its fine. All is well. Great cards.
 
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
I know you need to justify a 2080ti > 3090 but really, don't. Its fine. All is well. Great cards.

Yeah being salty is bad for your heart too, take care :)
 
Joined
Jun 21, 2018
Messages
5 (0.00/day)
Location
Sweden
System Name Main PC
Processor Ryzen 9 5900X @ 4.7GHz (all core)
Motherboard MSI MAG X570 Tomahawk
Cooling Noctua NH-D15
Memory G.Skill 64GB 3600MHz CL16
Video Card(s) Palit RTX 3080 12GB GameRock
Storage 2x PNY XLR8 NVMe 2TB SSD, Samsung 850 EVO 1TB, Samsung 860 EVO 500GB NVMe, 34TB HDD
Display(s) ASUS ROG Swift PG279Q / 27" / IPS / 1440p / 165 Hz G-SYNC
Case Fractal Design R5 w/ 5x Noctua A15
Power Supply Corsair RM850Wx
Mouse Logitech G Pro Wireless, Logitech G403 Wireless, Rival 650 Wireless, Rival 600, Rival 310
Keyboard Steelseries Apex Pro, Steelseries Apex 7 TKL, Corsair K70 Rapidfire (MX speed/silver)
Software Windows 10 Pro
Benchmark Scores Time Spy: 18 808 (GPU: 19 819/CPU: 14 591) http://www.3dmark.com/spy/26583284
3080 Ti. 500 dollar less than 3090, only a few hundred cuda cores less, 4 GB VRAM less and a slightly more narrow memory buss. Ez choice.
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.39/day)
To me it's not that simple. because, samsung 7nm EUV HP has 77Mtr/mm2 density. up from 8nm Custom 44,4Mtr/mm2, GA102 wiould be shrinked so much more so that 628mm2 die would fit in just under 400mm2 and clock higher. but also probably drop to 256 bit bus 16GB. $999 for something that can be shrinked in the next 18 months or less to half the power and 50% more performance is not worth it. I think there is something like Maxwell (that is coming in less than 1 year to detrone GTX 780Ti with something like GTX 970) 3090 ->4070 in this case.
 
Joined
Feb 8, 2012
Messages
3,013 (0.68/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Interesting statistics ... about 31% of the userbase is willing to wait and see what AMD brings to the table... it means that AMD (with proper new power/performance optimized arch) can retake its former piece of the market essentially in one generation
 
Joined
May 15, 2020
Messages
697 (0.48/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
Interesting statistics ... about 31% of the userbase is willing to wait and see what AMD brings to the table... it means that AMD (with proper new power/performance optimized arch) can retake its former piece of the market essentially in one generation
You say that because you assume that the population who voted in this poll is representative of the whole market.

Well, it isn't, there's no way there will be 12% of the market buying the 3090. The sample is pretty far from representative.
 
Joined
Feb 8, 2012
Messages
3,013 (0.68/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
You say that because you assume that the population who voted in this poll is representative of the whole market.

Well, it isn't, there's no way there will be 12% of the market buying the 3090. The sample is pretty far from representative.
Forget the whole market, the relevant market are those who can buy and run triple A titles on their PCs. So I'd say people that visit TPU are good representative of the relevant market for game industry.
12% of those people would like to buy 3090, not will buy ... but 31% of those are waiting for AMD to show something ... big difference

When you have duopoly of this kind, things get interesting close to product launches ... for example what's happening on CPU front with Intel and AMD, people are expecting something on the GPU front from AMD also.
 
Joined
Jul 10, 2011
Messages
788 (0.17/day)
Processor Intel
Motherboard MSI
Cooling Cooler Master
Memory Corsair
Video Card(s) Nvidia
Storage Samsung/Western Digital/ADATA
Display(s) Samsung
Case Thermaltake
Audio Device(s) On Board
Power Supply Seasonic
Mouse A4TECH
Keyboard UniKey
Software Windows 10 x64
Forget the whole market, the relevant market are those who can buy and run triple A titles on their PCs. So I'd say people that visit TPU are good representative of the relevant market for game industry.
12% of those people would like to buy 3090, not will buy ... but 31% of those are waiting for AMD to show something ... big difference

When you have duopoly of this kind, things get interesting close to product launches ... for example what's happening on CPU front with Intel and AMD, people are expecting something on the GPU front from AMD also.

5980 will buy Ampere, 3369 RDNA2.

Don't forget this legendary pool.
 
Joined
Sep 8, 2020
Messages
204 (0.15/day)
System Name Home
Processor 5950x
Motherboard Asrock Taichi x370
Cooling Thermalright True Spirit 140
Memory Patriot 32gb DDR4 3200mhz
Video Card(s) Sapphire Radeon RX 6700 10gb
Storage Too many to count
Display(s) U2518D+u2417h
Case Chieftec
Audio Device(s) onboard
Power Supply seasonic prime 1000W
Mouse Razer Viper
Keyboard Logitech
Software Windows 10
I rarely play games now that i'm older but i've been trough a lot of hardware changes and i learned that high end and midrange graphics cards are not worth it.
I see this Nvidia beat the consoles, they didn't beat anything, PS5 and possible xbox games will be made and are made for RTX2080 level of hardware, pc games will be made for gtx1060/rx480-580 level of hardware because this is the majority of what people have in their computer, sure there will be one or two games that might use rtx 3080 to it's full potential to showcase and boost sales but it's not enough.
Hardware should not be that important, quality content should be important, if Nvidia wants to sell GPU's then they should seriously sponsor a lot of game developers and bring some games to PC enthusiasts.
When a new console launches they showcase games, when a gpu launches they show you lots of graphs, i don't play graphs.
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.39/day)
GTX 1060 is gtx 980, 6 year ancient level of performance now. This is preposterous. I need 4k120. At first I play just to benchmark my system. But I'm addicted and end up wasting thousands of hours that cost me thousand of $, so the price is irrelevant, it cost much more than you think, your life, I'd rather not buy the card but I'm so addicted to play the game.
 
Joined
Sep 8, 2020
Messages
204 (0.15/day)
System Name Home
Processor 5950x
Motherboard Asrock Taichi x370
Cooling Thermalright True Spirit 140
Memory Patriot 32gb DDR4 3200mhz
Video Card(s) Sapphire Radeon RX 6700 10gb
Storage Too many to count
Display(s) U2518D+u2417h
Case Chieftec
Audio Device(s) onboard
Power Supply seasonic prime 1000W
Mouse Razer Viper
Keyboard Logitech
Software Windows 10
People like you where the price of hardware is irrelevant are so few that Nvidia and Amd would go bankrupt, you can't even make 1%, nobody spends millions of $ to make games for the 1%, that's why pc gaming is ridiculously expensive and not worth it in my opinion compared to consoles.
 
Joined
May 15, 2020
Messages
697 (0.48/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
People like you where the price of hardware is irrelevant are so few that Nvidia and Amd would go bankrupt, you can't even make 1%, nobody spends millions of $ to make games for the 1%, that's why pc gaming is ridiculously expensive and not worth it in my opinion compared to consoles.
This is a tech site, most people coming here are first and foremost hardware geeks, not only gamers. It's a different subculture.
 
Joined
Jan 5, 2006
Messages
17,830 (2.67/day)
System Name AlderLake / Laptop
Processor Intel i7 12700K P-Cores @ 5Ghz / Intel i3 7100U
Motherboard Gigabyte Z690 Aorus Master / HP 83A3 (U3E1)
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans / Fan
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MHz CL36 / 8GB DDR4 HyperX CL13
Video Card(s) MSI RTX 2070 Super Gaming X Trio / Intel HD620
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2 / Samsung 256GB M.2 SSD
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p / 14" 1080p IPS Glossy
Case Be quiet! Silent Base 600 - Window / HP Pavilion
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W / Powerbrick
Mouse Logitech MX Anywhere 2 Laser wireless / Logitech M330 wireless
Keyboard RAPOO E9270P Black 5GHz wireless / HP backlit
Software Windows 11 / Windows 10
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock

ARF

Joined
Jan 28, 2020
Messages
3,957 (2.55/day)
Location
Ex-usa
RTX 3080 looks only 65% faster than the RX 5700 XT in the AoS benchmark.

Navi 21 is expected to offer at least 100% performance improvement over Navi 10.

1599589905270.png

 
Joined
Feb 8, 2012
Messages
3,013 (0.68/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Don't forget this legendary pool.
Funny stuff, that's why I'd like to think this corner of the interwebz to be reasonable ... I realize how unreasonable it is to think that, as I couldn't type it with a straight face ... but hey, relatively?
 
Top