• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Which NVIDIA Ampere card would you buy?

Which NVIDIA Ampere card would you buy?

  • RTX 3090

    Votes: 5,036 9.3%
  • RTX 3080

    Votes: 11,737 21.7%
  • RTX 3070

    Votes: 6,502 12.0%
  • Will skip this generation

    Votes: 3,868 7.2%
  • Will wait for RDNA2

    Votes: 22,325 41.3%
  • Using AMD

    Votes: 2,758 5.1%
  • Buying a next-gen console

    Votes: 1,810 3.3%

  • Total voters
    54,036
  • Poll closed .
NVIDIA just announced the GeForce RTX 3090 ($1500), RTX 3080 ($700) and RTX 3070 ($500).

Do you feel like Ampere is for you? If yes, which card are you most interested in? Or sticking with Turing? Or AMD?
I rather buy RTX 3080 with 20 GB VRAM i.e. RTX 3080 Ti type product between RTX 3090 ($1500, 24 GB) and RTX 3080 ($700, 10GB). Willing to pay $999 USD.

AMD will catch up with 2080 Super +20% maximum. They certainly not expected so huge speed bump...
Xbox Series X GPU (52 CU, 1825 Mhz base clock only) is already RTX 2080 level with Gears 5 benchmark.

Xbox Series X GPU has 56 CU in full "XT" configuration.

Xbox Series X GPU = dual shader engines with 56 CU or 26 DCU
RX 5700 XT = dual shader engines with 40 CU or 20 DCU

History
R9-270X = dual shader engines with 20 CU, 32 ROPS
R9-290X = quad shader engines with 44 CU, 64 ROPS

R9-290X slightly exceeded 2X scale from R9-270X

Based on history, expect quad shader engines equipped NAVI version in 80 CU to 112 CU range.
 
The problem is that people who bought a 2000 cards especially 2080 Ti's have paid too much to get them, that's why they price them a little bit too high. So nobody wants to loose 600-700 $ of money selling them used. Because if the 3070 will bring same or close performance of a 2080 Ti that means the 2080 Ti value is the same too.

So im not selling my 2080 Ti and i advice to all of you who bought a 20XX card to not rush to sell it !!!

You can keep it and wait for the 4000 or who knows graphics cards from nvidia.
Why would anybody think they could recoup their money on a GPU? After a new gen comes out the best you can get for a brief period of time is 50%, falling off quickly thereafter. Sure, lots of people will ask near full price, until finally some poor sucker comes along and pays that price.
 
Last edited:
Why would anybody think they could recoup their money on a GPU? After a new gen comes out the best you can get for a brief period of time is 50%, falling off quickly thereafter. Sure, lots of people will ask near full price, until finally some poor sucker comes along and pays that price.

Of course people want to recoup the highest amount of money from their GPU.

Its second hand marketing !!! Same principle are for cars, bikes, ecc.... !


What people are missing is the fact that the card they own at least a 20XX Super's and especially who got 2080 Ti is this : your graphics card is still one of the fastest and valid for at least 2 years !!!
 
Why would anybody think they could recoup their money on a GPU? After a new gen comes out the best you can get for a brief period of time is 50%, falling off quickly thereafter. Sure, lots of people will ask near full price, until finally some poor sucker comes along and pays that price.

I see GTX 1080 on sale now for about 300 EUR. I bought mine for 500 (well, 420 but that's a tax break you don't always have) new. I will buy a replacement for 500. Let's say it gets worse to 250, indeed 50%. But that isn't a bottom or anything... second hand needs to be combined with buying sub-top at the right moment. At that point you're looking at negligible TCO for any GPU.

Recoup entirely, no. Recoup most? Hell yeah. If you consider that this GPU was worth 3 years of gaming, that's 1095 days, at 250 total cost... just a little over 0,22 EUR per day :) And this just keeps on giving, too, because we can STILL enjoy all the little upgrades over time. Higher res, new tech, etc. etc.

One might even defend that compared to any console which will have just about zero resale value when it is past-gen hardware and no new releases on it (and emulation over time on a PC!), alongside the backwards compatibility it offers, a GPU is virtually free on the bottom line, or even a money maker if you consider your entertainment investments over time. Not to mention its potential for content creation. It really, actually is a tool that can make you money or has 'output value' in other ways (crunching, folding). I even forgot about mining.

Well it more about accepting that people just have different needs.
Some people want the best performance for their dollar.
Some just want the best performance per watt (which would always belong to the most expensive GPU).
The majority will look for the right balance between the 2 metrics.

Let just say you have a Laptop with limited cooling potential, then perf/watt will become the most limiting factor, the more efficient GPU will give more FPS. You can look into 5600M vs 2060 laptops and see why efficiency matter.



HUH ?
performance-per-watt_3840-2160.png

Oh please.

Customers buy whatever product looks best at that specific point in time. Trying to attribute that to a few bar charts with minimal percentile gaps is a sign of madness in the eyes of the beholder. That's you in this case.

Nvidia sells because
- It leads in technological advances, mindshare, thought leadership. This is a big thing in any soft- and hardware environment as development is iterative. If you can think faster than the rest, you're smarter and you'll keep leading. Examples: Gsync and other advances in gaming/content delivery, new AA methods, GameWorks emergent tech in co-conspiracy with devs, a major investment in engineers and industry push, CUDA development, etc etc.

- It has a great marketing department

- It has maintained a steady product stack across many generations, this instills consumer faith. The company delivers. Every time. If it delays, there is no competitor that has a better offer.

Got it? Now please, cut this nonsense. Perf/watt is an important metric on an architectural level because it influences what can happen with a specified power budget. Its interesting for us geeks. Not for the other 98% of consumers. In the end all that really matters is what a GPU can do in practice. Perf/watt is great but if you're stuck at 4GB VRAM, well.. yay. If the max TDP is 100W... yay. And if you're just buying the top-end GPU, don't even get me started man, you never cared about perf/watt at all, you only care about absolute perf.

Perf/watt arguments coming from a 2080ti owner is like the new born Tesla driver who's suddenly all about climate, except the reason he bought it was just because he's fond of fast cars. And then speeds off in Ludicrous mode.
 
Last edited:
I still see people from time to time trying to sell Maxwell Titan Xs for 500$. I also recently saw a used Titan V for 3000 euros.

You wouldn't believe how stubborn people are to let go of items that have clearly plummeted in value.
 
Oh please.

Customers buy whatever product looks best at that specific point in time. Trying to attribute that to a few bar charts with minimal percentile gaps is a sign of madness in the eyes of the beholder. That's you in this case.

Nvidia sells because
- It leads in technological advances, mindshare, thought leadership. This is a big thing in any soft- and hardware environment as development is iterative. If you can think faster than the rest, you're smarter and you'll keep leading. Examples: Gsync and other advances in gaming/content delivery, new AA methods, GameWorks emergent tech in co-conspiracy with devs, a major investment in engineers and industry push, CUDA development, etc etc.

- It has a great marketing department

- It has maintained a steady product stack across many generations, this instills consumer faith. The company delivers. Every time. If it delays, there is no competitor that has a better offer.

Got it? Now please, cut this nonsense. Perf/watt is an important metric on an architectural level because it influences what can happen with a specified power budget. Its interesting for us geeks. Not for the other 98% of consumers. In the end all that really matters is what a GPU can do in practice. Perf/watt is great but if you're stuck at 4GB VRAM, well.. yay. If the max TDP is 100W... yay. And if you're just buying the top-end GPU, don't even get me started man, you never cared about perf/watt at all, you only care about absolute perf.

Perf/watt arguments coming from a 2080ti owner is like the new born Tesla driver who's suddenly all about climate, except the reason he bought it was just because he's fond of fast cars. And then speeds off in Ludicrous mode.

Oh please,
Stop with the PCMR shaming, we all are PCMR fanatic :).

Well this might be a surprise to you that the absolute performance is only when you are trying to bench for high score, most of the time 2080 Ti owners will try for a balance between thermal, noise output and performance when gaming. FYI the AIB 2080 Ti can draw 350W too but it make little sense to use that much power for little performance gain.

uXkg35kjyWzlYUiZ.jpg


Would it surprise you that there are people who would run the 3080 at 250-280W ? From this chart the gain from 250 to 320W is around 15fps, a 16% perf gain for 28% power increase. Yes the more FPS the better but you would more likely to notice your room being warmed up after a while rather a few fps drop :).

Worrying about max power a card can use is just short sighted or just salty, especially when Ampere offer 25-30% higher efficiency vs Turing.

Do you know why Vega is meme ?
perfwatt_3840_2160.png


7% perf/watt improvement vs Fury X :)
 
Oh please,
Stop with the PCMR shaming, we all are PCMR fanatic :).

Well this might be a surprise to you that the absolute performance is only when you are trying to bench for high score, most of the time 2080 Ti owners will try for a balance between thermal, noise output and performance when gaming. FYI the AIB 2080 Ti can draw 350W too but it make little sense to use that much power for little performance gain.

uXkg35kjyWzlYUiZ.jpg


Would it surprise you that there are people who would run the 3080 at 250-280W ? From this chart the gain from 250 to 320W is around 15fps, a 16% perf gain for 28% power increase. Yes the more FPS the better but you would more likely to notice your room being warmed up after a while rather a few fps drop :).

Worrying about max power a card can use is just short sighted or just salty, especially when Ampere offer 25-30% higher efficiency vs Turing.

Do you know why Vega is meme ?
perfwatt_3840_2160.png


7% perf/watt improvement vs Fury X :)

Okay, you repeated yourself once more, but your point is?

Everyone can run their GPU as they see fit, but you're still limited by its max TDP, and you will use that too when you need the performance. Undervolting is always an option, but to have it influence a purchase decision is another world entirely. Your world. People didn't undervolt Vega's because they loved to. They did it because the settings from the factory were usually pretty loose and horrible. They bought a Vega not because it would undervolt so well... they bought it because it had, at some point in time, a favorable perf/dollar. Despite its power usage.

Similarly, not a single soul in the world, well maybe not counting you, did buy a 2080ti for its great efficiency per watt, they bought it for absolute performance and having the longest epeen for the shortest possible time in GPU history, as it now seems. Can you undervolt it yes, and if you won't need the last 5% you probably will. But its an irrelevant metric here and also wrt a 3080 or 3090 with a much increased power budget.

Turing is also proving you wrong with the exact same perf/watt figures across the whole stack, by the way. And Ampere won't be much different. The scaling is the same or almost the same regardless of tier/SKU. They all clock about as high.

Well people are selling 2080 Ti for 500usd now, might as well grab one. At least the 11GB framebuffer is more enticing than 8GB and the 2080 Ti only use 40W more.
RTX and DLSS performance would likely to be the same between the two.

Unlikely given the updates to the SMs themselves. The core has changed quite a bit in favor of more RT perf. This is how they get to their 2080ti perf equivalent claims on the x70, too, most likely - at least in part.
 
Last edited:
Calm it down guys, no need for any heated discussions... Just chat and chill :D

If I'm honest with the rigs at home, I could quite happily buy one of each of them and still have more than enough power for the crunching and folding work that I do. I would however like to upgrade my SLI 1080 Ti's as the hardware looks lost inside my case but I'm ok with that :) :laugh:

I will miss SLI, I've pretty much used it since the GTX 580's when I had two/three of those cards running 8064 x 1600... Ah memories :)

Still what I'm not forgetting either AMD's new offering. Should that be better/faster/more efficient and the like, I'd likely go down to that direction as well :) I love hardware, I'll buy whatever is best for me and not worry too much about anything else :)

As for buying any of the 2 series cards, only when everything is released then I might treat myself to a Kingpin 2080 Ti or something for the collection.. Either way, I'm definitely in no rush at all for any of the newer cards and I'm firmly waiting on reviews and aftermarket cards but sticking with the old faithful models, such as MSI's Gaming X, Asus ROG Strix and then my personal favourite, EVGA's SC or FTW cards as I've had no issues with them at all :)
 
Okay, you repeated yourself once more, but your point is?

Everyone can run their GPU as they see fit, but you're still limited by its max TDP, and you will use that too when you need the performance. Undervolting is always an option, but to have it influence a purchase decision is another world entirely. Your world. People didn't undervolt Vega's because they loved to. They did it because the settings from the factory were usually pretty loose and horrible. They bought a Vega not because it would undervolt so well... they bought it because it had, at some point in time, a favorable perf/dollar. Despite its power usage.

Similarly, not a single soul in the world, well maybe not counting you, did buy a 2080ti for its great efficiency per watt, they bought it for absolute performance and having the longest epeen for the shortest possible time in GPU history, as it now seems. Can you undervolt it yes, and if you won't need the last 5% you probably will. But its an irrelevant metric here and also wrt a 3080 or 3090 with a much increased power budget.

Turing is also proving you wrong with the exact same perf/watt figures across the whole stack, by the way. And Ampere won't be much different. The scaling is the same or almost the same regardless of tier/SKU. They all clock about as high.

Well you are wrong in many arguments.
First Nvidia TDP is set in the bios, I could flash a 380W TDP bios to my 2080 Ti if I wanted to, but there is no reason to because efficiency just fell off a cliff after about 260W TDP, as mirrored in Nvidia Perf/Power chart.

Second 24 months is actually the longest time a GPU has held onto the performance crown (2080 Ti was released Sept 2018). The previous king would be the 8800GTX at 19 months (1080 Ti at 18 months). I previously own the Titan X Maxwell and 1080 Ti too so I'm just happy that 3090 will beat the 2080 Ti by a good amount.

Third it seems like almost 12% of voters in this forum are buying the RTX 3090, wherether for the best performance or highest efficiency, does it matter ? Can a Tesla offer both good performance and cost of fuel as well ? (which it does btw)
 
Well you are wrong in many arguments.
First Nvidia TDP is set in the bios, I could flash a 380W TDP bios to my 2080 Ti if I wanted to, but there is no reason to because efficiency just fell off a cliff after about 260W TDP, as mirrored in Nvidia Perf/Power chart.

Second 24 months is actually the longest time a GPU has held onto the performance crown (2080 Ti was released Sept 2018). The previous king would be the 8800GTX at 19 months (1080 Ti at 18 months). I previously own the Titan X Maxwell and 1080 Ti too so I'm just happy that 3090 will beat the 2080 Ti by a good amount.

Third it seems like almost 12% of voters in this forum are buying the RTX 3090, wherether for the best performance or highest efficiency, does it matter ?

I know you need to justify a 2080ti > 3090 but really, don't. Its fine. All is well. Great cards.
 
I know you need to justify a 2080ti > 3090 but really, don't. Its fine. All is well. Great cards.

Yeah being salty is bad for your heart too, take care :)
 
3080 Ti. 500 dollar less than 3090, only a few hundred cuda cores less, 4 GB VRAM less and a slightly more narrow memory buss. Ez choice.
 
To me it's not that simple. because, samsung 7nm EUV HP has 77Mtr/mm2 density. up from 8nm Custom 44,4Mtr/mm2, GA102 wiould be shrinked so much more so that 628mm2 die would fit in just under 400mm2 and clock higher. but also probably drop to 256 bit bus 16GB. $999 for something that can be shrinked in the next 18 months or less to half the power and 50% more performance is not worth it. I think there is something like Maxwell (that is coming in less than 1 year to detrone GTX 780Ti with something like GTX 970) 3090 ->4070 in this case.
 
Interesting statistics ... about 31% of the userbase is willing to wait and see what AMD brings to the table... it means that AMD (with proper new power/performance optimized arch) can retake its former piece of the market essentially in one generation
 
Interesting statistics ... about 31% of the userbase is willing to wait and see what AMD brings to the table... it means that AMD (with proper new power/performance optimized arch) can retake its former piece of the market essentially in one generation
You say that because you assume that the population who voted in this poll is representative of the whole market.

Well, it isn't, there's no way there will be 12% of the market buying the 3090. The sample is pretty far from representative.
 
You say that because you assume that the population who voted in this poll is representative of the whole market.

Well, it isn't, there's no way there will be 12% of the market buying the 3090. The sample is pretty far from representative.
Forget the whole market, the relevant market are those who can buy and run triple A titles on their PCs. So I'd say people that visit TPU are good representative of the relevant market for game industry.
12% of those people would like to buy 3090, not will buy ... but 31% of those are waiting for AMD to show something ... big difference

When you have duopoly of this kind, things get interesting close to product launches ... for example what's happening on CPU front with Intel and AMD, people are expecting something on the GPU front from AMD also.
 
Forget the whole market, the relevant market are those who can buy and run triple A titles on their PCs. So I'd say people that visit TPU are good representative of the relevant market for game industry.
12% of those people would like to buy 3090, not will buy ... but 31% of those are waiting for AMD to show something ... big difference

When you have duopoly of this kind, things get interesting close to product launches ... for example what's happening on CPU front with Intel and AMD, people are expecting something on the GPU front from AMD also.

5980 will buy Ampere, 3369 RDNA2.

Don't forget this legendary pool.
 
I rarely play games now that i'm older but i've been trough a lot of hardware changes and i learned that high end and midrange graphics cards are not worth it.
I see this Nvidia beat the consoles, they didn't beat anything, PS5 and possible xbox games will be made and are made for RTX2080 level of hardware, pc games will be made for gtx1060/rx480-580 level of hardware because this is the majority of what people have in their computer, sure there will be one or two games that might use rtx 3080 to it's full potential to showcase and boost sales but it's not enough.
Hardware should not be that important, quality content should be important, if Nvidia wants to sell GPU's then they should seriously sponsor a lot of game developers and bring some games to PC enthusiasts.
When a new console launches they showcase games, when a gpu launches they show you lots of graphs, i don't play graphs.
 
GTX 1060 is gtx 980, 6 year ancient level of performance now. This is preposterous. I need 4k120. At first I play just to benchmark my system. But I'm addicted and end up wasting thousands of hours that cost me thousand of $, so the price is irrelevant, it cost much more than you think, your life, I'd rather not buy the card but I'm so addicted to play the game.
 
People like you where the price of hardware is irrelevant are so few that Nvidia and Amd would go bankrupt, you can't even make 1%, nobody spends millions of $ to make games for the 1%, that's why pc gaming is ridiculously expensive and not worth it in my opinion compared to consoles.
 
People like you where the price of hardware is irrelevant are so few that Nvidia and Amd would go bankrupt, you can't even make 1%, nobody spends millions of $ to make games for the 1%, that's why pc gaming is ridiculously expensive and not worth it in my opinion compared to consoles.
This is a tech site, most people coming here are first and foremost hardware geeks, not only gamers. It's a different subculture.
 
RTX 3080 looks only 65% faster than the RX 5700 XT in the AoS benchmark.

Navi 21 is expected to offer at least 100% performance improvement over Navi 10.

1599589905270.png

 
Don't forget this legendary pool.
Funny stuff, that's why I'd like to think this corner of the interwebz to be reasonable ... I realize how unreasonable it is to think that, as I couldn't type it with a straight face ... but hey, relatively?
 
Back
Top