• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

What would you buy?

What would you buy?

  • RX 6800 XT / $510

    Votes: 4,297 34.6%
  • RTX 3080 10 GB / $550

    Votes: 759 6.1%
  • RTX 4070 / $600

    Votes: 2,139 17.2%
  • RX 6900 XT / $650

    Votes: 474 3.8%
  • RX 6950 XT / $700

    Votes: 776 6.2%
  • RTX 3090 / $780

    Votes: 469 3.8%
  • RTX 4070 Ti / $800

    Votes: 1,411 11.4%
  • RX 7900 XT / $800

    Votes: 2,095 16.9%

  • Total voters
    12,420
  • Poll closed .
I am waiting and hope that the 7800 XT is seeing the light of day this year and see the performance.

But I wouldn't buy any of the cards right now I will hold on to my PowerColor Radeon RX 6800 XT Red Devil it gives me the satisfaction at 1440p that the RX 6800, RTX 3070 and the RTX 3090 couldn't.
From what I've heard, Navi 31 has an artifacting issue in silicon design that is patched/disabled and partly responsible for the lower-than-expected performance at launch, and also the reason why AMD had a separate branch of drivers for the first 3 months. They dropped everything else and tried to fix the issue which is also why there were no driver updates for prior-gen cards between December and February.

Navi32 and Navi33 have likely been postponed while they're sorting out the issue so the downside is that 7700XT and 7800XT might be a bit later than expected, but the upshot is that they ought to be faster than expected, when you consider that the 7900-series are slowed down by the software mitigations for the silicon design flaw. Source is MLID but he was quoting a discussion he had with one of AMD's manufacturing partners so I don't think it's BS.
 
I would buy rtx 4070 all day, smaller, cooler and more efficient.
 
From what I've heard, Navi 31 has an artifacting issue in silicon design that is patched/disabled and partly responsible for the lower-than-expected performance at launch, and also the reason why AMD had a separate branch of drivers for the first 3 months. They dropped everything else and tried to fix the issue which is also why there were no driver updates for prior-gen cards between December and February.

Navi32 and Navi33 have likely been postponed while they're sorting out the issue so the downside is that 7700XT and 7800XT might be a bit later than expected, but the upshot is that they ought to be faster than expected, when you consider that the 7900-series are slowed down by the software mitigations for the silicon design flaw. Source is MLID but he was quoting a discussion he had with one of AMD's manufacturing partners so I don't think it's BS.

AMD also worked fixing their vapor cooling solution on their reference cards since they been having issues on 7900 XT/XTX.

But so far I haven't had a hiccup even undervolting my card has been running great it's my third RX 6800 XT.
 
AMD also worked fixing their vapor cooling solution on their reference cards since they been having issues on 7900 XT/XTX.

But so far I haven't had a hiccup even undervolting my card has been running great it's my third RX 6800 XT.
Honestly, the VRAM is the deciding factor for me at the moment. I've been stung by lack of VRAM twice with Nvidia at both 6GB and 8GB capacities.
There's no point having RT and DLSS3 if you have to turn down graphics settings to low/medium.
 
Honestly, the VRAM is the deciding factor for me at the moment. I've been stung by lack of VRAM twice with Nvidia at both 6GB and 8GB capacities.
There's no point having RT and DLSS3 if you have to turn down graphics settings to low/medium.

Well Nvidia has been playing this ball for years now and for me it clicked with their overpricing when I finally sold my GTX 1080 Ti and got a RX 5700 XT same performance for me in the game I played back than at 4K.

Later I got the RX 6800 reference, than a RX 6800 XT than I got a RTX 3090 but I felt something was missing and then I got my third RX 6800 XT and I was happy with my gaming again.

Being between 8 to 24GB of vram I can see 16GB is fine for now even at 1440p even with the last of us part 1 as poorly optimized as it is.
 
I am waiting and hope that the 7800 XT is seeing the light of day this year and see the performance.
Looking at 7900XT, it's obvious that 7800TX would be at best 10% faster than the 6800XT, and I highly doubt that it will be any better in performance/dollar. After all, 7900XT itself should have been the 7800XT in the first place.
 
Nvidia was caught by surprise with the Vram requirements, if you remember before RDNA2 was launched and before consoles, Nvidia didn't have 12gb in the low end, they had 3060 ti with 8gb vram.
Then when they found out what the consoles are going to look like they were like fuu.....k, they immediately launched a 12gb vram RTX 3060 and a RTX 3080 12gb.
So, in a way, they are not in the loop for what the trend will be, consoles make the trend and whatever new technologies Nvidia invents, they are not really taken up seriously and if something is useful and cheap to implement, AMD just copies it and make it free, especially for consoles.
While ray tracing may be the holly grail on PC, consoles makers don't even want to hear it when they find how much compute power it needs and how costly the hardware gets to be.
 
From what I've heard, Navi 31 has an artifacting issue in silicon design that is patched/disabled and partly responsible for the lower-than-expected performance at launch, and also the reason why AMD had a separate branch of drivers for the first 3 months. They dropped everything else and tried to fix the issue which is also why there were no driver updates for prior-gen cards between December and February.

Navi32 and Navi33 have likely been postponed while they're sorting out the issue so the downside is that 7700XT and 7800XT might be a bit later than expected, but the upshot is that they ought to be faster than expected, when you consider that the 7900-series are slowed down by the software mitigations for the silicon design flaw. Source is MLID but he was quoting a discussion he had with one of AMD's manufacturing partners so I don't think it's BS.
There's still a fair amount of 6000 cards in stores, so I don't think they're in a rush anyway.

Nvidia was caught by surprise with the Vram requirements, if you remember before RDNA2 was launched and before consoles, Nvidia didn't have 12gb in the low end, they had 3060 ti with 8gb vram.
Then when they found out what the consoles are going to look like they were like fuu.....k, they immediately launched a 12gb vram RTX 3060 and a RTX 3080 12gb.
If that's true, then where's the 16 GB 3070 Ti that Nvidia originally planned to release but cancelled?
 
4070 only bc AMD cards don’t have the umpft in workstation apps. Not yet atleast.

There's still a fair amount of 6000 cards in stores, so I don't think they're in a rush anyway.


If that's true, then where's the 16 GB 3070 Ti that Nvidia originally planned to release but cancelled?
And my dumb a$$ bought a 3070ti lol!!
 
If that's true, then where's the 16 GB 3070 Ti that Nvidia originally planned to release but cancelled?
Planned obsolence ? rethinking how good a 16gb card would be and cancelling not to ruin future sales ?
It's already been established that even a 3070 would be a brilliant card if it had 12-16gb of vram.
You gotta understand them, high class hookers are expensive, that money has to come from somewhere, just kidding :), or not.
 
Planned obsolence ? rethinking how good a 16gb card would be and cancelling not to ruin future sales ?
It's already been established that even a 3070 would be a brilliant card if it had 12-16gb of vram.
You gotta understand them, high class hookers are expensive, that money has to come from somewhere, just kidding :), or not.
This is why one of the hardest cards to get is the 3060 12GB and the 3080 12GB. Unless you want to pay more than you should.
 
If that's true, then where's the 16 GB 3070 Ti that Nvidia originally planned to release but cancelled?
Likewise, there were rumours and pre-production leaks of a 3080 20GB that never came to fruition.
 
i will be getting a 6950xt oc formula in a couple of days
 
And my dumb a$$ bought a 3070ti lol!!
OOoof.
It's not quite too late to sell it though. I managed to dump my MSI 3070 on ebay for just £10 less than I paid for it last month.
Among the informed, the 8GB limitation is a big problem, but the 3070Ti is worth good money to the ignorant masses who (if you're lucky) watch the first 90 seconds of a launch-day youtube 'review' and call it a day.

I'm shoving 3060 12GB cards in midrange pro workstations, as I have been since they launched. They're not the quickest, but speed is irrelevant when the alternative is "it crashed due to lack of VRAM and failed to complete the job/render/simulation". We have a few 3090s and RTX 8000s that staff can submit jobs to but now is the time to pick up a 3090 used, for cheap, I think. I bought two last week for the renderfarm and those 24GB pay for themselves in days where I work. YMMV based on your workload, obviously!
 
Last edited:
Personally, I bought the 4080 for my 1440p 165Hz monitor. And I would buy it again because only with this card can you achieve truly sufficient performance at 1440p with a high refresh rate monitor, as well as get all the latest technologies.
 
Last edited:
Anything but the 10GB 3080, doesn’t matter if nvidia or AMD.
 
I just upgraded from 6900XT to 7900XTX and its was a decent boost - would not buy a 6000 or a 3000 series GPU for a new build. But on a budget any thing goes
 
Personally, I bought the 4080 for my 1440p 165Hz monitor. And I would buy it again because only with this card can you achieve truly sufficient performance at 1440p with a high refresh rate monitor, as well as get all the latest technologies.
Funny, I run 1440p144 with a 6800xt and have no issue whatsoever, despite not having "the latest technologies", whatever those are.

What do you do, pray tell, with those "latest technologies" anyway?
 
What do you do, pray tell, with those "latest technologies" anyway?
1. I had a 3080, which had roughly the same performance as the 6800XT, and I can say for sure that this level of performance is not enough for 1440p at 165Hz.
2. Thanks to the latest technologies, now i am playing Cyberpunk with path tracing at a smooth 90+ FPS with great image quality. A month ago, thanks to these technologies, I played Hogwarts on ultra with RT at over 130 FPS without any signs of CPU bottlenecking. And earlier, thanks again to these latest technologies, I played A Plague Tale and Spider-Man remaster on ultra with RT at over 130 and 140 FPS.
 
just pulled the plug on 6950xt oc formula itll be here friday! ya me! first brand new gpu since 2013!
 
1. I had a 3080, which had roughly the same performance as the 6800XT, and I can say for sure that this level of performance is not enough for 1440p at 165Hz.
2. Thanks to the latest technologies, now i am playing Cyberpunk with path tracing at a smooth 90+ FPS with great image quality. A month ago, thanks to these technologies, I played Hogwarts on ultra with RT at over 130 FPS without any signs of CPU bottlenecking. And earlier, thanks again to these latest technologies, I played A Plague Tale and Spider-Man remaster on ultra with RT at over 130 and 140 FPS.
I feel if you're going for 165Hz and have any budgetary considerations whatsoever, the graphics settings in game play a far bigger role than the GPU performance. GPU performance seems to scale incrementally per generation. Graphics settings seem to scale exponentially with performance in complex AAA games.

No, you can't play CP2077 at 1440p165 on a GTX 1030, but you also *can* achieve that res/fps on much cheaper hardware than even a 3080 if you tune the settings appropriately. I managed >120fps on a 3070 with sweet-spot graphics settings without issues.

Path-tracing is cool, but it comes with a high-price and given how hideous the performance-hit has been on prior path-traced titles (Q2, Portal), - it's not something that is going to be anything other than a marketing stunt for most game devs for a few more generations.

As cool as all these new high-end cards and mods/showcases are, the reality is that 99.x% of games run flawlessly on the PS4 or a midrange PC from the PS4 era. My main workstation has an RTX 8000 in it right now (3090, basically) but I still play most of my games on the vastly inferior RX 6700 in the living room, because sofa. All I have to do to make games run on the 6700 as well as they run on the RTX 8000 is enable FSR balanced. That's it - a little bit of image softening and shimmer to get the exact same result at 1/12th the price. The gameplay, the plot, the acting, the art assets, and the texture quality are all the same regardless of which system I'm playing on.
 
Last edited:
I feel if you're going for 165Hz and have any budgetary considerations whatsoever, the graphics settings in game play a far bigger role than the GPU performance. GPU performance seems to scale incrementally per generation. Graphics settings seem to scale exponentially with performance in complex AAA games.

No, you can't play CP2077 at 1440p165 on a GTX 1030, but you also *can* achieve that res/fps on much cheaper hardware than even a 3080 if you tune the settings appropriately. I managed >120fps on a 3070 with sweet-spot graphics settings without issues.

Path-tracing is cool, but it comes with a high-price and given how hideous the performance-hit has been on prior path-traced titles (Q2, Portal), - it's not something that is going to be anything other than a marketing stunt for most game devs for a few more generations.

As cool as all these new high-end cards and mods/showcases are, the reality is that 99.x% of games run flawlessly on the PS4 or a midrange PC from the PS4 era. My main workstation has an RTX 8000 in it right now (3090, basically) but I still play most of my games on the vastly inferior RX 6700 in the living room, because sofa. All I have to do to make games run on the 6700 as well as they run on the RTX 8000 is enable FSR balanced. That's it - a little bit of image softening and shimmer to get the exact same result at 1/12th the price. The gameplay, the plot, the acting, the art assets, and the texture quality are all the same regardless of which system I'm playing on.
as it is this was a combined effort between me and2 of my friends ........... itll do me just fine ........... im happy with 60 fps locked on 4k
 
Seriously considering the 4070 if I can find a FE in stock. I really want the 2 slots width. Next slot is taken by a 10 Gb Ethernet card.

But I'm afraid it won't do VR flight sims the way I would like, it might not be enough.
 
No, you can't play CP2077 at 1440p165 on a GTX 1030, but you also *can* achieve that res/fps on much cheaper hardware than even a 3080 if you tune the settings appropriately.
I know about graphic settings and how they affect fps) However, I prefer to play with beautiful graphics at 120+ fps without the horrible shimmering of FSR, and I am willing to pay for it. I understand that some people don't care about graphics, and others don't care about fps, but I want both.
 
Back
Top