- Joined
- Jan 14, 2019
- Messages
- 16,203 (6.85/day)
- Location
- Midlands, UK
System Name | My second and third PCs are Intel + Nvidia |
---|---|
Processor | AMD Ryzen 7 7800X3D @ 45 W TDP Eco Mode |
Motherboard | MSi Pro B650M-A Wifi |
Cooling | Noctua NH-U9S chromax.black push+pull |
Memory | 2x 24 GB Corsair Vengeance DDR5-6000 CL36 |
Video Card(s) | PowerColor Reaper Radeon RX 9070 XT |
Storage | 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda |
Display(s) | Dell S3422DWG 34" 1440 UW 144 Hz |
Case | Corsair Crystal 280X |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | 750 W Seasonic Prime GX |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE Plasma |
I see your point, but I still don't think one should need a $400 GPU just to run the game at 1080p with RT off.I don't get the hate. First 'Next gen' games are always like this. Devs eventually learn to optimize better, and cards get more powerful. This has always happened in gaming.
Visuals are getting harder to improve meaningfully without impacting performance too much, that's how diminishing returns work. I also wouldn't expect AAA, story-based games to err on the side of performance rather than visuals.
On the bright side, as long as Low gives you 95% of the visual quality of Ultra with 3x the FPS, I'm fine.

On the other bright side, I remember when I bought my 6750 XT and some people tried to suggest upgrading my monitor because 1080p is so oldschool, and the 6750 XT can do so much more, and I was like "nah, it'll be fine". Who was right then?

Last edited: