• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Convince me that modern game graphics are good.

Not gonna lie..

Modern graphics smoke my old Atari 2600 by a lot.

So there has been some improvement.
 
I think I recall reading that the reason a lot of games of that console era had those bad color palettes (among other things) was to hide the drawbacks they were dealing with.
What? You mean to tell me that they weren't just going for the post-eruption Pompei theme?

Seriously, I know that was a thing for ancient consoles, but PS3/X360 were capable of doing standard 32bit colour buffers, no different than any modern game. The Modern Warfare-esque, grayish tonemaping and the rusty look of Gears of War et al (what I assume you're refering to) were artistic choices.

All of the current problems in game graphics have cascaded off of a push to make workflows shorter and development cheaper
To be fair, this push had also resulted in the greatness of said pre-2018 games. Physically-based rendering model was adopted to spare artists creating multiple versions of the same shader for every scene. Advancement in motion capture made facial and body animation look awesome without having animators waste months refining keyframes. And let's not even get into the tons of middlewere and tooling advancement. No one would have licensed that latest Maya if it didn't come with new stuff to make pushing those vertices faster and easier. Battlfield V wouldn't be as easy to make if they didn't rely on Quixel's photogrammetry magic. Game engines themselves were conceived for this reason; I doubt the IQ curve covers what one needs to be able to even fathom making Resident Evil 7 fully with only an assembler and years' supply of Scottish spirits...
 
Game devs have gotten lazy
 
Path tracing and RT ambient occlusion are the future and they look great in games that support it. The issue is that it still takes a GPU that costs more than a PS5 Pro to run them at good speeds and resolutions (on top of AI upscaling and frame generation).
 
Path tracing and RT ambient occlusion are the future and they look great in games that support it. The issue is that it still takes a GPU that costs more than a PS5 Pro to run them at good speeds and resolutions (on top of AI upscaling and frame generation).
Not everyones internet service is great either...
 
To be fair, this push had also resulted in the greatness of said pre-2018 games. Physically-based rendering model was adopted to spare artists creating multiple versions of the same shader for every scene. Advancement in motion capture made facial and body animation look awesome without having animators waste months refining keyframes. And let's not even get into the tons of middlewere and tooling advancement. No one would have licensed that latest Maya if it didn't come with new stuff to make pushing those vertices faster and easier. Battlfield V wouldn't be as easy to make if they didn't rely on Quixel's photogrammetry magic. Game engines themselves were conceived for this reason; I doubt the IQ curve covers what one needs to be able to even fathom making Resident Evil 7 fully with only an assembler and years' supply of Scottish spirits...
Fingers crossed they figure out again how to make games not look and run like absolute horse manure now then. I guess its the beginning of new development cycle for graphics and we're in the early adopter period, the bottom of the barrel from which we can only go up.

It just feels so useless to me, such an immense waste of time that could have gone into actual content, is spent on stuff around that content.
 
What? You mean to tell me that they weren't just going for the post-eruption Pompei theme?

Seriously, I know that was a thing for ancient consoles, but PS3/X360 were capable of doing standard 32bit colour buffers, no different than any modern game. The Modern Warfare-esque, grayish tonemaping and the rusty look of Gears of War et al (what I assume you're refering to) were artistic choices.
*shrugs*

That would have been my initial presumption; that perhaps it was a conscious choice for the game(s) in question. The thing is, it was increasingly common during that era in particular, even for games where I would think it wouldn't apply as merely an artistic choice. So yes, that's the joke; that the hardware could have done a lot more colors, but a lot of the games leaned into a certain few limited color filter looks. And it's not like games that could call for certain color palettes just sprang up at that time (they existed both before and after), so I'd attribute it to more than just artistic choice. For whatever reasons, it was indeed just a trend of the time, and I read that there were a lot of technical reasons insofar as trying to make the games look better (but doing the opposite for some people) and some of them made sense to me.

Maybe I worded it wrong to mention the color palette first and foremost, because the bloom and blur in particular was perhaps the real front runners, but the Brown and Yellow filters were definitely used in conjunction with those things a lot in that era. The mid-to-late 2000s and into the early 2010s were full of people wanting to get rid of that Brown filter look.

This comes to mind.


Certain color palettes to help establish a tone are fine. Games still do that today, and when done more subtly, and where it makes sense, it's fine. It was definitely overdone during that time period though.
 
Destroy All Humans - Remake vs. Original

After reading through all of this...I think what you are actually making a point of is that after-effects meant to make things "more real" are making graphics worse. I...well, I turn off the options if available. The option to add a smear of vaseline to the proverbial lense does make stuff appear more as it actually is...but nobody wants a billion dollars worth of vaseline and no steps forward for artistic style.

That said, I think the Destroy All Humans remake shows the exact opposite. It took basic style and did everything that it could to expand upon what at one point was a limitation of the consoles. I'd likewise tell you that the PS2, Late PS2, and current generation Armored Core games have made excellent strides in the visual department not sucking...as well as the controls. See: Armored Core controller grip. Note that Forbes is generally out of touch, but even they jumped on the bandwagon to have an article about the dual shock's infamous AC controller grip...which refused to use the analog sticks for aiming until late generation PS2 (Forbes).

I guess the real problem that I'd have in your shoes is how many games hide their lack of fun behind poor performance...and attribute that to the graphics rather than bad gameplay. I'm 100% behind modern AAA games forgetting fun to pursue things like graphical fidelity (Heavy Rain), or even gameplay (Detroit: Become Human), but that's more a problem with an industry being lead by the messages rather than by what the consumers want. This is why graphical garbage like Pokemon on the gameboy is still fundamentally the same fun core gameplay two decades later on the DS...but stuff like Suicide Squad dies due to being prettier but having poorer gameplay (despite both being a huge grind...I see you 150+ pokedex and level 100 grind, just like Suicide Squad missions).
 
*shrugs*

That would have been my initial presumption; that perhaps it was a conscious choice for the game(s) in question. The thing is, it was increasingly common during that era in particular, even for games where I would think it wouldn't apply as merely an artistic choice. So yes, that's the joke; that the hardware could have done a lot more colors, but a lot of the games leaned into a certain few limited color filter looks. And it's not like games that could call for certain color palettes just sprang up at that time (they existed both before and after), so I'd attribute it to more than just artistic choice. For whatever reasons, it was indeed just a trend of the time, and I read that there were a lot of technical reasons insofar as trying to make the games look better (but doing the opposite for some people) and some of them made sense to me.

Maybe I worded it wrong to mention the color palette first and foremost, because the bloom and blur in particular was perhaps the real front runners, but the Brown and Yellow filters were definitely used in conjunction with those things a lot in that era. The mid-to-late 2000s and into the early 2010s were full of people wanting to get rid of that Brown filter look.
Big part of that I think was also the target demographic, the idea was boys growing up play games, and Playstation proved that this was a huge market. Consoles have followed generational lifecycles for a while now. Dark and gritty was the name of the game. If it wasn't goth, it was some order of guns and explosions, sex and violence. Even a lot of fantasy gaming leaned into that. These are the days of Carmageddon, Quake, Unreal Tournament, where the soundtrack was metal mixed with progressive dance music. Tough was cool.
 
Not gonna lie..

Modern graphics smoke my old Atari 2600 by a lot.

So there has been some improvement.

E.T. Is still the goat when it comes to graphics....
 
Dunno. I think think modern graphics are pretty decent.

This was state of the art when I first started playing. (I still remember what this house 'looked like' to me back then.)

1731354778656.png


More than a few hours/days/weeks/months playing this one too:

1731354810580.png


</getoffmylawn>
 
I do this as a rule. Haven't used AA in any form since the days of CRTs.
Well, the GPU industry spent 20 years convincing us that higher resolution and more detail was why we should upgrade our graphics cards, and then game developers started throwing blur everywhere: Motion blur, chromatic abberation, TAA blur. Then you have to add the other "cinematic" effects into the mix like outrageous amounts of bloom, vignette, lens flare, film grain, dirty lens, and depth of field and you basically get utterly shit graphics that are striving to emulate a Michael Bay slow-mo explosion via shaky-cam as seen through the eyes of someone with severe cataracts.

Why spend all of this GPU power, VRAM, and SSD capacity on ultra high-resolution, high quality textures and anisotropic filtering to deliver the cleanest possible image if you're just going to use some shitty postprocessing to make it all a blurry mess. I had access to a blurry mess back in the 1990s via my CRT's sharpness knob at 512x384 resolution.
 
Motion blur, chromatic abberation, TAA blur.
All garbage. Honestly, what the hell? Why are they even options?

Why spend all of this GPU power, VRAM, and SSD capacity on ultra high-resolution, high quality textures and anisotropic filtering to deliver the cleanest possible image if you're just going to use some shitty postprocessing to make it all a blurry mess.
EXACTLY!! Hell yes! :rockout:
 
I do this as a rule. Haven't used AA in any form since the days of CRTs.
*panics*

I guess it depends on what resolution (or more accurately, what PPI) your display is, but I can hardly go without it.

TAA is acceptable if it isn't causing ghosting/after images too badly, which it's hit or miss on whether it does.

FXAA is usually intolerable, but I'll take it in some cases if there's no other options.

Motion blur, depth of field, and most other stuff I typically turn off though.
 
Just give an older game a texture upgrade and run it at higher resolution and it looks great.
Gimme Descent Freespace and Freespace 2 Remastered/whatever and I'm sold. Unfortunately won't happen :(
 
Why?!?
I guess it depends on what resolution (or more accurately, what PPI) your display is, but I can hardly go without it.
Try it. If you're at or above 720p & below 1080p and no larger than 32", you'll likely notice a difference and whether or not it irritates you will be personal preference. I have a spare 32" TV that is 1366x768 and it looks fine to me.

If you're 1080p or above, regardless of screen size, you are unlikely to see much of a difference unless you're 24" from your screen. 2160p(4k)? forget about it, you'll never notice the difference in gameplay.
 
TAA is acceptable if it isn't causing ghosting/after images too badly, which it's hit or miss on whether it does.

FXAA is usually intolerable, but I'll take it in some cases if there's no other options.
TAA can be decent if the developer has tuned it to deliver just enough frame blending to stabilise the image without losing too much detail. It's a sliding scale where your trade detail for image stability and you really shouldn't use a TAA any stronger than absolutely necessary. Sadly, no games really give you access to the tuning. If the developer has set TAA to sample the previous five frames, you're going to get five frames of smeary mess and no detail in motion.

FXAA isn't very good at all, but it's a noticeable more stable image than disabling AA entirely, with minimal loss of sharpness at modern resolutions. It was a bit blurry when it was launched as a feature back when 800x600 was the most common screen resolution, but even at 1080p you have quadruple the pixels which means that for FXAA you have effectively 1/4 the blurring that we used to have at 800x600.
 
Sorry, it was only meant for dramatic effect on how I can't tolerate going without it, haha.

I wasn't saying it's bad if it doesn't bother you. If anything, that's better because it's one less thing you have to worry about.
Try it. If you're at or above 720p & below 1080p and no larger than 32", you'll likely notice a difference and whether or not it irritates you will be personal preference. I have a spare 32" TV that is 1366x768 and it looks fine to me.
I'm currently on 24", 1920x 1200, so that's just under 95 PPI and it's very obvious to me when there's no anti-aliasing. I sit over an arms length away from my monitor.
TAA can be decent if the developer has tuned it to deliver just enough frame blending to stabilise the image without losing too much detail. It's a sliding scale where your trade detail for image stability and you really shouldn't use a TAA any stronger than absolutely necessary. Sadly, no games really give you access to the tuning. If the developer has set TAA to sample the previous five frames, you're going to get five frames of smeary mess and no detail in motion.

FXAA isn't very good at all, but it's a noticeable more stable image than disabling AA entirely, with minimal loss of sharpness at modern resolutions. It was a bit blurry when it was launched as a feature back when 800x600 was the most common screen resolution, but even at 1080p you have quadruple the pixels which means that for FXAA you have effectively 1/4 the blurring that we used to have at 800x600.
It truly is game dependent. In most games I've played (which is far from an exhaustive list, mind you), TAA is very sharp, but sometimes has ghosting/after-image results with some things in motion. FXAA is usually a blurry mess, but sometimes less severely than in others. I've noticed in some games, one form is more tolerable than the other.
 

Convince me that modern game graphics are good.​


hq720.jpg


lol, next question?
 
Turn off motion blur and film grain and now the game instantly looks 10x better.

Goes for most games and first things i turn off.
 
Depends. The new Life is Strange for example looked stunning.
 
Goes for most games and first things i turn off.
I will give a hot take here. Film grain is a horrible effect pretty much universally but motion blur can be a decent effect in certain games. I actually like having motion blur in a game like Helldivers 2. o_O
 
Back
Top