• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Convince me that modern game graphics are good.

Joined
Nov 13, 2023
Messages
27 (0.04/day)
Processor Intel i7-10700
Cooling Noctua
Video Card(s) Nvidia Gigabyte 4070 Ti Windforce OC
Display(s) Msi 170 Hz 1440p
Case Phantek A400
Power Supply Seasonic 850w Platinum
So i think most modern games' graphics are garbage and that old games often look a lot better ( especially when i play older games at 5120x2880 super resolution)

This is how i imagine the average AAA game graphics are made today ( a bit of sarcasm included):

-make a lot of high resolution textures
- keep spamming polygons and high res textures until there are more jaggies than there are grains of sand on Earth
- realize that there are too many jaggies
- use the cheapest and lowest form of antialiasing you can find like Call of Duty that has 5 options of AA, each of them more useless than the previous one, either not reducing the jaggies enough or making everything blurry as hell
- realize that there are still too many jaggies and the game looks blurry even at 4k res
- add even more useless and annoying post processing effects, like motion blur, depth of field, film grain, aggressive bloom and ambient occlusion etc.
- the game looks like an OIL PAINTING, it seems like someone smeared vaseline on your glasses
-additionally, make the artstyle as boring and realistic as you possibly can ( i don't need it to be realistic, if i want to see something realistic i can go outside my house)
- GPU requirements get heavier and heavier, 4090 needed to play at 60fps at 4K ( :laugh::laugh::laugh:)
-????
-Sell it at 80 euros
-Profit!

Now, there are some things the new games do better than the old ones like facial expressions, animations, some details like the skin or hair of characters etc. but i think 90% of everything else is worse....

What do you think?
Post screenshots of modern games that you think look GOOD and CLEAN. For an example, i think Halo 3 (2007) at 5120x2880 looks better than Halo infinite (2021) at 2560x1440p (both high settings) in basically every way, while being a lot less demanding on both CPU and GPU.
 
I don’t think that many sane people will disagree with the fact that older forward rendered games definitely looked at least “cleaner” than modern AAA fare does. Everything else is up to subjective preference, really, I do know people who enjoy the “filmic” look. I personally kinda hate it and consider it boring, but to each their own. That being said, I definitely consider the current trend of undersampling like half the things in the rendered frame and just relying on temporal AA of some sort to clean things up to be genuinely, objectively a horrible practice that is caused by developers overreaching with their graphical ambitions and having to scramble for performance. That, or being incompetent and/or lazy, but I wouldn’t want to appear crass.
 
Starfield has some pretty great moments, but yeah I basically agree with this. In the chase for fidelity they lose identity and so all games from the same generation looks the same.
 
So i think most modern games' graphics are garbage and that old games often look a lot better ( especially when i play older games at 5120x2880 super resolution)

This is how i imagine the average AAA game graphics are made today ( a bit of sarcasm included):

-make a lot of high resolution textures
- keep spamming polygons and high res textures until there are more jaggies than there are grains of sand on Earth
- realize that there are too many jaggies
- use the cheapest and lowest form of antialiasing you can find like Call of Duty that has 5 options of AA, each of them more useless than the previous one, either not reducing the jaggies enough or making everything blurry as hell
- realize that there are still too many jaggies and the game looks blurry even at 4k res
- add even more useless and annoying post processing effects, like motion blur, depth of field, film grain, aggressive bloom and ambient occlusion etc.
- the game looks like an OIL PAINTING, it seems like someone smeared vaseline on your glasses
-additionally, make the artstyle as boring and realistic as you possibly can ( i don't need it to be realistic, if i want to see something realistic i can go outside my house)
- GPU requirements get heavier and heavier, 4090 needed to play at 60fps at 4K ( :laugh::laugh::laugh:)
-????
-Sell it at 80 euros
-Profit!

Now, there are some things the new games do better than the old ones like facial expressions, animations, some details like the skin or hair of characters etc. but i think 90% of everything else is worse....

What do you think?
Post screenshots of modern games that you think look GOOD and CLEAN. For an example, i think Halo 3 (2007) at 5120x2880 looks better than Halo infinite (2021) at 2560x1440p (both high settings) in basically every way, while being a lot less demanding on both CPU and GPU.
Ghost of Tsushima runs on my rig with no upscale, no AA at 3440x1440 ultra everything and looks razor sharp. A good example I think. It can run without TAA and looks great. And I get > 100 FPS too, it is stutter free and the input latency is extremely low.

Its a real joy playing games like that, just the base experience being so solid keeps you going really, even if some parts are repetitive, just because of how well it plays.
 
I don’t think that many sane people will disagree with the fact that older forward rendered games definitely looked at least “cleaner” than modern AAA fare does.
As far as photorealism is concerned -and discounting specific issues such as sampling nois- better is not "cleaner," it's often quite the opposite.

keep spamming polygons and high res textures until there are more jaggies than there are grains of sand on Earth
Yes, a scene with on perfectly-alligned quad would have no jaggies, but I'd hardly call that better.

Disregarding the aesthetic component, a scene that has more triangles (for non-planar forms) and smaller MMUs (edit 1: minimum mappable unit. Basically the smallest detail you can show on a raster) is, objectively, "better." We can argue how much better, but not the direction of progress itself.

Edit 2: The above is, obviously, concerning graphics in itself. The argument on whether graphics matter in games is another topic.
 
As far as photorealism is concerned -and discounting specific issues such as sampling nois- better is not "cleaner," it's often quite the opposite.
Yup, but I was talking strictly from the point of visual clarity and readability. Photorealism isn’t “clean”, that’s true. My personal opinion is just that games don’t really need to strive for it and it’s a fairly boring aesthetic to boot, but as I said - that’s a subjective take and people will differ on it. I would take a VannilaWare hand drawn style or a PoP 2008 painterly one over a hundred “photorealistic” AAA games any day, but it’s up to taste.
 
I definitely consider the current trend of undersampling like half the things in the rendered frame and just relying on temporal AA of some sort to clean things up to be genuinely, objectively a horrible practice that is caused by developers overreaching with their graphical ambitions and having to scramble for performance

I think it's caused by developers wanting to make trailers and ads look good but then the consoles just aren't powerful enough so they have to undersample and use dynamic resolution all the time
Disregarding the aesthetic component, a scene that has more triangles (for non-planar forms) and smaller MMUs (edit 1: minimum mappable unit. Basically the smallest detail you can show on a raster) is, objectively, "better." We can argue how much better, but not the direction of progress itself.
Increasing resolution would be the better option imo. Instead we have devs trying to make the game as photorealistic as possible while the majority of people are still stuck at 1080p ( i will never go back to that resolution ever unless it's on a small screen like 12-14") and getting this weird image quality where yes, you can feel that the game is trying to have "lifelike" graphics but the result is a mess....

And people aren't going above 1080p because the game requirements keep increasing constantly and you need to upgrade your graphics card just to keep pace with 1080p. It's a shame.
 
That's been on my mind for a while now, it's all true.
- Textures: Back when we had more limitations they couldn't just import any random texture photography 1:1, they had to work on in to make it good enough and I think that translated to a cleaner overall when viewing in on screen.
- Models: They just load the infinite polygon models because tech allows it (yay for UE5) and ignore all other work related to them, like proper LODs and etc.. The overall scene looks busy and cluttered with no distinct elements. The small objects or smaller parts of models don't render well in the distance.
- Lighting: They rely on dynamic lighting system for everything, instead of using more efficient baked lighting. Older games had to hand-craft most lighting scenes in order to make them look good so more thought went into it. Dynamic lighting is also expensive so they use sampling or compromise other areas to save on performance.
- SAMPLING: super sampling, sampling for lights, shadows, rays, sampling for EVERYTHING... I think this is the biggest issue nowadays because they are trying to use techniques that simply require much more resources than we currently have. Light and shadows are noisy, they slap on a thick layer of denoisers, that makes them look painterly.

I have been really allergic to image instability in the past, like aliasing causing lines to flicker. Hoped my problem would be solved by moving on to 4K and beyond when it seemed like that was becoming achievable... and now we have this state. Shimmering, flickering noisy visuals that also look blurry at the same time.
Games are trying to construct the presented image with a lot of missing information because of upscaling, low resolution sampling and then trying to make it presentable. Of course it looks bad.
 
Last edited:
I admit, it has been a while since a game impressed me visually. I thought Cyberpunk looked awesome (properly patched and tuned). I really liked the look and feel of Doom 2016. I have always been on the performance side of the performance vs image quality question. I don't like motion blur, depth of field, lens flares, etc. I have always thought that nice looking textures don't need a lot of filters or effects.

But i was blown away by light mapping in doom 3. Direct X 9 was a huge step up. Unreal Tournament on an agp voodoo 5 running glide was amazing. Final Fantasy 7 on pc running direct3d instead of the crappy software rendering like the playstation was a whole new ball game. It seems there are less major leaps these days.
 
Last edited by a moderator:
Can't and wont convince you since for me it depends on various things and even game to game basis and its highly subjective.
Personally I like modern graphics like the ones in Horizon Zero/Forbidden West, Hellblade,Cyberpunk, Plague Tale Requiem,etc but I also like anime/drawn style games like Wuthering Waves for example and I can't imagine any other type of graphics for such a game and I also like the graphics in Borderlands 3 and feel that it suits the game perfectly.

Tho I will admit that I do have a distaste for too dated graphics or those retro modern pixel art type of graphics that just doesn't click with me most of the time. 'doesn't meant that I wont give them a chance gameplay wise but its defo not my kind of thing'
 
I have been really allergic to image instability in the past, like aliasing causing lines to flicker. Hoped my problem would be solved by moving on to 4K and beyond when it seemed like that was becoming achievable... and now we have this state. Shimmering, flickering noisy visuals that also look blurry at the same time.
Me too, like i could tolerate it but it was a constant distraction, that's why i upgraded to 1440p. 4K is too expensive for me right now. So you're on 4k and still having problems with image instability right? The best treatment for it is DLAA imo but it's not in every game sadly... sometime i have to use brute force ( Nvidia DSR)
I thought Cyberpunk looked awesome (properly patched and tuned)
i'm looking to play Cyberpunk as soon as i upgrade my CPU, what do you mean by pacthed and tuned? Ufficial updates or some kind of mod?
 
Sometimes I disable antialiasing completely because all of the other options are a blurry, smeary, ghost-trailing mess. TSR, TAA, FSR, DLSS - I hate it all because it only looks good when the image is static and I'm not playing Photo Simulator 9000, I need the image to look good when I'm actually playing - not standing around idle.

An image with no AA at all is pretty awful, but I'd rather deal with a bit of shimmer and jaggies than have everything blurred and smeared in motion. Thankfully many of the UE5 games today at least have FXAA which sucks, but it's an improvement on no AA at all.

Watch this, if you haven't already:


1731073198204.png
 
Last edited:
I like the lighting in FM. When the sun is setting though the trees and across the track.. very pretty.
 
Me too, like i could tolerate it but it was a constant distraction, that's why i upgraded to 1440p. 4K is too expensive for me right now. So you're on 4k and still having problems with image instability right? The best treatment for it is DLAA imo but it's not in every game sadly... sometime i have to use brute force ( Nvidia DSR)
Still on 1440p myself as well, because 4K performance hasn't been improving and I can't afford the GPU required for that.
But from what I see from others the aliasing is better, however the newly introduced problems are there.

A good video that highlights the issues:
 
Last edited:
Imo games today are all too much the same and rely on the same tools and tricks that just result in the same fake or popular crap. Although they look impressive, I usually get the sense that your looking at more smoke n mirrors instead of something actually interactive. Instead of doing it the brute force way, what way is that?

Something like this.

These Unreal Engine 5 demos literally make Crysis look like Playstation. Remember Playsation? How it had that hard photo-realism look, like V-rally, like photos stuck on the objects. Crysis took that to the next level but I hated it as it was just too hard to make out the scene without requiring an unnatural amount of conscious processing, it was like everything in the jungle was 2D, really irritating. Crysis 3 was crap, Far cry 6 was crap. And look at those new jungles in UE5, it's so enjoyable its kinda like real-life. Watching that stuff in 4K is just so enjoyable. The fact is many games dont look anything like these. However the lighting and water still have a long long way to go, no games today can do water like those UR demos. No games have yet mastered night time driving lighting.

So the way I see it games are very demanding for all the wrong reasons, they just dont look good despite being super demanding.
I don't want cinematic, or the popular mainstream stuff with 647frames persecond.
I'm horrible horribly superficial, i just want to be able to sit back and drive a car in the wet, with pouring rain, gale winds, in pitch black surroundings-no other BS fake lighting and just let those powerful headlights cut though the darkness revealing immense details, and just be able to go, wow.. just look at that.. wow...
Hopefully by the time PS6 arrives.....
 
So i think most modern games' graphics are garbage and that old games often look a lot better ( especially when i play older games at 5120x2880 super resolution)

This is how i imagine the average AAA game graphics are made today ( a bit of sarcasm included):

-make a lot of high resolution textures
- keep spamming polygons and high res textures until there are more jaggies than there are grains of sand on Earth
- realize that there are too many jaggies
- use the cheapest and lowest form of antialiasing you can find like Call of Duty that has 5 options of AA, each of them more useless than the previous one, either not reducing the jaggies enough or making everything blurry as hell
- realize that there are still too many jaggies and the game looks blurry even at 4k res
- add even more useless and annoying post processing effects, like motion blur, depth of field, film grain, aggressive bloom and ambient occlusion etc.
- the game looks like an OIL PAINTING, it seems like someone smeared vaseline on your glasses
-additionally, make the artstyle as boring and realistic as you possibly can ( i don't need it to be realistic, if i want to see something realistic i can go outside my house)
- GPU requirements get heavier and heavier, 4090 needed to play at 60fps at 4K ( :laugh::laugh::laugh:)
-????
-Sell it at 80 euros
-Profit!

Now, there are some things the new games do better than the old ones like facial expressions, animations, some details like the skin or hair of characters etc. but i think 90% of everything else is worse....

What do you think?
Post screenshots of modern games that you think look GOOD and CLEAN. For an example, i think Halo 3 (2007) at 5120x2880 looks better than Halo infinite (2021) at 2560x1440p (both high settings) in basically every way, while being a lot less demanding on both CPU and GPU.
My airline pilot Captain(B777) beer drinking neighbor who flies a "Real" simulator gets blown away by the graphics of MSFS2020 and DCS. My sim is 7950x, Strix 4090 and 49 inch Samsung G9.
 
The easiest example I can think of is Battlefield. Every version for years pushed the envelope on graphics, which included art style...until 2042.

I think BF:V had some of the best graphics I've ever seen in an FPS game and while I had issues with map design from a functional perspective, they were beautiful to look at. The art style was there, they used photogrammetry to improve textures and visuals, wonderful destruction, there was Ray Tracing, the only thing that sucked was that it started with early DLSS that looked like smearing Vaseline on the screen. Turn that off and it looked outstanding.

Then BF:2042 and oh boy. bye bye photogrammetry, bye bye artistic design. Everything was sterile, bland, empty, and they had to drop ray tracing because they couldn't get it to work anymore and still perform well.

Not having ever spent time in a dev studio, I can only base guesses on what I've seen reported by various devs over the years, but it seems like most of the major studios started with fewer people that really loved the games they created and they spent time creating the worlds. When the games went AAA and people started expecting yearly or bi-annual releases, the studios hired more people, or got more studios involved, launches required more efficiency, expediency, etc. and everybody rushed everything. The people who were likely slower to put out their truly creative works were pushed out (look at all the most influential and important devs that left DICE for example...including the ones that developed their FROSTBYTE engine) one way or another. There's no time to optimize. They have to put as much sellable content into as little time as possible so the publishers and investors get the most profit possible. Everything designed by committee. It's not just DICE though, this has happened everywhere.

Look at the difference:
BFV.jpg




bgkuqaxdx5s71.png


^this 2042 screenshot (pulled from someone on Reddit) is at 4K (somehow the BFV image I had got shrunken to 1600x900) and the BFV image still looks way better to me. So much more time spent on art style, design, and rendering techniques.

I'd also say that there are plenty of devs at these studios who still care about the games, but the culture is different and the demands from management are different too. They only get the time they get and the directives have changed.

There's another issues cropping up now that I'm uncertain what the long-term effects will be. UE5 is replacing everything. Yes, it can produce some fantastic visuals, but the main reason this is happening is that it is easier to use than other engines and it's the efficiency with which content can be generated that is driving it's domination. We've seen several instances where it also comes with stuttering or other problems because even though it's faster to develop in, they're still not spending the time required to optimize and there are still issues in the engine to overcome. Compound that with Nvidia and co. pushing DLSS/FSR/XeSS and we're in a situation where they're just pushing unoptimized crap because "you can just turn on DLSS and frame gen and get more frames for free", but it's at the cost of fidelity and/or latency hits. Most people are calling it lazy for the developers, but I'd call it cost cutting. I don't think the coders and artists want to cut these corners. I don't think they have a choice.
 
I've always tried to turn all that shit off and then crank resolution to the maximum.
However this strategy doesn't work on Red Dead 2 which looks really bad no matter what you do. You sort of have to mix and match filters to get it not blurry on PC. Really unfortunate.

I value high frame rate and clarity over "photorealism".

*Defining shit as the various filters: AA, AF, TAA, motion blur, etc.

Back in the mid 2000s (20 years ago) we had a real problem with the brown blur aesthetic. Take a look at Far Cry 2 for how obnoxious that was.
 
TAA can make this look like a static picture when it blurs so much detail out that it makes the movement stop behind certain things. Fences or mesh is one of those things TAA fails at badly. MSAA is really good at & so is SMAA.
Crossout had a pretty video on the difference
 
So i think most modern games' graphics are garbage and that old games often look a lot better ( especially when i play older games at 5120x2880 super resolution)

This is how i imagine the average AAA game graphics are made today ( a bit of sarcasm included):

-make a lot of high resolution textures
- keep spamming polygons and high res textures until there are more jaggies than there are grains of sand on Earth
- realize that there are too many jaggies
- use the cheapest and lowest form of antialiasing you can find like Call of Duty that has 5 options of AA, each of them more useless than the previous one, either not reducing the jaggies enough or making everything blurry as hell
- realize that there are still too many jaggies and the game looks blurry even at 4k res
- add even more useless and annoying post processing effects, like motion blur, depth of field, film grain, aggressive bloom and ambient occlusion etc.
- the game looks like an OIL PAINTING, it seems like someone smeared vaseline on your glasses
-additionally, make the artstyle as boring and realistic as you possibly can ( i don't need it to be realistic, if i want to see something realistic i can go outside my house)
- GPU requirements get heavier and heavier, 4090 needed to play at 60fps at 4K ( :laugh::laugh::laugh:)
-????
-Sell it at 80 euros
-Profit!

Now, there are some things the new games do better than the old ones like facial expressions, animations, some details like the skin or hair of characters etc. but i think 90% of everything else is worse....

What do you think?
Post screenshots of modern games that you think look GOOD and CLEAN. For an example, i think Halo 3 (2007) at 5120x2880 looks better than Halo infinite (2021) at 2560x1440p (both high settings) in basically every way, while being a lot less demanding on both CPU and GPU.
I think some are good, but others I see as regressive.

As an example I deffo prefer Vesperia's graphics to newer tales games like Berseria and Zesteria.

Some games I feel are good, but also not improved over older games, I think the biggest example is ray tracing, RT enabled on a low mode tends to look trash, RT enabled on a high mode might look alright, but older pre RT games would look just as good in their ultimate detail options. So I consider RT a trade off where it helps Nvidia sell GPUs and makes it easier for dev's to implement effects, but the downside is consumers lose a ton of performance (and cash?) to get something they already had.

Similar with engine and API modernisation, in all honesty I think DX9 is good enough visually, DX11 is definitely good enough, but when we talk about things like forward rendering, it seems changes were made for the sake of developers, so they now have API and engines they prefer to use (UE very dominant now), but we lose good things like MSAA, SGSSAA, and have trash like TAA instead. The new stuff does remove CPU single core bottlenecks that were notorious in DX9, but to me thats the only advantage. There is old DX9 games to this day which will still struggle on CPU's that are many times more powerful compared to ones when the games were released. Thats how bad the bottlenecking was. Interestingly though whenever I have seen/played a game that supports both DX11 and DX12, in every occurrence DX11 ran better. That tells you something is wrong, with the implementation of DX12 we are seeing or even DX12 itself.

Tales of Zestiria really show cases how good SGSSAA is.
The PS4 version of the game, is a jaggy, shimmery mess. Low resolution combined with FXAA.
The PC version removes a lot of the jaggies, but still has shimmering on edges. However add SGSSAA and it turns into a work of art. It is magic. SGSSAA is far more impressive than RT will ever be, and its a shame there was no attempt to make it mainstream.

Some dev's really struggle, there has been games released with performance issues that have weak AA, poor shadows etc, dev's desperate to try and solve their issues with cheap options to them. Then a modder comes along maybe SpecialK dev, and the game is running much better whilst implementing mipmaps, enhanced textures, texture caching, SGSSAA, 8x shadow resolution and more, devs of game looking on in awe.

GTA5 with the best visual mods looks pretty amazing, but I bet if there was a remaster with visuals on par, it would need way more resources to run as well as the original game modded.
 
Last edited:
So i think most modern games' graphics are garbage and that old games often look a lot better ( especially when i play older games at 5120x2880 super resolution)

This is how i imagine the average AAA game graphics are made today ( a bit of sarcasm included):

-make a lot of high resolution textures
- keep spamming polygons and high res textures until there are more jaggies than there are grains of sand on Earth
- realize that there are too many jaggies
- use the cheapest and lowest form of antialiasing you can find like Call of Duty that has 5 options of AA, each of them more useless than the previous one, either not reducing the jaggies enough or making everything blurry as hell
- realize that there are still too many jaggies and the game looks blurry even at 4k res
- add even more useless and annoying post processing effects, like motion blur, depth of field, film grain, aggressive bloom and ambient occlusion etc.
- the game looks like an OIL PAINTING, it seems like someone smeared vaseline on your glasses
-additionally, make the artstyle as boring and realistic as you possibly can ( i don't need it to be realistic, if i want to see something realistic i can go outside my house)
- GPU requirements get heavier and heavier, 4090 needed to play at 60fps at 4K ( :laugh::laugh::laugh:)
-????
-Sell it at 80 euros
-Profit!

Now, there are some things the new games do better than the old ones like facial expressions, animations, some details like the skin or hair of characters etc. but i think 90% of everything else is worse....

What do you think?
Post screenshots of modern games that you think look GOOD and CLEAN. For an example, i think Halo 3 (2007) at 5120x2880 looks better than Halo infinite (2021) at 2560x1440p (both high settings) in basically every way, while being a lot less demanding on both CPU and GPU.
Try Avatar or TWWH3 at 4K. Even Spiderman is beautiful. If you have an LED monitor just turn the contrast and saturation up to acceptable levels. Or watch a Youtube video on a PS1 Game and see how far 3D has come.
 
I think the kind of stuff that can be done (or will be done in the future) with lighting using path tracing looks pretty darn impressive. I look forward to when mainstream consumer hardware has enough grunt to run it as standard.

"Good graphics" is ultimately a subjective matter, but I think a lot of games of various art styles (realistic, anime, cartoon...etc.) can benefit from better lighting.
 
Don't need RT , DLAA only for me , great story can captivate me over graphics.
Just finish main story of GOD OF WAR Ragnarok
Hell Blade II , best graphics , short short story , not everybody's cup of tea , but I enjoy it a lot ,just wish it was not so short.
 
Back
Top