• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Demand VS Optimization

Are games becoming demanding or just badly optimized?

  • Yes they are becoming more demanding.

    Votes: 5 12.8%
  • Yes some are badly optimized.

    Votes: 28 71.8%
  • No they run great for me.

    Votes: 6 15.4%
  • They are not demanding I just have a potato for a PC

    Votes: 0 0.0%

  • Total voters
    39
Joined
Jan 25, 2014
Messages
2,098 (0.51/day)
System Name Ryzen 2023
Processor AMD Ryzen 7 7700
Motherboard Asrock B650E Steel Legend Wifi
Cooling Noctua NH-D15
Memory G Skill Flare X5 2x16gb cl32@6000 MHz
Video Card(s) Sapphire Radeon RX 6950 XT Nitro + gaming Oc
Storage WESTERN DIGITAL 1TB 64MB 7k SATA600 Blue WD10EZEX, WD Black SN850X 1Tb nvme
Display(s) LG 27GP850P-B
Case Corsair 5000D airflow tempered glass
Power Supply Seasonic Prime GX-850W
Mouse A4Tech V7M bloody
Keyboard Genius KB-G255
Software Windows 10 64bit
This is a thing that has been on my mind a few times. I have been wondering if today's games are more demanding or badly optimized? There are games that can use up almost 6-8 GB of VRAM, and then there are a few games that wont even run if the system has less than 4 GB of system memory. For example there is the game that kicked off a meme that continues till today " but can it run Crysis?".It was an older game (came out in 2007) which we thought was demanding but even till this day it will not run greatly on a new PC's. So was it demanding or just badly optimized? Will game developers stop putting in the effort to optimize their games if fast cpu's become mainstream and we all have gpu's with 6-8 GB of VRAM.
 
As long as it runs GTA V!!! (and GTA VI... :D)
:peace:
 
We've been experiencing the issue of un-optimised games for a few years now. There are usually a group of 'usual suspects' who you can be sure of to release un-optimised games as well as the odd few who have accidently slipped into it like stepping into bucket of paint while painting your house.

Usual suspects (to name but a few)

Bethesda -- Namely anything skyrim, fallout
Bohemia -- ARMA/DayZ --- Though maybe not so much DayZ as devs are supposedly trying to rewrite the entire game in a newer engine. Though the current DayZ can run like runny poop on some machines, Same goes for ARMA III -- It is seriously unoptimised.
Rockstar North -- Grand theft Auto Nuff said..... Rockstar Studios put a lock on one of their earlier GTA games that would not launch on my PC, I had to hack the game to get it to run and even then It ran absolutely poop. Would not allow me to enable max graphical settings and back then i was running 970 SLi
Ubisoft -- The Crew, Watchdogs, Farcry, Assassins Creed, The Division, Ghost Recon (2017) --- Some of these were extremely badly optimised.
 
I don't know if it's just my general awesomeness ,or the fact that I'm an MLG , but I never have any issues with underperforming games. I can run Arma II OA (fully modded) @ over 150 frames per second at 1200P. :pimp:

I think once you've played a game like Arma operation arrowhead, most other unoptimized games pale in comparison. I remember You could have two GTX 690s & be LUCKY to see 50FPS:roll: FFS Bohemia Interactive!

Bethesda is another one.
ive never had any problems with Rock* tho
 
There is no rule to this. Some run great and look great, some look bad and run bad. Most of them are in between. Games don't look as good as they could, I'll tell you this much. We've got ~10TFlops cards available for purchase, but rather than using their potentail developers use old game engines and sprinkle them with some new technologies like screen space reflections, depth of field and hbao+ and think they've done enough, what it really looks like is similar to a polished turd. I can't complain on performance since my gtx 1080 has no problems at 1440p, but I feel like graphics haven't progressed much. My top 10 best looking games list still has 2013 titles like Crysis 3 and Alien Isolation as well as 2015 Witcher 3 higher than most 2016/17 games I played. Quantum Break is #1 best looking but it's a case of bad optimization, it doesn't even push GPU to max. utilization.
 
Will game developers stop putting in the effort to optimize their games if fast cpu's become mainstream and we all have gpu's with 6-8 GB of VRAM.

No , they are working harder than ever. They have to produce a a working game that is a technical wonder in just two years or even one. In addition to that hardware is advancing slower and slower , different APIs and programming techniques have to be employed to cope with that , that makes the situation even worse.

Think about this : it takes 1000 lines of code in something like Vulkan just to get a damn triangle show up on the screen. And it's not even the most efficient way of doing it.
 
Last edited:
Maybe. Back in the day games (and other software) HAD to run well because the machines were so weak compared to what we have today. Today hardware has made leaps and bounds over what used to be available before, so developers have either gotten lazy when it comes to that or turned their attention to focus on other things rather than making the game work on slow hardware.
 
No , they are working harder than ever. They have to produce a a working game that is a technical wonder in just two years or even one. In addition to that hardware is advancing slower and slower , different APIs and programming techniques have to be employed to cope with that , that makes the situation even worse.

Think about this : it takes 1000 lines of code in something like Vulkan just to get a damn triangle show up on the screen.

I dont doubt that games are harder to make, thats EA's #1 Excuse behind the inclusion of microtransactions in their games. On the other hand I believe there are some very very skilled people as well as teams of people in the industry that are good at what they do in the same way that Visual Basic or C++ forms the foundation of programming languages that can make it very easy to transfer over to other or more advanced programming languages once you nailed the basics. I dont think its that much harder to programme then existing games if youre familiar with the engine youre using but the problem I think is generally one of creativity and keeping the game as fresh and original as they can and there are too many games that copy each other. For example DayZ clones or Battle Royale type games like Fortnight or PUBG -- THese games arent doing anything new.

I hear its a headache programming for the PC because not everyone's setup is the same and they have to cater towards those that might not have an absolute beastly machine that can wreck any game so I kinda echo the sentiment about a studio only having two year cycle to develop a new game and sometimes that aint long enough. Its the main reason why games are built for the console from the ground up.

Im sure that most studios have some of the top talent in the industry working on their teams but somehow bugs always make it past QC. Sometimes fixing some minor bug will break many things in the game but There are games that even despite having 3-5 years in development, still run like crap.
 
Most PC exclusive games are not well optimised, but usualy turning off one or a few settings like AA, AO usualy get you good framerates on most hardware.
You only need to find the settings that balance performance and IQ on your hardware.
For example in Warhammer 2 if i play on Ultra i get about 40-50 fps in battles with dips to 30 on larger 2v2 battles, going to High settings i get about + 20 fps.
 
Im sure that most studios have some of the top talent in the industry working on their teams but somehow bugs always make it past QC. Sometimes fixing some minor bug will break many things in the game but There are games that even despite having 3-5 years in development, still run like crap.

I am sure they have skilled programmers but their productivity has a limit , you can't scale it indefinitely. In this field if you have 10 people working on something instead of one they might not necessarily get it done 10x times faster or 10x times better.
 
Last edited:
It depends but the thing is if you run todays games at settings similar to a console a Potato PC can run it. That is the optimized setting going beyond that may be somewhat optimized but your still just brute forcing things.

A good example Rome 2 vs Warhammer Total War. CPU optimization was abysmal however this was a direct issue related to DirectX API etc and the way the engine was originally coded. Keep in mind the game engine used came out almost 9 years ago. As such the optimizations that were made = for gunpowder design hell the entire engine was built with that in mind then forced to do new things. With DX12 etc some engine limitations were removed due to changes in the API.

Generally speaking Game optimization is a balance of three major things. Settings (optimized for console) / API Used / Game Engine used. Think of it as a Ven Diagram at the point all three meet is essentially console gaming. As you change settings beyond that your pulling out of that focused center. Depending on the API the base settings could offer high frame rates. However Ultra settings may push post process effects to an extreme degree which the API might not be able to handle well. This creates an imbalance that results in constant bitching.

For example. Total War series. The engine used in todays games in 9 years old and was made with DX9 / 10 in mind during that era dual threads was pretty much the mainstay even then it was problematic to implement. DX11 was just an extention of 10 with Tessellation. Warhammer with DX12 removed constraints that allowed for better threading optimization. Now add in things like graphics settings Ultra unit size = massive armies. those armies need to have individual animations of each soldier compared to an opposing soldier along with various stats etc to predict the next animation used. Now do that for 5000 vs 5000 enemies on screen. To put it bluntly thats a lot of processing to be done. Same applies to other game engines.

Unreal Engine is quite adept at small maps / locations. Notice many open world style games have issues with stutter or loading problems of assets because the engine wasn't until recently designed to do that.

Unigine engine does Tessellation quite well and basic lighting however it uses a huge amount of unoptimized post process effects. These create delays.

Grand Theft Auto IV for example was never meant to run at those Ultra settings. With hardware capable of running it sure you could push it to an extreme however as the game was coded it was never meant to function at that point. Its not to say its unoptimized it was simply not coded with that in mind. However if a PC game doesn't have absurd ultra settings people bitch about it.

Fundamentally speaking if you want higher than console graphics settings your essentially ASKING for unoptimized games. Optimizing for various settings is possible sure but optimization for each individual architecture and graphics card is pretty much impossible. As such pushing settings higher and higher results in a work load that is inherently unoptimized because you are pushing the visuals beyond the scope of the developers focused settings.

Example want Skyrim / Fallout to look even more amazing? Push UGRID To load fundamentally expanding the LOD range dramatically. However while not demanding graphically it puts HUGE strain on the CPU since it now loads actors and physics for those regions which then needs to be calculated even if you can't see it. This is inherent to game design as a whole. It boils down to balance. Developers on PC allow us the ability to destroy that balance they created for ultra pretty graphics.

That does not mean unoptimized games are not released for they really are with the patch later mentality. However the biggest offenders tend to be movie tie ins, Early Access etc.

The majority of games I have played of late the biggest unoptimized game I have found is PUB G. running low settings or turning things off to maintain a stable frame rate even with a GTX 1080 Ti. However even that is improving regularly.

Then of course comes the problem with anything that is coded. You can fix one bug but potentially create another or more. Its a cascading issue that takes time to work through. Its time consuming and man power consuming. Meaning expensive. That doesn't excuse companies from not fixing bugs. However it generally means bug ffixing is not as a easy as hey look at this quest why is it not working? It could be an issue caused by something unrelated that can't be seen in a typical debug. All of which needs to be taken into account.
 
I would say that when games get more technically advanced, the visual quality to optimization ratio gets worse. It's like when you buy a high end GPU, yes you get more technical prowess, but much less bang for buck.

As far as titles go, the worst offenders for me this year regarding poor optimization are RE7 and Wolfenstein 2. I can actually run Ghost Recon Wildlands better than either of these games, which is saying something because it has a MUCH larger game world, it's very well detailed with great visual quality, you can travel through it with no load screens, and there's tons of active AI everywhere. I can also run it at pretty good settings.

Regarding your survey, it makes little sense because literally every option in it could apply for many people. There's no one right answer there.
 
Some are badly optimized yes, but most games are generally made to work with the 'average' system specs of the people, add a few visual eye candies for the higher tier and tone down some for the lower tier systems. since the average system gets higher every year, the devs ceiling becomes higher as well. if you don't upgrade to near the 'average' of course you'll get left out, even if your old system can potentially handle the game with some very efficient coding. it's just cost effective thinking, why spend money and time optimizing for the like 1-5% of the population which has low end rigs? where you can spend time optimizing for the average system which is like 50% of the population

also for those games that use 6-8GB of VRAM and has a 'high demand'. I think it's a matter of scaling, since starting from gtx 1060 (which is top midrange) you now have a 6GB VRAM, naturally people would like to take advantage of it, so they code their game to scale based on the system it's in. (e.g. they create higher settings).

for example GTA V. 2GB Vram worked fine for me, it uses that amount. then I upgraded to a 8GB Vram, it still worked fine while now using more Vram and allowed me to up my settings and have less stutters. Same with BF1, if you have dual core it uses that fine, use 6 cores and it also uses all cores.

so while your GTA V uses 6GB Vram... it doesn't necessarily mean that it's a demanding game... cause there are people running GTA V with 2GB Vram. the game just uses utilizes your rig's capability.

Scaling is not perfect however, as example the core count. they have to design games with the 'average' system in mind. so if the average system is dual core as with the past recent years. then it's better to code the game in dual core, and then try scaling to more cores if they can, and they usually couldn't. Now the norm is quad core so we are now seeing more games that utilizes 4 cores and above, and i hope it continues.. because core wars..
 
Ubisoft -- The Crew, Watchdogs, Farcry, Assassins Creed, The Division, Ghost Recon (2017) --- Some of these were extremely badly optimised.

Ubi is all over the place :) WD, AC Unity, horrible. The Division and Far Cry: awesome.

Most of this however isn't optimization related, its the basic architecture of the game and its engine that is at fault for bad performance. Engines evolve over time, and they can evolve past their original intended use. Ubisoft has repurposed its engines a hundred times and some settings work well while others do not. The Division: built on a brand new engine and has superb visuals vs great performance and usage of system resources; all of this game's problems stem from networking. Watch Dogs: repurposed Anvil.
 
Last edited:
A lot of optimization is lost because game devs are always in a time crunch, they go with what works which is usually barely enough to pass internal QA. Those same devs also do not know how take advantage of multi-cores, this is from lazy and weak code habits or learning barely just enough to get a job.
 
Majority are poorly optimized, just wam bam thank you ma'amed together because they suffer from consolitis
 
  • Like
Reactions: 64K
I have been wondering if today's games are more demanding or badly optimized?
Both and that has always been the case. Why that's a bad question:

1978: computers and consoles had extremely limited resources. Even the simplistist of games were demanding. To make them run, they had to optimized to the point that every bit mattered. Atari 2600 has 128 bytes of RAM.

1988: developers had some more freedom and tools but they were still seriously limited. They could afford things like sprites now that were unfathomable previously. NES had 2,048 bytes of RAM.

1998: developers had more flexibility but they still had to fit *everything* inside the constraints of the hardware (especially memory, both ROMs and RAM). It allowed game worlds to get bigger, textures more refined, and more content but hardware was still a huge constraint. N64 had 4 MiB of RAM. PlayStation had a huge leg up because of the CD medium--they could actually use licensed audio and it wouldn't sound like ass.

2008: The ROM constraints have been alleviated thanks to high capacity disks and the ability to change between disks mid game; however, RAM is still an issue: on PC because of the proliferation of 32-bit operating systems, in consoles, simply because they didn't have much. Xbox 360 had 512 MiB of RAM.

2018: The memory constraints are finally mostly gone but now developers want to stream everything to limit or eliminate loadings screens. The bottleneck has moved from purely memory constraints to a CPU/GPU/memory constraint of what to display when. This is fundamentally why benchmarks include minimum framerate. It's in that instant when the CPU is deciding what the GPU needs to do that framerate drops and players notice it. Xbox One X has 12 GiB of RAM.

So games have always been demanding and optimization is mostly the arduous task of reducing demand in a given timeframe. These are both things game developers have to wrestle with and they can never both be completely satisfied, only balanced.
 
Last edited:
I had more problems in the past, if anything. I think the parity with console architecture nowadays actually helps too.

And don't get me started on 90s gaming. I come from the DOS and early Windows world first. You haven't seen bad gaming performance (or frustrating configuration) unless you were there.
 
Yeah...back when there wasn't standard APIs for everything, every game had to optimized for every kind of hardware it might encounter. It was terrible until the late 1990s when game developers and hardware manufactures coalesced around DirectX.
 
Last edited:
idk I feel like things are better than ever as far as optimization. Sure there are culprits who don't care, but most understand that the most game sales are in console/low-midrange pc.

I just dropped from an R9 290 which was dated and nowhere near the fastest thing out there to a 760 GTX and I have to say at 1080 it still plays everything just fine.

I remember having to RMA a 9800XT 256MB back in the day and popping in my 9000 128MB during the process... Half Life 2 played still because it dropped most of the complexity in directx 8.1 mode. But sooo many others crawled and couldn't handle my monitors native 1600x1200. I had to drop to 1280x1024 on most and 1024x768 on a couple. (Doom 3, Rome TW)

This was easily as big a drop, and gotta say its been fine.
 
Back
Top