• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

No one buys games for graphics.

You are funny with your stupid bias as this is the game on the same hardware after the first game update.



I purposely separated these vids to catch some folks out, you fell.

Still it shows just how bad it is, some Nvidia users just prop up their own purchases to continue saying it's not a game or dev issue. We would expect a 4080 to be faster, I mean it's 500 more than my 7900 XT.

This isn't bias, it's just that 5800X3D's are not invincible speed demon CPUs. Having one doesn't automatically mean every single game should run at "full ultra" settings. They're actually pretty slow unless your workload (or specific game) massively benefits from its large cache. When this isn't the case, performance just tumbles.

As for Radeons, is there a single game that doesn't launch in an utterly sorry state on AMD cards? Or that AMD doesn't take months to release a supported driver? It's just not bias, I'm bringing up the unpleasant and the inconvenient of being a Radeon gamer, that is all. New games always run poorly, crash or have bugs, with very belated game ready releases. It's been that way as long as I can remember.

Well Tetris is addicting and was built with a lack of graphics.

Game play is what matters. If it looks good then you can enjoy it even more.

Tetris, a good if not the perfect example. Played Tetris Effect? Probably the most wonderful puzzle game around, I'm a huge fan. It's particularly insane in VR, I hear, though I haven't tried. It doesn't make old NES Tetris for example unenjoyable in the slightest, but are you really going to look at all the love and effort, as well as the eye candy in Effect and brush it off as "needless and lame"?
 
@Dr. Dro respectfully we don't need to hear Nvidia marketing here, thank you.

Here are recommended Ultra specs for FM.

21425356615827
 
A good gameplay make you forgot graphics everytime.
If not, the game is not good.
Simple as that.

You got it right in the first paragraph, using "I". However, you've no business speaking about what anyone else might feel is important.

So no, it's not at all as "simple as that". Very far from it.
 
@Dr. Dro respectfully we don't need to hear Nvidia marketing here, thank you.

Here are recommended Ultra specs for FM.

21425356615827

Have you noticed that none of these CPUs are the large-cache variants? The 5800X is faster than the 5800X3D if cache isn't involved, because it clocks higher and has a lower LLC latency (the X3D slice incurs a clock cycle penalty). The 11700K has something like a tiny 16 MB L3, if I remember correctly. It may well be that Forza Motorsport doesn't benefit from cache, couple that with the Radeon driver being the distinct piece of work it tends to be, and it's no wonder that you're sub-60fps on ultra high settings. Even then the game doesn't look unplayable to me, drop the raytracing which no doubt is enabled on Ultra and you're probably (completely) playable again.
 
Hard disagree, it looks like you're letting nostalgia of simpler times cloud your judgment. All of these are PS1 era games from the 1990's and early 2000's.
Hard disagree with what exactly ? I said I found myself these enjoying these more than the recent ones.
I know I'm right, because It's me I'm talking about.
OP asked to give my opinion. My opinion Is that Re-Volt is hands down the most fun RC cars games i've played, it has it's soul and is tons of fun in multi. I enjoy it
I still play war3 with friends in LAN once in a while as well as Age 2, and it's the most fun
I played tony hawk a week ago, and I enjoyed doing tricks I didn't know i could do, as well as listening to the soundtrack which is a banger
etc...
I enjoy this more than Cyberpunk 2077 for instance, walking down a neon-lighted souless city filled with giants dildos and junkies doesn't scream fun to me.
Do you see where I'm getting at ?
The last game which managed to check a lot of boxes for me was Plague Tale Requiem. It was beautiful in my eyes as well as story driven.
The only reason why I bought this game is because I'm french and I wanted to play a medieval game in my country during this part of history so I bought the 1st one, then the 2nd because I enjoyed the first and didn't regret it. Not about graphics
 
Last edited:
Have you noticed that none of these CPUs are the large-cache variants? The 5800X is faster than the 5800X3D if cache isn't involved, because it clocks higher and has a lower LLC latency (the X3D slice incurs a clock cycle penalty). The 11700K has something like a tiny 16 MB L3, if I remember correctly. It may well be that Forza Motorsport doesn't benefit from cache, couple that with the Radeon driver being the distinct piece of work it tends to be, and it's no wonder that you're sub-60fps on ultra high settings. Even then the game doesn't look unplayable to me, drop the raytracing which no doubt is enabled on Ultra and you're probably (completely) playable again.
You don't know what you are talking about.

I had the 5800X myself before the x3D variant.

I am sure 200mhz is responsible for a drop from 90 -110+ to 40 in your own head my friend.

7K series especially the 7900's are in 3090 territory and can handle RT much better than the 6K series, they are roughly on par with the 4070 - Ti in RT except for path tracing.
 
t's just not bias, I'm bringing up the unpleasant and the inconvenient of being a Radeon gamer, that is all. New games always run poorly, crash or have bugs, with very belated game ready releases. It's been that way as long as I can remember.

Can't say I ever had that impression of things while I was running a 6800XT. It worked well and performed well in every game I chucked at it.
 
Hard disagree with what exactly ? I said I found myself these enjoying these more than the recent ones.
I know I'm right, because It's me I'm talking about.
OP asked to give my opinion. My opinion Is that Re-Volt is hands down the most fun RC cars games i've played, it has it's soul and is tons of fun in multi. I enjoy it
I still play war3 with friends in LAN once in a while as well as Age 2, and it's the most fun
I played tony hawk a week ago, and I enjoyed doing tricks I didn't know i could do, as well as listening to the soundtrack which is a banger
etc...
I enjoy this more than Cyberpunk 2077 for instance, walking down a neon-lighted souless city filled with giants dildos and junkies doesn't scream fun to me.
Do you see where I'm getting at ?
The last game which managed to check a lot of boxes for me was Plague Tale Requiem. It was beautiful in my eyes as well as story driven.

Cyberpunk was/is a flopped game, agree with you on this one, but I had things like Starfield, Baldur's Gate 3, Hogwarts Legacy, Jedi Survivor, Final Fantasy XVI, God of War and Ragnarok, Horizon Zero Dawn and Forbidden West, etc. - all of these are current-gen high-spec games which are really enjoyable, for example.

Will agree and have from the start pointed out that it's not mutually exclusive, but I personally don't feel gameplay has been sacrificed for graphics, or has to be, or vice versa. The biggest problem as always is budget.

You don't know what you are talking about.

I had the 5800X myself before the x3D variant.

I am sure 200mhz is responsible for a drop from 90 -110+ to 40 in your own head my friend.

7K series especially the 7900's are in 3090 territory and can handle RT much better than the 6K series, they are roughly on par with the 4070 - Ti in RT except for path tracing.

You're talking about next-gen ultra and a compromise on the same line. The 3090 is 3 years old by now (and so is Zen 3). Sure it's still faster at RT than anything AMD's released since bar the 7900 XTX, but it's still not the very latest hardware and you should expect that next-gen games pushing maximum graphics settings would weigh down on such a card.

Can't say I ever had that impression of things while I was running a 6800XT. It worked well and performed well in every game I chucked at it.

This year for example, AMD missed the entire Q1 AAA launch window without a single game ready driver and had an entire branch and series of drivers that worked only on the 7900 XTX, leaving everyone else in the dark. People tend to forget things like this.
 
Cyberpunk was/is a flopped game, agree with you on this one, but I had things like Starfield, Baldur's Gate 3, Hogwarts Legacy, Jedi Survivor, Final Fantasy XVI, God of War and Ragnarok, Horizon Zero Dawn and Forbidden West, etc. - all of these are current-gen high-spec games which are really enjoyable, for example.

Will agree and have from the start pointed out that it's not mutually exclusive, but I personally don't feel gameplay has been sacrificed for graphics, or has to be, or vice versa. The biggest problem as always is budget.



You're talking about next-gen ultra and a compromise on the same line. The 3090 is 3 years old by now. Sure it's still faster at RT than anything AMD's released since bar the 7900 XTX, but it's still not the very latest hardware and you should expect that next-gen games pushing maximum graphics settings would weigh down on such a card.
I was playing it at 1080P, not 4K like they were recommending and still dropping below 60 and frame rate is all over the place.

You have gone from oh it's Nvidia is better, AMD is poor and you should feel bad and know it! to now oh just expect all games coming out to crush your high end system, the 4090 is a halo tier product like the 3090 and Ti variant, they are ridiculous, back when I had the GTX 480 that's the peak of the high end, or you could by a 460 2 win which beat it and get halo level performance.

5970 is a halo tier product vs a 5870. 6990 vs the 6970. GTX 780 Ti and TITAN BLACK vs 780.

Except back then you were on the high end and you definitely had high end framerates.


It's all in your own head @Dr. Dro So you can justify what you buy. Shilled and donkey pilled.
 
I bought the 1st crysis for the graphics, then I found out that I needed a much more powerful PC.
 
You're talking about next-gen ultra and a compromise on the same line. The 3090 is 3 years old by now (and so is Zen 3). Sure it's still faster at RT than anything AMD's released since bar the 7900 XTX, but it's still not the very latest hardware and you should expect that next-gen games pushing maximum graphics settings would weigh down on such a card.
While true, we can still get a pretty educated opinion on a game's overall performance vs its looks vs the hardware thrown at it.

And honestly, even at 90+FPS I still think what you get on screen for that on current gen high end is a fucking disgrace. Its a bloody racing game that seems to render with the efficiency of a Ubisoft title. These games should be pushing 120+ on midrange hardware.
 
I was playing it at 1080P, not 4K like they were recommending and still dropping below 60 and frame rate is all over the place.

You have gone from oh it's Nvidia to now oh just expcet all games coming out to crush your high end system, the 4090 is a halo tier product like the 3090 and Ti variant, they are ridiculous, back when I had the GTX 480 that's the peak of the high end, or you could by a 460 2 win which beat it and get halo level performance.

5970 is a halo tier product vs a 5870. 6990 vs the 6970. GTX 780 Ti and TITAN BLACK vs 780.

Except back then you were on the high end and you definitely had high end framerates.

See that's another problem I brought up on my initial post. You're playing at 1080p, so why do you even need graphics intended for UHD displays? It's gonna look super compressed.

Good that you mention dual-core cards, back then we had SLI and CrossFire. Both have gone the way of the dodo and rightfully so, despite the wow factor, I'm willing in retrospect to say they sucked. I had both the 5970 (forced-on AFR and CF couldn't be disabled in drivers, until many years later), and 3-way 480 (very poor support and scaling). Always had compatibility issues and incurred CPU overhead at a time that CPUs weren't exactly oozing performance like they do today. The original Titan had one distinct advantage over other cards, is that it had 6 GB... 10 years ago, and about 1.5x the performance of the 680 (so a little slower than a 690 provided SLI worked well), but without scaling woes or the 2 GB limitation. It was a product for enthusiasts... which I also had. And yes, the R9 290X was a wiser buy and a better card.

While true, we can still get a pretty educated opinion on a game's overall performance vs its looks vs the hardware thrown at it.

And honestly, even at 90+FPS I still think what you get on screen for that on current gen high end is a fucking disgrace. Its a bloody racing game that seems to render with the efficiency of a Ubisoft title.

It's a way of seeing things, and quite agreeable, but I'm perhaps, more than on the visual itself, interested in how the game achieves said visuals.
 
See that's another problem I brought up on my initial post. You're playing at 1080p, so why do you even need graphics intended for UHD displays? It's gonna look super compressed.

Good that you mention dual-core cards, back then we had SLI and CrossFire. Both have gone the way of the dodo and rightfully so, despite the wow factor, I'm willing in retrospect to say they sucked. I had both the 5970 (forced-on AFR and CF couldn't be disabled in drivers, until many years later), and 3-way 480 (very poor support and scaling). Always had compatibility issues and incurred CPU overhead at a time that CPUs weren't exactly oozing performance like they do today. The original Titan had one distinct advantage over other cards, is that it had 6 GB... 10 years ago, and about 1.5x the performance of the 680 (so a little slower than a 690 provided SLI worked well), but without scaling woes or the 2 GB limitation. It was a product for enthusiasts... which I also had. And yes, the R9 290X was a wiser buy and a better card.
You are just creating arguments to justify your original post but the original post was just BS against AMD because you own Intel + Nvidia.

Then insert AMD Radeon card at the end to try and appear not so.

You are still forgetting the lifespan of those cards and just how fast they were when launched, the 4000 and 7000 series are not old.
They tower well above console hardware.

The devs recommended a spec, it should perform to that spec, do you not see the problem yet? or are you still trying to cover your tracks but uphold you are right?
 
It's a way of seeing things, and quite agreeable, but I'm perhaps, more than on the visual itself, interested in how the game achieves said visuals.
And I totally get that angle too. I mean, sure more calculations are being done. But are they pulling their weight?

If anything wrt graphics and game logic, and its advancements... I mean, its part of the reason I play games too. See the evolution, experience it. I'm finding it hard to speak of evolution here. More like sidegrading, with no real end goal other than to make money off what is essentially a subpar product, ran on subpar technology. With RT, we're literally beta testing the goddamn tech with our wallets, and so far, honestly... what the hell have we really gained? Yeah, GPUs are mighty busy doing all the work, and yet, I can put it next to a well designed dynamically lit scene in raster and be hard pressed to spot differences - even in motion. Differences are there, certainly. But does it really matter if we can read a billboard in a puddle of water? Similarly, the upscaling technology - the longer we get to experience it in the wild, the more it is becoming a crutch to release games without optimization. All this new technology only seems to inspire uninspired, unmotivated and low-creativity products. I hate it.
 
Last edited:
This isn't bias, it's just that 5800X3D's are not invincible speed demon CPUs. Having one doesn't automatically mean every single game should run at "full ultra" settings. They're actually pretty slow unless your workload (or specific game) massively benefits from its large cache. When this isn't the case, performance just tumbles.

As for Radeons, is there a single game that doesn't launch in an utterly sorry state on AMD cards? Or that AMD doesn't take months to release a supported driver? It's just not bias, I'm bringing up the unpleasant and the inconvenient of being a Radeon gamer, that is all. New games always run poorly, crash or have bugs, with very belated game ready releases. It's been that way as long as I can remember.



Tetris, a good if not the perfect example. Played Tetris Effect? Probably the most wonderful puzzle game around, I'm a huge fan. It's particularly insane in VR, I hear, though I haven't tried. It doesn't make old NES Tetris for example unenjoyable in the slightest, but are you really going to look at all the love and effort, as well as the eye candy in Effect and brush it off as "needless and lame"?
I played Tetris Effect it's awesome. They really enhanced the experience.
 
And I totally get that angle too. I mean, sure more calculations are being done. But are they pulling their weight?

Arguably not, given that people are upset about the rising system specs (back to the first post, economic rut, people unwilling to spend money on frivolous things such as entertainment). It will be a transitional period until hardware is fast enough to negate these rising needs, which will take some time still.
 
You just have to ask the specific questions:

Do I buy games specifically for graphics? No.
Are games with basic graphics necessarily boring? No.
Is it true that advanced graphics make a game more enjoyable? It depends.
Is it true that a few games require advanced graphics to be fully enjoyable? Yes!
Do you enjoy games with advanced graphics? Hell yes!
Do you like games that depend on such advanced graphics to fully flesh themselves out? YES.
Is it true that the average gamer considers graphical fidelity a generational marker? And the answer to that is absolutely.

The reason this generation feels "underwhelming" is that system requirements are about to shoot up to insane heights due to the use of ray and path tracing, very high resolution textures and general assets (which are increasingly unique with more and more variety), advanced positional audio, etc. - all the while simultaneously, in real life we're in an economic rut with people unwilling to spend money to throw hardware at the problem; all the while game developers got really good at mastering the traditional raster-based techniques make games look more than reasonably realistic, they look pretty much amazing as it is. This presents an ugly development reality, because a LOT of time and resources are spent to make games look this good with traditional technology.

Add all of that to the fact that desktop panels have not really advanced in resolution in the past 10 years whatsoever and the bulk majority of displays is still 1080p/Full HD standard with a few 1440p's at the higher end, and there's just no point in software developed with UHD/4K and beyond in mind, leading to this illusion that things haven't really advanced despite the increase in requirements.

Enter Alan Wake 2, the game that's triggering all this discussion. It gave the boot to the ancient Nvidia Maxwell and Pascal GPUs as well as AMD's underqualified RDNA 1 hardware and is requesting DX12 Ultimate, people seem to be upset by it dropping downlevel hardware but you have to remember, Nvidia's Turing is 6 years old and already offered DX12U support from day one. The upper settings aren't supported on AMD because again, it's making extensive use of raytracing which their hardware and/or drivers is simply balls at. This conflicts with "gamer pride" and "latest generation cards must always run ultra high settings". End of the day, yes, the problem is you, the gamer, and not "optimization". Is asking for a RTX 2060 6 years later really too much? IMHO, no, it is not.

But that's just how I personally see things.
@Dr. Dro Try and keep up with your actual argument instead of defecating on our intellects and yourself at the same time, this is embarrassing.

You went from Oh upped to 2060 and bashing others, then to oh it's a Radeon, of course it won't run well, 7900 XT.

You are beyond ignorant and some of us can see right through you.


If you are working for Nvidia, have some kind of mechanism that makes money through them or intel, then I guess I can understand, otherwise you are just a dumbass.
 
Arguably not, given that people are upset about the rising system specs (back to the first post, economic rut, people unwilling to spend money on frivolous things such as entertainment). It will be a transitional period until hardware is fast enough to negate these rising needs, which will take some time still.
I don't think that's gonna happen, especially given the way GPUs are moving right now. There's also the uncanny valley problem. At some point, more realism can actually counteract the (artistic) goal you want to achieve. This will inevitably mean that these techs are not universally applicable, because in most games they honestly have no merit, they only cost performance. Games compete amongst each other too - having lower system specs is an incredible boost to potential sales. Its part of the reason the PC gaming market can diversify as it does.
 
You are just creating arguments to justify your original post but the original post was just BS against AMD because you own Intel + Nvidia.

Then insert AMD Radeon card at the end to try and appear not so.

You are still forgetting the lifespan of those cards and just how fast they were when launched, the 4000 and 7000 series are not old.
They tower well above console hardware.

The devs recommended a spec, it should perform to that spec, do you not see the problem yet? or are you still trying to cover your tracks but uphold you are right?

Let's just not turn this into a full scale fight, I've no interest. You can see through me all you want, because I make it no secret that I currently dislike AMD, but it's not petty clubism or brand loyalty. That much I can assure you, in fact, few could ever claim to be as committed to AMD as I used to be 5 years ago. I had the full stack of flagships, and all of their HBM generations. Two Fury X's even, to end the CF era with a bang. Vega Frontier, because I believed in the mission. Radeon VII, because it was surreal hardware design at the time. I joined their beta-tester program. Was super active in it, too. But growing differences led to a bitter parting (on my end). And no, I just saw your edit, I am not trying to "justify" my expenses. I'm a tech enthusiast on a tech forum, and one of the perks of having finances in order is that you can buy any toys you like (within reason, notice I don't have a 4090).

What i'm saying and will maintain is that your game performs substandard compared to your hardware's technical capabilities because the drivers suck.

I don't think that's gonna happen, especially given the way GPUs are moving right now. There's also the uncanny valley problem. At some point, more realism can actually counteract the (artistic) goal you want to achieve. This will inevitably mean that these techs are not universally applicable, because in most games they honestly have no merit, they only cost performance. Games compete amongst each other too - having lower system specs is an incredible boost to potential sales. Its part of the reason the PC gaming market can diversify as it does.

Happen, it will. but it'll be due to natural causes if you get what I mean. It'll still take 2 or 3 hardware generations for the flagship to achieve this full potential, and a little more for the affordable series to start handling these types of workload with ease.
 
Let's just not turn this into a full scale fight, I've no interest. You can see through me all you want, because I make it no secret that I currently dislike AMD, but it's not petty clubism or brand loyalty. That much I can assure you, in fact, few could ever claim to be as committed to AMD as I used to be 5 years ago. I had the full stack of flagships, and all of their HBM generations. Two Fury X's even, to end the CF era with a bang. Vega Frontier, because I believed in the mission. Radeon VII, because it was surreal hardware design at the time. I joined their beta-tester program. Was super active in it, too. But growing differences led to a bitter parting (on my end). And no, I just saw your edit, I am not trying to "justify" my expenses. I'm a tech enthusiast on a tech forum, and one of the perks of having finances in order is that you can buy any toys you like (within reason, notice I don't have a 4090).

What i'm saying and will maintain is that your game performs substandard compared to your hardware's technical capabilities because the drivers suck.



Happen, it will. but it'll be due to natural causes if you get what I mean. It'll still take 2 or 3 hardware generations for the flagship to achieve this full potential, and a little more for the affordable series to start handling these types of workload with ease.
When you use your own experiences and beliefs to make an argument that is not based upon that but to try and oppress everyone else the way you did, you sound like you are trying to market stuff to others based on what you own.

Instead of saying, my time with Radeon was not great, I prefer to use Nvidia and in some instances Nvidia are more ready from the get go in newer titles. - Which is objectively false in the majority and does not hold up to scrutiny..... it is still fine as you are explaining your point of view.

But you went on to say everyone is an idiot, this is why and this is the only truth.

When called out you want to pretend you are some paragon of peace appealing to moderators because you lost intellectual footing.

As you stated, this issue is with you, no one else, a projection of your own issues.

Sort it and this won't happen again.
 
  • Like
Reactions: Jun
Tell that to classic text-based games like Zork or MUD, or 4-color block graphic classics like MULE, Missile Command, or Centipede. No one bought those games for the fancy visuals -- meaning the developers knew -- knew -- the gameplay had to be outstanding.

Today? Developers will spend 90% of their time on the visuals, and 8 of the remaining 10% trying to optimize performance so the thing even runs properly. Gameplay is a belated afterthought.
obviously you havent talked to any developers or done any game development, a decent dev team delegates whats being done, one person might do atmospherics and another would do character and npc graphics and yet another might do landscape graphics, but when it comes to game play mechanics, it takes several people to do gameplay mechanics and UI and quests and than there is QA time needed which is usually cut short due to launch deadlines.

So no 90% is not visuals, indie games dont even spend that much time doing visuals, If you really want to know about development check out
, he is part of a team working on an MMO, the streamer himself worked on Everquest 1.

My first thought of this thread was that its one huge whine-fest about graphics when infact you cant tell what gamers like or dont like, to each his own.
 
My first thought of this thread was that its one huge whine-fest about graphics when infact you cant tell what gamers like or dont like, to each his own.

Sums it up nicely. It was productive (sort of), so I'm satisfied :)
 
Happen, it will. but it'll be due to natural causes if you get what I mean. It'll still take 2 or 3 hardware generations for the flagship to achieve this full potential, and a little more for the affordable series to start handling these types of workload with ease.
Time will tell. We've also seen a very promising technology fade out of existence (3dfx).
 
This year for example, AMD missed the entire Q1 AAA launch window without a single game ready driver and had an entire branch and series of drivers that worked only on the 7900 XTX, leaving everyone else in the dark. People tend to forget things like this.

That may well be true (I haven't a clue, since I've been without the 68 for longer than that), but you clearly stated "It's been that way as long as I can remember". That's also quite possibly true, but only if you completely lack long term memory. My own memory is utterly shite, but I can remember quite clearly when it wasn't like that.
 
I'll never play games that look like minecraft.....
 
Back
Top