• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD is Taking Time with Radeon RX 9000 to Optimize Software and FSR 4

Without rt though how do we get better graphics? I think we've already hit diminishing returns a long time ago with raster. Rdr2 and plague tale already look photorealistic enough. So we either pause graphics (and the requirement for new gpus) or push RT. Daniel Owen has made a really good video about this, I think you should check it out
You're asking a question I don't need to answer, I feel no need to. Do we need better graphics, is a better question, and if we do, what is an acceptable price for it?

That is the question I ask myself and the market will ask the same one, after the initial hype dies down and reality sets in. The push for RT is a push. Its not a desire gamers have. Its something that many take for granted much like they do every other development. At the same time, you see a divide where a select group is capable of accessing said features and vast majority is not.
 
Without rt though how do we get better graphics? I think we've already hit diminishing returns a long time ago with raster. Rdr2 and plague tale already look photorealistic enough. So we either pause graphics (and the requirement for new gpus) or push RT. Daniel Owen has made a really good video about this, I think you should check it out
Graphics improvements have already slowed down to a level where you only have new GPUs every 2 years instead of 1, and your upgrades easily last 2-3 generations at least. That's 4-6 years. I remember when you had to buy a new GPU every year to stay up to date. I don't think RT is gonna push us out of this stagnation. Not at the pace it's currently progressing anyway.

Edit: My personal take is that with current prices, I'm fine with a little bit of stagnation (that is, not spending money all the time).
 
I don't know about crypto. Sure, mining is dead, but...
View attachment 381422

On the other hand, I do agree about "max stupid".
Personally, I'm planning to upgrade my PC with this generation (the 9070/XT is the most probable candidate), but that's it. I'm gonna call it quits until 2030 at least (hopefully).
The max stupid idea revolves around a sudden, collective realization that the current price is utterly bonkers. What's the practical use of a 100k bitcoin? Its too valuable to give away to anything. So what's it gonna do? Be a big number all the time? Or will you sell it at a loss?
 
You're asking a question I don't need to answer, I feel no need to. Do we need better graphics, is a better question, and if we do, what is an acceptable price for it?

That is the question I ask myself and the market will ask the same one, after the initial hype dies down and reality sets in. The push for RT is a push. Its not a desire gamers have. Its something that many take for granted much like they do every other development. At the same time, you see a divide where a select group is capable of accessing said features and vast majority is not.
Sure, I wasn't even suggesting that pushing RT is better than just hitting the pause button. I'm not sure myself, lol.
 
Without rt though how do we get better graphics? I think we've already hit diminishing returns a long time ago with raster. Rdr2 and plague tale already look photorealistic enough. So we either pause graphics (and the requirement for new gpus) or push RT. Daniel Owen has made a really good video about this, I think you should check it out
I'd honestly prefer we pause graphic fidelity and focus on art design instead. A realism-focused game isn't going to be realistic in 5-10 years but a game with good art design will be timeless forever.

Realism is good enough now, we're trading too many resources for too little gain.
 
Sure, I wasn't even suggesting that pushing RT is better than just hitting the pause button. I'm not sure myself, lol.
That's exactly it right... I mean, in the end, everything's a deal with pros and cons, you weigh them, and you conclude its worth it or not.
 
I'd honestly prefer we pause graphic fidelity and focus on art design instead. A realism-focused game isn't going to be realistic in 5-10 years but a game with good art design will be timeless forever.

Realism is good enough now, we're trading too many resources for too little gain.
Sure, ps1 is a good example of that. Tried to push fidelity and it kinda looks like crap nowadays vs snes or even n64 games that are still good to look at.
 
I'd honestly prefer we pause graphic fidelity and focus on art design instead. A realism-focused game isn't going to be realistic in 5-10 years but a game with good art design will be timeless forever.

Realism is good enough now, we're trading too many resources for too little gain.
I agree. Besides, there are paths of development towards realism still left open. For example, people only just started to look somewhat life-like. We have beautiful light rays, shiny reflections, ambient occlusion, which is all nice, but why do people still look like plastic dolls, especially in the rain? Not to mention cloth and hair which don't look wet at all. And sand and mud which we still do with bump mapping.
 
I agree. Besides, there are paths of development towards realism still left open. For example, people only just started to look somewhat life-like. We have beautiful light rays, shiny reflections, ambient occlusion, which is all nice, but why do people still look like plastic dolls, especially in the rain? Not to mention cloth and hair which don't look wet at all. And sand and mud which we still do with bump mapping.
the cake is A Lie! (I actually think the non RTX cake scene is far more evocative)

 
Post 5070 is true. They want full performance and street-price figure before pulling the trigger.
Strap in. This is going to take a while longer, but not much longer after 5070's public release.
 
What rush? The overwhelming majority of games add RT as an afterthought, and what's there, is 9 out of 10 times completely irrelevant or simply detracts from the image while costing FPS. Just the fact that its marketed doesn't mean its worth a damn. And the RT applications that are fully within the unified API space are often of that nature, while those outside of it are performance killers forcing proprietary upscaling. What do you mean, handhelds? They can barely run the game as it is.

We're over 6 years into this RT paradigm now. It ain't going places much if you ask me.
Completely and utterly wrong. Real-time ray tracing is the future of graphics rendering, despite how many times you and others like you attempt to poo-poo it. Its uptake has simply been delayed by a number of factors:
  • It's a complete paradigm shift compared to rasterisation - you don't just need appropriate tools, but the appropriate mindset. Game developers who were born and raised on rasterisation are going to take time to get to grips with RT, and especially will have to unlearn a vast quantity of the stupid bulls**t hackery required to coerce rasterisation to render things somewhat realistically.
  • Game development is no longer about pushing the boundaries of technology, but making money. Even if developers want to implement RT, their managers aren't necessarily going to let them because of the extra training and development time, and thus cost. This creates inertia.
  • Hardware isn't quite powerful enough to handle it yet. You might say "then it shouldn't have been introduced", but you need to make game developers aware of and comfortable with a technology sooner rather than later.
  • Hardware isn't getting powerful enough at a fast enough rate to handle it. Unfortunately RT was introduced just before we hit the Moore's Law wall, which is particularly important given how hardware-intensive RT is.
RT has been the holy grail of graphics rendering forever. We may not yet be able to hold that grail, but we can at least touch it. If you'd suggested the latter would be a possibility to any computer graphics researcher a decade ago, they'd have laughed you out of the room - and yet here we are.

You don't like RT, we get it, but stop allowing that irrational dislike to blind you to the fact that RT is, in every aspect, the future of realistic graphics rendering that is superior to rasterisation in every conceivable way. In another decade, the only conversation about the latter will be in relation to graphics from before the RT era.
 
the cake is A Lie! (I actually think the non RTX cake scene is far more evocative)

Interesting comparison. I think the RT version looks more grim and menacing. I'm just not sure whether it's supposed to be that way. A few light sources and textures seem to be a bit off, though, especially in the orange lighting. Papers on the wall seem to have a different brightness level than the rest of the wall, which is weird. Lights and their reflections look awesome, though. This is just my opinion.
 
I agree. Besides, there are paths of development towards realism still left open. For example, people only just started to look somewhat life-like. We have beautiful light rays, shiny reflections, ambient occlusion, which is all nice, but why do people still look like plastic dolls, especially in the rain? Not to mention cloth and hair which don't look wet at all. And sand and mud which we still do with bump mapping.
Because rasterisation is a hack built in top of a hack built on top of a fridge that's propped up on bricks on the side of a steep hill in a tornado. The more hacks you add to try and achieve realism the more likely it is that one will interact unintentionally with another - this is precisely why weird graphics glitches occur in game engines. RT is the only sane solution to this, because it simulates the world in the way the world works, and you don't need any hacks to do that.
 
Completely and utterly wrong. Real-time ray tracing is the future of graphics rendering, despite how many times you and others like you attempt to poo-poo it. Its uptake has simply been delayed by a number of factors:
  • It's a complete paradigm shift compared to rasterisation - you don't just need appropriate tools, but the appropriate mindset. Game developers who were born and raised on rasterisation are going to take time to get to grips with RT, and especially will have to unlearn a vast quantity of the stupid bulls**t hackery required to coerce rasterisation to render things somewhat realistically.
  • Game development is no longer about pushing the boundaries of technology, but making money. Even if developers want to implement RT, their managers aren't necessarily going to let them because of the extra training and development time, and thus cost. This creates inertia.
  • Hardware isn't quite powerful enough to handle it yet. You might say "then it shouldn't have been introduced", but you need to make game developers aware of and comfortable with a technology sooner rather than later.
  • Hardware isn't getting powerful enough at a fast enough rate to handle it. Unfortunately RT was introduced just before we hit the Moore's Law wall, which is particularly important given how hardware-intensive RT is.
RT has been the holy grail of graphics rendering forever. We may not yet be able to hold that grail, but we can at least touch it. If you'd suggested the latter would be a possibility to any computer graphics researcher a decade ago, they'd have laughed you out of the room - and yet here we are.

You don't like RT, we get it, but stop allowing that irrational dislike to blind you to the fact that RT is, in every aspect, the future of realistic graphics rendering that is superior to rasterisation in every conceivable way. In another decade, the only conversation about the latter will be in relation to graphics from before the RT era.
If that's the case, then why do we have the same ratio of raster vs RT hardware even on Nvidia GPUs since Turing? If RT is the way to go, then surely we should be seeing some indication of at least Nvidia investing in it more heavily than in raster and/or general performance, right? - This isn't a form of disagreement, more like a genuine question.

Because rasterisation is a hack built in top of a hack built on top of a fridge that's propped up on bricks on the side of a steep hill in a tornado. The more hacks you add to try and achieve realism the more likely it is that one will interact unintentionally with another - this is precisely why weird graphics glitches occur in game engines. RT is the only sane solution to this, because it simulates the world in the way the world works, and you don't need any hacks to do that.
RT only simulates lights and shadows as far as I know. They're just a portion of the world around us, not the entirety of it.
 
Without rt though how do we get better graphics?
We could start finally admitting that much of it is a software rather than hardware problem. I have 20 year old games where 'primitive' lighting looks interesting purely due to, eg, placing coloured lighting behind moving fans, or ceiling lights that get knocked from a grenade explosion during combat, start swinging and cast moving shadows causing a "Oh sh*t did something move in that corner?" moment) that game developers lost the art of sometime during last decade, where most lighting turned into low-effort shader based "Fill every room with the same invisible glowing fog" thing they were copy-pasting from game to game. Lack of attention to detail rather than lack of hardware is the reason lighting now look flat and boring. It's absolutely possible to make lighting look good without ray-tracing. Games devs just stopped trying.

"So we either pause graphics (and the requirement for new gpus)"
That actually sounds fine to me. Look at how enemies acted in LithTech engine games, eg, convincingly communicated with each other whilst flanking you in FEAR (2005) / flipped over tables, used them as cover whilst firing over the top exposing just a sliver of their bodies in No One Lives Forever, how Thief (1997) pushed the boundaries for hardware accelerated 3D audio propagation, etc, how the writing in games like SOMA caused you to actually stop and think whilst playing vs the same old "Greetings Citizen. After you've blown $2k on a GPU to stare at the ray-traced reflections off that fishes eyeball in the stream, you can go fetch me 50 Nirnroot!" style recycled fetch-quests. Take a look at graphics in Oblivion, what improved that wasn't "MOAR hardware for MOAR textures", but things like Unique Landscapes Mod where people actually put in an ounce of creativity in not making every area look exactly the same, or realistic windmill sizes in Skyrim instead of looking weirdly shrunken & deformed by default. We really are at the stage where if you want better games today, then GPU's, RT, etc, are bottom of the list of things to improve.

the cake is A Lie! (I actually think the non RTX cake scene is far more evocative)

Unfortunately trying to retro-actively shoehorn RTX into old games often changes the whole tone to the point they don't fit the game. When I saw Quake RTX screenshots I thought "Is this Quake?" (the game that's supposed to have an intentionally gloomy "Gothic Lovecraftian Sci-Fi" art style) because it ended up looking more like the Egyptian levels in Serious Sam FE, and the game basically lost all its 'character'... :rolleyes:
 
Last edited:
Interesting comparison. I think the RT version looks more grim and menacing. I'm just not sure whether it's supposed to be that way. A few light sources and textures seem to be a bit off, though, especially in the orange lighting. Papers on the wall seem to have a different brightness level than the rest of the wall, which is weird. Lights and their reflections look awesome, though. This is just my opinion.
It does look awesome in places. I just think you need a directors touch. Reminds me of procedural generation in that aspect.
 
RT feels a bit rushed or railroaded as a technology to me. We already seem to be entering the "RT is now mandatory" era while hardware (except for the top end, and even then it requires software tricks like upscaling) has yet to catch up to running it as a decent pace.

I appreciate the fact it makes the lives of devs easier but we're going backwards in terms of framerate and fidelity because of it. I'm sure RT lighting in most games looks great except upscaling now introduces visual noise which renders the improvement a moot point.
 
For me personally, the only game I played which made me think "wow, RT is the real deal" was Control. Every other game I played with RT, it was so minute I wondered what the heck was I trading framerate for.
 
For me personally, the only game I played which made me think "wow, RT is the real deal" was Control. Every other game I played with RT, it was so minute I wondered what the heck was I trading framerate for.
For me, the "wow effect" kicks in when I turn RT on, but fades away in about 5 minutes. If I turn it off and keep playing for half an hour, I won't even know that there was a difference. Even in Control.
 
If that's the case, then why do we have the same ratio of raster vs RT hardware even on Nvidia GPUs since Turing? If RT is the way to go, then surely we should be seeing some indication of at least Nvidia investing in it more heavily than in raster and/or general performance, right? - This isn't a form of disagreement, more like a genuine question.
Because if NVIDIA doesn't increase its generation-on-generation performance in every aspect, the reviewers and buying public are going to trash them, and their investors will be mad. So NVIDIA, which literally doesn't care about rasterisation anymore, has to keep delivering linear rasterisation improvements anyway with more of the same old fixed-function rasterisation hardware, at the same time they deliver far greater RT improvements with new fixed-function hardware. At some stage I expect that they will either merge the rasterisation and RT hardware somewhat to prevent so much duplication, or be able to reuse the RT hardware to emulate rasterisation workloads, or possibly some combination of the two. Knowing NVIDIA they're already working hard on this problem.

RT only simulates lights and shadows as far as I know. They're just a portion of the world around us, not the entirety of it.
Close your eyes and tell me how much you see. Now reconsider your statement that light is "just a portion" of the world.

If you are referring to how most games implement RT now, it is via a hybrid RT/rasterisation path which causes more problems than it solves, because now you have the innumerable hacks of raster interacting with the correctness of RT. But as per my previous post, these experiments - because that is what they are - are necessary for game developers to begin to come to grips with RT and the paradigm shift it entails.
 
For me, the "wow effect" kicks in when I turn RT on, but fades away in about 5 minutes. If I turn it off and keep playing for half an hour, I won't even know that there was a difference. Even in Control.
But that's true for... everything. Higher framerates, input latency, higher resolution, better textures. Heck, sometimes I use my crap headphones to listen to music and they sound fine. I then need to go back to my high end ones to go WOW.
 
Because if NVIDIA doesn't increase its generation-on-generation performance in every aspect, the reviewers and buying public are going to trash them, and their investors will be mad. So NVIDIA, which literally doesn't care about rasterisation anymore, has to keep delivering linear rasterisation improvements anyway with more of the same old fixed-function rasterisation hardware, at the same time they deliver far greater RT improvements with new fixed-function hardware. At some stage I expect that they will either merge the rasterisation and RT hardware somewhat to prevent so much duplication, or be able to reuse the RT hardware to emulate rasterisation workloads, or possibly some combination of the two. Knowing NVIDIA they're already working hard on this problem.


Close your eyes and tell me how much you see. Now reconsider your statement that light is "just a portion" of the world.

If you are referring to how most games implement RT now, it is via a hybrid RT/rasterisation path which causes more problems than it solves, because now you have the innumerable hacks of raster interacting with the correctness of RT. But as per my previous post, these experiments - because that is what they are - are necessary for game developers to begin to come to grips with RT and the paradigm shift it entails.
I'd argue that yes, RT lighting will solve issues with rasterized lighting.
We're at a standstill (or even regression?) regarding rasterized geometry/textures, though, and (in my possibly misinformed opinion) that ain't on RT to fix. That's on engine devs to do right.
 
For me, the "wow effect" kicks in when I turn RT on, but fades away in about 5 minutes. If I turn it off and keep playing for half an hour, I won't even know that there was a difference. Even in Control.
There are multiple reasons for this.
  • Rasterisation has become incredibly good at simulating the real world, so good that the basic RT we currently have isn't able to outperform it visually. That's a consequence of literally decades of work on rasterisation, and far less on RT.
  • You've become used to how rasterisation simulates the real world, so games using rasterisation don't look "off" to you, even when their rendering is actually incorrect compared to the real world.
  • Conversely your brain gets used to RT quickly, because the latter does such a good job at simulating the real world.
 
For me personally, the only game I played which made me think "wow, RT is the real deal" was Control. Every other game I played with RT, it was so minute I wondered what the heck was I trading framerate for.
Was that a mod? Or did they add it in officially?
 
Back
Top