• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Ratchet & Clank Patch Adds AMD Radeon Ray Tracing Support

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
The latest patch of "Ratchet & Clank: Rift Apart" released on Wednesday, finally adds real time ray tracing support for AMD Radeon GPUs. Patch version v1.808.0.0 went live on Wednesday, which calls upon AMD Radeon users to have at least the one-off 23.10.23.03 drivers that AMD released specifically for the game. The company latest main-trunk driver remains the Adrenalin 23.7.2 WHQL, so you'll need the off-trunk driver to use ray tracing for now. "Ratchet & Clank: Rift Apart" on PC leverages ray tracing for reflections, shadows, and ambient-occlusion. These toggles in the game's external setup program were grayed out on older AMD drivers.



View at TechPowerUp Main Site | Source
 
Nice, so just a driver update wasn't enough, you also needed a game update
 
Nice, so just a driver update wasn't enough, you also needed a game update
It kinda was obvious to me when developer and AMD, both where working on it as it was stated in the press. I still remember people bitching about the driver being bad because AMD has listed it as an issue in the driver list. Nobody understood it has to be listed but it is not necessarily AMD driver fault. Now here it is. The developer had to update the game in order for AMD cards to use RT.
 
Last edited:
The Way Its Meant To Be Played...

Well, will there be an update to the performance testing for R&C?
 
It kinda was obvious to me when developer and AMD, both where working on it as it was stated in the press. I still remember people bitching about the driver being bad because AMD has listed it as an issue in the driver list. Nobody understood it has to be listed but it is not necessarily AMD driver fault. Now here it is. The developer had to update the game in order for AMD cards to use RT.

Not so obvious for some forum members. ;) I got pretty roasted for pointing it out, lol.
 
Not so obvious for some forum members. ;) I got pretty roasted for pointing it out, lol.
I know you did get grilled for it bad. I really don't understand people being sometimes so narrow minded. It is mind boggling to me. Something so simple can be twisted so much to prove a point.
AMD had to put this in the list despite driver or not driver related. From the information we all have received and the game itself being a port, It was not a driver issue but the developer had to buckle up and fix it. Game port from one platform to the other when the hardware is basically the same or similar, is not that big of an issue as people tend to portray it. And that is a fact.
Is there a room for improvement in the driver ? There is always a room for improvement.
 
Nice, so just a driver update wasn't enough, you also needed a game update
Because games have never had to be updated to add new features before :laugh: :roll: :laugh:

I swear people went absolutely BANANAS over AMD RT not being available at launch.
 
I swear people went absolutely BANANAS over AMD RT not being available at launch.
And rightfully so? You've spent $1000 on a GPU and you get a 2nd class experience from a AAA game? (no matter who's fault it is, this shouldn't happen)
 
And rightfully so? You've spent $1000 on a GPU and you get a 2nd class experience from a AAA game? (no matter who's fault it is, this shouldn't happen)
does it even make sense? since its a console port.. they use radeons right? its beyond weird imo
 
does it even make sense? since its a console port.. they use radeons right? its beyond weird imo
As far as I know most implementations for RT in consoles are software based, so you still need a different implementation when porting to PC.
 
As far as I know most implementations for RT in consoles are software based, so you still need a different implementation when porting to PC.
how can a weak console possibly perform RT on cpu, not hardware accelerated... i don't think so

what i can think of is that the RT was not activated as is the case with RT so often in consoles, not used because too weak.. but this doesnt explain how it works on the pc port on geforce from get go and not on radeon. to me this seems beyond weird, again
 
how can a weak console possibly perform RT on cpu
It doesn't, it runs on the GPU it just doesn't use the RT cores, Crysis Remastered for example had a software implementation of RT and it worked fine even on last gen consoles. They don't use the hardware RT because it's just too weak.
 
It doesn't, it runs on a GPU, Crysis Remastered for example had a software implementation of RT and it worked fine even on last gen consoles. They don't use the hardware RT because it's just too weak.
that was a exception though... so if you dont have a source , this is hard to believe. Hardware RT is way more performant than doing it on software, like 10x so, if not more.
 
that was a exception though... so if you dont have a source , this is hard to believe. Hardware RT is way more performant than doing it on software, like 10x so, if not more.
There are at least two engines that do RT this way on consoles, Crytek and UE5 in Fortnite. PS5 has 36 RT cores which is basically an RX6700, just looking at a few benchmarks it goes without saying that it's very weak in RT performance. I can't even find performance numbers for it on TPU the slowest card in their charts is an RX 6800, so it's weak, that's for sure.
1691674457991.png
 

Attachments

  • 1691674454820.png
    1691674454820.png
    116.8 KB · Views: 63
There are at least two engines that do RT this way on consoles, Crytek and UE5 in Fortnite. PS5 has 36 RT cores which is basically an RX6700, just looking at a few benchmarks it goes without saying that it's very weak in RT performance. I can't even find performance numbers for it on TPU the slowest card in their charts is an RX 6800, so it's weak, that's for sure.
View attachment 308369
Yea like I said, RT is often deactivated on the consoles because of bad RT perf. If at all, its probably used with a upscaler, I cannot imagine otherwise.
 
And rightfully so? You've spent $1000 on a GPU and you get a 2nd class experience from a AAA game? (no matter who's fault it is, this shouldn't happen)
Meh. This kind of thing happens all the time. Games launch without feature because it isnt ready yet, then gets patched in less then a month after launch.

The game was perfectly playable without AMD RT and still looked fantastic. I never once thought "wow, this game is ugly AF without RT, I want my money back".

People really need to chill TF out. It was supplementary lighting technology, that was coming in a patch, and the dev was working on. This isnt a Remenant II situation where the dev gets lazy and throws a bandaid at something and that band aid is required to get the game working right. The devs said it was coming, but they were having issues, and somehow this is the second coming of the devil.
 
does it even make sense? since its a console port.. they use radeons right? its beyond weird imo
A developer on Moore's Law is Dead spoke on this very thing. I'm gonna relay what was said in the way I took it in a nutshell... Games programmed for the Playstation do NOT translate into easy adoption to PC AMD parts. Code is completely different for how the console operates and how the PC operates. Also, devs are dealing with one system configuration when it comes to the PS5. If you can get a chance to watch it, it was easily one of the better guests he has on. Dude is extremely knowledgeable.
 
A developer on Moore's Law is Dead spoke on this very thing. I'm gonna relay what was said in the way I took it in a nutshell... Games programmed for the Playstation do NOT translate into easy adoption to PC AMD parts. Code is completely different for how the console operates and how the PC operates. Also, devs are dealing with one system configuration when it comes to the PS5. If you can get a chance to watch it, it was easily one of the better guests he has on. Dude is extremely knowledgeable.
of course and it makes complete sense but still doesn't explain why the pc port on geforce worked from the get go then (with RT) and radeon didn't. And I'm refraining from saying more on this - everyone can add his own thoughts onto this.
 
of course and it makes complete sense but still doesn't explain why the pc port on geforce worked from the get go then (with RT) and radeon didn't. And I'm refraining from saying more on this - everyone can add his own thoughts onto this.
My guess would be :
-that management wanted to get the game out ASAP
-They worked with Nvidia earlier in the game dev time line
-management decided that RT not working on AMD on launch was an "acceptable risk"
 
My guess would be :
-that management wanted to get the game out ASAP
-They worked with Nvidia earlier in the game dev time line
-management decided that RT not working on AMD on launch was an "acceptable risk"
I mean it should be an acceptable risk because R&C already looks and runs great on the PS5. So even without RT, an AMD user can still enjoy the game with better textures, higher resolution and higher framerate. NVIDIA users just got extra visual fidelity early, which yes, does add to the user experience, but is (IMO) not that major that it would be a deal breaker.

At the moment, the only thing that really catches the eye with RT are the shiny/realistic(??) reflections implemented in almost all the games that have it. This is just me, but even as an NVIDIA and 7000-series owner myself I don't care much about RT right now.
 
I mean it should be an acceptable risk because R&C already looks and runs great on the PS5. So even without RT, an AMD user can still enjoy the game with better textures, higher resolution and higher framerate. NVIDIA users just got extra visual fidelity early, which yes, does add to the user experience, but is (IMO) not that major that it would be a deal breaker.

At the moment, the only thing that really catches the eye with RT are the shiny/realistic(??) reflections implemented in almost all the games that have it. This is just me, but even as an NVIDIA and 7000-series owner myself I don't care much about RT right now.
Ironically one of the later levels looks worse on nvidia hw with RT enabled, the lighting has really low contrast and looks weird.
 
Ironically one of the later levels looks worse on nvidia hw with RT enabled, the lighting has really low contrast and looks weird.
Exactly, this is why RT is not one of my current factors when purchasing a GPU for gaming. I completely understand if one who wants the "absolute best visual experience" would go for a more RT-capable card, but again, thats not me.
 
of course and it makes complete sense but still doesn't explain why the pc port on geforce worked from the get go then (with RT) and radeon didn't. And I'm refraining from saying more on this - everyone can add his own thoughts onto this.
I said this very thing in the original post. What I said in this post has nothing to do with Nvidia. Was R&C PC was a Nvidia sponsored game?
 
Back
Top