• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD-made PlayStation 5 Semi-custom Chip Has Ray-tracing Hardware (not a software solution)

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Sony's next-generation PlayStation 5 could land under many Christmas trees...in the year 2020, as the company plans a Holiday 2020 launch for the 4K-ready, 8K-capable entertainment system that has a semi-custom chip many times more powerful than the current generation, to support its lofty design goals. By late-2020, Sony calculates that some form of ray-tracing could be a must-have for gaming, and is working with its chip designer AMD to add just that - hardware-acceleration for ray-tracing, and not just something that's pre-baked or emulated over GPGPU.

Mark Cerny, a system architect at Sony's US headquarters, in an interview with Wired, got into the specifics of the hardware driving the company's big platform launch for the turn of the decade. "There is ray-tracing acceleration in the GPU hardware," he said, adding "which I believe is the statement that people were looking for." Besides raw processing power increases, Sony will focus on getting the memory and storage subsystems right. Both are interdependent, and with fast NAND flash-based storage, Sony can rework memory-management to free up more processing resources. AMD has been rather tight-lipped about ray-tracing on its Radeon GPUs. CEO Lisa Su has been dismissive about the prominence of the tech saying "it's one of the many technologies these days." The company's mid-2019 launch of the "Navi" family of GPUs sees the company skip ray-tracing hardware. The semi-custom chip's GPU at the heart of PlayStation 5 was last reported to be based on the same RDNA architecture.



View at TechPowerUp Main Site
 
Does seem like wasted resources, 80% of the games that have RTX now don't impress me at all in toggle on/off youtube videos comparing it. I would have rather seen a beefier cheap allowing for other things to be turned up higher like shadows, etc... developers who want to do RTX on ps5 will probably have to turn down some other settings in their tweaking process sadly to achieve 60 fps 4k.
 
Let's hope that AMD's hardware for ray tracing will also be usable for other stuff than ray tracing alone. So games that don't implement ray tracing can run even better.
 
Let's hope that AMD's hardware for ray tracing will also be usable for other stuff than ray tracing alone. So games that don't implement ray tracing can run even better.

Can't be done that way. You either use fixed function hardware for better RT acceleration but no other use or you don't and have to rely on the shaders themselves.
 
Well, the same units are usable for sound, I suppose :)
 
Does seem like wasted resources, 80% of the games that have RTX now don't impress me at all in toggle on/off youtube videos comparing it. I would have rather seen a beefier cheap allowing for other things to be turned up higher like shadows, etc... developers who want to do RTX on ps5 will probably have to turn down some other settings in their tweaking process sadly to achieve 60 fps 4k.

Pretty much this.
 
Does seem like wasted resources, 80% of the games that have RTX now don't impress me at all in toggle on/off youtube videos comparing it. I would have rather seen a beefier cheap allowing for other things to be turned up higher like shadows, etc... developers who want to do RTX on ps5 will probably have to turn down some other settings in their tweaking process sadly to achieve 60 fps 4k.

Raytracing is here to stay.
It is the only way to do proper shadows and lighting

You don't have to raytrace a whole scene. You can use it on certain elements, which limits performance-hit yet allowing for much better IQ.

Ampere next year will probably make RTX much more relevant. Turing is on the slow-side. Only 2080 Super and 2080 Ti can do it decently at 1440p.
 
Last edited:
So much skeptical :D
- First of all, Sony will not implement Microsoft DXR, they will have their on RTRT baked in Open GL or OpenCL. I believe they will address asymmetrical load type balance to both CPU and GPU, unlike Microsoft DXR that restrictive only to GPU.
- Sony has their own developer pool in SIE, unlike desktop , we only had one GPU maker to bribing every developer.
- Just like las said, you don't have to pull all feature in full scene, and top of that frame scaling was there to ease the process .
 
Raytracing is here to stay.
It is the only way to do proper shadows and lighting

You don't have to raytrace a whole scene. You can use it on certain elements, which limits performance-hit yet allowing for much better IQ.

Ampere next year will probably make RTX much more relevant. Turing is on the slow-side. Only 2080 Super and 2080 Ti can do it decently at 1440p.
"proper" shadows and lighting?

Games were experimenting with shadows and lighting back on the playstation1 FFS!

Yoiu dont "NEED" raytracing to make functional shadows and lighting. Just as games were doing terrain deformation long before DX11 came along, RT is jsut another way of doing shadows. It isnt the end all be all of graphics, and many people (including myself) have looked at "RT" enhanced games and see almost no improvement, certianly not enough to justify such a massive performance loss and dedicated hardware!
 
Does seem like wasted resources, 80% of the games that have RTX now don't impress me at all in toggle on/off youtube videos comparing it. I would have rather seen a beefier cheap allowing for other things to be turned up higher like shadows, etc... developers who want to do RTX on ps5 will probably have to turn down some other settings in their tweaking process sadly to achieve 60 fps 4k.

There is a big difference between buying a first-generation, arguably-overpriced RTX card and hoping that software will pop out of nowhere... and a closed platform like the PS5 that will stay around for years and devs can optimize with more ease for. Then a good amount of RT games will bleed over from the PS and XB to PC and eventually you'll have plenty of software where RT doesn't feel tacked-on.
 
Raytracing is here to stay.
It is the only way to do proper shadows and lighting

You don't have to raytrace a whole scene. You can use it on certain elements, which limits performance-hit yet allowing for much better IQ.

I don't mind raytracing being there, it's one of the steps in graphics' technology evolution. However having part of a GPU only for certain use is like a technology from 15 years ago, when we had vertex and pixel shaders, which were later replaced by unified shaders.
 
"proper" shadows and lighting?

Games were experimenting with shadows and lighting back on the playstation1 FFS!

Yoiu dont "NEED" raytracing to make functional shadows and lighting. Just as games were doing terrain deformation long before DX11 came along, RT is jsut another way of doing shadows. It isnt the end all be all of graphics, and many people (including myself) have looked at "RT" enhanced games and see almost no improvement, certianly not enough to justify such a massive performance loss and dedicated hardware!

Yes you do. The shadows you see in games today, without raytracing, are not realistic. You can't do this without raytracing, DUH. Google it. Tons of video's about this. Game dev's talking about it. You know more than game developers?
 
Does seem like wasted resources, 80% of the games that have RTX now don't impress me at all in toggle on/off youtube videos comparing it. I would have rather seen a beefier cheap allowing for other things to be turned up higher like shadows, etc... developers who want to do RTX on ps5 will probably have to turn down some other settings in their tweaking process sadly to achieve 60 fps 4k.
the fact that you associate ray-tracing with RTX is pretty fucked

"proper" shadows and lighting?

Games were experimenting with shadows and lighting back on the playstation1 FFS!

Yoiu dont "NEED" raytracing to make functional shadows and lighting. Just as games were doing terrain deformation long before DX11 came along, RT is jsut another way of doing shadows. It isnt the end all be all of graphics, and many people (including myself) have looked at "RT" enhanced games and see almost no improvement, certianly not enough to justify such a massive performance loss and dedicated hardware!
I wonder if you said the same about tessellation when it first became available on first-gen cards lol

Yes you do. The shadows you see in games today, without raytracing, are not realistic. You can't do this without raytracing, DUH. Google it. Tons of video's about this. Game dev's talking about it. You know more than game developers?
this guy gets it
 
I wonder if you said the same about tessellation when it first became available on first-gen cards lol

Tessellation is still sparingly used, a decade after it was introduced.
 
I don't mind raytracing being there, it's one of the steps in graphics' technology evolution. However having part of a GPU only for certain use is like a technology from 15 years ago, when we had vertex and pixel shaders, which were later replaced by unified shaders.

You are right - Nvidia found other uses for the tensor and rt cores outside of ray tracing tho. Can't remember the features, since I don't have Turing and I don't really care about Turing. I will get Ampere on launch tho.

This is brand new tech. I don't think ray tracing will matter before next year, with Ampere. Turing is too slow for proper ray tracing and I bet AMD's solution will be too (full scene ray tracing that is).

Ray tracing is part of the evolution for pc graphics, which has not changed for the last few years. Just more of the same. Ray tracing is going to bring us next level graphics, over time. True lighting and shadows (as in realistic).
 
Tessellation is still sparingly used, a decade after it was introduced.
fact of the matter it is used and it is useful in the areas where it is used. saying that raytracing is useless is a stupid and ignorant thing to say. it's the next step game devs and hardware need to take in order to achieve the next-gen look without relying on rasterization hacks.
 
fact of the matter it is used

Fact of the matter is that it isn't used regularly because people figured out cheaper ways to achieve the same effect, see parallax occlusion mapping. RT isn't useless but there is no reason to believe it wont end up the same as tessellation did, a feature that was so expensive the hardware never managed to be scaled fast enough to be used effectively and rasterized methods proved to be good enough.
 
yes,hardware is not software.
 
Fact of the matter is that it isn't used regularly because people figured out cheaper ways to achieve the same effect, see parallax occlusion mapping. RT isn't useless but there is no reason to believe it wont end up the same as tessellation, a feature that was so expensive the hardware never managed to be scaled fast enough to be used effectively and rasterized methods proved to be good enough.
sorry, but parallax occlusion mapping is a screen space effect and doesn't get close to tessellation in terms of the visual quality. not to mention the artifacting since it is a SS effect.
games will need to stop using hacks for effects to look truly mindblowing, and they will in time, just not very soon; real-time ray-tracing is the first step towards that
 
sorry, but parallax occlusion mapping is a screen space effect and doesn't get close to tessellation in terms of the visual quality. not to mention

Well, sorry, but POM is not a screen space effect, I have a feeling you don't know these things work. Whereas tessellation generates tons of extra geometry (which is why it remained prohibitively expensive to this day and will continue to do so, as RT will probably do), POM works by displacing existing geometry to create depth in textures. It's much faster and is indiscernible from tessellation the vast majority of the time.

133704


As you can see there are also no artifacts since it's not a screen space effect. You probably seen POM countless times in games and didn't even know it, maybe you even mistaken it for tessellation.

games will need to stop using hacks

What's absolutely hilarious about this is you'd be amazed about the extent developers go to make DXR function in real time because it never works out of the box, they still need to find hacks to make it feasible.
 
Can't be done that way. You either use fixed function hardware for better RT acceleration but no other use or you don't and have to rely on the shaders themselves.
Not if the solution is built around FPGA, which is the most likely scenario.
It won't be as fast as purpose-built RT core, but it would allow a relatively quick reconfiguration to e.g. physics.
 
Cause noisy mess smoothed out by RTX are "proper" shadows and lighting.

lel xD don't forget to mention DLSS:
133711

it give you also "proper" anti aliasing, mutch like TXAA, only bether becuase it give even more blur :)
 
Last edited:
I believe Ray Tracing is the future, but I also believe nVidia should have used their limited 12nm dieshrink to improve conventional rasterizing for the 20 Series and waited until the upcoming dieshrink (post-20 series) to tack on RTX. I think the real reason they didn't was because they had an inkling that next gen consoles were going to be pushing Ray Tracing and they decided to have the fundamentals in place long before that happened. Just like many other things that have hit PC gaming first, I believe it'll be when console games have the tech built-in at the base level that we'll see it become important to PC GPU's in general.

1) PC GPU's get it first in limited cases and it's amazing, if fringe.
2) Consoles make it commonplace in games.
3) Console ports carry it back over to PC and new GPU's have to be able to do it to make those games run well.

That's how I see it going for Ray Tracing. I assume that by the end of the upcoming console generation that Ray Tracing will have almost completely replaced conventional lighting and shadows in games because it probably is the better way to handle the problem. But I don't think nVidia made the right call with the 20 Series because I don't think they needed to be launched that far in advance of the next gen consoles and I do think smoother 4k/high framerate gaming across more segments would have been more useful to gamers of the last two years. Well, more useful than limited Ray Tracing effects that will quickly become obsolete when the next gen consoles have their own take that isn't exactly the same as RTX.
 
Well, sorry, but POM is not a screen space effect, I have a feeling you don't know these things work. Whereas tessellation generates tons of extra geometry (which is why it remained prohibitively expensive to this day and will continue to do so, as RT will probably do), POM works by displacing existing geometry to create depth in textures. It's much faster and is indiscernible from tessellation the vast majority of the time.

RT is here to stay for sure. This is pretty straightforward if you know anything about how rendering works. Parallax occlusion mapping is not displacing any geometry and it's illusion breaks when view angle is big relative to plane normal. POM is a height map based UV search/lookup implemented by searching heightmap intersection along the view direction ray for the fragment processed, or it may be cone stepping, binary search or even ray marching. Search for relief mapping, look into relaxed cone stepping. It is fragment shader logic, no vertex/geometry involved.
 
Back
Top