Thursday, October 29th 2020

Microsoft: Only Consoles Supporting Full RDNA 2 Capabilities Are Xbox Series X and Series S, Excludes PlayStation 5

Microsoft has today published another article on its Xbox Wire blog, dedicated to all the news regarding the Xbox consoles and its ecosystem. In the light of yesterday's launch of AMD Radeon RDNA 2 graphics cards, Microsoft has congratulated its partner and provider of processors SoCs for their next-generation consoles. Besides the celebrations and congratulations, Microsoft has proceeded to show off what the Xbox Series X and Series S consoles are capable of, and how they integrate the RDNA 2 architecture. The company notes that there are hardware accelerated DirectX Raytracing, Mesh Shaders, Sampler Feedback, and Variable Rate Shading units built-in, so game developers can take advantage of it.

Another interesting point Microsoft made was that "Xbox Series X|S are the only next-generation consoles with full hardware support for all the RDNA 2 capabilities AMD showcased today." What this translates into is that Microsoft is the only console maker that uses the full RDNA 2 potential. This could leave Sony out in the dark with its PlayStation 5 console, meaning that it does not support all the features of AMD's new GPU architecture. There are not any specific points, however, we have to wait and see what Sony has left out, if anything.
Source: Xbox Wire
Add your own comment

47 Comments on Microsoft: Only Consoles Supporting Full RDNA 2 Capabilities Are Xbox Series X and Series S, Excludes PlayStation 5

#1
ratirt
I know it is marketing but I think, Microsoft would not throw in the Ether statements like these if there wasn't a bit of truth in them.
Posted on Reply
#2
Vya Domus
They probably just talking about some of the futures supported by DX12 Ultimate.
Posted on Reply
#3
dinmaster
microsoft: stupid sony with no ir for ir remotes...
sony: nvme bro

lol just stupid squabbling from microsoft. whatever it actually is, im sure it amounts to nothing.. at least to the gaming experience.
Posted on Reply
#4
mtcn77
You see there is good kinds of competition, as there is between the console brands... And then, there is the bad kind, toxic and petty: green team, red team tinted.
Posted on Reply
#5
ratirt
If there is something missing in the PS5 with the RDNA2 support, I wonder what that might be and how will it impact the sales and the console features.
We know it is not RayTracing since both consoles support it.
Posted on Reply
#6
Chomiq
ratirt
If there is something missing in the PS5 with the RDNA2 support, I wonder what that might be and how will it impact the sales and the console features.
We know it is not RayTracing since both consoles support it.
PS5 uses different API so of course it won't support features that were designed for DX 12 Ultimate.

Ms is grasping at straws to prove that their console is better than the other.
Posted on Reply
#7
Diverge
Makes sense, since Xbox is basically a windows 10 PC, and playstation has roots in freebsd/linux.
Posted on Reply
#8
chris.london
ratirt
If there is something missing in the PS5 with the RDNA2 support, I wonder what that might be and how will it impact the sales and the console features.
We know it is not RayTracing since both consoles support it.
Directstorage API for one. Sony has its own, maybe even superior implementation with dedicated hardware.
Posted on Reply
#9
Valantar
dinmaster
microsoft: stupid sony with no ir for ir remotes...
sony: nvme bro

lol just stupid squabbling from microsoft. whatever it actually is, im sure it amounts to nothing.. at least to the gaming experience.
Lol, what are you even talking about here?


I wonder what they mean by this though. Seems like there are two options to me: either they are being nit-picky about detailed API features (maybe the PS5 lacks mesh shader support, VRS support, or some such?), or the Xboxes have some version of Infinity Cache on board. I don't quite see where it would fit going by their own annotated die shot from Hot Chips, but I guess that stripe in between the CPU and GPU cores might be a small IC? Doesn't look like it, but I guess it's a (very distant) possibility. Or maybe the XSS has IC to make up for its lower memory bandwidth and size? We don't know anything at all about the XSS die, after all.
Posted on Reply
#10
Vayra86
ratirt
I know it is marketing but I think, Microsoft would not throw in the Ether statements like these if there wasn't a bit of truth in them.
The real question is if that bit of truth is even half relevant, because you can rest assured Sony is going to have some sort of stand-in tech to replace whatever is missing.

This is just a desperate attempt to differentiate within a gaming landscape that is becoming more samey every generation. I mean really, the only noticeable difference between platforms now is the controller in your hands. You're playing 95% ports and 90% of them are also on PC.

All I've been reading on the new consoles is just utter nonsense. Its a horsepower increase, everything else is marketing and can be safely ignored. The library will consists of lots of respins, remakes and re-releases. It gets fake 4K with lots of internal scaling and tweaking, dynamic detail layers... It gets carried by subscription and streaming services plus constant iterations and updates of existing games with new price tags.

I'm not touching this shitstorm with a 10 foot pole, especially now that we know competition is back in the PC GPU space and big strides are made in technologies, engines etc. Thanks for that, consoles, now you can f right off again ;)
Posted on Reply
#11
mtcn77
Vayra86
The real question is if that bit of truth is even half relevant, because you can rest assured Sony is going to have some sort of stand-in tech to replace whatever is missing.
I think they are talking about sdwa operations. Would anybody care to explain?
Posted on Reply
#12
Vayra86
mtcn77
I think they are talking about sdwa operations. Would anybody care to explain?
If MS can't specify it themselves, I'm certainly not wasting time trying to figure out what we should assume they meant.

Moreover if they didn't specify, its not relevant to begin with. Otherwise they would have made it big.
Posted on Reply
#13
ratirt
Chomiq
PS5 uses different API so of course it won't support features that were designed for DX 12 Ultimate.

Ms is grasping at straws to prove that their console is better than the other.
Which API it supports?
Vayra86
The real question is if that bit of truth is even half relevant, because you can rest assured Sony is going to have some sort of stand-in tech to replace whatever is missing.

This is just a desperate attempt to differentiate within a gaming landscape that is becoming more samey every generation. I mean really, the only noticeable difference between platforms now is the controller in your hands. You're playing 95% ports and 90% of them are also on PC.

All I've been reading on the new consoles is just utter nonsense. Its a horsepower increase, everything else is marketing and can be safely ignored. The library will consists of lots of respins, remakes and re-releases. It gets fake 4K with lots of internal scaling and tweaking, dynamic detail layers... It gets carried by subscription and streaming services plus constant iterations and updates of existing games with new price tags.

I'm not touching this shitstorm with a 10 foot pole, especially now that we know competition is back in the PC GPU space and big strides are made in technologies, engines etc. Thanks for that, consoles, now you can f right off again
I'm not touching it either, I'm just curious what it is. I was going for PS5, I even bought a 65 inch TV especially for it. If there is any insufficiency in terms of features which may impact picture quality or performance, I'd like to know about it. If the consoles are the same (they kinda are when you consider what's inside) then maybe there will be a difference which one you will pick, like quality or game performance.
To be honest, consoles like Xbox and PlayStation have been quite the same with the minor difference of some titles which were exclusive for each one. Not like Switch which I've acquired 2 weeks ago :)
Posted on Reply
#14
Valantar
ratirt
Which API it supports?
Sony has their own custom APIs for the PS5.
Posted on Reply
#15
ratirt
Valantar
Sony has their own custom APIs for the PS5.
OK but what is it? Do we know anything about it or only PS has it's own? If they have that own API, was it based on some of techniques we know or something they have created from scratch?
Posted on Reply
#16
Dredi
ratirt
OK but what is it? Do we know anything about it or only PS has it's own? If they have that own API, was it based on some of techniques we know or something they have created from scratch?
It is akin to vulcan.
Posted on Reply
#17
TheLostSwede
Valantar
I wonder what they mean by this though. Seems like there are two options to me: either they are being nit-picky about detailed API features (maybe the PS5 lacks mesh shader support, VRS support, or some such?), or the Xboxes have some version of Infinity Cache on board. I don't quite see where it would fit going by their own annotated die shot from Hot Chips, but I guess that stripe in between the CPU and GPU cores might be a small IC? Doesn't look like it, but I guess it's a (very distant) possibility. Or maybe the XSS has IC to make up for its lower memory bandwidth and size? We don't know anything at all about the XSS die, after all.
I guess you didn't read the presentation slides? Infinity Cache is based on the Zen L3 cache, so it might not even be possible to implement it in the console SoCs or an APU due to this, or you'd end up with two sets of L3 cache. It could also be that the CPU L3 cache is shared with the GPU in the console SoCs.
ratirt
OK but what is it? Do we know anything about it or only PS has it's own? If they have that own API, was it based on some of techniques we know or something they have created from scratch?
en.wikipedia.org/wiki/PlayStation_4_system_software#System
Posted on Reply
#18
Valantar
TheLostSwede
I guess you didn't read the presentation slides? Infinity Cache is based on the Zen L3 cache, so it might not even be possible to implement it in the console SoCs or an APU due to this, or you'd end up with two sets of L3 cache. It could also be that the CPU L3 cache is shared with the GPU in the console SoCs.


en.wikipedia.org/wiki/PlayStation_4_system_software#System
I did read that, but I don't see the issue. If the CPU in an APU can have an L3 cache that is exclusive to it - which they all do! - why couldn't the GPU also have one? Does the CPU or SoC fabric care specifically how many levels of cache the GPU has as long as data transfers work as they should?
Posted on Reply
#19
BoboOOZ
Valantar
I did read that, but I don't see the issue. If the CPU in an APU can have an L3 cache that is exclusive to it - which they all do! - why couldn't the GPU also have one? Does the CPU or SoC fabric care specifically how many levels of cache the GPU has as long as data transfers work as they should?
I think it could, definitely, but given the memory specifications of the 2 consoles, the bandwidth seems sufficient for the GPU horsepower without infinity cache.

Plus, if MS had the infinty cache in their Xbox, i'm pretty sure they would mention it. What they are mentioned is only on the software/processing side, nothing about the hardware/architecture, so it's just API wars all over again.

However, i'm pretty sure this kind of information could sway a lot of unknowing users... Gotta love console wars :)
Posted on Reply
#20
lynx29
BoboOOZ
I think it could, definitely, but given the memory specifications of the 2 consoles, the bandwidth seems sufficient for the GPU horsepower without infinity cache.

Plus, if MS had the infinty cache in their Xbox, i'm pretty sure they would mention it. What they are mentioned is only on the software/processing side, nothing about the hardware/architecture, so it's just API wars all over again.

However, i'm pretty sure this kind of information could sway a lot of unknowing users... Gotta love console wars :)
exclusives sway buyers. and sony wins on exclusives every generation with very unique story based games. either way i expect both consoles to sell out for many months. /shrug
Posted on Reply
#21
Aquinus
Resident Wat-man
Nonissue. That's because the PS5 uses Vulkan and not the MS graphics stack. Fine by me, I like the diversity.
Posted on Reply
#22
Flanker
ratirt
OK but what is it? Do we know anything about it or only PS has it's own? If they have that own API, was it based on some of techniques we know or something they have created from scratch?
IIRC, Sony named their rendering API Gnmx and Gnm, and they are somewhat similar to DirectX 11 and Direct 12 respectively.
Posted on Reply
#23
TheLostSwede
Valantar
I did read that, but I don't see the issue. If the CPU in an APU can have an L3 cache that is exclusive to it - which they all do! - why couldn't the GPU also have one? Does the CPU or SoC fabric care specifically how many levels of cache the GPU has as long as data transfers work as they should?
It's more of a cost/space concern. Cache takes up a humongous amount of space and having two L3 caches is simply not going to be a sensible move cost wise.
Just look at how much space the L3 cache eats up on a single CCX. What do you want in a GPU? Cache or compute?
Keep in mind that below you have 2MB per core, so the 128MB Infinity Cache is going to take up a whole heap more space.

Posted on Reply
#24
Valantar
BoboOOZ
I think it could, definitely, but given the memory specifications of the 2 consoles, the bandwidth seems sufficient for the GPU horsepower without infinity cache.

Plus, if MS had the infinty cache in their Xbox, i'm pretty sure they would mention it. What they are mentioned is only on the software/processing side, nothing about the hardware/architecture, so it's just API wars all over again.
They definitely would, but not if AMD explicitly forbid them to mention it before RDNA 2 launched for PCs - there's no doubt that there are agreements in place on those details, after all. But still, this is of course pure speculation, and I definitely see the first of the two options I suggested as the most likely. And I also agree that the VRAM bandwidth alone should carry the console GPUs - it's much higher than for the dGPUs, after all.
Posted on Reply
#25
TheLostSwede
Flanker
IIRC, Sony named their rendering API Gnmx and Gnm, and they are somewhat similar to DirectX 11 and Direct 12 respectively.
If you'd read the Wikipedia link I provided, you would've seen that one is high-level and the other low(er)-level and not really different generations.
Posted on Reply
Add your own comment