Wednesday, April 21st 2021

DirectStorage API Works Even with PCIe Gen3 NVMe SSDs

Microsoft on Tuesday, in a developer presentation, confirmed that the DirectStorage API, designed to speed up the storage sub-system, is compatible even with NVMe SSDs that use the PCI-Express Gen 3 host interface. It also confirmed that all GPUs compatible with DirectX 12 support the feature. A feature making its way to the PC from consoles, DirectStorage enables the GPU to directly access an NVMe storage device, paving the way for GPU-accelerated decompression of game assets.

This works to reduce latencies at the storage sub-system level, and offload the CPU. Any DirectX 12-compatible GPU technically supports DirectStorage, according to Microsoft. The company however recommends DirectX 12 Ultimate GPUs "for the best experience." The GPU-accelerated game asset decompression is handled via compute shaders. In addition to reducing latencies; DirectStorage is said to accelerate the Sampler Feedback feature in DirectX 12 Ultimate.
More slides from the presentation follow.

Source: NEPBB (Reddit)
Add your own comment

76 Comments on DirectStorage API Works Even with PCIe Gen3 NVMe SSDs

#1
ZoneDymo
"PCIe Gen3 NVMe SSD"
you were a poet but you didnt even know it
Posted on Reply
#3
Space Lynx
Astronaut
DeathtoGnomesYou'll have to explain this one to me :p
the letter e.

also, looks like it's time to upgrade to gen4 nvme after all.
Posted on Reply
#4
ZoneDymo
DeathtoGnomesYou'll have to explain this one to me :p
PCI-Eee
Gen Three
NVMee
SSDEee

it just all rhymes :P
Posted on Reply
#5
ADB1979
Will this work on non DX12 titles, there are plenty of DX9, 10 and 11 titles that would benefit from this boost.

Also, when MicroShaft says "current GPU's" are compatible, what does "current" mean.? Does this only apply tos DX12 GPU's or DX11 GPU's as well.?
Posted on Reply
#6
windwhirl
ADB1979Will this work on non DX12 titles, there are plenty of DX9, 10 and 11 titles that would benefit from this boost.

Also, when MicroShaft says "current GPU's" are compatible, what does "current" mean.? Does this only apply tos DX12 GPU's or DX11 GPU's as well.?
Looks to me like it's DX12 exclusive. Besides, it must be implemented by the developer, it's not the user's decision whether you can use it or not.
Posted on Reply
#7
Space Lynx
Astronaut
windwhirlLooks to me like it's DX12 exclusive. Besides, it must be implemented by the developer, it's not the user's decision whether you can use it or not.
it probably will be implemented on future ps5/xbox series x ports.
Posted on Reply
#8
1d10t
From my brief understanding, these only works with storage that direct to the CPU, so it has its limitations. Anything attach to motherboard south bridge that runs on SATA mode will not get same treatment or I just misread graph above?
Posted on Reply
#9
Panther_Seraphin
1d10tFrom my brief understanding, these only works with storage that direct to the CPU, so it has its limitations. Anything attach to motherboard south bridge that runs on SATA mode will not get same treatment or I just misread graph above?
Looking at the graphs it basically takes the CPU out of the equation in regards to decompressing data and relies on the GPU to do it.

So depending on implementation it could increase framerates etc by removing CPU cycles being used in the main game thread to decompress assets for the GPU. This could also impact loading times etc as instead of the CPU having to decompress assets to then pass onto the GPU the GPU can take care of that while the CPU is doing things like Level building, AI creation etc etc etc.
Posted on Reply
#10
Zubasa
1d10tFrom my brief understanding, these only works with storage that direct to the CPU, so it has its limitations. Anything attach to motherboard south bridge that runs on SATA mode will not get same treatment or I just misread graph above?
Requiring nvme means that drives running in AHCI / SATA mode are out of the question by default.
Now the question is for drives running in NVME mode through the PCH.
My gut feeling is most likely no since going through the PCH naturally causes a latency penalty which nullify the benefits.
Also this might cause security issues, as the PCH is connected to a lot of external IO that could potentially be attacked.
So most likely it just works on drives connected directly to the "northbridge"/uncore/soc/IO die.
Posted on Reply
#11
Mussels
Freshwater Moderator
It's gunna let compressed data go from NVME -> GPU VRAM without needing the CPU to decompress it first (GPU's are pretty good at that sort of multi threaded workload these days)

So it should let textures stream across fast, get decompressed and smashed open fast, and generally make load times and texture pop in go away
Posted on Reply
#12
Makaveli
ZubasaRequiring nvme means that drives running in AHCI / SATA mode are out of the question by default.
Now the question is for drives running in NVME mode through the PCH.
My gut feeling is most likely no since going through the PCH naturally causes a latency penalty which nullify the benefits.
Also this might cause security issues, as the PCH is connected to a lot of external IO that could potentially be attacked.
So most likely it just works on drives connected directly to the "northbridge"/uncore/soc/IO die.
They also didn't specify which version of nvme
  • 1.0e (January 2013)
  • 1.1b (July 2014)
  • 1.2 (November 2014)
    • 1.2a (October 2015)
    • 1.2b (June 2016)
    • 1.2.1 (June 2016)
  • 1.3 (May 2017)
    • 1.3a (October 2017)
    • 1.3b (May 2018)
    • 1.3c (May 2018)
    • 1.3d (March 2019)
  • 1.4 (June 2019)
    • 1.4a (March 2020)
    • 1.4b (September 2020)
I'm going to assume they are starting it at 1.2
ADB1979Will this work on non DX12 titles, there are plenty of DX9, 10 and 11 titles that would benefit from this boost.

Also, when MicroShaft says "current GPU's" are compatible, what does "current" mean.? Does this only apply tos DX12 GPU's or DX11 GPU's as well.?
Dev's will not go back to implement this on older titles not worth spending the resources on that.
Posted on Reply
#13
Unregistered
Well, the only NVMe that I have is my 500 GB boot drive. Not putting any games there. Gonna wait and see if it's worth to buy an extra 2 TB NVMe just for this.
#14
ADB1979
windwhirlLooks to me like it's DX12 exclusive. Besides, it must be implemented by the developer, it's not the user's decision whether you can use it or not.
MakaveliDev's will not go back to implement this on older titles not worth spending the resources on that.
I think I misunderstood how it functions, I read it as being separate from the games itself and at the DX level, and as such seems to be unlikely that it will work on DX11 of older hardware, or DX11 or older games.
VannyWell, the only NVMe that I have is my 500 GB boot drive. Not putting any games there. Gonna wait and see if it's worth to buy an extra 2 TB NVMe just for this.
I have a 250GB "boot" drive attached to my SB, and a 2TB "games" drive attached directly to my Ryzen CPU, so it looks like I hit the luck jackpot already as I am already ideally configured from the looks of things.
___

It will be interesting to see how this pans out over time and is certainly a boon that AMD has created the APU's for the XBONE and PS5 so us PC gamers will also get the rewards of simple (and better) game ports and this DX12/DirectStorage/XBONE updatewill likely also pass through as the XBONE is of course essentially now just a PC, the future is bright.
Posted on Reply
#15
Unregistered
ADB1979I have a 250GB "boot" drive attached to my SB, and a 2TB "games" drive attached directly to my Ryzen CPU, so it looks like I hit the luck jackpot already as I am already ideally configured from the looks of things.
Does it need to be attached to the CPU? My boot drive is set up that way, and I can't even get to the slot without removing my chonky CPU cooler.
Posted on Edit | Reply
#16
r9
"A feature making its way to the PC from consoles" that's just wrong.
Posted on Reply
#17
windwhirl
r9"A feature making its way to the PC from consoles" that's just wrong.
Eh, not really. With consoles, manufacturers can change everything every gen, so they don't have to keep old code around and can go bonkers innovating how they do things. PCs could actually be held back by the need for backwards compatibility (it's just never really measured until someone comes with a new idea that needs to do away with the old ones).
Posted on Reply
#18
1d10t
Panther_SeraphinLooking at the graphs it basically takes the CPU out of the equation in regards to decompressing data and relies on the GPU to do it.

So depending on implementation it could increase framerates etc by removing CPU cycles being used in the main game thread to decompress assets for the GPU. This could also impact loading times etc as instead of the CPU having to decompress assets to then pass onto the GPU the GPU can take care of that while the CPU is doing things like Level building, AI creation etc etc etc.
ZubasaRequiring nvme means that drives running in AHCI / SATA mode are out of the question by default.
Now the question is for drives running in NVME mode through the PCH.
My gut feeling is most likely no since going through the PCH naturally causes a latency penalty which nullify the benefits.
Also this might cause security issues, as the PCH is connected to a lot of external IO that could potentially be attacked.
So most likely it just works on drives connected directly to the "northbridge"/uncore/soc/IO die.
Kinda hope this new technology more flexible just like implementation in PS5 and XSX consoles, they support expansion even though its "proprietary". I guess this piece still in its infancy, let just wait and see what future could bring.
Posted on Reply
#19
mechtech
MS you know what would be nice. A 24" 4k monitor that doesn't need 300% scaling to see stuff ;)
Posted on Reply
#20
windwhirl
mechtechMS you know what would be nice. A 24" 4k monitor that doesn't need 300% scaling to see stuff ;)
Blame the madmen that want and push for those.
Posted on Reply
#21
Mussels
Freshwater Moderator
It's possible devs could implement this seperate to DX12 - but its for new games only
The key is that it requires a DX12 GPU to have support for the feature, not that it requires the game to be running in DX12 mode
(At this stage no one knows if the feature will get rolled back to older GPU's, we've only got a few vague clues)
Posted on Reply
#22
mechtech
windwhirlBlame the madmen that want and push for those.
I'm really enjoying my 4k 27" screen. Looking at a 24" 1080p screen now is like looking at an 8-bit video game. However I do find it a bit big for the desk, 23-24" would be about perfect size, I mean this is 2021 after all, and whats the typical resolution on a smart phone these days...................
Posted on Reply
#23
windwhirl
mechtechI'm really enjoying my 4k 27" screen. Looking at a 24" 1080p screen now is like looking at an 8-bit video game. However I do find it a bit big for the desk, 23-24" would be about perfect size, I mean this is 2021 after all, and whats the typical resolution on a smart phone these days...................
Smartphones are not comparable. They're controlled with what is the equivalent of a giant mouse pointer. They need massive pixel density so that a large amount of content can fit in a tiny 6 inch screen at best, if not smaller, and massive scaling for the user to control the UI and other stuff relatively easily, like tapping on links or selecting content. Take out the pixel density and everything will be big. Take out the scaling and you can try to fine control things when your finger is area bombardment for your touch screen (and provided I have somewhat average fingers for the sample there, I know people with way thicker fingers than mine, and the scaling on my phone can't go to 100%, 125% is the minimum, which is what I used here).

For the reference: 1080p screen at 21.5 inches (so, around 102 PPI, a bit above the 92 PPI of a 1080p 24 inch screen), my phone's 1280x720p 5.7 inch screen scaled to match in real-world size against my display at the upper right and Firefox Responsive Design mode on the lower right to show what it would be like if phones didn't have high pixel density displays. The giant black/white circles are the size of my finger tip on the screen.


Your issue is that you need or want to be able to see a massive amount of content (otherwise you wouldn't have a 4K screen), have a small desk and want to use normal size scales. You can't have all three. Something's gotta give. You gotta step down your resolution, or get a bigger desk or get used to scaling.
Posted on Reply
#24
Panther_Seraphin
Ever wondered why the BAR size suddenly got brought up even though its been in the PCI-e spec for years?

I can honestly see the theory of this working on DX11 cards but as with anything in the tech world. Unless its pushing something new or shiny you will pretty much never see it back ported en mass.
Posted on Reply
#25
Zubasa
Panther_SeraphinEver wondered why the BAR size suddenly got brought up even though its been in the PCI-e spec for years?

I can honestly see the theory of this working on DX11 cards but as with anything in the tech world. Unless its pushing something new or shiny you will pretty much never see it back ported en mass.
MusselsIt's possible devs could implement this seperate to DX12 - but its for new games only
The key is that it requires a DX12 GPU to have support for the feature, not that it requires the game to be running in DX12 mode
(At this stage no one knows if the feature will get rolled back to older GPU's, we've only got a few vague clues)
But then is it that much to ask for a DX12 compatible GPU? Everything since Fermi and GCN are "compatible" with DX12.
Older GPUs don't even get driver updates anymore, so even if M$ makes them work somehow they won't get the driver needed.
Posted on Reply
Add your own comment
Apr 24th, 2024 19:34 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts