Monday, January 25th 2021

Kosin Demonstrates RTX 3090 Running Via Ryzen Laptop's M.2 Slot

Kosin a Chinese subsidiary of Lenovo has recently published a video showing how they modded their Ryzen notebook to run an RTX 3090 from the NVMe M.2 slot. Kosin used their Xiaoxin Air 14 laptop with a Ryzen 5 4600U processor for the demonstration. The systems internal M.2 NVMe SSD was removed and an M.2 to PCIe expansion cable was attached allowing them to connect the RTX 3090. Finally, the laptop housing was modified to allow the PCIe cable to exit the chassis and a desktop power supply was attached to the RTX 3090 for power.

The system booted and correctly detected and utilized the attached RTX 3090. The system performed admirably scoring 14,008 points in 3DMark TimeSpy, for comparison the RTX 3090 paired with a desktop Ryzen 5 3600 scores 15,552, and when paired with a Ryzen 7 5800X scores 17,935. While this is an extreme example pairing an RTX 3090 with a mid-range mobile processor it goes to show the amount of performance achievable over the NVMe M.2 connector. The x4 PCIe 3.0 link of the laptop's M.2 slot could handle a maximum of 4 GB/s, while the x16 PCIe 3.0 slot on previous generation processors offered 16 GB/s, and the new x16 PCIe 4.0 connector doubles that providing 32 GB/s of available bandwidth.
Source: PC Watch
Add your own comment

55 Comments on Kosin Demonstrates RTX 3090 Running Via Ryzen Laptop's M.2 Slot

#1
Mussels
Moderprator
Now this is the kind of dumb tech idea i love
Posted on Reply
#3
Mussels
Moderprator
I mean, why cant they do a console with a screwed in NVME expansion like the old CF cards or whatever they were called

Let us screw in an NVME in an enclosure, or, alternate devices that fit the connector (NVME cards, wifi cards, external GPU connector)
Posted on Reply
#4
lexluthermiester
MusselsNow this is the kind of dumb tech idea i love
If you think about it, it's not that dumb. A lot of engineering went into this demo.
Posted on Reply
#5
my_name_is_earl
I haven't seen 1... ONE!!! RTX 3XXX series card release that I can add to cart since release. Go pond sand Nvidia. My 2080 super will have to do for possibly the entire 2021.
Posted on Reply
#6
sam_86314
Pretty cool and interesting I guess, but using this method (or a similar one) for an eGPU is nothing new...


This video came out in 2014.
Posted on Reply
#7
Mussels
Moderprator
sam_86314Pretty cool and interesting I guess, but using this method (or a similar one) for an eGPU is nothing new...


This video came out in 2014.
This is via NVME, which is a lot faster than mini PCI-E - thats the key difference. Almost every modern laptop has an NVME slot that runs 3.0 x4 or 4.0 x4 bandwidth, making the potential for this a lot greater than with needing custom solutions
Posted on Reply
#8
Deathy
MusselsThis is via NVME, which is a lot faster than mini PCI-E - thats the key difference. Almost every modern laptop has an NVME slot that runs 3.0 x4 or 4.0 x4 bandwidth, making the potential for this a lot greater than with needing custom solutions
NVME is a transfer protocol, mini PCI-E is a form factor that is used to conduct USB and PCI-E signals. The NVME equivalent is the M.2 slot, which is used to conduct SATA or PCI-E signals. In both cases, people used the PCI-E functionality of the slot to have a GPU talk to the system via riser cards and adapters. The difference you allege is like saying "This PCI-E 2.0 x1 slot is totally different from this PCIE-E 3.0 x4 slot.". No more "custom solutions" needed for either method, since both are based on normal standards everyone can manufacture for. It's just a matter of whether or not mass producing something is profitable.

Edit: This has been done a lot in the M-STX world, where people use the x4 M.2 slot for external GPUs.
Posted on Reply
#9
Yukikaze
How is something (an m.2 eGPU) that has been done at least dozens of times before (going back to 2017) front-page news? The eGPU community has been using m.2 slots pretty much since they existed to hook up GPUs to things that typically can't (NUCs, Laptops, SFF systems, etc).
Posted on Reply
#10
Mussels
Moderprator
DeathyNVME is a transfer protocol, mini PCI-E is a form factor that is used to conduct USB and PCI-E signals. The NVME equivalent is the M.2 slot, which is used to conduct SATA or PCI-E signals. In both cases, people used the PCI-E functionality of the slot to have a GPU talk to the system via riser cards and adapters. The difference you allege is like saying "This PCI-E 2.0 x1 slot is totally different from this PCIE-E 3.0 x4 slot.". No more "custom solutions" needed for either method, since both are based on normal standards everyone can manufacture for. It's just a matter of whether or not mass producing something is profitable.

Edit: This has been done a lot in the M-STX world, where people use the x4 M.2 slot for external GPUs.
Yes its been done, but its still noteworthy because so few people do it. PCI-E 4.0 x4 is a lot better than a 1x slot, this is actually usable.
Posted on Reply
#11
kayjay010101
Gotta agree with people here, M.2 risers to use eGPUs is not a new thing. It's used extensively in mining, and for smaller computers like SFF, Laptops and NUCs.
Posted on Reply
#12
medi01
What a shame they still equip ryzen notebooks with overpriced powerhungry green crap.
Let's pretend 6900XT doesn't beat 3090 in newest games, shall we...
Posted on Reply
#13
Deathy
MusselsYes its been done, but its still noteworthy because so few people do it. PCI-E 4.0 x4 is a lot better than a 1x slot, this is actually usable.
I wasn't arguing that it isn't noteworthy. But news should be written and published to provide context. So you should explain that this has been done before, that this can be done with other connectors, how this is different and the same compared to normal eGPU solutions via TB, how the future might look with USB 4.0 being more PCIe friendly. Otherwise this is just a Reddit post of a YT video that explains what happens in the video. Basically. :)
Posted on Reply
#14
junglist724
MusselsYes its been done, but its still noteworthy because so few people do it. PCI-E 4.0 x4 is a lot better than a 1x slot, this is actually usable.
This is PCI-E 3.0. Having to leave your laptop chassis open doesn't seem practical.
lexluthermiesterIf you think about it, it not that dumb. A lot of engineering went into this demo.
What engineering? This is just an m.2 to x4 PCI-E riser plugged into a laptop with the bottom panel taken off. Anyone could spend $23 on Amazon and do the same.
Posted on Reply
#15
DeathtoGnomes
is this just a fancy way of saying laptops dont need a thunderbolt port?
Posted on Reply
#16
kayjay010101
junglist724This is PCI-E 3.0. Having to leave your laptop chassis open doesn't seem practical.

What engineering? This is just an m.2 to x4 PCI-E riser plugged into a laptop with the bottom panel taken off. Anyone could spend $23 on Amazon and do the same.
Less than that, ebay has m.2 to pcie risers for <$10. I have one plugged in my system right now for a PCIe drive to 'convert' it to an M.2 drive as I ran out of pcie slots.
Posted on Reply
#17
Asni
The best part of this mod is that the gaming experience would be great with that configuration, while it's awful through USB or TB protocol because of the additional latency and the uneven frametime.
Bandwidth doesn't tell the whole story.
Posted on Reply
#18
1d10t
YukikazeHow is something (an m.2 eGPU) that has been done at least dozens of times before (going back to 2017) front-page news? The eGPU community has been using m.2 slots pretty much since they existed to hook up GPUs to things that typically can't (NUCs, Laptops, SFF systems, etc).
You can also utilizing USB 3.0 back then when mining craze emerged :D
One of my favorites was ADT Link, they had various mod you can imagine.
Posted on Reply
#19
InVasMani
So they used a PCIE riser on a PCIE x4 slot essentially got it. For people that don't grasp what I just said a M.2 slot is just a physically wired PCIE x4 slot with another form factor. This begs the question of worse of why laptop makers don't make these slots dual purpose in the first place w/o intricate disassembly and a dodgy work around that takes away the whole point of it being portable safely in the first place. I wouldn't want to actually move that around from the looks of it. The M.2 slots been used for other purposes as well with adapters like converting it into multiple SATA, SAS, or U.2 ports I forget which, but probably each knowing the industry and how a lot of bases tend to get covered.
1d10tYou can also utilizing USB 3.0 back then when mining craze emerged :D
One of my favorites was ADT Link, they had various mod you can imagine.
That's a good point now I wonder if I could use USB-C on my Samsung Galaxy S20+ and install Linux on it then install a GPU from the USB-C and get maybe STEAM OS working in some form or another. That would be rather wild. I know you could use the USB-C to connect a portable NVME device or two and have like 4TB to 8TB of NVME storage on it which itself is kinda funny to think about. That storage density will also increase as time ticks onward thanks to 3D NAND layers and ever increasing capacities. The other aspect that storage would be faster than the built in storage if I had to guess. Getting a GPU working from USB-C on a Samsung Galaxy S20+ via a Linux OS install and using it for STEAM would actually be more impressive in a lot of ways and way beyond the intended use scope of the phone itself. Theoretically I think the display EDID could actually do 3000 x 1350 96Hz as well judging from the math and what I know of CRU with display resolution and refresh rate modding.
Posted on Reply
#20
Yukikaze
1d10tYou can also utilizing USB 3.0 back then when mining craze emerged :D
One of my favorites was ADT Link, they had various mod you can imagine.
No, USB3.0 cannot support an eGPU and never could.
InVasManiThat's a good point now I wonder if I could use USB-C on my Samsung Galaxy S20+ and install Linux on it then install a GPU from the USB-C and get maybe STEAM OS working in some form or another. That would be rather wild. I know you could use the USB-C to connect a portable NVME device or two and have like 4TB to 8TB of NVME storage on it which itself is kinda funny to think about. That storage density will also increase as time ticks onward thanks to 3D NAND layers and ever increasing capacities. The other aspect that storage would be faster than the built in storage if I had to guess. Getting a GPU working from USB-C on a Samsung Galaxy S20+ via a Linux OS install and using it for STEAM would actually be more impressive in a lot of ways and way beyond the intended use scope of the phone itself. Theoretically I think the display EDID could actually do 3000 x 1350 96Hz as well judging from the math and what I know of CRU with display resolution and refresh rate modding.
USB-C without Thunderbolt cannot support eGPU.
Posted on Reply
#21
bug
So they made a PCIe card work over a slot that talks PCIe? Amazing, I didn't think it could be done :wtf:
Posted on Reply
#22
Mussels
Moderprator
I just saw more news on this, with lenovo doing the same thing

We may be about to see a trend of external GPU enclosures utilisng M.2 NVME slots, no longer needing fancy proprietary stuff


my excitement is simple: for decades i've wanted laptops to have a universal way to upgrade GPU's, and this may end up being it.
Posted on Reply
#23
Haile Selassie
DeathyNVME is a transfer protocol, mini PCI-E is a form factor that is used to conduct USB and PCI-E signals. The NVME equivalent is the M.2 slot, which is used to conduct SATA or PCI-E signals. In both cases, people used the PCI-E functionality of the slot to have a GPU talk to the system via riser cards and adapters. The difference you allege is like saying "This PCI-E 2.0 x1 slot is totally different from this PCIE-E 3.0 x4 slot.". No more "custom solutions" needed for either method, since both are based on normal standards everyone can manufacture for. It's just a matter of whether or not mass producing something is profitable.

Edit: This has been done a lot in the M-STX world, where people use the x4 M.2 slot for external GPUs.
Correct, this has been done about 7 years ago.
Posted on Reply
#24
newtekie1
Semi-Retired Folder
MusselsThis is via NVME, which is a lot faster than mini PCI-E - thats the key difference. Almost every modern laptop has an NVME slot that runs 3.0 x4 or 4.0 x4 bandwidth, making the potential for this a lot greater than with needing custom solutions
NVME has nothing to do with this. NVME is a storage protocol that runs over the PCI-E bus. If you don't have a storage device plugged in, then the connector is nothing more than another form factor for PCI-E x4. It doesn't matter if it is miniPCI-E or M.2, it is still just a PCI-E x4 connection to the graphics card.
Musselsmy excitement is simple: for decades i've wanted laptops to have a universal way to upgrade GPU's, and this may end up being it.
That is what Thunderbolt does, in a much nicer form faster too. And Thunderbolt achieves that because it is also just a PCI-E x4 connection.
Posted on Reply
#25
evernessince
The only thing I got from this is that nearly everyone but consumers can easily get the new video cards.
Posted on Reply
Add your own comment