• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Mantle allows adding Vram on multi-gpu configs.

Joined
Feb 2, 2015
Messages
2,707 (0.80/day)
Location
On The Highway To Hell \m/
Because I'm stupid, and I don't know everything. And I'm genuinely curious. So are you saying it's pretty much irrelevant? I guess I need that explained to me. I have a gut feeling this is going to make it "matter" now. I just can't work it all out in my head.
 
Joined
Jun 13, 2012
Messages
1,328 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Now tell me what PCI-E bandwidth/lane count has to do with all of this. Something tells me there's still quite a bit missing from this picture. Like, does it make x16 more, or less, necessary/advantageous for each available GPU...for instance.

Yea that is one things that I have been wondering about if that will be required and how much being say 8x or even even pcie 2.0

why? we've seen benchmark results from PCI-E lanes in use, and even 4x 2.0 slots barely touch performance.

What he means is If both cards rams are being shared so 2x 4gb cards, you have use of all 8gb ram between the gpu's if data has to be moved from 1 card to another what kinda possible issues will that bring up being limited bandwidth compared to speed of the memory on the gpu. It all sounds like great idea in theory til some the possible logistical issues of 1 card maybe access 2nd cards memory of pci-e which could cause fps tanking. A lot is unknown yet about it.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
The cards wont need to transfer more data between each other, they'll do *less* - the bandwidth will go to the cards like they were a single GPU.

That said, w1zzard has done benchmarks on PCI-E performance scaling, and generally 4x 2.0 is enough for 99% of single GPU cards out there - so an 8x slot or 4x 3.0 will have plenty of power for uses like this.
 
Joined
Jun 13, 2012
Messages
1,328 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
The cards wont need to transfer more data between each other, they'll do *less* - the bandwidth will go to the cards like they were a single GPU.

That said, w1zzard has done benchmarks on PCI-E performance scaling, and generally 4x 2.0 is enough for 99% of single GPU cards out there - so an 8x slot or 4x 3.0 will have plenty of power for uses like this.

um that 2nd line there seems like you missed the point completely. For card 1 to talk to card 2 if its 1 giant ram pool, it would have to go through PCI-e bus which means going back through the cpu since that is where chip is (less you have a board with a seperate plx chip). Just cause performance scaling shows gpu doesn't need full pci now, if both gpu rams are like 1 big pool, it could in theory start maxing out that pci-e bus if data has be to be moved from 1 card to another. Which cause it is MUCH slower then vram on a video card could cause fps drops. Might not use PCI-e completely but the extra latency could cause it as well.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Why are they going to need more bandwidth than crossfire or SLI currently uses? Right now RAM is duplicated between them, which is a lot of unneccesary traffic. Cut that out, and things should get easier and not harder.

If you can run a single card on a slot without performance issues, then running it as part of crossfire or SLI will be exactly the same.
 
Joined
Aug 22, 2014
Messages
39 (0.01/day)
System Name Bird's Monolith
Processor Intel i7-4770k 4.6 GHz (liquid metal)
Motherboard Asrock Z87 Extreme3
Cooling Noctua NH-D14, Noctua 140mm Case Fans
Memory 16 GB G-Skill Trident-X DDR3 2400 CAS 9
Video Card(s) EVGA 1080ti SC2 Hybrid
Storage 2 TB Mushkin Triactor 3D (RAID 0)
Display(s) Dell S2716DG / Samsung Q80R QLED TV
Case Fractal Design Define R4
Audio Device(s) Audio Engine D1 DAC, A5+ Speakers, SteelSeries Arctis 7
Power Supply Seasonic Platinum 660 W
Mouse SteelSeries Rival
Keyboard SteelSeries Apex
Software Windows 10 Pro x64
This will not work the way you're expecting it to. It is not feasible to just expect the VRAM to add up and here is why. Currently GPUs use alternate frame rendering to render the same scene. Because they are rendering the same scene, it makes sense that each GPU would require access to all the same assets. Currently SLI and Crossfire use their bridges (or XDMA in the case of AMDs bridgeless Crossfire) to transfer framebuffers back and forth either for final output or to supplement additional pixel info for temporal shading effects. There is no feasible way from an engineering standpoint for GPUs to cross access eachother's memory. Essentially you would have to build a bridge that crosses over and merges both memory busses, handles handshaking, the cost/benefit ratio and/or diminishing return from added overhead makes it impractical.

For this reason, data is duplicated instead of cross accessed. Now Mantle (and DX12) lets the developer explicitly control what gets loaded into memory of each GPU independently, however each GPU is still limited to rendering what it has the assets for. When you're rendering the same scene, this isn't very useful from an alternate frame rendering standpoint, you're rendering the same scene, they still need all the same assets.

Now, if you could somehow isolate certain objects from the scene and have the GPUs render those independently and then kind of superimpose their pixel data into a final frame, it could maybe work that way. The developer would have to ensure that overhead of such a procedure doesn't exceed the benefits of increasing rendering throughput. Does that make sense? For example, have one GPU draw the environment, have the second GPU draw the objects in the environment, characters, buildings etc... There is overhead when one part of the image relies on pixel data from the other. Example, perhaps a developer is running a sharpening filter, do they run it all at once after the final frame is composed, or do they transfer framebuffers on the fly? There is some optimization there and I think it would certainly be cool, but as far as I know, no game engines have this kind of functionality yet, and it's definitely not something that will start working with the flick of a switch.

Now, where I think this is a really cool idea is imagine you're playing an open world game and it's split screen with a friend. Traditionally with this kind of scenario the two characters would not be allowed out of the same general area of eachother, limited by how many assets can be stored and managed. However, if you could allocate an independent GPU for each person, you'd both be able to roam the world as far away form eachother as you want given your CPU and RAM are allowing.

This has some super cool applications! Imagine a computer with 4 GPUs in it, you could have each one drive a separate monitor and four people could participate in the same game all playing on the same machine! FPS games with no network latency, open world games with no proximity limits. I have two monitors and an SLI setup, it would be freakin' rad if I could play GTA V with a friend and we each had our own monitor, just plug in a second controller and go!

I really hope developers are thinking of these kinds of possibilities.

Why are they going to need more bandwidth than crossfire or SLI currently uses? Right now RAM is duplicated between them, which is a lot of unneccesary traffic. Cut that out, and things should get easier and not harder.

If you can run a single card on a slot without performance issues, then running it as part of crossfire or SLI will be exactly the same.

VRAM is duplicated between cards, but it's not done over the SLI/Crossfire bridge, it's all loaded over PCIe, just send both cards the same data. Framebuffers are sent across the SLI bridge for output or sequential frame dependent image operations. The amount of bandwidth on an SLI/Crossfire bridge would be wholly inadequate for high speed memory operations on the scale of VRAM operation.
 
Last edited:
Joined
Apr 8, 2012
Messages
575 (0.13/day)
System Name Main rig
Processor Intel i7-4790K Devils Canyon
Motherboard Asus Z97-A
Cooling Antec Kuhler 620
Memory 16GB Corsair Vengence Pro
Video Card(s) Sapphire R9-290 Vapor-X
Storage Samsung 840 EVO 1TB
Case Corsair 600T
Audio Device(s) Sound Blaster X-FI HD Platinum
Power Supply Corsair AX750 Gold
i think they mean a more efficient use, such as each card rendering different assets, not necessarily additive memory.
 
Joined
Sep 17, 2014
Messages
20,949 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I will believe this rumor when it actually hits a release, until then, I discard it as an empty promise. I do this with the assumption that we are going to be able to use cards from either Nvidia and AMD together, as well.

It all sounds great, but it only counts when we see it perform. The reason for this is of course easy, look at Mantle as it is currently and what it was promised to be.

@ShredBird: I would also imagine that the split between GPU's could be done on the post-processing and AA front. Have one GPU for geometry and objects, the other for a post processing layer and AA, which can eat up tons of VRAM.
 
Joined
Apr 8, 2012
Messages
575 (0.13/day)
System Name Main rig
Processor Intel i7-4790K Devils Canyon
Motherboard Asus Z97-A
Cooling Antec Kuhler 620
Memory 16GB Corsair Vengence Pro
Video Card(s) Sapphire R9-290 Vapor-X
Storage Samsung 840 EVO 1TB
Case Corsair 600T
Audio Device(s) Sound Blaster X-FI HD Platinum
Power Supply Corsair AX750 Gold
Im not sure what was promised with mantle that it didnt deliver.

Works awesome for me.
 
Joined
Dec 31, 2009
Messages
19,366 (3.70/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
You must not play BF4? A AAA title that it is still not working well with (memory leak to name on issue). AMD also anticipated a lot more market saturation with it by now and it just hasn't taken off.
 
Joined
Apr 8, 2012
Messages
575 (0.13/day)
System Name Main rig
Processor Intel i7-4790K Devils Canyon
Motherboard Asus Z97-A
Cooling Antec Kuhler 620
Memory 16GB Corsair Vengence Pro
Video Card(s) Sapphire R9-290 Vapor-X
Storage Samsung 840 EVO 1TB
Case Corsair 600T
Audio Device(s) Sound Blaster X-FI HD Platinum
Power Supply Corsair AX750 Gold
no problems for me: BF4, Theif, DAI, CIV:BE

i get about a 20FPS uplift on each of those games over DX10/11
 
Joined
Dec 31, 2009
Messages
19,366 (3.70/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Its not all about FPS... As I said, there are known problems with mantle in BF4 with memory leaks and other items (feel free to google). I cannot use it with a 295x2 in BF4, even with one GPU disabled, no matter what driver. It stutters and after a couple hours, my memory is full and it stutters.

Perhaps it has been fixed since i last tried it a couple of months ago, but DX11 API is a lot more stable for me in BF4.
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
18,875 (3.07/day)
Location
UK\USA
Processor AMD 3900X \ AMD 7700X
Motherboard ASRock AM4 X570 Pro 4 \ ASUS X670Xe TUF
Cooling D15
Memory Patriot 2x16GB PVS432G320C6K \ G.Skill Flare X5 F5-6000J3238F 2x16GB
Video Card(s) eVga GTX1060 SSC \ XFX RX 6950XT RX-695XATBD9
Storage Sammy 860, MX500, Sabrent Rocket 4 Sammy Evo 980 \ 1xSabrent Rocket 4+, Sammy 2x990 Pro
Display(s) Samsung 1080P \ LG 43UN700
Case Fractal Design Pop Air 2x140mm fans from Torrent \ Fractal Design Torrent 2 SilverStone FHP141x2
Audio Device(s) Yamaha RX-V677 \ Yamaha CX-830+Yamaha MX-630 Infinity RS4000\Paradigm P Studio 20, Blue Yeti
Power Supply Seasonic Prime TX-750 \ Corsair RM1000X Shift
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ Wooting Two HE
Benchmark Scores Meh benchmarks.
You must not play BF4? A AAA title that it is still not working well with (memory leak to name on issue). AMD also anticipated a lot more market saturation with it by now and it just hasn't taken off.

Its not all about FPS... As I said, there are known problems with mantle in BF4 with memory leaks and other items (feel free to google). I cannot use it with a 295x2 in BF4, even with one GPU disabled, no matter what driver. It stutters and after a couple hours, my memory is full and it stutters.

Perhaps it has been fixed since i last tried it a couple of months ago, but DX11 API is a lot more stable for me in BF4.

I played it for a few hours maxed out of a single 290 at 3200x1800 without a single issue. Game sucks because well the game just sucks anyways.
 
Joined
Mar 19, 2005
Messages
302 (0.04/day)
Location
Chicago
System Name Big Beast
Processor AMD Ryzen 7 5800x3D
Motherboard Asus Strix x570-I Gaming
Cooling NZXT AIO
Memory Patriot Viper 16GB 3666MHz
Video Card(s) AMD Radeon 7900XTX
Storage Lot's of it
Display(s) Samsung CHG70 27"
Case NZXT H1
Power Supply 850watt
Software Windows 11 Professional x64
Now, if you could somehow isolate certain objects from the scene and have the GPUs render those independently and then kind of superimpose their pixel data into a final frame, it could maybe work that way. The developer would have to ensure that overhead of such a procedure doesn't exceed the benefits of increasing rendering throughput. Does that make sense? For example, have one GPU draw the environment, have the second GPU draw the objects in the environment, characters, buildings etc...

Lucid Virtu does this:
They talk about it at about the 2:00 minute mark. There is a better demo of it using unreal tournament 3 but i couldnt find it.
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
18,875 (3.07/day)
Location
UK\USA
Processor AMD 3900X \ AMD 7700X
Motherboard ASRock AM4 X570 Pro 4 \ ASUS X670Xe TUF
Cooling D15
Memory Patriot 2x16GB PVS432G320C6K \ G.Skill Flare X5 F5-6000J3238F 2x16GB
Video Card(s) eVga GTX1060 SSC \ XFX RX 6950XT RX-695XATBD9
Storage Sammy 860, MX500, Sabrent Rocket 4 Sammy Evo 980 \ 1xSabrent Rocket 4+, Sammy 2x990 Pro
Display(s) Samsung 1080P \ LG 43UN700
Case Fractal Design Pop Air 2x140mm fans from Torrent \ Fractal Design Torrent 2 SilverStone FHP141x2
Audio Device(s) Yamaha RX-V677 \ Yamaha CX-830+Yamaha MX-630 Infinity RS4000\Paradigm P Studio 20, Blue Yeti
Power Supply Seasonic Prime TX-750 \ Corsair RM1000X Shift
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ Wooting Two HE
Benchmark Scores Meh benchmarks.
Lucid sucks balls, sorry it's crap it never worked correctly for me with either one issue or another like not swapping to add-in card and 20-30w extra power usage at idle.

Been there tried it and dumped it.
 
Joined
Aug 22, 2014
Messages
39 (0.01/day)
System Name Bird's Monolith
Processor Intel i7-4770k 4.6 GHz (liquid metal)
Motherboard Asrock Z87 Extreme3
Cooling Noctua NH-D14, Noctua 140mm Case Fans
Memory 16 GB G-Skill Trident-X DDR3 2400 CAS 9
Video Card(s) EVGA 1080ti SC2 Hybrid
Storage 2 TB Mushkin Triactor 3D (RAID 0)
Display(s) Dell S2716DG / Samsung Q80R QLED TV
Case Fractal Design Define R4
Audio Device(s) Audio Engine D1 DAC, A5+ Speakers, SteelSeries Arctis 7
Power Supply Seasonic Platinum 660 W
Mouse SteelSeries Rival
Keyboard SteelSeries Apex
Software Windows 10 Pro x64
Lucid Virtu does this:
They talk about it at about the 2:00 minute mark. There is a better demo of it using unreal tournament 3 but i couldnt find it.

Do manufacturers even put Lucid on boards anymore? And LMAO at this video, they claim that users buying identical video cards and installing a multi-gpu bridge is too difficult for users therefore there was a market need for Lucid. While it's pretty cool that it can isolate assets like that, I think it'll be much better handled at the API level.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Lucid sucks balls, sorry it's crap it never worked correctly for me with either one issue or another like not swapping to add-in card and 20-30w extra power usage at idle.

Been there tried it and dumped it.

my motherboard supports it and i tried to use it, but it relied on timely software updates to recognise titles basically, and that never happened.
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
18,875 (3.07/day)
Location
UK\USA
Processor AMD 3900X \ AMD 7700X
Motherboard ASRock AM4 X570 Pro 4 \ ASUS X670Xe TUF
Cooling D15
Memory Patriot 2x16GB PVS432G320C6K \ G.Skill Flare X5 F5-6000J3238F 2x16GB
Video Card(s) eVga GTX1060 SSC \ XFX RX 6950XT RX-695XATBD9
Storage Sammy 860, MX500, Sabrent Rocket 4 Sammy Evo 980 \ 1xSabrent Rocket 4+, Sammy 2x990 Pro
Display(s) Samsung 1080P \ LG 43UN700
Case Fractal Design Pop Air 2x140mm fans from Torrent \ Fractal Design Torrent 2 SilverStone FHP141x2
Audio Device(s) Yamaha RX-V677 \ Yamaha CX-830+Yamaha MX-630 Infinity RS4000\Paradigm P Studio 20, Blue Yeti
Power Supply Seasonic Prime TX-750 \ Corsair RM1000X Shift
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ Wooting Two HE
Benchmark Scores Meh benchmarks.
my motherboard supports it and i tried to use it, but it relied on timely software updates to recognise titles basically, and that never happened.

Well after seeing the issue's i was seeing which it was saying better idle power usage and save power when main card was not required failed i went to their site to see about updates to see that they wanted me to pay for upgrades so i though fck it.

Free at this time but typically $55 ^^, may try it again just to see if it fails as badly.

http://lucidlogix.com/free-upgrade/

EDIT: Maybe not as their website apparently is not kept up to date lmao ( unless that message is wrong too lol.
 
Last edited:

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Well after seeing the issue's i was seeing which it was saying better idle power usage and save power when main card was not required failed i went to their site to see about updates to see that they wanted me to pay for upgrades so i though fck it.

same here. i wanted to use it to lower power consumption in 2D, but noooope.
 
Top