• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Building new all-purpose computer (gaming, video, file-server) with Ryzen 7950X and ASUS Crosshair Extreme X670E/Gigabyte Aorus Extreme/MSI Godlike

Joined
Aug 7, 2010
Messages
21 (0.00/day)
My current computer i-7920X is getting old in the tooth and I am planning on a new computer for gaming, video/photo, BOINC (Worldcomputinggrid), mining and file serving.
I think I will get the Phanteks Enthoo 719 case (as it is under 24 inches high, has space for 12 HDD and front USB-C port). I already bought an EVGA T2 1600W power supply on sale.
I have 3 3090's (EVGA FTW x2, MSI Suprim X).
In the new system I want to use the 2 EVGA 3090 (as DaVinci Resolve Studio and other programs can make use of them).
From the motherboard I need 10G ethernet, USB4 x2, 3-5 m.2 slots and 1 additional PCI-e (x4 v3?) to connect another 6-8 HDD (motherboards have about 6 SATA connectors).
I usually run Windows 11 but was wondering about running Windows Server or Proxmox and have my daily system as a VM with direct access to the graphics card. That way if a Windows update borks the system I can use just the last working snapshot.
1. Case: I looked at E-ATX cases from Fractal Design XL and Meshify 2, Thermaltake Level 20 GT, Phanteks Enthoo Pro 2, Corsair 5000D and 7000D. The Enthoo 719 fits the best under the desk (under 24 inches high), is not too wide.
2. MB: I had MBs from Asus, Asrock and Gigabyte with good results. I could go with either of the 3 flagship boards from Asus, Gigabyte and MSI as all have 2 x PCI-e v5 (16 or 8/8), 10G Ethernet, USB 4. I need an additional PCI-e slot for SATA drives.
3. OS: is it worth to deal with Proxmox or Windows server and run the OS as VM? I have on my current computers Windows 11 on two m.2. If one crashes, I use the other one (had to do that recently).
I will use 360mm AIO water cooling (have a Phanteks available) or could do a Corsair 420mm one.
Any suggestions?
 
Last edited:
I would seriously consider dividing up your computing tasks into 2 physically separate machines. Going by how you would like to run your OS, I think you are inviting problems with running it on extra bits of software like VMs. In situations like this I always take into account an old saying Einstein once said "Make everything as simple as possible, but not simpler".
 
Low quality post by BlaezaLite
Buy a 3090 for £400...
 
I would seriously consider dividing up your computing tasks into 2 physically separate machines. Going by how you would like to run your OS, I think you are inviting problems with running it on extra bits of software like VMs. In situations like this I always take into account an old saying Einstein once said "Make everything as simple as possible, but not simpler".
I already have a HTPC/gaming computer hooked up to the living room TV and couple others running. The wife wants me to simplify the whole setup. As all three premium motherboards have 4-5 m.2 I am thinking about using one just for gaming (OS+games) and have one for productivity stuff (Davinci Resolve, Adobe cloud). That should make it more resilient by separating the gaming setup from the productivity setup. I can get an external m.2 reader and clone those drives periodically.
 
@sgunes
What are your monitors?

Consider active backplate instead of AIO. Or use Thermalright Extreme Odyssey pads for vram...
 
@sgunes
What are your monitors?

Consider active backplate instead of AIO. Or use Thermalright Extreme Odyssey pads for vram...
Monitors, as in screens: LG 34” 5x2k.
Monitors, as in hardware monitoring: hwinfo64. I keep VRAM under 94C by limiting power to about 70%, Memory at +1200 MHz.

Regarding the case, I currently favor the Fractal Design Define 7 XL (space for BluRay disc drive and 15 3.5 HDD and 5 2.5” SSDs).
 
  • Like
Reactions: Lei
From the use case of your system, I'll suggest you to put at least 3 different machines instead of one.

A computer for gaming, video/photo. ( Don't ever think about use mainstream PC with dual GPU usage for DaVinci Resolve or any other software. HEDT PC is needed there, believe me tested hundred of hardware versions for various customer workflows )

A computer for BOINC (Worldcomputinggrid), mining. ( this might even be separated as well depending on how intense you want to mine and work on GPU computing on BOINC ) ( one of 2 most degrading operations that can be done on a system, don't suggest to do it on a daily system and that conrains files has a high importance in it. )

A computer for file serving. ( depending of the file's importance you can use consumer or enterprise hardware for the system, my software suggestion will be TrueNAS Scale )
 
From the use case of your system, I'll suggest you to put at least 3 different machines instead of one.

A computer for gaming, video/photo. ( Don't ever think about use mainstream PC with dual GPU usage for DaVinci Resolve or any other software. HEDT PC is needed there, believe me tested hundred of hardware versions for various customer workflows )

A computer for BOINC (Worldcomputinggrid), mining. ( this might even be separated as well depending on how intense you want to mine and work on GPU computing on BOINC ) ( one of 2 most degrading operations that can be done on a system, don't suggest to do it on a daily system and that conrains files has a high importance in it. )

A computer for file serving. ( depending of the file's importance you can use consumer or enterprise hardware for the system, my software suggestion will be TrueNAS Scale )
Thanks for the suggestions. I mostly agree with your assessment.
I already have a QNAP with 18 drives (TS-832X and 2 expansions) as my primary fileserver, so these drives on the new build (transplanted from my current setup) would be just a back-up to the QNAP.
I have 4 other computers doing mining/BOINC and the plan is to have the new machine doing the same when I am not doing anything with it (>90% of the time).
I looked at the Threadripper Pro but they are priced extraordinarily high now and I think that a 16 core-32 thread 7950X OC'd with DDR5-6000 (CAS 30) would probably beat a 24 core-48 thread 5965WX (with DDR4-3200 memory) which is priced at 2399 USD not just in single-core but also multi-core tasks. For that price I can get almost CPU+MB+128GB of DDR 5-6000 memory. G.SKILL Trident Z5 RGB Series 64GB (2 x 32GB) 288-Pin PC RAM DDR5 6000 Intel XMP 3.0 Desktop Memory Model F5-6000J3040G32GX2-TZ5RK - Newegg.com I don't really need more than 128GB memory and the MSI Godlike MB gives me enough PCI-e lanes to cover everything.
I don't work in IT or professional video production, so nothing is really mission-critical. This is more for personal travel videos/pictures.
 
I'm not fond of QNAP, SYNOLOGY, ASUSTOR, nor TERRA MASTER. They are not as reliable as they should nor flexible as I would like them to be. I suggest you to use QNAP as backup or cold storage instead of your primary file server. Those machines ( all of the brands I listed before ) have a tendency of breaking down close to their end of warranty time or just after it. I prefer to build my own with more reliable products ( supermicro SOC boards for example ). Keeping a separate file server will also prolong your disks lifetime and will reduce possible accidents and issues as well.

As for AM5, I can't speculate since it is not out yet and don't know it's performance. I can only give suggestions about what is already here. But I can say this much, even if you buy the highest end X670E motherboard, the PCI-ex availability will be same because of the CPUs PCI-ex lane count ( unless an AIB partner offers a workstation grade motherboard with PCI-ex 5.0 switch included which is unlikely. ).

I'm still waiting my Threadripper PRO 5000 series CPUs to arrive for further testing, I'll test those with PBO and Manual OC with liquid cooling and compare against stock performance. Currently offering 3975WX for multi GPU setups.

I'll send you a link for DaVinci Resolve performance article from one of our competitor's ( our brand is not publicly known much, but our customers are -Walt Disney, Sony, Warner Bros etc ) since the information here is valid and correct.

DaVinci Resolve article

Here, you'll see that mainstream system with a single GPU is more than enough. Multi GPU setups are a little overboard for those platforms.

If you have other computers for mining/BOINC operations, I would suggest leave the new coming system out of the operation. You don't have to put all of your eggs in a single basket and increase it's possibility to fall. When you degrade your daily system and start to have issues it is not funny, don't ask how I know.
 
The article states explicitly that DaVinci Resolve Studio (the paid version) can use multiple GPUs.
Regarding the other suggested upgrades: you seem to have missed the point that I am not an IT or video professional and this is just a hobby.
I have probably more computing power, redundancy and resiliency than 99% of all hobbyists.
 
Honestly, I would go AM5 over any sort of HEDT. You can put plenty of RAM on these systems, and the Godlike board supports dual PCIe 5.0 slots, so there should be no issue upgrading waay down the line.

If you have the cash to burn then go for dual 3090's, even if I hate myself for saying that...

Still cheaper than HEDT.
 
My current computer i-7920X is getting old in the tooth and I am planning on a new computer for gaming, video/photo, BOINC (Worldcomputinggrid), mining and file serving.
I think I will get the Phanteks Enthoo 719 case (as it is under 24 inches high, has space for 12 HDD and front USB-C port). I already bought an EVGA T2 1600W power supply on sale.
I have 3 3090's (EVGA FTW x2, MSI Suprim X).
In the new system I want to use the 2 EVGA 3090 (as DaVinci Resolve Studio and other programs can make use of them).
From the motherboard I need 10G ethernet, USB4 x2, 3-5 m.2 slots and 1 additional PCI-e (x4 v3?) to connect another 6-8 HDD (motherboards have about 6 SATA connectors).
I usually run Windows 11 but was wondering about running Windows Server or Proxmox and have my daily system as a VM with direct access to the graphics card. That way if a Windows update borks the system I can use just the last working snapshot.
1. Case: I looked at E-ATX cases from Fractal Design XL and Meshify 2, Thermaltake Level 20 GT, Phanteks Enthoo Pro 2, Corsair 5000D and 7000D. The Enthoo 719 fits the best under the desk (under 24 inches high), is not too wide.
2. MB: I had MBs from Asus, Asrock and Gigabyte with good results. I could go with either of the 3 flagship boards from Asus, Gigabyte and MSI as all have 2 x PCI-e v5 (16 or 8/8), 10G Ethernet, USB 4. I need an additional PCI-e slot for SATA drives.
3. OS: is it worth to deal with Proxmox or Windows server and run the OS as VM? I have on my current computers Windows 11 on two m.2. If one crashes, I use the other one (had to do that recently).
I will use 360mm AIO water cooling (have a Phanteks available) or could do a Corsair 420mm one.
Any suggestions?

Well that's a lot of stuff on a single computer and a single point of failure. You might use some hypervisor to run multiple OS on the new box to keep some separation of the different things you want to do but that kind of setup is way out of my league. Having separate machines sounds like a better option overall to find a good balance of what you want to do and manage performance for tasks and the prospect of downtime and failures.

Your requirement's seem much greater than mine but I'd figure I'd share my general VM experience below.

I use VMWare Workstation as a daily for my work (I don't think VMWare Workstation supports direct GPU access but does support 3d acceleration) and it's ok-for my tasks which aren't too demanding (I configure 6 cores and 20GB ram for a Win10 guest) from my host 5950x w/ 64GB RAM on PCIe storage. Never really tried gaming on the VM other than a test of age of empires II which ran just fine. Snapshots are nice to make daily backups of the VM easier and faster but if you need to restore a whole image back to your local disk the file copy might take a long time especially over gigabit ethernet. QLC drive is not recommended as the file copy will take forever once the cache is exhausted. In my case I use a USB3 disk as primary daily backup, because restore will be almost 2x faster, and a NAS as secondary daily backup for a redundant copy with additional disk redundancy. On the plus side having at least 2 machines capable of running the VM means if one machine goes down you can transfer your backup to the other machine and still have a daily VM to fire up without additional setup. I had my daily machine literally blow up on me recently and was able to transfer my VM to my backup/gaming PC which really saved my butt until I got a replacement daily PC.

Other thoughts. AM5 is of course going to be brand new and who knows what growing pains there will be. Also if your dealing with a lot of file serving data and VM's full ECC RAM support might be a consideration as well. If you want to run the GPU through the VM you will need to figure out what support is needed for that. From reading a long time ago I think UEFI/BIOS and hypervisor support is required for that to work.
 
Last edited:
The article states explicitly that DaVinci Resolve Studio (the paid version) can use multiple GPUs.
Regarding the other suggested upgrades: you seem to have missed the point that I am not an IT or video professional and this is just a hobby.
I have probably more computing power, redundancy and resiliency than 99% of all hobbyists.
Currently I run Davinci Resolve with 8K-RAW 60fps footage on a 12core 24 threads AMD 9 5900x with only one NVIDIA 3090 and 64GB DDR4 3200 RAM. 8K footage runs good in a 4k timeline 30fps (without proxies), but it struggles slightly sometimes in the playback (depending on the colorgrading and the workflow, if I first cut and then do the colorgrading, there is no problem in my workflow...), for the rest of DR my PC is good and it renders out 8K output in 16-17fps speed when I use the right codec. Like you I am not a professional videographer but I look forward for something faster to depend less on the workflow and to win time. A AM4 5950x would do it 12% better than what I have, but a AM5 9 7950x (and also a Intel 13900k) would both absolutely destroy my current PC in DR-Studio. DDR5 will bring a big boost to DR what we can see by some DR-benchmarks (12900k and 13900k with DDR4 vs DDR5). Now if I was using only 4K, then I would stay with my AM4 5900k there I never had any issues. In your case where I guess you don't use 8K, a 5950x or a 12900k would already be great and a 7950x (but also a 13900k) would be a nice to have and more than you need with a WAU performance. So you don't need a Threadripper or a Epic, with a 7950x you would be ready for the time when you will use 8K. In your case I would absolutely go for AM5 as this socket should also be ready for the next 2 generations of AMD CPU's after the 7950x, what would be a easy upgrade if you will unexpected need more power, while a AM4 or any current Intel your motherboard will not accept new generations of CPU's. And any future 8950x or 9950x should easy destroy most or all the current threadrippers in Davinci Resolve and this for way less money.
 
Currently I run Davinci Resolve with 8K-RAW 60fps footage on a 12core 24 threads AMD 9 5900x with only one NVIDIA 3090 and 64GB DDR4 3200 RAM. 8K footage runs good in a 4k timeline 30fps (without proxies), but it struggles slightly sometimes in the playback (depending on the colorgrading and the workflow, if I first cut and then do the colorgrading, there is no problem in my workflow...), for the rest of DR my PC is good and it renders out 8K output in 16-17fps speed when I use the right codec. Like you I am not a professional videographer but I look forward for something faster to depend less on the workflow and to win time. A AM4 5950x would do it 12% better than what I have, but a AM5 9 7950x (and also a Intel 13900k) would both absolutely destroy my current PC in DR-Studio. DDR5 will bring a big boost to DR what we can see by some DR-benchmarks (12900k and 13900k with DDR4 vs DDR5). Now if I was using only 4K, then I would stay with my AM4 5900k there I never had any issues. In your case where I guess you don't use 8K, a 5950x or a 12900k would already be great and a 7950x (but also a 13900k) would be a nice to have and more than you need with a WAU performance. So you don't need a Threadripper or a Epic, with a 7950x you would be ready for the time when you will use 8K. In your case I would absolutely go for AM5 as this socket should also be ready for the next 2 generations of AMD CPU's after the 7950x, what would be a easy upgrade if you will unexpected need more power, while a AM4 or any current Intel your motherboard will not accept new generations of CPU's. And any future 8950x or 9950x should easy destroy most or all the current threadrippers in Davinci Resolve and this for way less money.
I agree with you and what A Computer Guy wrote.
As a hobbyist (prosumer) my planned 7950x system with the Asus ROG Crosshair X670E (as that is the only current flagship MB with USB4 built-in) with 128GB of DDR 5-6000MHz memory, Arctic or Corsair 420mm water cooling and my dual EVGA 3090 FTW Ultra should be plenty powerful. Currently I am thinking about using one m.2 Gen 5 with Windows 11 for gaming and general stuff and have one m.2 Gen 5 with Windows 11 for DaVinci and Adobe (Lightroom and Photoshop). I still have a x5950 with an MSI Suprim X 3090 that I have in my HTPC/Gaming PC as backup.
I might be able to OC this system more/easier than a TR or Epyc system.
Threadripper and Epyc would be way more expensive and overkill.

Well that's a lot of stuff on a single computer and a single point of failure. You might use some hypervisor to run multiple OS on the new box to keep some separation of the different things you want to do but that kind of setup is way out of my league. Having separate machines sounds like a better option overall to find a good balance of what you want to do and manage performance for tasks and the prospect of downtime and failures.

Your requirement's seem much greater than mine but I'd figure I'd share my general VM experience below.

I use VMWare Workstation as a daily for my work (I don't think VMWare Workstation supports direct GPU access but does support 3d acceleration) and it's ok-for my tasks which aren't too demanding (I configure 6 cores and 20GB ram for a Win10 guest) from my host 5950x w/ 64GB RAM on PCIe storage. Never really tried gaming on the VM other than a test of age of empires II which ran just fine. Snapshots are nice to make daily backups of the VM easier and faster but if you need to restore a whole image back to your local disk the file copy might take a long time especially over gigabit ethernet. QLC drive is not recommended as the file copy will take forever once the cache is exhausted. In my case I use a USB3 disk as primary daily backup, because restore will be almost 2x faster, and a NAS as secondary daily backup for a redundant copy with additional disk redundancy. On the plus side having at least 2 machines capable of running the VM means if one machine goes down you can transfer your backup to the other machine and still have a daily VM to fire up without additional setup. I had my daily machine literally blow up on me recently and was able to transfer my VM to my backup/gaming PC which really saved my butt until I got a replacement daily PC.

Other thoughts. AM5 is of course going to be brand new and who knows what growing pains there will be. Also if your dealing with a lot of file serving data and VM's full ECC RAM support might be a consideration as well. If you want to run the GPU through the VM you will need to figure out what support is needed for that. From reading a long time ago I think UEFI/BIOS and hypervisor support is required for that to work.
I downloaded from Microsoft hypervisor VM‘s with Windows 11 and there is a way to assign a graphics card to the VM. With the new Ryzen 4 that have their own graphics built in, the hypervisor could use the GPU in the CPU and the VM should have all the power of the graphics card available for it.
 
Last edited:
Be careful which Mainboard you buy regarding virtualization and gpu passthrough. For doing it with Hyper-V you need a mainboard with Access Control Services (ACS) and IOMMU. The part with ACL is not well documented, IOMMU alone is not enough.

You can test passthrough capability with this script:

You can also share a gpu between virtual machines with microsoft gpu-p. Check it out here:

 
Last edited:
Be careful which Mainboard you buy regarding virtualization and gpu passthrough. For doing it with Hyper-V you need a mainboard with Access Control Services (ACS) and IOMMU. The part with ACL is not well documented, IOMMU alone is not enough.

You can test passthrough capability with this script:

You can also share a gpu between virtual machines with microsoft gpu-p. Check it out here:

I am currently leaning towards the Asus Crosshair X670E (as it is the only one with 10G Ethernet and USB4 built-in). Is there any way to find out if this MB has ACS and IOMMU? The manual for the Asus is not available on the website, yet.
 
The MSI ACE has everything your looking for but USB4. Whatever MB you pick, make sure to look through the Manual and block diagram because often things like SATA, M2 and PCIE slot become disabled when the system is fully populated.
 
The MSI ACE has everything your looking for but USB4. Whatever MB you pick, make sure to look through the Manual and block diagram because often things like SATA, M2 and PCIE slot become disabled when the system is fully populated.
which for a board that costs $1200, is insanity. Every bloody thing better well work, no matter what. They used to make PCIE bridge chips for this reason, so its not like it isn't doable...



Subbed to watch this build take place.
 
My current computer i-7920X is getting old in the tooth and I am planning on a new computer for gaming, video/photo, BOINC (Worldcomputinggrid), mining and file serving.
I think I will get the Phanteks Enthoo 719 case (as it is under 24 inches high, has space for 12 HDD and front USB-C port). I already bought an EVGA T2 1600W power supply on sale.
I have 3 3090's (EVGA FTW x2, MSI Suprim X).
In the new system I want to use the 2 EVGA 3090 (as DaVinci Resolve Studio and other programs can make use of them).
From the motherboard I need 10G ethernet, USB4 x2, 3-5 m.2 slots and 1 additional PCI-e (x4 v3?) to connect another 6-8 HDD (motherboards have about 6 SATA connectors).
I usually run Windows 11 but was wondering about running Windows Server or Proxmox and have my daily system as a VM with direct access to the graphics card. That way if a Windows update borks the system I can use just the last working snapshot.
1. Case: I looked at E-ATX cases from Fractal Design XL and Meshify 2, Thermaltake Level 20 GT, Phanteks Enthoo Pro 2, Corsair 5000D and 7000D. The Enthoo 719 fits the best under the desk (under 24 inches high), is not too wide.
2. MB: I had MBs from Asus, Asrock and Gigabyte with good results. I could go with either of the 3 flagship boards from Asus, Gigabyte and MSI as all have 2 x PCI-e v5 (16 or 8/8), 10G Ethernet, USB 4. I need an additional PCI-e slot for SATA drives.
3. OS: is it worth to deal with Proxmox or Windows server and run the OS as VM? I have on my current computers Windows 11 on two m.2. If one crashes, I use the other one (had to do that recently).
I will use 360mm AIO water cooling (have a Phanteks available) or could do a Corsair 420mm one.
Any suggestions?
Admit that you were waiting all along until you could upgrade to a new PC with another 7000 series X CPU...
 
which for a board that costs $1200, is insanity. Every bloody thing better well work, no matter what. They used to make PCIE bridge chips for this reason, so its not like it isn't doable...



Subbed to watch this build take place.
Your still limited. For USB4 you must give up the second 4x Gen5 lane. The CPU only has 24 in total.

Edit: You lose a M2 as well. Or something in the trade.
 
Last edited:
I may just wait for 3-4 months and get the 7950 with 3D cache in 01 or 02/2023. By that time we will have an idea about the 3 big motherboards and if there are any problems with them.
 
Seems like this should be a few separate builds. I understand you’re trying to combine it all in one but at some point it’s better to just have multiple setups.
 
I have somewhat of a similar purpose for my machine. I have a Threadripper 3970X with 256GB of RAM though.
I run multiple containers, no VMs for now. I game on Linux and have DaVinci Resolve installed for 4K video edits.

I did build a separate Xeon-D machine as my file server though. Its sitting in a rack, all connected together with 10G ethernet.
 
Bear in mind that if you have 128 GB of ram, you may not be able to run it at 6000mhz.

I agree with others here that it should be two or more systems.

If you like the idea of one system, which I also do, consider a case with two motherboards.

I would do custom water, clearly price isn't really an issue. 3090 non ti especially, custom allows you to actively cool the backplates and you can run the vram at 100%+. With the right setup you can have startlingly low maintenance systems, basically just cleaning the fans/rads. Pure copper loop + mayhems XTR nano is the way.

In your position, unless you are urgently needing a new system, I would seriously consider waiting a couple of months for new zen 4 steppings/X3D versions, and for AGESA to mature. In the mean time you can design your loop and case/s.

Good luck with the upgrade and I and many others will be happy to see updates and perhaps give advice.
 
Bear in mind that if you have 128 GB of ram, you may not be able to run it at 6000mhz.
On AMD5? Not going to happen anytime soon. 2x Dual-Rank high freq isn't a option right now. Maybe after a few AGESA updates. I know DDR5-3600 works with 128GB after the failsafe boot :)
 
Back
Top