• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

2x HighPoint Rocket R7608A Gen5 RAID AIC installed in AMD EPYC 9004 series computer system

Joined
Sep 16, 2024
Messages
42 (0.14/day)
System Name AMD EPYC workstation
Processor 2x AMD EPYC 9004 series 96cores 192threads
Motherboard Gigabyte MZ73-LM0
Cooling water-cooled AIO
Memory 1.5TB (64GB x 24) SK Hynix DD5-5600
Video Card(s) MSI RTX4090 SUPRIM LIQUID X
Storage 8x Crucial T705 GEN5 SSDs in RAID0
Power Supply 2x ACBEL 1200W Platinum Redundant Power Supply
Software Windows 11 Pro for Workstations
A total of eight Crucial T705 GEN5 SSDs installed as Windows bootable RAID0.
Come attached with two CrystalDiskMark results, one with dual R7608A RAID AIC(4x Crucial T705 GEN5 SSD on each card), the other with single R7608A(with 8x Crucial T705).
The settings of CrystalDiskMark has been resetted to default after testing.
The cards are too close and the ventilation is severly affected, needs extra fan module(s) for the R7608A.
 

Attachments

  • HighPoint05.jpg
    HighPoint05.jpg
    205.8 KB · Views: 130
  • 1000114082.jpg
    1000114082.jpg
    2.5 MB · Views: 125
  • 1000114083.jpg
    1000114083.jpg
    4.1 MB · Views: 126
  • 1000114097.jpg
    1000114097.jpg
    3.9 MB · Views: 130
  • 1000114077.jpg
    1000114077.jpg
    3.5 MB · Views: 119
  • 1000114078.jpg
    1000114078.jpg
    3.4 MB · Views: 115
  • HighPoint03x.jpg
    HighPoint03x.jpg
    180.3 KB · Views: 105
  • HighPoint04.jpg
    HighPoint04.jpg
    421.9 KB · Views: 129
  • Diskbench_final2.jpg
    Diskbench_final2.jpg
    431.6 KB · Views: 130
  • Diskbench_final1.jpg
    Diskbench_final1.jpg
    431.5 KB · Views: 132
I am trying to understand your question. The top card will definitely be starved of air in the pictured configuration. It is completely choked off.

Could you rephrase/clarify your question?
 
Helluva rig :pimp:
How hot do the drives get? Maybe you could swap the positions of the lower RAID AIC and the GPU riser cable.
 
Two PCIe lanes per SSD on a fully populated card versus four lanes whilst cross-RAIDing two half-populated cards for maximum sequental speed.
 
Helluva rig :pimp:
How hot do the drives get? Maybe you could swap the positions of the lower RAID AIC and the GPU riser cable.
There are only 4 PCIE5.0 x16 slots, two for each CPU
 
Additional cooling fans have been added and the temperature of the HighPoint Rocket R7608A cards is ok.
 

Attachments

  • HighPoint sensors01.jpg
    HighPoint sensors01.jpg
    279.6 KB · Views: 76
  • HighPoint sensors02.jpg
    HighPoint sensors02.jpg
    273.8 KB · Views: 78
  • Diskbench_final2.jpg
    Diskbench_final2.jpg
    431.1 KB · Views: 75
The Windows bootable RAID read speed hits 106GB/s
 

Attachments

  • Max Read Speed.jpg
    Max Read Speed.jpg
    253 KB · Views: 222
What programs do you run?
As the HighPoint R7608A cards are installed on slot 4 and slot 3(CPU0), I have to set CrystalDiskMark to run on CPU0.

PCIE5.0 x32 has a tranfer rate of 64GT/s (real world ~120GB/s). As I am just a rookie in computer tech, I could only hit 106GB/s and are unable to harness the full strength of PCIE5.0
 

Attachments

  • MZ73-LM0.jpg
    MZ73-LM0.jpg
    261.4 KB · Views: 65
  • Processor affinity.jpg
    Processor affinity.jpg
    605.1 KB · Views: 72
As the HighPoint R7608A cards are installed on slot 4 and slot 3(CPU0), I have to set CrystalDiskMark to run on CPU0.

PCIE5.0 x32 has a tranfer rate of 64GT/s (real world ~120GB/s). As I am just a rookie in computer tech, I could only hit 106GB/s and are unable to harness the full strength of PCIE5.0
Overhead and command is probably the balance of data throughput. Still insane speeds though.

37% of 3.7Ghz CPU overhead. No controller card I’m aware of could handle that.
 
Overhead and command is probably the balance of data throughput. Still insane speeds though.

37% of 3.7Ghz CPU overhead. No controller card I’m aware of could handle that.
Forget PCs as they have only PCIE 20 lanes(AMD 28 lanes, 24 usable)
 
I have to set CrystalDiskMark to run on CPU0.
Sorry, I should have been clearer in my query. I wasn't asking about benchmarking apps. Instead I was enquiring about the intended use for the computer, e.g. are you using it for coding, rendering or AI LLMs? It must be something that warrants high speed storage.

With the likelihood of one or more of the 8 RAID0 drives eventually going down and taking everything with it, I assume the data is ephemeral and copied elsewhere?

I cannot see any storage drives in the Windows system. Are you throwing data over a 100GbE QSFP28 fibre optic link to fast storage in another machine?
 
Sorry, I should have been clearer in my query. I wasn't asking about benchmarking apps. Instead I was enquiring about the intended use for the computer, e.g. are you using it for coding, rendering or AI LLMs? It must be something that warrants high speed storage.

With the likelihood of one or more of the 8 RAID0 drives eventually going down and taking everything with it, I assume the data is ephemeral and copied elsewhere?

I cannot see any storage drives in the Windows system. Are you throwing data over a 100GbE QSFP28 fibre optic link to fast storage in another machine?
Yup, there is backup in my NAS(pic below) and my home network runs on 100G QSFP28 optics
1000114956.jpg


My i9-14900K PC has built-in 10G RJ45 port and also runs on Windows bootable RAID0
1000114957.jpg

1000114958.jpg

boot RAID.jpg


Sorry, I should have been clearer in my query. I wasn't asking about benchmarking apps. Instead I was enquiring about the intended use for the computer, e.g. are you using it for coding, rendering or AI LLMs? It must be something that warrants high speed storage.

With the likelihood of one or more of the 8 RAID0 drives eventually going down and taking everything with it, I assume the data is ephemeral and copied elsewhere?

I cannot see any storage drives in the Windows system. Are you throwing data over a 100GbE QSFP28 fibre optic link to fast storage in another machine?
My 100G QSFP throughput
1000114959.jpg
 
Last edited:
Thanks, but there's still no mention of the standard apps you run under Windows on an 8-drive RAID0 array, apart from bench marking.
 
Back
Top