• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RAID 5 for games?

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
What about a sshd?

The small SSD cache on SSHDs doesn't really help performance a whole lot.

I just realized I could go even cheaper and get another 500GB WDC Black and put it in RAID 0 with the one I already have. I don't need all 128GB of my SSD for my OS and programs either... currently I'm skimming by on a 30GB partition for that. I figure I could RAID some hard drives, and use the rest of my SSD to either store the most demanding games on, or set it up as a cache for the RAID array, though it would only be 80GB or so.

An 80GB cache would be better than nothing. I ran with a 60GB cache on my 2TB drive for years. I just recently finally upgraded to a 480GB cache only because I replaced my MX200 OS drive with a new SSD, and didn't really have another use for the MX200.
 
  • Like
Reactions: hat

phill

Moderator
Staff member
Joined
Jun 8, 2011
Messages
15,973 (3.39/day)
Location
Somerset, UK
System Name Not so complete or overkill - There are others!! Just no room to put! :D
Processor Ryzen Threadripper 3970X
Motherboard Asus Zenith 2 Extreme Alpha
Cooling Lots!! Dual GTX 560 rads with D5 pumps for each rad. One rad for each component
Memory Viper Steel 4 x 16GB DDR4 3600MHz not sure on the timings... Probably still at 2667!! :(
Video Card(s) Asus Strix 3090 with front and rear active full cover water blocks
Storage I'm bound to forget something here - 250GB OS, 2 x 1TB NVME, 2 x 1TB SSD, 4TB SSD, 2 x 8TB HD etc...
Display(s) 3 x Dell 27" S2721DGFA @ 7680 x 1440P @ 144Hz or 165Hz - working on it!!
Case The big Thermaltake that looks like a Case Mods
Audio Device(s) Onboard
Power Supply EVGA 1600W T2
Mouse Corsair thingy
Keyboard Razer something or other....
VR HMD No headset yet
Software Windows 11 OS... Not a fan!!
Benchmark Scores I've actually never benched it!! Too busy with WCG and FAH and not gaming! :( :( Not OC'd it!! :(
If you don't want to re-install your entire steam library if a drive fails, RAID-5 is the way to go. RAID-5 only has a couple downsides compared to RAID-0 but, it has some upsides too.
  • RAID-5 offers redundancy so, you won't lose all of your stuff if you have a single drive failure.
  • RAID-5 offers similar read performance as RAID-0 which tends to scale to the number of disks you have.
  • RAID-5 has worse write performance as RAID-0 due to the need to store a parity data for redundancy.
I personally have a steam library on both my RAID-5 (HDDs,) and RAID-0 (SSDs,) for the bigger games that I don't play as often, I install them to the RAID-5. For the smaller or more often played games, I install them to the SSD RAID-0 and that works out fairly well for me.

FWIW, RAID-5 read speeds can be pretty good. This is what I get with 4x1TB drives. Bigger drives are going to have better transfer speeds too since single 1TB can't really break 130MB/s at its best. It's the access time that makes the drive feel slow which is why SSDs feel fast.

RAID-5 (4x1TB WD Black,)

RAID-0 SSDs: Corsair Force GT 120GB


Single 1TB WD Black:

I like the testing software, is that in Linux per chance??

Give it a go and let us know your findings!! :)
 
  • Like
Reactions: hat

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
I like the testing software, is that in Linux per chance??
Yup, it's my daily driver. ...and the software:
1527069975478.png

Give it a go and let us know your findings!! :)
Running Linux or running games off the RAID-5? I've been running Ubuntu exclusively for a couple years now and it's fine for what I do with it. As for running games off the RAID-5, I already do that and it's fast enough. Obviously not as fast as the SSDs but, it's fast enough where it doesn't really bother me at all.
 
Joined
Jun 15, 2016
Messages
1,042 (0.36/day)
Location
Pristina
System Name My PC
Processor 4670K@4.4GHz
Motherboard Gryphon Z87
Cooling CM 212
Memory 2x8GB+2x4GB @2400GHz
Video Card(s) XFX Radeon RX 580 GTS Black Edition 1425MHz OC+, 8GB
Storage Intel 530 SSD 480GB + Intel 510 SSD 120GB + 2x500GB hdd raid 1
Display(s) HP envy 32 1440p
Case CM Mastercase 5
Audio Device(s) Sbz ZXR
Power Supply Antec 620W
Mouse G502
Keyboard G910
Software Win 10 pro
Raid 5 is obsolete and replaced by raid 6 but you need extra hdd and you get no performance drop when one hdd goes bad..
 
  • Like
Reactions: hat

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Mechanical drives sucks for that. They are only for storage IMO. Way too slow compared to SSD, and RAID5 using onboard controller is NOT good regardless...

Get another 500GB SSD and run RAID0, or go M.2 -> 1TB Samsung 970 Evo

My 2x 500GB 850 Evo have been running flawlessly in RAID0 for years now. Very fast. Only con is 3s longer boot time. Every bench or game loading time is faster in RAID0 config (tested several titles and benchmarks when I bought the 2nd). RAID0 is fine for this, zero important data (gamesaves and everything else is backed up in cloud + network).

Going M.2 NVME on next build tho (no RAID).
 
Last edited:
  • Like
Reactions: hat

Mindweaver

Moderato®™
Staff member
Joined
Apr 16, 2009
Messages
8,194 (1.49/day)
Location
Charleston, SC
System Name Tower of Power / Sechs
Processor i7 14700K / i7 5820k @ 4.5ghz
Motherboard ASUS ROG Strix Z690-A Gaming WiFi D4 / X99S GAMING 7
Cooling CM MasterLiquid ML360 Mirror ARGB Close-Loop AIO / CORSAIR Hydro Series H100i Extreme
Memory CORSAIR Vengeance LPX 32GB (2 x 16GB) DDR4 3600 / G.Skill DDR4 2800 16GB 4x4GB
Video Card(s) ASUS TUF Gaming GeForce RTX 4070 Ti / ASUS TUF Gaming GeForce RTX 3070 V2 OC Edition
Storage 4x Samsung 980 Pro 1TB M.2, 2x Crucial 1TB SSD / Samsung 870 PRO 500GB M.2
Display(s) Samsung 32" Odyssy G5 Gaming 144hz 1440p, ViewSonic 32" 72hz 1440p / 2x ViewSonic 32" 72hz 1440p
Case Phantek "400A" / Phanteks “Enthoo Pro series”
Audio Device(s) Realtek ALC4080 / Azalia Realtek ALC1150
Power Supply Corsair RM Series RM750 / Corsair CXM CX600M
Mouse Glorious Gaming Model D Wireless / Razer DeathAdder Chroma
Keyboard Glorious GMMK with box-white switches / Keychron K6 pro with blue swithes
VR HMD Quest 3 (128gb) + Rift S + HTC Vive + DK1
Software Windows 11 Pro x64 / Windows 10 Pro x64
Benchmark Scores Yes
I saw this yesterday right before I left the office. I meant to reply when I got home but forgot... One question I've not seen asked or I overlooked it is, do you plan to use software RAID or hardware? I would never use RAID 5 in a software array. If you do plan to use hardware be sure to have a BBU to enable write-back cache for better performance. If you plan to use software RAID then I would go with RAID 0 and for the best of both worlds RAID 10, but it's double the drives. Think of 2x RAID 0 arrays Mirrored.

Years ago I used 3x 250gb WD RE4 drives in RAID 0 for my game drive and it was fast, but nothing compared to today's SSD drives. I would suggest buying a big mechanical storage drive and a 256-512gb SSD and use something like Steam Mover to move the games you are playing to the SSD, I would do this over even setting up an SSD cache drive. It creates Symbolic Links. You can create a batch file to do this, but Steam Mover works fairly easy for anyone (I feel like a spammer now..but if the bot ban hammers me I know the guy that owns the site.. ;).. lol). I have created my own program, but it's not as polished as Steam Mover.. hehe You know @FordGT90Concept my have a Symbolic Link program of his on somewhere here, but like I said I've never run into any issues with Steam Mover.. But I've not used it in years since I created my own.

Also on a side note, I have 2x 250gb SSD's in RAID 0 and I don't notice any better performance than my 850 pro 256gb in games and programs. Sure the benchmarks say it's faster, but in the real world, I don't notice it. I've only ever really noticed it in copying huge files like DB's and Backups, but nothing in programs or games. Actually, there was only really one game it maked a difference in and that was Rage with its mega textures, but that was on a PCI-E SSD with 1200 write and 1400 read.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I do not. The only Steam-related program I have is to fix Steam installscript errors.

Steam mostly runs from itself. If you just copy the Steam directory to another drive and run Steam, it will update everything itself.

Any kind of RAID is fine for games. All games have systems to handle slow loading. RAID5 will have faster read performance than a single drive but with a slight latency penalty.

My Origin, Steam, and GOG libraries are on a single 6 TB hard drive. It's 3/4 full. I'm going to have to buy a ~12 TB hard drive soon and copy everything over.
 

Mindweaver

Moderato®™
Staff member
Joined
Apr 16, 2009
Messages
8,194 (1.49/day)
Location
Charleston, SC
System Name Tower of Power / Sechs
Processor i7 14700K / i7 5820k @ 4.5ghz
Motherboard ASUS ROG Strix Z690-A Gaming WiFi D4 / X99S GAMING 7
Cooling CM MasterLiquid ML360 Mirror ARGB Close-Loop AIO / CORSAIR Hydro Series H100i Extreme
Memory CORSAIR Vengeance LPX 32GB (2 x 16GB) DDR4 3600 / G.Skill DDR4 2800 16GB 4x4GB
Video Card(s) ASUS TUF Gaming GeForce RTX 4070 Ti / ASUS TUF Gaming GeForce RTX 3070 V2 OC Edition
Storage 4x Samsung 980 Pro 1TB M.2, 2x Crucial 1TB SSD / Samsung 870 PRO 500GB M.2
Display(s) Samsung 32" Odyssy G5 Gaming 144hz 1440p, ViewSonic 32" 72hz 1440p / 2x ViewSonic 32" 72hz 1440p
Case Phantek "400A" / Phanteks “Enthoo Pro series”
Audio Device(s) Realtek ALC4080 / Azalia Realtek ALC1150
Power Supply Corsair RM Series RM750 / Corsair CXM CX600M
Mouse Glorious Gaming Model D Wireless / Razer DeathAdder Chroma
Keyboard Glorious GMMK with box-white switches / Keychron K6 pro with blue swithes
VR HMD Quest 3 (128gb) + Rift S + HTC Vive + DK1
Software Windows 11 Pro x64 / Windows 10 Pro x64
Benchmark Scores Yes
I do not. The only Steam-related program I have is to fix Steam installscript errors.

Steam mostly runs from itself. If you just copy the Steam directory to another drive and run Steam, it will update everything itself.

Any kind of RAID is fine for games. All games have systems to handle slow loading. RAID5 will have faster read performance than a single drive but with a slight latency penalty.

My Origin, Steam, and GOG libraries are on a single 6 TB hard drive. It's 3/4 full. I'm going to have to buy a ~12 TB hard drive soon and copy everything over.

That's good to know. Yea, I haven't used Steam Mover in awhile nor the program I wrote to move the games folder and create Symbolic Links either. Currently, I have a 1tb SSHD (it was a gift) for my main drive for games and then a few SSD's for the games I play the most.. which is slim to none now.. lol I mainly play FO4VR (when I get time) and I did create symbolic links for the Mods. Only because I didn't want to have them installed twice for pancake FO4 and FO4VR.
 
  • Like
Reactions: hat
Joined
Jan 20, 2017
Messages
328 (0.12/day)
System Name Burning a hole through my wallet
Processor 3700X
Motherboard Maximus 8 Hero
Cooling Custom loop (EK Extreme 360 Rad, Supremacy evo w/AM4 bracket)
Memory 2x16 Corsair Vengeance Pro RGB @3200MHz
Video Card(s) EVGA 2080s hybrid
Storage 960 Evo, 660p, P1, BX500, 2XWD Black, Ironwolf Proo
Display(s) Predator 27" 4k 144hz HDR
Case NZXT h700i
Power Supply EVGA G3 850
Mouse Logitech G502 hero
Keyboard Drop ALT W/holypanda switches
Software Win 10 Pro 64, Ubuntu 20.04, Manjaro (latest)
Personally i put the most played games from steam on my 960 evo, put the rest on my raid 0 2x2TB wd blacks (~300 gigs free) and leave the bazillion gigs of battlefield games and other origin crap on my data 4tb drive.
 
  • Like
Reactions: hat
D

Deleted member 178884

Guest
I wonder how most people here even live off 1tb, I've hit 1.1tb storage used on my toshiba x300 6tb and it's rising fast. I'd recommend toshiba's x300 hdds. If your looking for raw performance i'd take a look at kingston ssd's heres my crystaldisk video (skip to around 4:30)

When you dump the cash on a 12tb drive go helium they are excellent performers and capacity range.
 
  • Like
Reactions: hat
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
That's what they said back when I got my 512gb ssd, which is now in another system. I wasn't impressed with the performance in games at that time...
Time's change, now some games are a nightmare on hdd for example Pubg.
Op, buy one 3tb drive and just get familiar with the steam move folder option and move games between you new hdd and old 512sdd win win and cheap too.
 
  • Like
Reactions: hat

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,731 (3.41/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
The 512gb ssd isn't an option anymore. It's in another system and has been for some time, and that's where it's likely to stay.

For those asking about what type of raid I would use specifically, that would most likely be software raid from within Windows. Not too happy about that either though because software raid uses CPU cycles. I would set it up from bios, but I've learned recently that isn't true hardware raid either. The reason for running raid like this would be because I'm cheap and getting a real hardware raid controller card isn't something cheap people do

That 512gb ssd isn't an option, it's in use in another system.

I would be using software raid from within Windows. I heard bad things about onboard raid, and the only way to get true hardware raid is to buy a separate controller card, which I won't do. That's expensive, and the whole idea behind using raid is to save some money, because it would be cheaper than an ssd.
 
Joined
Jan 20, 2017
Messages
328 (0.12/day)
System Name Burning a hole through my wallet
Processor 3700X
Motherboard Maximus 8 Hero
Cooling Custom loop (EK Extreme 360 Rad, Supremacy evo w/AM4 bracket)
Memory 2x16 Corsair Vengeance Pro RGB @3200MHz
Video Card(s) EVGA 2080s hybrid
Storage 960 Evo, 660p, P1, BX500, 2XWD Black, Ironwolf Proo
Display(s) Predator 27" 4k 144hz HDR
Case NZXT h700i
Power Supply EVGA G3 850
Mouse Logitech G502 hero
Keyboard Drop ALT W/holypanda switches
Software Win 10 Pro 64, Ubuntu 20.04, Manjaro (latest)
Personally I have my drives software raided in raid 0 and have yet to have an issue with them. On my home server I ran raid off the mobo chipset, but both drives failed due to heat (haf xb evo hotswap bays suck for cooling 24/7 operation), and now there are 2x3TB seagate nas drives (NOT Ironwolfs) in it on a dedicated card with no issue.
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
The 512gb ssd isn't an option anymore. It's in another system and has been for some time, and that's where it's likely to stay.

For those asking about what type of raid I would use specifically, that would most likely be software raid from within Windows. Not too happy about that either though because software raid uses CPU cycles. I would set it up from bios, but I've learned recently that isn't true hardware raid either. The reason for running raid like this would be because I'm cheap and getting a real hardware raid controller card isn't something cheap people do

That 512gb ssd isn't an option, it's in use in another system.

I would be using software raid from within Windows. I heard bad things about onboard raid, and the only way to get true hardware raid is to buy a separate controller card, which I won't do. That's expensive, and the whole idea behind using raid is to save some money, because it would be cheaper than an ssd.
Well then personally I would go with option d, get two nvme ssds and that 3tb hdd , use one 256Gb nvme for os ,a new 512 nvme for fave and in play games and the 3tb for storage.
I did 3xwestern digi blacks in raid 0 for a few years until I got an ssd , i found it pointless given the latency increased a bit with occasional lags during really heavy cpu use and the read speeds were not as quick , then I got a volume corruption, that's life.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
The 512gb ssd isn't an option anymore. It's in another system and has been for some time, and that's where it's likely to stay.

For those asking about what type of raid I would use specifically, that would most likely be software raid from within Windows. Not too happy about that either though because software raid uses CPU cycles. I would set it up from bios, but I've learned recently that isn't true hardware raid either. The reason for running raid like this would be because I'm cheap and getting a real hardware raid controller card isn't something cheap people do

That 512gb ssd isn't an option, it's in use in another system.

I would be using software raid from within Windows. I heard bad things about onboard raid, and the only way to get true hardware raid is to buy a separate controller card, which I won't do. That's expensive, and the whole idea behind using raid is to save some money, because it would be cheaper than an ssd.
Highpoint cards are affordable and good. The RocketRAID in my server is 11 years old now and going strong.

Just beware that none of Highpoint cards support GPT booting. You do not want to make a RAID built on them your boot volume (because you'll be limited to 2 TB capacity).

Operating system RAID < chipset RAID < PCIe AIB RAID

The only problem with chipset RAID is that you can only transfer the RAID from like (e.g. Intel) to like (Intel). If you want to move it from Intel to AMD, you have to start from scratch.
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,731 (3.41/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
It seems I messed up my OP with a simple typo. I mentioned I had a 128GB SSD for the OS, and a 500GB SSD. If that were the case... this thread wouldn't exist, lol. The 500GB is a HDD, not an SSD. I did have a 512GB SSD at one time, but it's now in another system.

I considered 3 1TB drives in RAID 5 initially because they're cheap, cheaper than even a 1TB SSD. Then I went and mentioned RAID 1 for some reason, which is nowhere near what I need for this. I'm talking about mostly game installs and potentially other data that I wouldn't care to lose. I don't need redundancy outside of possibly a convenience factor for if/when a drive does die, I wouldn't have to reinstall games. Though I have in the past collected cool stuff that I no longer can find, mostly Quake related stuff... remembering that I'd think RAID 5 would be the way to go. Losing stuff just sucks.

About types of raid, I found this @FordGT90Concept
https://skrypuch.com/raid/

This basically says onboard raid (fake RAID) is horrible, so if you're going to do that, you may as well use software RAID. Hardware RAID can be really bad too, if the controller fails...
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I wouldn't trust that website because it doesn't even know what the difference is: "Fake RAID" is a host bus adapter (HBA) where "Hardware RAID" is a host bus controller (HBC). I also completely disagree with the article because HBAs live below the software stack so it's completely invisible to software that doesn't directly interface with the drivers. This isolates the RAID from all software environments and that's a huge plus, even if there isn't extra hardware facilitating the RAID.

Let's go point by point:
> Use RAID 0 or 1.
CPUs these days have gobs of power to perform parity checks. It's perfectly fine to use RAID5 on them.

>Can't nest RAID arrays.
>Create RAID arrays using whole disks only.
Gee, I wonder why: Redundant Array of Inexpensive Disks. This is exactly the point I was making above: software RAID isn't implemented below itself. It's just file system trickery to spread the data out over multiple drives. Legit RAID doesn't do that because the entire point of it is to prevent data loss from mechanical failure. What the author is advocating here is complicating RAID for no apparent reason.

>Pray your motherboard never dies or keep several identical ones on hand.
False. Like I said, AMD can detect AMD and Intel can detect Intel. My RAID1 migrated from a dual socket 771 board to a single socket 1151. I was amused because I didn't even intend to keep it. It was actually fairly difficult to purge the RAID1 completely from both drives (think I ended up erasing them). It's also hilarious that the author brings this up because software RAID locks you into the specific operating system that built it.

>Can't have hot spares and can't support hot swappable drives.
True on the first point but no one looking at a four drive RAID should care. I did RAID5+1 hot spare for a decade. The volume was starting to get full and I thought to myself: "why am I powering a drive that's doing utterly nothing on the 0.00000000001% chance that two drives will die at the same time?" Also, I have all that data backed up on an external so should that 0.00000000001% come to pass, I'd just shrug and order two replacement drives instead of one. So I did the sane thing and put that drive into the RAID and increased the capacity of the RAID by 50%. Seriously, if you aren't talking 8+ drives, it's not even worth considering. Even then, I seriously wouldn't promote the idea unless you have 16+ drives in the RAID. In which case, you wouldn't be using RAID 0, 1, 5, nor 10 anyway so this point is entirely moot.

Virtually all newer motherboards support hot plugging. You just have to tell it which ports to enable it on. Not that I would hot swap in the first place. I've been bitten by too many fans to know that's a terrible idea.


Like I said: Operating system RAID < chipset RAID (HBA) < PCIe AIB RAID (HBC)

There's a lot of software RAID advocates out there and I really don't understand it. Software RAID has more overhead than HBA RAID and a lot of arguments for it are nonsensical. Let me go point by point again:
In Linux, you can create RAID devices using any regular block device (including whole drives, partitions, regular files, other RAID devices, etc) with mdadm. You can mix and match RAID levels using RAID 0, 1, 4, 5, 6, 10 and linear (linear not really being RAID per se, but it's handled by the same framework). You can also arbitrarily nest RAID devices, so you can create a RAID 0 of RAID 6s of RAID 1s if that's what floats your boat. You can also physically rip the drives out of one machine, plug them into another, and your RAID array(s) will continue working as before, with no twiddling needed.
Only in Linux. If the drives are plugged into different controllers than the machine they came from and Linux doesn't support those controllers, it's boned.

mdadm of course supports all of the features you'd expect like hot spares, hot swappable drives (hardware permitting), but it also has several other useful features. Of particular note is that you can grow a RAID 5 array completely online (it calls this feature reshaping). That is, take an n drive array with n-1 capacity, add an additional drive and (completely online) end up with an n+1 drive array with n capacity. Furthermore, you can add in as many drives as you'd like and compose them into the same array, hanging them off the ports on the motherboard, ports on an expansion card, external drives, drives on the network...
"hardware permitting" :roll: Yeah, those are hardware features. Implementing them in software is pretty easy but, you see, if the hardware doesn't support it, it's not supported period. Software can never supercede hardware so why is software so fantastic?

Remember how I was talking about integrating the hot spare on my RocketRAID above? That was done entirely by the RocketRAID card and it continued to work on it regardless if the operating system was loaded or not. Hell, that RocketRAID DID integrate a hot-spare into the RAID while I was sleeping one time. You know what's fantastic about RAID controllers? They'll even do it while the computer is sitting at the BIOS screen--no operating system loaded at all. Oh, 0% CPU load rebuilding the array too but it was noticeably slow (because that's just the way it is with RAID5).

Yes, enterprise can use software RAID across multiple systems and cards but that's the only use-case where software RAID makes sense: after the hardware itself is redundant. As I said before, I see no sense in software RAID until you exceed 16 drives.

Well, that all sounds great, but what about performance? The good news is that performance of Software RAID is generally on par with Hardware RAID and almost always (significantly) better than Fake RAID. You might not have noticed, but in the last decade or two, CPUs have become very fast, greatly outpacing hard drive speed. Even with a full RAID 5 resync in progress with many fast drives, you're unlikely to see more than 25% CPU usage, and that's just on a single core, these days you probably have at least 4 cores. RAID levels that don't involve parity (0, 1, 10, linear) incur essentially no CPU load.
That's freakin hilarious because the bottleneck is the drives themselves. Software RAID is never on par with hardware RAID because the parity work doesn't even hit the system bus. "Fake RAID" is also faster than software RAID because it's happening at the hardware level instead of software. When software tries to do something, it has to go through hundreds of lines of instruction before it reaches the data. For example, software has to interpret its own instructions a read/write is requested, then it has to send it to the kernel, which interprets it through its file system, which invokes the driver, and finally read/write operations commence. Every step along the way, there are checks being made to ensure the requests are valid. File systems often have some error correction code too. None of that happens with "fake RAID" or "hardware RAID." To the operating system, all it sees is a single drive and the RAID accepts those commands and does what it needs to do regardless of how many drivers are under it. Software RAID is fundamentally inefficient.


You can tell a HBA from a HBC because HBCs have dedicated RAM.
 
Last edited:
  • Like
Reactions: hat

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,731 (3.41/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
How often would raid 5 be doing these parity checks, or any other form of raid that would consume CPU power? It seems to me it would happen mostly when writing, which in my case, wouldn't happen all too often.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Every single read and write does parity checks. If the sector size is 512 bytes, writing a 1,000,000 bytes would result in no less than 1954 parity calculations. When reading, it recalculates parity and checks it against the stored checksum to determine if the data integrity is good. If not, it uses the information to determine which drive oopsie whoopsied (it logs and fixes it).
 
Last edited:
  • Like
Reactions: hat

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
I would never use RAID 5 in a software array.
If this were the mid 2000s, I would agree with you but with modern hardware, there isn't really a good reason to use hardware unless you're building a server and you just want to make it rock solid. Even with full on software RAID-5 in MDADM, I was able to get enough performance to saturate 1Gbps ethernet which is good enough for me when it comes to archival and streaming video.
Any kind of RAID is fine for games. All games have systems to handle slow loading. RAID5 will have faster read performance than a single drive but with a slight latency penalty.
Ehhh, not as much as you would think. Check out those benchmarks I ran, access latency on the RAID-5 versus a single disk in the array actually almost the same.
The only problem with chipset RAID is that you can only transfer the RAID from like (e.g. Intel) to like (Intel). If you want to move it from Intel to AMD, you have to start from scratch.
On the same token though, if I took my setup and dropped it into another board that supports RSTe, it would be detected out of the box. The same will likely happen if you sick with AMD when starting with it, same deal with HighPoint or LSI. It's not really a problem, just something to be aware of when going with RAID. This actually might even be an argument for software RAID because as long as the RAID is supported in the OS and you can boot, it will be detected since it isn't hardware dependent. Just a thought.
It's just file system trickery to spread the data out over multiple drives.
I don't agree with that statement. Software RAID is still acting on the device as if it were a RAID controller, the difference is that commands that would typically issued to a controller are handled by the driver instead (which is really what FakeRAID in setups like RSTe actually do to a very large extent.) The driver knows the topology of the RAID so when it issues a read or write, like a regular hardware controller, figures out which disk the data resides in and then accesses it. Software versus hardware doesn't change that, it's just a matter of what's handling that, the driver or dedicated RAID hardware. Nothing stops you (in Linux,) for using a software RAID device as a block device with no file system and just writing bytes directly to the array. File systems have nothing to do with software raid implementations.
Only in Linux. If the drives are plugged into different controllers than the machine they came from and Linux doesn't support those controllers, it's boned.
With software RAID if Linux doesn't support the controller, you won't see the disks at all. If the controller is supported and you can at least see the disks, mdadm will work fine as all it needs is the ability to interact with disks as anything else in the OS does. If you can read and write to a disk, it's fair game for software RAID. I'm not saying software RAID is the best option, I'm saying it's not a bad option when you have no other good options or if your performance needs are met by something like mdadm (say, streaming video or archiving.)

"hardware permitting" :roll: Yeah, those are hardware features. Implementing them in software is pretty easy but, you see, if the hardware doesn't support it, it's not supported period. Software can never supercede hardware so why is software so fantastic?
This doesn't need to be a pissing content. There is a time and a place for mdadm and that isn't all the time. The benefit of software raid are low barrier to entry because you don't need really any special hardware to start using it and flexibility. Performance isn't better but, it's good enough a lot of the time. Which for some people, that's okay because benchmarks aren't everything.
"Fake RAID" is also faster than software RAID because it's happening at the hardware level instead of software.
No it's not. That's a flat out incorrect. It's handled by the driver which is kernel space code handling it, you know, a lot like software RAID. The only difference is that things for figuring out disk topology are determined by the controller but, actually reading and writing to the array is completely dependent on the driver.
When software tries to do something, it has to go through hundreds of lines of instruction before it reaches the data. For example, software has to interpret its own instructions a read/write is requested, then it has to send it to the kernel, which interprets it through its file system, which invokes the driver, and finally read/write operations commence.
Every single read and write does parity checks. If the sector size is 512 bytes, writing a 1,000,000 bytes would result in no less than 1954 parity calculations. When reading, it recalculates parity and checks it against the stored checksum to determine if the data integrity is good. If not, it uses the information to determine which drive oopsie whoopsied (it logs and fixes it).
I'm not sure that's an entirely accurate statement for all RAID implementations. I don't think most controllers are going to check parity unless there is a reason to, whether that's because it's doing a validation run or because it hit a block where there was an issue reading, I don't think it calculates parity on every read, otherwise performance would be a lot worse and far more similar to write speeds, which they're not.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
No it's not. That's a flat out incorrect. It's handled by the driver which is kernel space code handling it, you know, a lot like software RAID. The only difference is that things for figuring out disk topology are determined by the controller but, actually reading and writing to the array is completely dependent on the driver.
It's technically firmware RAID, not driver. Features of the RAID can be enabled and disabled from within the firmware itself; however, more complex tasks like rebuilding the RAID only occur while the driver is loaded.


I'm not sure that's an entirely accurate statement for all RAID implementations. I don't think most controllers are going to check parity unless there is a reason to, whether that's because it's doing a validation run or because it hit a block where there was an issue reading, I don't think it calculates parity on every read, otherwise performance would be a lot worse and far more similar to write speeds, which they're not.
All drives perform the read operation simutaneously so in a minimum RAID5: disk0 reads one sector that contains data, disk1 reads one sector that also contains data and disk2 reads one sector that contains parity data. disk0 and disk1 caculates the parity data and checks if it equals the parity data. If yes, the data is available to software, if no, it fixes it and makes it available. Like I said, adds a little latency (<1ms) but the fact you're getting double the data, it ends up being a lot more throughput.
 
Last edited:
  • Like
Reactions: hat

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
First, I'm going to address Software RAID vs. Firmware RAID vs. Hardware RAID.

Software RAID is RAID handled entirely by the OS. This type has the highest overhead, it is also the most unreliable in my experience. Though the reliability can vary greatly depending on the OS itself. For example, the Windows software RAID is pretty horrible, but FreeNAS software RAID is decent(but then again FreeNAS was built from the ground up with Software RAID in mind). But my advise is don't use Software RAID if you are running Windows.

Firmware RAID is RAID that is configured by the controller, but must of the heavy lifting calculations are handled by the computer's CPU. Motherboard onboard RAID falls in this category, as well as most inexpensive "hardware" RAID cards such as most Highpoint cards. This isn't a dig at Highpoint, I use them, and they are really good RAID cards for the money. If you are looking to get into RAID, IMO, picking up an inexpensive Highpoint card is a good idea. But heck, even using the onboard RAID works too. There is still system overhead, but with modern CPUs it isn't noticeable. Most of the controllers use a single threaded calculation, so at most it will load a single core, and it really only does it on writes, reads don't really cause any load. With modern 4+ Core processors, this isn't really a big deal to a home system.

Hardware RAID is RAID that is handled entirely by the controller. All the calculations are done by the controller, so there is very little system overhead. This is important in a server environment where there is a heavy load on the server constantly, but in a home environment it usually isn't. Obviously hardware RAID controllers can be expensive, but we are seeing very good deals on used ones come up on ebay, as data centers are replacing their older cards with new.


Second I'd like to address the statement that games have a mechanism to deal with slow hard drives. Yes, this is true to an extent. However, there are a lot of games that end up with issues from a slow hard drive, particularly large open world games. I've noticed in FarCry5 particularly, that a slow hard drive causes stutter. But I've even noticed it in GTA:V as well. It finally prompted me to replace my HDD from 2013 with a better one, as well as adding an SSD cache to it. The stuttering is now gone.

Finally, what to do in the OP's situation. I'm going to go back to my original suggestion and say a single hard drive with an SSD cache. You can get a decent 3TB hard drive these days for under $50. Pair with an inexpensive SSD(or use part of the one already in the system) using PrimoCache, and you've got a great setup for running games off of. Even if you throw in a dedicated 240GB SSD for the cache, with the $30 for Primocache, you still looking at under $150.
 
Joined
Oct 19, 2007
Messages
8,198 (1.36/day)
Processor Intel i9 9900K @5GHz w/ Corsair H150i Pro CPU AiO w/Corsair HD120 RBG fan
Motherboard Asus Z390 Maximus XI Code
Cooling 6x120mm Corsair HD120 RBG fans
Memory Corsair Vengeance RBG 2x8GB 3600MHz
Video Card(s) Asus RTX 3080Ti STRIX OC
Storage Samsung 970 EVO Plus 500GB , 970 EVO 1TB, Samsung 850 EVO 1TB SSD, 10TB Synology DS1621+ RAID5
Display(s) Corsair Xeneon 32" 32UHD144 4K
Case Corsair 570x RBG Tempered Glass
Audio Device(s) Onboard / Corsair Virtuoso XT Wireless RGB
Power Supply Corsair HX850w Platinum Series
Mouse Logitech G604s
Keyboard Corsair K70 Rapidfire
Software Windows 11 x64 Professional
Benchmark Scores Firestrike - 23520 Heaven - 3670
I have a RAID guide stickied on this forum. If you want to check it out, link is in my sig. :)
 
  • Like
Reactions: hat

Mindweaver

Moderato®™
Staff member
Joined
Apr 16, 2009
Messages
8,194 (1.49/day)
Location
Charleston, SC
System Name Tower of Power / Sechs
Processor i7 14700K / i7 5820k @ 4.5ghz
Motherboard ASUS ROG Strix Z690-A Gaming WiFi D4 / X99S GAMING 7
Cooling CM MasterLiquid ML360 Mirror ARGB Close-Loop AIO / CORSAIR Hydro Series H100i Extreme
Memory CORSAIR Vengeance LPX 32GB (2 x 16GB) DDR4 3600 / G.Skill DDR4 2800 16GB 4x4GB
Video Card(s) ASUS TUF Gaming GeForce RTX 4070 Ti / ASUS TUF Gaming GeForce RTX 3070 V2 OC Edition
Storage 4x Samsung 980 Pro 1TB M.2, 2x Crucial 1TB SSD / Samsung 870 PRO 500GB M.2
Display(s) Samsung 32" Odyssy G5 Gaming 144hz 1440p, ViewSonic 32" 72hz 1440p / 2x ViewSonic 32" 72hz 1440p
Case Phantek "400A" / Phanteks “Enthoo Pro series”
Audio Device(s) Realtek ALC4080 / Azalia Realtek ALC1150
Power Supply Corsair RM Series RM750 / Corsair CXM CX600M
Mouse Glorious Gaming Model D Wireless / Razer DeathAdder Chroma
Keyboard Glorious GMMK with box-white switches / Keychron K6 pro with blue swithes
VR HMD Quest 3 (128gb) + Rift S + HTC Vive + DK1
Software Windows 11 Pro x64 / Windows 10 Pro x64
Benchmark Scores Yes
@Aquinus - True, it has got better, but for speed, without the need for redundancy, I would only use RAID 0. If I needed redundancy and speed on a software RAID array then I would spend the extra cash on a RAID 10 array. Most average users make the mistake of thinking RAID 0 arrays are just 2x drives and think going with RAID 5 array would be quicker using 3x drives.. which is wrong (not say you fall in this category). RAID 0 can handle more than 2x drives, but it can be a double edge sword because adding too many drives can cause a negative impact on your system using software RAID. Around 2005-2007 I used a RAID 0 software array with 3x WD RE4 250gb which gave me 750gb storage with around 170-200 write / read.

Also, users wanting to use RAID need to know the importance of labeling your RAID array. ;)
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
RAID10 is wasteful only having 50% of the drive total capacity available. RAID5 is n-1 capacity. If we were talking 3 drive RAID5 versus 2 drive RAID0, the read performance would be about the same but write performance of RAID5 is much worse.

Basic overview of common levels: https://www.datarecovery.net/articles/raid-level-comparison.aspx

I would never recommend RAID0 with more than two drives. The risk of data loss keeps going up and up.

Around 2005-2007 I used a RAID 0 software array with 3x WD RE4 250gb which gave me 750gb storage with around 170-200 write / read.
Here's my 11 year old RAID5:
cdm.png

Heh, I thought it felt painfully slow. Well, there's the proof. It needs upgrading but, meh, drives are still good. :roll:
 
Last edited:
Top