• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Need specifics on RAID 10

Joined
Mar 12, 2009
Messages
1,079 (0.20/day)
Location
SCOTLAND!
System Name Machine XV
Processor Dual Xeon E5 2670 V3 Turbo unlocked
Motherboard Kllisre X99 Dual
Cooling 120mm heatsink
Memory 64gb DDR4 ECC
Video Card(s) RX 480 4Gb
Storage 1Tb NVME SSD
Display(s) 19" + 23" + 17"
Case ATX
Audio Device(s) XFi xtreme USB
Power Supply 800W
Software Windows 10
How is the speed writing from the cache to the RAID though? You can't cache all file transfers to a RAID, even more so if you're storing something that changes a lot like a VM drive.

I get mabe 40mb/s from cache to array thats Limited by the parity calculations But that's not usually a problem as the 1tb drive is more than large enough to dump any data I want to it. The array is not used for Modifying large data files but long term storage with some redundancy.

Esxi and the operating systems run from a separate hard drive but I have ran them on the array without issue .
 

de.das.dude

Pro Indian Modder
Joined
Jun 13, 2010
Messages
8,783 (1.74/day)
Location
Stuck in a PC. halp.
System Name Monke | Work Thinkpad| Old Monke
Processor Ryzen 5600X | Ryzen 5500U | FX8320
Motherboard ASRock B550 Extreme4 | ? | Asrock 990FX Extreme 4
Cooling 240mm Rad | Not needed | hyper 212 EVO
Memory 2x16GB DDR4 3600 Corsair RGB | 16 GB DDR4 3600 | 16GB DDR3 1600
Video Card(s) Sapphire Pulse RX6700XT 12GB | Vega 8 | Sapphire Pulse RX580 8GB
Storage Samsung 980 nvme (Primary) | some samsung SSD
Display(s) Dell 2723DS | Some 14" 1080p 98%sRGB IPS | Dell 2240L
Case Ant Esports Tempered case | Thinkpad | Antec
Audio Device(s) Logitech Z333 | Jabra corpo stuff
Power Supply Corsair RM750e | not needed | Corsair GS 600
Mouse Logitech G400 | nipple
Keyboard Logitech G213 | stock kb is awesome | Logitech K230
VR HMD ;_;
Software Windows 10 Professional x3
Benchmark Scores There are no marks on my bench

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
I get mabe 40mb/s from cache to array thats Limited by the parity calculations But that's not usually a problem as the 1tb drive is more than large enough to dump any data I want to it. The array is not used for Modifying large data files but long term storage with some redundancy.

Esxi and the operating systems run from a separate hard drive but I have ran them on the array without issue .

Ah okay. 40MB/s is a little slow for my taste. My RAID-5 off of the X79 PCH gives me something like 140-160MB/s. It also rebuilds pretty fast.
 

brandonwh64

Addicted to Bacon and StarCrunches!!!
Joined
Sep 6, 2009
Messages
19,542 (3.66/day)
Ah okay. 40MB/s is a little slow for my taste. My RAID-5 off of the X79 PCH gives me something like 140-160MB/s. It also rebuilds pretty fast.

With raid 0 on two first gen sata HDD's I get around low 100's but this is on a onboard marvel chip.
 
Joined
Mar 12, 2009
Messages
1,079 (0.20/day)
Location
SCOTLAND!
System Name Machine XV
Processor Dual Xeon E5 2670 V3 Turbo unlocked
Motherboard Kllisre X99 Dual
Cooling 120mm heatsink
Memory 64gb DDR4 ECC
Video Card(s) RX 480 4Gb
Storage 1Tb NVME SSD
Display(s) 19" + 23" + 17"
Case ATX
Audio Device(s) XFi xtreme USB
Power Supply 800W
Software Windows 10
It's not as fast because the data is not striped.

With raid 5 if you lose more than 2 disks you loose everything.
If I lose 2 disks I only loose what's on the 2 disks all the rest of the files on the other 8 drives is intact because it keeps the files whole and on a single drive.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
I should have noted using software RAID or a crappy hardware controller, but headache wise I still say the less drives the easier/better... and sure, using a higher end hardware controller card he wouldn't have any issues, but that will cost him a lot of money like you said. Plus, I wouldn't setup a RAID 5 or 10 with out using a BBU, and that's even more money.

I've got around $800 bucks in one of my hardware RAID Arrays. It's a LSI 3ware 9650SE 4-port (about $650) and I bought a BBU (around $110) and that's not including the Harddrives which were around $220 ea using enterprise drives. So, sinking money into just a 4-port RAID Array can get costly quick. At the moment I'm running 9 Hardware RAID Arrays.

If a RAID-5 array is setup properly a BBU is not necessary. Though for a server an expensive UPS is never a bad idea.
 

Easy Rhino

Linux Advocate
Staff member
Joined
Nov 13, 2006
Messages
15,444 (2.43/day)
Location
Mid-Atlantic
System Name Desktop
Processor i5 13600KF
Motherboard AsRock B760M Steel Legend Wifi
Cooling Noctua NH-U9S
Memory 4x 16 Gb Gskill S5 DDR5 @6000
Video Card(s) Gigabyte Gaming OC 6750 XT 12GB
Storage WD_BLACK 4TB SN850x
Display(s) Gigabye M32U
Case Corsair Carbide 400C
Audio Device(s) On Board
Power Supply EVGA Supernova 650 P2
Mouse MX Master 3s
Keyboard Logitech G915 Wireless Clicky
Software The Matrix
Why does everyone think you need an expensive controller for lots of drives? I've got a $45 Highpoint RAID card that supports 10 drives... My entire server hard drive setup that I have now supports 10 drives and cost about $310 when it was all said and done.

i know that you know what you are talking about but it needs to be clarified. a lot of people confuse a raid controller card and a software raid controller card and simply a host card.

there are massive speed and performance benefits of a raid on chip card over simply a software raid controller. so if you want raid 5 and you want it very fast and you want it to do fancy things like write-through caching calculations then you need a controller that has raid 5 on it with ram, etc.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
i know that you know what you are talking about but it needs to be clarified. a lot of people confuse a raid controller card and a software raid controller card and simply a host card.

there are massive speed and performance benefits of a raid on chip card over simply a software raid controller. so if you want raid 5 and you want it very fast and you want it to do fancy things like write-through caching calculations then you need a controller that has raid 5 on it with ram, etc.

You are right to an extent, but wrong also. Now a days even the software cards can do write-through and give very decent speeds with RAID-5. The $45 card I have for example does write-through without a problem for example, and easily sustains 110-120MB/s transfers to and from it, which is enough to pretty much saturate a gigabit connection and hence enough for any server. Yes, it is still using the CPU to do the work unlike a true hardware card, but even in my dual-core Celeron server the CPU is barely bothered by the extra load. In fact when read/writing the CPU load barely breaks 5%, and my i7 in my main rig just laughs it off like the CPU load isn't even there. Yeah, if I wanted it faster for an array directly in the machine I was using, or had a 10 Gigabit network, I'd want a hardware RAID controller. You are right, there are massive performance gains to a hardware card. But for a home server that is storing data that doesn't need to be accessed at extremely fast speeds there is no point in a super expensive hardware RAID card, in fact even in most business there isn't a need for it unless they are doing something like video editing directly off the files server or something.
 

Mindweaver

Moderato®™
Staff member
Joined
Apr 16, 2009
Messages
8,194 (1.49/day)
Location
Charleston, SC
System Name Tower of Power / Sechs
Processor i7 14700K / i7 5820k @ 4.5ghz
Motherboard ASUS ROG Strix Z690-A Gaming WiFi D4 / X99S GAMING 7
Cooling CM MasterLiquid ML360 Mirror ARGB Close-Loop AIO / CORSAIR Hydro Series H100i Extreme
Memory CORSAIR Vengeance LPX 32GB (2 x 16GB) DDR4 3600 / G.Skill DDR4 2800 16GB 4x4GB
Video Card(s) ASUS TUF Gaming GeForce RTX 4070 Ti / ASUS TUF Gaming GeForce RTX 3070 V2 OC Edition
Storage 4x Samsung 980 Pro 1TB M.2, 2x Crucial 1TB SSD / Samsung 870 PRO 500GB M.2
Display(s) Samsung 32" Odyssy G5 Gaming 144hz 1440p, ViewSonic 32" 72hz 1440p / 2x ViewSonic 32" 72hz 1440p
Case Phantek "400A" / Phanteks “Enthoo Pro series”
Audio Device(s) Realtek ALC4080 / Azalia Realtek ALC1150
Power Supply Corsair RM Series RM750 / Corsair CXM CX600M
Mouse Glorious Gaming Model D Wireless / Razer DeathAdder Chroma
Keyboard Glorious GMMK with box-white switches / Keychron K6 pro with blue swithes
VR HMD Quest 3 (128gb) + Rift S + HTC Vive + DK1
Software Windows 11 Pro x64 / Windows 10 Pro x64
Benchmark Scores Yes
If a RAID-5 array is setup properly a BBU is not necessary. Though for a server an expensive UPS is never a bad idea.

There again, I would not set up a RAID-5 with out a BBU, and I would still have a UPS as well. I wouldn't consider a RAID-5 as being setup properly with out a BBU. I'm not saying you are wrong newtekie1, I'm just stating how I would handle it.

If you can't afford a BBU for your Hardware RAID-5 Controller card then you shouldn't be using a Hardware RAID-5 array and just go software.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
There again, I would not set up a RAID-5 with out a BBU, and I would still have a UPS as well. I wouldn't consider a RAID-5 as being setup properly with out a BBU. I'm not saying you are wrong newtekie1, I'm just stating how I would handle it.

If you can't afford a BBU for your Hardware RAID-5 Controller card then you shouldn't be using a RAID-5 array.

A BBU is only required if you are using write-back cache, so no a BBU isn't not necessary. And even if you are using write-back, a properly configured inexpensive UPS is enough to keep you safe, you don't need a BBU for the RAID controller itself.

However, yes, I would say if you can afford a true hardware RAID-5 controller and don't get the BBU, you're doing it wrong.
 

Mindweaver

Moderato®™
Staff member
Joined
Apr 16, 2009
Messages
8,194 (1.49/day)
Location
Charleston, SC
System Name Tower of Power / Sechs
Processor i7 14700K / i7 5820k @ 4.5ghz
Motherboard ASUS ROG Strix Z690-A Gaming WiFi D4 / X99S GAMING 7
Cooling CM MasterLiquid ML360 Mirror ARGB Close-Loop AIO / CORSAIR Hydro Series H100i Extreme
Memory CORSAIR Vengeance LPX 32GB (2 x 16GB) DDR4 3600 / G.Skill DDR4 2800 16GB 4x4GB
Video Card(s) ASUS TUF Gaming GeForce RTX 4070 Ti / ASUS TUF Gaming GeForce RTX 3070 V2 OC Edition
Storage 4x Samsung 980 Pro 1TB M.2, 2x Crucial 1TB SSD / Samsung 870 PRO 500GB M.2
Display(s) Samsung 32" Odyssy G5 Gaming 144hz 1440p, ViewSonic 32" 72hz 1440p / 2x ViewSonic 32" 72hz 1440p
Case Phantek "400A" / Phanteks “Enthoo Pro series”
Audio Device(s) Realtek ALC4080 / Azalia Realtek ALC1150
Power Supply Corsair RM Series RM750 / Corsair CXM CX600M
Mouse Glorious Gaming Model D Wireless / Razer DeathAdder Chroma
Keyboard Glorious GMMK with box-white switches / Keychron K6 pro with blue swithes
VR HMD Quest 3 (128gb) + Rift S + HTC Vive + DK1
Software Windows 11 Pro x64 / Windows 10 Pro x64
Benchmark Scores Yes
A BBU is only required if you are using write-back cache, so no a BBU isn't not necessary. And even if you are using write-back, a properly configured inexpensive UPS is enough to keep you safe, you don't need a BBU for the RAID controller itself.

There again, I stated what I would do, and I never said it was necessary buddy. Why would anyone not want to use that cache? The performance is great. I think the gains out weigh the small extra cost.


However, yes, I would say if you can afford a true hardware RAID-5 controller and don't get the BBU, you're doing it wrong.

:toast:
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
There again, I stated what I would do, and I never said it was necessary buddy. Why would anyone not want to use that cache? The performance is great. I think the gains out weigh the small extra cost.




:toast:

Well, like I said, if you are putting it in a server on a Gigabit network the performance improvement is useless as even software cards using write-through are enough to saturate the network. So using the cache just adds a chance for data loss for no gain at all.
 

Easy Rhino

Linux Advocate
Staff member
Joined
Nov 13, 2006
Messages
15,444 (2.43/day)
Location
Mid-Atlantic
System Name Desktop
Processor i5 13600KF
Motherboard AsRock B760M Steel Legend Wifi
Cooling Noctua NH-U9S
Memory 4x 16 Gb Gskill S5 DDR5 @6000
Video Card(s) Gigabyte Gaming OC 6750 XT 12GB
Storage WD_BLACK 4TB SN850x
Display(s) Gigabye M32U
Case Corsair Carbide 400C
Audio Device(s) On Board
Power Supply EVGA Supernova 650 P2
Mouse MX Master 3s
Keyboard Logitech G915 Wireless Clicky
Software The Matrix
You are right to an extent, but wrong also. Now a days even the software cards can do write-through and give very decent speeds with RAID-5. The $45 card I have for example does write-through without a problem for example, and easily sustains 110-120MB/s transfers to and from it, which is enough to pretty much saturate a gigabit connection and hence enough for any server. Yes, it is still using the CPU to do the work unlike a true hardware card, but even in my dual-core Celeron server the CPU is barely bothered by the extra load. In fact when read/writing the CPU load barely breaks 5%, and my i7 in my main rig just laughs it off like the CPU load isn't even there. Yeah, if I wanted it faster for an array directly in the machine I was using, or had a 10 Gigabit network, I'd want a hardware RAID controller. You are right, there are massive performance gains to a hardware card. But for a home server that is storing data that doesn't need to be accessed at extremely fast speeds there is no point in a super expensive hardware RAID card, in fact even in most business there isn't a need for it unless they are doing something like video editing directly off the files server or something.

the problem with software raid comes when running more than one virtual machine. performance comes to a crawl with all of that cross traffic. the best i could get with software raid running several VMs was 50 MB/s transfers with caching enabled.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
the problem with software raid comes when running more than one virtual machine. performance comes to a crawl with all of that cross traffic. the best i could get with software raid running several VMs was 50 MB/s transfers with caching enabled.

You're right, but that can bring even hardware controllers down to a crawl, it can be a problem with any hard drive setup simply because hard drives just aren't good with heavy random access loads. A good hardware controller with a large amount of RAM for cache definitely helps the situation though, since the RAM cache helps makes up for the shortcomings of the hard drives. But even some of the software controllers that allow you to use a SSD as a cache drive are a lot better with this type of load than they used to be.
 
Joined
Sep 15, 2008
Messages
1,050 (0.18/day)
Location
Pikeville NC
System Name In the works
Processor 4670k
Motherboard ASUS Z87
Cooling 5x120mm
Memory 8GB Sniper Gskill 1833
Video Card(s) MSI 970 Gaming
Storage 2x 240gb SSD's + 1TB seagate
Display(s) 27" Acer 1920x1080
Case Corsair Carbide 200R
Power Supply OCZ Power 600W Modular
Keyboard Corsait K70 with red switches
Software Win10
20 post later and OP has said NOTHING. rofl.
 

Easy Rhino

Linux Advocate
Staff member
Joined
Nov 13, 2006
Messages
15,444 (2.43/day)
Location
Mid-Atlantic
System Name Desktop
Processor i5 13600KF
Motherboard AsRock B760M Steel Legend Wifi
Cooling Noctua NH-U9S
Memory 4x 16 Gb Gskill S5 DDR5 @6000
Video Card(s) Gigabyte Gaming OC 6750 XT 12GB
Storage WD_BLACK 4TB SN850x
Display(s) Gigabye M32U
Case Corsair Carbide 400C
Audio Device(s) On Board
Power Supply EVGA Supernova 650 P2
Mouse MX Master 3s
Keyboard Logitech G915 Wireless Clicky
Software The Matrix
You're right, but that can bring even hardware controllers down to a crawl, it can be a problem with any hard drive setup simply because hard drives just aren't good with heavy random access loads. A good hardware controller with a large amount of RAM for cache definitely helps the situation though, since the RAM cache helps makes up for the shortcomings of the hard drives. But even some of the software controllers that allow you to use a SSD as a cache drive are a lot better with this type of load than they used to be.

The cards now come with NAND flash on them so you can ditch the BBU and don't have to bother with any sort of hybrid raid setup. those get expensive though.

20 post later and OP has said NOTHING. rofl.

If you look at his thread record you would know why.

still provides for good conversation though.
 

Mindweaver

Moderato®™
Staff member
Joined
Apr 16, 2009
Messages
8,194 (1.49/day)
Location
Charleston, SC
System Name Tower of Power / Sechs
Processor i7 14700K / i7 5820k @ 4.5ghz
Motherboard ASUS ROG Strix Z690-A Gaming WiFi D4 / X99S GAMING 7
Cooling CM MasterLiquid ML360 Mirror ARGB Close-Loop AIO / CORSAIR Hydro Series H100i Extreme
Memory CORSAIR Vengeance LPX 32GB (2 x 16GB) DDR4 3600 / G.Skill DDR4 2800 16GB 4x4GB
Video Card(s) ASUS TUF Gaming GeForce RTX 4070 Ti / ASUS TUF Gaming GeForce RTX 3070 V2 OC Edition
Storage 4x Samsung 980 Pro 1TB M.2, 2x Crucial 1TB SSD / Samsung 870 PRO 500GB M.2
Display(s) Samsung 32" Odyssy G5 Gaming 144hz 1440p, ViewSonic 32" 72hz 1440p / 2x ViewSonic 32" 72hz 1440p
Case Phantek "400A" / Phanteks “Enthoo Pro series”
Audio Device(s) Realtek ALC4080 / Azalia Realtek ALC1150
Power Supply Corsair RM Series RM750 / Corsair CXM CX600M
Mouse Glorious Gaming Model D Wireless / Razer DeathAdder Chroma
Keyboard Glorious GMMK with box-white switches / Keychron K6 pro with blue swithes
VR HMD Quest 3 (128gb) + Rift S + HTC Vive + DK1
Software Windows 11 Pro x64 / Windows 10 Pro x64
Benchmark Scores Yes
I've got one comment to share with you.. Edit your posts, don't double post. I'm sure you have been told this multiple times.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
With raid 0 on two first gen sata HDD's I get around low 100's but this is on a onboard marvel chip.

...and I would find that barely acceptable. When you're working on the RAID locally for things like VMs, you really want it to be quick.
 

brandonwh64

Addicted to Bacon and StarCrunches!!!
Joined
Sep 6, 2009
Messages
19,542 (3.66/day)
...and I would find that barely acceptable. When you're working on the RAID locally for things like VMs, you really want it to be quick.

But for a everyday PC it makes opening stuff like word, chrome, and other small apps that much better to deal with.
 
Joined
Nov 11, 2012
Messages
619 (0.15/day)
1) You asked a broad question. Ten minutes on google would answer 90% of your question, and bring us all to the same page. Being angry that this was suggested isn't reasonable.

2) JBOD exist (just a bunch of drives). It functionally spans itself so that multiple physical drives appear as only one logical drive. No redundancy, no performance enhancement. There is no reason that I can currently see to have a JBOD array for a home user.

3) RAID has to rewrite each drive. You cannot just put an extra two drives in, then RAID them. Back-up your data, and be prepared for a clean drive.

4) Sweet jebus you're looking at an expensive setup. This is one of the cheapest 8 SATAII based RAID controllers I could find: http://www.newegg.com/Product/Product.aspx?Item=N82E16816115026 If you really want that then it's your prerogative.

5) RAID =/= backup. It's been touched on already, but RAID arrays aren't a backup. They might allow a number of drives to fail catastrophically, but cannot address slow degradation. Also, remember that you're building an array of the same type of drives. If they have a design flaw then more than one is likely to fail (in rapid succession).


6) Why in Hades do you want 10? You lose half of the drive space. You get a bit of a performance boost. If you're looking at that kind of investment the solution is clear. Buy an SSD for the primary applications, run a 4 drive RAID 5 array for storage, and bite the bullet on 3TB drives. You wind up with the OS drive and 9TB of storage. A ten drive RAID 10 array only creates an additional 6 TB of data, while costing at least 3 times as much (assuming you find a RAID controller that only costs as much as two drives).


TL:DR.
Like the 3930k, if you have to ask about this topic you likely don't need it. RAID 10 in unlikely to be useful given that you want data back-up and not continuous operation reliability. You can increase your storage capacity, while decreasing cost, by using RAID 5.


Ok what you're saying makes a lot of sense. I wasn't sure, I thought RAID 10 meant more stable, you have less of a chance of losing data. Data security is very important to me. We're talking years of data. Performance increase is not very important to me at all. Currently my main HDD with all primary applications is a 256GB Agility 4 SSD. Everything else is just storage. Yes I access these files, but normal hard drive speed is sufficient to access all my massive data. My SSD as primary drive is sufficient to store all my applications and is very fast, I love it.

Given that performance increase is not important, but risk of data loss is the most important, do you still recommend RAID 5? Because the point of RAID 10 is data redundancy so as to significantly minimize the risk of catastrophic data loss due to hard drive failure.

Cost is of course also important; I would of course look for the best deals. Currently my setup of just using multiple hard drives, is working out for me fine, especially since I use Shell Link Extension to make virtual folders located on other drives to give a "virtual storage increase" - "virtual" as not a technical term but rather as meaning simply, "you don't have to click into the other drives to access the data".

Let me give an example of what I currently do:

For my video collection (it's not really private so I can show you all a screenshot of the file structure), I put all the folders on a different drive, but added Junctions within the one drive, giving the illusion they are all in the same location. Hope you grasp what it is I'm showing you:



As you see here, almost all the folders are Junction links, basically it means that if I move a file into the folder it copies it into the other drive. (a couple of the folders I haven't moved yet; that's why you don't see the chain link overlay on the folder icon on Alias and Motive).

So it's working fine for me to have JBOD, only reason to have RAID is because it would simplify things a bit by having it all one huge drive instead of several drives. Is it even worth it to do RAID for my situation?

Try unraid or flex raid
Easy to expand without formats
no needbuy expensive controllers
Everything is done in software so it's easy to change hardware

Data is not striped so if you lose 1 drive you can recover via the parity drive and if you lose 2 you only lose data on those disks not the entire array

Whoa, Flex-RAID from their website looks great, anyone have any comments or experience with this?

20 post later and OP has said NOTHING. rofl.

sorry for the delay in response, I was asleep last night and woke up this morning and only just checked the thread again. Happy that there are so many useful responses.
 

Wrigleyvillain

PTFO or GTFO
Joined
Oct 13, 2007
Messages
7,702 (1.28/day)
Location
Chicago
System Name DarkStar
Processor i5 3570K 4.4Ghz
Motherboard Asrock Z77 Extreme 3
Cooling Apogee HD White/XSPC Razer blocks
Memory 8GB Samsung Green 1600
Video Card(s) 2 x GTX 670 4GB
Storage 2 x 120GB Samsung 830
Display(s) 27" QNIX
Case Enthoo Pro
Power Supply Seasonic Platinum 760
Mouse Steelseries Sensei
Keyboard Ducky Pro MX Black
Software Windows 8.1 x64
Ok cool though next time try to put it in the right section, man.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Top