• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Best Raid Controller Card?

Joined
Dec 16, 2010
Messages
1,662 (0.34/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, 2 x 512GB Samsung PM981a, 4 x 4TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
As much as we all may argue about what is best, RAID remains a tradeoff between reliability and cost. There is no "optimal" RAID configuration, since all RAID configurations have a greater than zero chance of losing all data. What the person building the RAID needs to do is identify what failure rate is acceptable and then design a RAID configuration around that.
 
Joined
Nov 4, 2005
Messages
11,655 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs and over 10TB spinning
Display(s) 56" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Compared to even single disks RAID5, 1, or 10 are the only real redundant and fault tolerant arrays that minimize risk.
 
Joined
Jul 19, 2013
Messages
10 (0.00/day)
View attachment 54265
On a multiple controller depth to physical medium setup such as this NCQ can increase the overhead and increase the queuing time in real world reads and writes. Its the RAID cards job to handle the disks, use the setup for Fast Path on these LSI cards, they wouldn't have made it and show such great numbers if their logic was somehow worse than what the standard is.


Lastly the TRIM will not pass through on these disks with a RAID card, other features windows implements for SSD's will not pass though, and since the RAID card will be blissfully unaware there is cache on the hybrid drive it will treat them as standard mechanical drives, and perform standard mechanical drive operations, which will get run through the cache and wear it out ever so slightly faster.

The multiple controller depth paragraph above i have no idea what you are saying to be honest. It reads as gibberish, I have tried to interpret it five times now. Could you possibly post that in English?

TRIM is inconsequential on an SSHD. It doesn't use TRIM commands, it isn't an SSD. You lack a basic understanding of SSHD technology. The internal algorithms promote and demote information to the cache based upon block-level analysis of the data patterns. The data is not directly deleted from the cache by the filesystem, which is responsible for issuing the TRIM command. Thus, no need for TRIM command.

As much as we all may argue about what is best, RAID remains a tradeoff between reliability and cost. There is no "optimal" RAID configuration, since all RAID configurations have a greater than zero chance of losing all data. What the person building the RAID needs to do is identify what failure rate is acceptable and then design a RAID configuration around that.

Agreed. Data replication is a must, regardless of the RAID setup. RAID is not backup.

Compared to even single disks RAID5, 1, or 10 are the only real redundant and fault tolerant arrays that minimize risk.

so raid 50, 60 and other nested RAID arrays aren't redundant? Check again.
 
Last edited by a moderator:
Joined
Sep 30, 2004
Messages
507 (0.07/day)
Processor dual G34 6128he
Motherboard h8dg6
Cooling OEM
Memory 48gb DDR3 ECC REG
Video Card(s) onboard
Storage 12x 3tb 4x 15k sas
Display(s) crappy lcd
Case chenbro rackmouunt
Audio Device(s) onboard
Power Supply seasonic 3x redundant
Software ProxMox
hardware raid on a budget i recommend adaptec 5805. it is not the newest kid in town but more capable then about any home system will require.

the new 6805 would be great also, but it over double the price and you will not push enough utilization to see a performance difference. do not get a 6xxx series with an E in the model. they are HBA stile cards (software.)

if you do need an HBA i would go with a Dell H310 or H200. low cost abundant and capable of doing the job. (HBA = Host Bus Adapter, its a bridge not hardware implement. but if you wanted to dabble in ZFS or snapraid or some software variant then they are the way to go.)
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.25/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Really I think not? I would not use those drives for any type of RAID array, because I care about my data. They are not built for that purpose. The money spent for those drives could have easily got him enterprise drives with error recovery and are 10k. I have and have used many RAID arrays in a production environment. Now can he use those drive? Yes, but why? I'm just suggesting that his money could have been spent better. He is buying the drives thinking it will be faster because of the Hybrid SSD Cache, but in reality he is better off getting regular SSD's. You are putting a lot of faith in a hybrid drive that is not built for RAID.

Using a SSHD is no more worse than using a standard desktop drive. Seagate desktop drives actually seem to work far better in RAID than Western Digital drives in my experience. I believe this is because Seagate doesn't actually disable the error recovery features while WD does. Yes, for about the same price he could have gone with Constellation drives, but performance would have been worse. The SSD cache on these drives actually does work with RAID, so these drives work very well in RAID actually.

As much as we all may argue about what is best, RAID remains a tradeoff between reliability and cost. There is no "optimal" RAID configuration, since all RAID configurations have a greater than zero chance of losing all data. What the person building the RAID needs to do is identify what failure rate is acceptable and then design a RAID configuration around that.

As much as we all may argue about what is best, RAID remains a tradeoff between reliability and cost. There is no "optimal" RAID configuration, since all RAID configurations have a greater than zero chance of losing all data. What the person building the RAID needs to do is identify what failure rate is acceptable and then design a RAID configuration around that.

RAID0 is the only RAID level that doesn't minimize risk, all other RAID levels are more redundant and less risky compared to a single hard drive.
 
Last edited:

Mindweaver

Moderato®™
Staff member
Joined
Apr 16, 2009
Messages
8,193 (1.50/day)
Location
Charleston, SC
System Name Tower of Power / Sechs
Processor i7 14700K / i7 5820k @ 4.5ghz
Motherboard ASUS ROG Strix Z690-A Gaming WiFi D4 / X99S GAMING 7
Cooling CM MasterLiquid ML360 Mirror ARGB Close-Loop AIO / CORSAIR Hydro Series H100i Extreme
Memory CORSAIR Vengeance LPX 32GB (2 x 16GB) DDR4 3600 / G.Skill DDR4 2800 16GB 4x4GB
Video Card(s) ASUS TUF Gaming GeForce RTX 4070 Ti / ASUS TUF Gaming GeForce RTX 3070 V2 OC Edition
Storage 4x Samsung 980 Pro 1TB M.2, 2x Crucial 1TB SSD / Samsung 870 PRO 500GB M.2
Display(s) Samsung 32" Odyssy G5 Gaming 144hz 1440p, ViewSonic 32" 72hz 1440p / 2x ViewSonic 32" 72hz 1440p
Case Phantek "400A" / Phanteks “Enthoo Pro series”
Audio Device(s) Realtek ALC4080 / Azalia Realtek ALC1150
Power Supply Corsair RM Series RM750 / Corsair CXM CX600M
Mouse Glorious Gaming Model D Wireless / Razer DeathAdder Chroma
Keyboard Glorious GMMK with box-white switches / Keychron K6 pro with blue swithes
VR HMD Quest 3 (128gb) + Rift S + HTC Vive + DK1
Software Windows 11 Pro x64 / Windows 10 Pro x64
Benchmark Scores Yes
Using a SSHD is no more worse than using a standard desktop drive.

I never said it was worse than a standard desktop drive buddy, and with that said I wouldn't use a standard drive either. Plus, I just think for the money that was spent he would be better off with a RAID specific drive over these Hybrid drives. I hope these drives work well for the OP. I can't say that I'm not interested in his results when it's complete. Also, I think we are all trying to help the OP. :toast:
 
Joined
Dec 16, 2010
Messages
1,662 (0.34/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, 2 x 512GB Samsung PM981a, 4 x 4TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
RAID0 is the only RAID level that doesn't minimize risk, all other RAID levels are more redundant and less risky compared to a single hard drive.

You're right, but there are endless discussions on the internet as to which RAID level is best with many people insisting on using the same RAID level for all applications. My argument is that the RAID level used depends on application; certain data is more valuable than others. It might make sense to use a less redundant RAID level is your data can be easily copied from backup or the internet. In that case spending lots of money to ensure a very high level of data security is money wasted.

RAID is a tradeoff of three factors, and you simply cannot design an array that excels in all three factors:

Low cost
High performance
High data security

I value low cost and high data security with no regard to performance, so I use RAID 6 for my home storage. Someone like the OP values high performance and high data security with little regard to cost might chose a RAID 10. Someone who values low cost and high performance with no care about data integrity might choose a RAID 0.
 
Joined
Nov 4, 2005
Messages
11,655 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs and over 10TB spinning
Display(s) 56" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
The multiple controller depth paragraph above i have no idea what you are saying to be honest. It reads as gibberish, I have tried to interpret it five times now. Could you possibly post that in English?

so raid 50, 60 and other nested RAID arrays aren't redundant? Check again.
Both the disk and the RAID card have caching algorithms for data, NCQ in a RAID 5 array where parity calculation and caching are already in use has almost no effect from my real world experience with HDD's featuring 64MB of cache as well as hardware based controller cards.
Depending on existing que depth and request length it can have a minimally positive and negative effect.


I am unaware of the caching policy using by SSHD's from different manufacturers, I agree many are probably based on real world block usage, but many seem to feature real time caching of files as a feature as well, meaning a RAID card caching back to the disks from its own cache will causes odd read and writes to the drives internal SSD cache.


I don't know a single person on the board or other boards who use or would use nested arrays for their personal systems. Way to take it overboard.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.25/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
I never said it was worse than a standard desktop drive buddy, and with that said I wouldn't use a standard drive either. Plus, I just think for the money that was spent he would be better off with a RAID specific drive over these Hybrid drives. I hope these drives work well for the OP. I can't say that I'm not interested in his results when it's complete. Also, I think we are all trying to help the OP.

I never said you said it was worse than a standard desktop drive buddy. My point was that, in my experience, desktop drives from Seagate work just as well as the enterprise drives, and the SSHD drives perform better.


You're right, but there are endless discussions on the internet as to which RAID level is best with many people insisting on using the same RAID level for all applications. My argument is that the RAID level used depends on application; certain data is more valuable than others. It might make sense to use a less redundant RAID level is your data can be easily copied from backup or the internet. In that case spending lots of money to ensure a very high level of data security is money wasted.

RAID is a tradeoff of three factors, and you simply cannot design an array that excels in all three factors:

Low cost
High performance
High data security

I value low cost and high data security with no regard to performance, so I use RAID 6 for my home storage. Someone like the OP values high performance and high data security with little regard to cost might chose a RAID 10. Someone who values low cost and high performance with no care about data integrity might choose a RAID 0.

Yes, you certainly can design an array that excels at all three factors. My RAID5 array was faster than a single drive, more secure than a single drive, and relatively low cost since it used standard desktop drives.
 
Joined
Dec 16, 2010
Messages
1,662 (0.34/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, 2 x 512GB Samsung PM981a, 4 x 4TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
Yes, you certainly can design an array that excels at all three factors. My RAID5 array was faster than a single drive, more secure than a single drive, and relatively low cost since it used standard desktop drives.

By low cost I meant low cost per capacity, in which case you actually support my original post. Your RAID 5 is more expensive per capacity and slower than a RAID 0 even if your configuration excels at data security. RAID is all about trading storage space for redundancy, so it will never be less expensive for the same capacity compared to JBOD or non-redundant RAID 0. You will never be able to design an array that is better than any other configuration at all three factors simultaneously.
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.25/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
By low cost I meant low cost per capacity, in which case you actually support my original post. Your RAID 5 is more expensive per capacity and slower than a RAID 0 even if your configuration excels at data security. RAID is all about trading storage space for redundancy, so it will never be less expensive for the same capacity compared to JBOD or non-redundant RAID 0. You will never be able to design an array that is better than any other configuration at all three factors simultaneously.
I see what you're saying.
 

Mindweaver

Moderato®™
Staff member
Joined
Apr 16, 2009
Messages
8,193 (1.50/day)
Location
Charleston, SC
System Name Tower of Power / Sechs
Processor i7 14700K / i7 5820k @ 4.5ghz
Motherboard ASUS ROG Strix Z690-A Gaming WiFi D4 / X99S GAMING 7
Cooling CM MasterLiquid ML360 Mirror ARGB Close-Loop AIO / CORSAIR Hydro Series H100i Extreme
Memory CORSAIR Vengeance LPX 32GB (2 x 16GB) DDR4 3600 / G.Skill DDR4 2800 16GB 4x4GB
Video Card(s) ASUS TUF Gaming GeForce RTX 4070 Ti / ASUS TUF Gaming GeForce RTX 3070 V2 OC Edition
Storage 4x Samsung 980 Pro 1TB M.2, 2x Crucial 1TB SSD / Samsung 870 PRO 500GB M.2
Display(s) Samsung 32" Odyssy G5 Gaming 144hz 1440p, ViewSonic 32" 72hz 1440p / 2x ViewSonic 32" 72hz 1440p
Case Phantek "400A" / Phanteks “Enthoo Pro series”
Audio Device(s) Realtek ALC4080 / Azalia Realtek ALC1150
Power Supply Corsair RM Series RM750 / Corsair CXM CX600M
Mouse Glorious Gaming Model D Wireless / Razer DeathAdder Chroma
Keyboard Glorious GMMK with box-white switches / Keychron K6 pro with blue swithes
VR HMD Quest 3 (128gb) + Rift S + HTC Vive + DK1
Software Windows 11 Pro x64 / Windows 10 Pro x64
Benchmark Scores Yes
I never said you said it was worse than a standard desktop drive buddy. My point was that, in my experience, desktop drives from Seagate work just as well as the enterprise drives, and the SSHD drives perform better.

Oh I see buddy. After thinking about it awhile, why wouldn't an SSHD be worse than a standard drive? I mean the SSHD has 2x the parts to fail. It has the mechanical part that can fail and the SSD part that could fail. I don't have any experience with SSHD, will the SSHD still work if the SSD part fails? I know the answer to if the mechanical part fails will it still work and that answer is "No". This is part of the reason I would never use an SSHD in RAID.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.25/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Oh I see buddy. After thinking about it awhile, why wouldn't an SSHD be worse than a standard drive? I mean the SSHD has 2x the parts to fail. It has the mechanical part that can fail and the SSD part that could fail. I don't have any experience with SSHD, will the SSHD still work if the SSD part fails? I know the answer to if the mechanical part fails will it still work and that answer is "No". This is part of the reason I would never use an SSHD in RAID.
The SSDs in these drives are extremely simple compared to traditional SSDs. They are more like an SDCard than a SSD actually. Normal SSDs have multiple flash chips, if just one goes bad the whole drive is toast. The SSDs on these SSHDs are a single chip. Sure that is an extra chip on the PCB that could fail, but is it likely? No.

I'm sure the next question is what happens when the SSD wears out? The SSD isn't going to wear out in the life of the drive. Seagate actually use 16GB flash chips on the drives with 8GB of SSD cache, and 8GB chips on the early drives with 4GB of SSD cache. They do this so they can have 50% over-provisioning, which will make the SSD last a very very long time. Normal SSDs use something like 10% over-provisioning. Plus, since the cache is only written to a small amount, the life of the SSD is basically a non-issue.

Now, what happens if that SSD does fail? We'll I'm not entirely sure. However, since the data isn't stored solely on the SSD, it is also on the hard drive portion, if the SSD does fail the data should be safe. In fact, I would guess the drive would continue to function, though at the speed of a traditional hard drive. Though, it could also be just like what happens when a traditional cache on a hard drive fails.
 
Joined
May 1, 2007
Messages
320 (0.05/day)
System Name Windows XP Pro 32 bit
Processor Intel Core 2 Quad Q6600
Motherboard Asus Striker II Formula
Cooling XIGMATEK HDT-S1283 cpu heatsink
Memory 2x 2gb Crucial Ballistix ram
Video Card(s) Geforce 280 GTX 1gb
Storage Seagate 200 GB - ST3200820AS / 120gb Western Digital HD
Display(s) ViewSonic VX922
Case Antec 900
Audio Device(s) Creative X-Fi Titanium Fatal1ty Professional
Power Supply HIPER 880W Power Supply
Wow looks like this conversation went a long way.

If you're interested in how my SSHD's are doing with my motherboard RAID 10 config - I will answer that with, "so far so good" - I've been running them for about 8 months or so without a hitch.

I was expecting more of a performance boost so I was concerned that the SSD part of my drives weren't working because they were configured as RAID, however, after some simple google searching it appears that these drives can and do work with RAID configs.

So after investigating further, I updated my Intel Matrix Storage manager and found that Write-Cache Buffer Flushing was enabled and the cache mode was off.

I turned this the cache mode to "Write Back" (You need to turn off Write-Cache Buffer off before doing this) and I believe it is now configured to use my SSD part of my hard drives, time will tell if this prompts a performance boost, but worth knowing that the SSD part probably won't work without the setting adjustments on the the Intel Matrix Storage Manager.

 

Mindweaver

Moderato®™
Staff member
Joined
Apr 16, 2009
Messages
8,193 (1.50/day)
Location
Charleston, SC
System Name Tower of Power / Sechs
Processor i7 14700K / i7 5820k @ 4.5ghz
Motherboard ASUS ROG Strix Z690-A Gaming WiFi D4 / X99S GAMING 7
Cooling CM MasterLiquid ML360 Mirror ARGB Close-Loop AIO / CORSAIR Hydro Series H100i Extreme
Memory CORSAIR Vengeance LPX 32GB (2 x 16GB) DDR4 3600 / G.Skill DDR4 2800 16GB 4x4GB
Video Card(s) ASUS TUF Gaming GeForce RTX 4070 Ti / ASUS TUF Gaming GeForce RTX 3070 V2 OC Edition
Storage 4x Samsung 980 Pro 1TB M.2, 2x Crucial 1TB SSD / Samsung 870 PRO 500GB M.2
Display(s) Samsung 32" Odyssy G5 Gaming 144hz 1440p, ViewSonic 32" 72hz 1440p / 2x ViewSonic 32" 72hz 1440p
Case Phantek "400A" / Phanteks “Enthoo Pro series”
Audio Device(s) Realtek ALC4080 / Azalia Realtek ALC1150
Power Supply Corsair RM Series RM750 / Corsair CXM CX600M
Mouse Glorious Gaming Model D Wireless / Razer DeathAdder Chroma
Keyboard Glorious GMMK with box-white switches / Keychron K6 pro with blue swithes
VR HMD Quest 3 (128gb) + Rift S + HTC Vive + DK1
Software Windows 11 Pro x64 / Windows 10 Pro x64
Benchmark Scores Yes
That's good to know. So, have you had any luck with finding a RAID Card? I would say you'll see a good improvement with Hardware RAID. I've had a lot of success with my LSI 3Ware 9650SE card, but it's still pretty high in price. I've heard some people have some success with the HighPoint RocketRaid 640L, and it's around 100 bucks. If I'm not mistaken I think Kreij bought one to use with SSD's, but I don't know if he used RAID10 or not.
 
Joined
May 1, 2007
Messages
320 (0.05/day)
System Name Windows XP Pro 32 bit
Processor Intel Core 2 Quad Q6600
Motherboard Asus Striker II Formula
Cooling XIGMATEK HDT-S1283 cpu heatsink
Memory 2x 2gb Crucial Ballistix ram
Video Card(s) Geforce 280 GTX 1gb
Storage Seagate 200 GB - ST3200820AS / 120gb Western Digital HD
Display(s) ViewSonic VX922
Case Antec 900
Audio Device(s) Creative X-Fi Titanium Fatal1ty Professional
Power Supply HIPER 880W Power Supply
That's good to know. So, have you had any luck with finding a RAID Card? I would say you'll see a good improvement with Hardware RAID. I've had a lot of success with my LSI 3Ware 9650SE card, but it's still pretty high in price. I've heard some people have some success with the HighPoint RocketRaid 640L, and it's around 100 bucks. If I'm not mistaken I think Kreij bought one to use with SSD's, but I don't know if he used RAID10 or not.
after all the mixed responses, I wasn't sure if getting a raid controller was the best choice or not.
The SATA II on the LSI kind of bothers me, SATA III would definitely be preferable, right?
 
Joined
Dec 16, 2010
Messages
1,662 (0.34/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, 2 x 512GB Samsung PM981a, 4 x 4TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
You are completely mistaken about what you are doing. The SSD part of your hard disks is enabled 100% of the time; there is nothing you can do to enable or disable it.

What you have done is enable write back caching on the OS. This is very dangerous for data security, especially when it is run at the OS level. If you care about your data, do not use it.

Write back caching means that when your OS sends a command to write data to disk, it is not written to disk immediately but is actually written into a RAM cache. Once the data is in the RAM cache, the OS (and you) receive a confirmation that the data has been written to disk (even though it hasn't yet). Then, in the background, the RAID manager moves data from the RAM cache to the array as fast as the disks will accept it. The problem is that if the system loses power or crashes, then all the data that is in RAM but not yet written to disk is lost forever.

This loss of data can be caused by a power outage, or since the Intel RAID is run by the OS, even an improper shutdown of the OS (such as a system crash or an improper restart) can cause all data in the RAM cache to be lost. This can corrupt silently, and you might not even know it is occurring. Just checking to see if your files you just wrote before the crash are on the disk isn't enough. The master file table can be fine and show your files as on the disk, but one day you will go to open those files and find that parts of them are corrupt.

Professional RAID cards that use write back caching use RAM on the RAID card itself (separating it from the OS) so that an OS level crash does not corrupt the data in the card's RAM. But that still means that the cache can be lost in a power outage. Any smart person using a RAID card either has a battery backup on the card to preserve the cache in a power loss or disables the cache altogether.
 
Joined
May 1, 2007
Messages
320 (0.05/day)
System Name Windows XP Pro 32 bit
Processor Intel Core 2 Quad Q6600
Motherboard Asus Striker II Formula
Cooling XIGMATEK HDT-S1283 cpu heatsink
Memory 2x 2gb Crucial Ballistix ram
Video Card(s) Geforce 280 GTX 1gb
Storage Seagate 200 GB - ST3200820AS / 120gb Western Digital HD
Display(s) ViewSonic VX922
Case Antec 900
Audio Device(s) Creative X-Fi Titanium Fatal1ty Professional
Power Supply HIPER 880W Power Supply
Thanks sir. So to verify, do I want to keep Write-cache buffer flushing on and Cache-Mode to off or read only?
 
Joined
Dec 16, 2010
Messages
1,662 (0.34/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, 2 x 512GB Samsung PM981a, 4 x 4TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
Thanks sir. So to verify, do I want to keep Write-cache buffer flushing on and Cache-Mode to off or read only?

If you have system memory to spare, use read-only, since that has no chance for data loss and can speed up reading of frequently accessed files. The only reason you would want to disable the cache completely is if your system was very low on memory.
 
Joined
May 1, 2007
Messages
320 (0.05/day)
System Name Windows XP Pro 32 bit
Processor Intel Core 2 Quad Q6600
Motherboard Asus Striker II Formula
Cooling XIGMATEK HDT-S1283 cpu heatsink
Memory 2x 2gb Crucial Ballistix ram
Video Card(s) Geforce 280 GTX 1gb
Storage Seagate 200 GB - ST3200820AS / 120gb Western Digital HD
Display(s) ViewSonic VX922
Case Antec 900
Audio Device(s) Creative X-Fi Titanium Fatal1ty Professional
Power Supply HIPER 880W Power Supply
If you have system memory to spare, use read-only, since that has no chance for data loss and can speed up reading of frequently accessed files. The only reason you would want to disable the cache completely is if your system was very low on memory.
Thanks, I guess when I read it and what it did, I thought it would use the SSD part of my hard drives to cache instead of my actual RAM.
I've set it back to Read as my data is important to me (and the time it takes to configure my computer back :p)
 

Mindweaver

Moderato®™
Staff member
Joined
Apr 16, 2009
Messages
8,193 (1.50/day)
Location
Charleston, SC
System Name Tower of Power / Sechs
Processor i7 14700K / i7 5820k @ 4.5ghz
Motherboard ASUS ROG Strix Z690-A Gaming WiFi D4 / X99S GAMING 7
Cooling CM MasterLiquid ML360 Mirror ARGB Close-Loop AIO / CORSAIR Hydro Series H100i Extreme
Memory CORSAIR Vengeance LPX 32GB (2 x 16GB) DDR4 3600 / G.Skill DDR4 2800 16GB 4x4GB
Video Card(s) ASUS TUF Gaming GeForce RTX 4070 Ti / ASUS TUF Gaming GeForce RTX 3070 V2 OC Edition
Storage 4x Samsung 980 Pro 1TB M.2, 2x Crucial 1TB SSD / Samsung 870 PRO 500GB M.2
Display(s) Samsung 32" Odyssy G5 Gaming 144hz 1440p, ViewSonic 32" 72hz 1440p / 2x ViewSonic 32" 72hz 1440p
Case Phantek "400A" / Phanteks “Enthoo Pro series”
Audio Device(s) Realtek ALC4080 / Azalia Realtek ALC1150
Power Supply Corsair RM Series RM750 / Corsair CXM CX600M
Mouse Glorious Gaming Model D Wireless / Razer DeathAdder Chroma
Keyboard Glorious GMMK with box-white switches / Keychron K6 pro with blue swithes
VR HMD Quest 3 (128gb) + Rift S + HTC Vive + DK1
Software Windows 11 Pro x64 / Windows 10 Pro x64
Benchmark Scores Yes
Yea, it's only advisable to us write back on a hardware RAID controller with BBU (Battery Backup Unit) in case of power lose. I would suggest getting a Hardware RAID Card. I think you would see a performance increase even using SATAII, but the 640L is SATAIII.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.96/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
What you have done is enable write back caching on the OS. This is very dangerous for data security, especially when it is run at the OS level. If you care about your data, do not use it.

I would say that this is true unless you were running server grade hardware on a UPS. I have confidence in a Xeon and ECC memory (running 100% stock, like server hardware should,) will be fine. Write-through would probably be optimal, because you still cache it even if you wait for writes to flush to the array.
 
Joined
Dec 16, 2010
Messages
1,662 (0.34/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, 2 x 512GB Samsung PM981a, 4 x 4TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
I would suggest getting a Hardware RAID Card. I think you would see a performance increase even using SATAII, but the 640L is SATAIII.

There might be a performance increase going from SATA II to SATA III, but given the same SATA spec, those RocketRAID cards will not be any better than integrated RAID. Those low end RocketRAID cards are software RAID as well. It doesn't matter either way because RAID 10 involves no parity calculations.

He has an X48 system, so the only benefit would be that he can plug the RocketRAID card into one of the PCIe 2.0 slots on the northbridge (PCIe 2.0 x4 = 2GB/s) and not be constrained by the 1GB/s uplink from the southbridge. Of course, I am not sure about the performance of SSHDs and whether 4 of them can actually break 1GB/s. If they can't, then the bottleneck is the drives themselves and the add on card is pointless.

I would say that this is true unless you were running server grade hardware on a UPS. I have confidence in a Xeon and ECC memory (running 100% stock, like server hardware should,) will be fine. Write-through would probably be optimal, because you still cache it even if you wait for writes to flush to the array.

I agree; your situation is different because you are taking every precaution to prevent system crashes. My storage server with a RAID card has no battery backup, so I have all its write caches (including the on-disk caches) disabled.
 
Joined
Nov 4, 2005
Messages
11,655 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs and over 10TB spinning
Display(s) 56" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
I run it, but am willing to live with the risks (I backup), and since the agility drives don't feature a cache that needs flushed.......


Any benchmark numbers for us to oogle?
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.96/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
I agree; your situation is different because you are taking every precaution to prevent system crashes. My storage server with a RAID card has no battery backup, so I have all its write caches (including the on-disk caches) disabled.

Even without a BBU or UPS, write-through is useful for reads. There is no risk of data loss because data is stored to the disk synchronously with cache. It's supposed to improve read speeds if recently written data is accessed shortly after being written.
 
Top