"I go fast!1!11!1!"
- Oct 13, 2008
- 25,441 (6.32/day)
- IA, USA
|Processor||Intel Core i7-6700K (4 x 4.00 GHz) w/ HT and Turbo on|
|Motherboard||MSI Z170A GAMING M7|
|Memory||2 x Kingston HyperX DDR4-2133 8 GiB|
|Video Card(s)||Sapphire NITRO+ RX 590 Special Edition 8 GiB DVI + HDMI|
|Storage||Crucial MX300 275 GB, Seagate Exos X12 TB 7200 RPM|
|Display(s)||Samsung SyncMaster T240 24" LCD (1920x1200 HDMI) + Samsung SyncMaster 906BW 19" LCD (1440x900 DVI)|
|Case||Coolermaster HAF 932 w/ USB 3.0 5.25" bay|
|Audio Device(s)||Realtek ALC1150, Micca OriGen+|
|Power Supply||Enermax Platimax 850w|
|Mouse||SteelSeries Sensei RAW|
|Software||Windows 10 Pro 64-bit|
|Benchmark Scores||Faster than the tortoise; slower than the hare.|
Anything and everything is capable of error. It's always the margins that matter (MTBF, if you will). The longer a system can go without error, the better it is for science. When building a super computer, for instance, you buy parts that proven to work at a given spec, you make sure it isn't a bad processor in a separate computer, then plug it in to the server until it goes bad. You do everything possible to make sure errors are kept to a bare minimum. Anything that tends to cause errors is avoided.One final query, is a box stock computer capable of error?? If so, it also would be liable to produce inaccurate/erroneous result..correct?
What we're talking about here is an error once a month versus an error once a year. That would be a 12:1 ratio. The bigger the ratio, the better it is for science/computing/whatever.
Regardless of whether or not the ratio is big or small, all science should be double checked--if not by the computers that did it in the first place then by someone else through peer review (in which case you get laughed at).