Tuesday, April 7th 2009
Video of the Day: 23 GeForce GTX 295 Video Cards Installed in a Single Server Rig
Wanna see what crazy is? How about 23 NVIDIA GeForce GTX 295 video cards installed in a single rig. Yep, that's not a typo. Named Atlas Folder, this wholesale pack of G200b GPUs (dubbed: folding server farm) is like an industrial grinder for Folding@Home. Watch the cool YouTube video here and feel free to leave a comment. I bet even NVIDIA engineers dream for such a monstrous distributed computing station.
Source:
MaximumPC
62 Comments on Video of the Day: 23 GeForce GTX 295 Video Cards Installed in a Single Server Rig
You can see a discussion here: foldingforum.org/viewtopic.php?f=38&t=9340&start=0
and here: foldingforum.org/viewtopic.php?f=38&t=9011
Jason
On daily basis, how much time do you spend controlling all your Clients? At the moment I have only 6 Clients and sometimes I get more than one down while I'm at work.
The power consumption of a single shelf is ~1,125 Watts. I have 5 shelves and more misc equipment. Power consumption is probably around 7,000 Watts continuous. I haven't added it all up lately. Well thanks, but there is an important difference. True supercomputers use 64-bit floating point math (double-precision), Folding@Home makes do with 32-bit floating point math (single-precision). So yes, building supercomputers is getting cheaper fast, but there's still no free (cheap) lunch. If you need double-precision you still must pony up the cash. Not much... I can check for dead EUE clients anywhere from my iPhone by going here, it's updated every 60 seconds. If I need to make a correction all of the machines are controllable through LogMeIn and I have LogMeIn Ignition on my iPhone so I can log into the servers from my iPhone and tweak things as necessary.
I made a post a couple of weeks ago about how I control and implement all of the clients. I think it's kinda slick even if I do say so myself, but necessity is the mother of invention. Controlling all the clients individually got old very, very fast. I had to think of a better way. You can see my post about that here.
Thanks for the interest guys,
Jason
Did I understand you are driver-limited. Therefore max 8 GPUs per PC, so you would need a minimum of 3 independent PCs to run your 24 cards? Or does the x2 thing knock that down to just 4 cards (8 GPU dies)?
PS. Write to nVidia, show them your stuff, get them to fix the drivers, and sponsor your rack.
I wonder what it will actually be crunching.
Just kidding, and that's one hell of a folding rig you have!
It is neat to see all that awsome hardware (especially the GPUs) working hard to find a cure :toast:
For those that want to learn more about his cause, check out the link to his website at bottom of his sig.
i'm skeptikal about this f@h do you know exactly what data is processed ?
Stanford say is protein folding and other molecular dynamics but is really this?
That being said, looks like I've found another admin who names a group of server units after characters in Atlas Shrugged. :p Ah, critics.
Cures take a long, long time to develop.
I wonder if it is i7s...
I think you can also run WCG while folding...
As for connecting them all the to the same monitor, I'm sure the rack has a KVM, which allows multiple machines to be connected to the same Keyboard, Monitor, and Mouse. Although, once set-up the machines should be able to be run headless, with remote access via LogMeIn or Remote Desktop.