• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

IBM’s bionic computers bleed electronic blood

Mar 26, 2010
7,680 (2.69/day)
Jakarta, Indonesia
System Name micropage7
Processor Intel G4400
Motherboard MSI B150M Bazooka D3
Cooling Stock ( Lapped )
Memory 16 Gb Team Xtreem DDR3
Video Card(s) Nvidia GTX460
Storage Seagate 1 TB, 5oo Gb and SSD A-Data 128 Gb
Display(s) LG 19 inch LCD Wide Screen
Case HP dx6120 MT
Audio Device(s) Stock
Power Supply Be Quiet 600 Watt
Software Windows 7 64-bit
Benchmark Scores Classified

In the 1950s, the highest priority for national defense was the Air Force ballistic missile program. The ICBM, and therefore the entire space program, and the internet, were made possible by the IBM 360 mainframe computer and its immediate predecessors. Last week at their Zurich lab, IBM gave a media tour of some of the company’s latest concepts — concepts which it believes will be just as revolutionary as the 360-era computing devices. The key to making supermachines 10,000 times more efficient, IBM says, is to build bionic computers cooled and powered by electronic blood.

Bytes and flops can be handy measures for characterizing file sizes, or gauging how long a simulation might take to give you an answer. If however, you are budgeting a datacenter, or even just a processor or battery for a smartphone, a more useful measure to work with might be operations per joule. If doing something even more exotic, like designing machines to navigate the highways and byways of a circulatory system, operations per liter may be even more relevant. As was astutely noted in a recent article, an important element in the units we choose to describe our computing efforts is the need to capture a sense of direction, and more importantly, progress.

IBM’s stated goal is to have a one petaflop computer in a 10-liter volume. In other words, it wants to take a petaflop machine that would fill a room today, to be put on a desktop. Today’s top dog, China’s Tianhe-2, has a petabyte of memory, and achieves 33.86 petaflops (quadrillion floating-point calculations per second) by using 32,000 Xeon processors and 48,000 Xeon Phi accelerators. The machine runs Kylin Linux and requires some 17.8 megawatts of power. Power draws on that scale have previously led IBM to develop a prototype system called Aquasar, which has branching tubes to deliver liquid coolant right where it is needed. They specked this system with some unusual units — 7.9 trillion operations per second per kilogram of carbon dioxide released into the atmosphere.

In a previous article we suggested that ultimately, resources like power and cooling for high-density computing need to be volume sourced, rather than supplied through costly wires and power-ground planes on a board. The logical extension of the design rational (which would deliver these resources through an increasingly fractalized system of tubes) would be to immerse the entire computer. While there are technical hurdles to achieving something like that at scale, IBM has made significant advances in a technology they call a redox flow battery. IBM’s flow battery computer has a microfluidic chip which uses vanadium electrolytes with different oxidation states to generate voltages ranging from 0.5 to 3 volts. It is capable of generating up to a watt of power per square centimeter of board surface, and potentially provides for a significant amount of heat removal.

Grams of CO2 as a computer specification
IBM’s new yardstick of flops/kilogram CO2 would appear to make the assumption that the CO2 cost of building the computer is negligible compared to the cost of running it. Indeed for life based on cells, that assumption can be somewhat justified because cells only use a little more energy when they initially divide or grow as they do when they are in normal operation. However, until we can construct computers as easily as we would inoculate a vat of broth with a rapidly replicating bacterium, construction costs will probably continue to be a major factor.

To draw a little comparison here, a commodity like steel might have little or no CO2 cost associated with its operation, but significant costs for its construction. To melt and process the metal for cast girders might take 1300 megajoules/tonne steel and generate 235 kg of CO2 per tonne steel. For computers, efficiency has increased as they have proliferated over time, but the effects clearly have not offset one another. In the other words, the average total computing need and budget per capita has grown over time, as has the percentage use of our computers of our total energy budget. Perhaps we need a better measure for computing — one that would involve what we want to actually do, rather than how much of something we use to do it.



PC Gaming Enthusiast
Jul 25, 2008
9,490 (2.74/day)
Louisiana -Laissez les bons temps rouler!
Processor Core i7-3770k 3.5Ghz, O/C to 4.2Ghz fulltime @ 1.19v
Motherboard ASRock Fatal1ty Z68 Pro Gen3
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F CPU cooler
Memory 2x 8GB Mushkin Redline DDR-3 1866
Video Card(s) MSI GTX 980 Ti Gaming 6G LE
Storage 1x 250GB MX200 SSD; 2x 2TB WD Black; 1x4TB WD Black;1x 2TB WD Green (eSATA)
Display(s) HP 25VX 25" IPS @ 1920 x 1080
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Logitech G500
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
VERY interesting!! Thanks for this fascinating article. It appears we are getting closer to science fiction every day!