1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Asetek to Offer Liquid Cooling Solutions for Data Centers

Discussion in 'News' started by Cristian_25H, Jan 2, 2012.

  1. Cristian_25H

    Cristian_25H News Poster

    Joined:
    Dec 6, 2011
    Messages:
    4,244 (4.43/day)
    Thanks Received:
    1,136
    Location:
    Still on the East Side
    Asetek Inc., the world's leading supplier of liquid cooling solutions for computers, announced today that it will offer a range of liquid cooling solutions to address a diverse set of cooling challenges facing modern HPC clusters data centers. These challenges range from reducing latency in high frequency trading applications, to increasing performance and density in servers, HPC clusters and data centers and decreasing the amount of energy used for cooling. All of these solutions utilize the company's proven CPU and GPU cooling technology which is commercially deployed today in hundreds of thousands of computers around the world.

    [​IMG] [​IMG] [​IMG]

    The benefit of Asetek liquid cooling technology is its efficiency in removing heat directly from processors and moving that heat to an optimal place for transferring it to the environment. The success of Asetek's technology results from its reliability, affordability and suitability for installation in high volume computer manufacturing. Asetek data center liquid cooling solutions utilize this technology to provide three general levels of server cooling:

    - Internal Loop Liquid Cooling enables the use of the fastest processors, including high wattage processors, in high density servers.

    - Rack CDU Liquid Cooling removes processor and or GPU heat from rack servers and blades out of the data center without the use of traditional computer room air conditioners or water chillers, enabling extreme densities on server, rack and data center level. The strongest value proposition however, is that the solution uses free outside ambient air cooling allowing around 50% power savings on the data center cooling cost.

    - Sealed Server Liquid Cooling removes all server heat from the data center via liquid; literally no air from inside the data center is used for server cooling. This solution enables high density with high performance processors and ambient room temperature server cooling.

    "We have studied the server market and engaged with our customers. While much of what is written suggests that the problem of data center cooling is monolithic, we have discovered the need is for a diverse set of solutions to meet specific data center performance, density and efficiency objectives," said André Sloth Eriksen, Founder and CEO of Asetek. "Using proven Asetek technology to engineer a range of cooling solutions gives Asetek a unique ability to address the wide diversity of cooling challenges that exist in the HPC and data center market today."

    Leveraging its existing CPU and GPU liquid cooling technology brings both proven reliability and high volume cost advantages to Asetek's data center liquid cooling solutions. Low pressure system design is a cornerstone of Asetek's proven reliability. Low pressure puts less stress on joints and connections, substantially reducing failures. The company's integrated pump–cold plate units for CPU and GPU cooling are well suited for supplying redundancy in servers with multiple processors. All liquid channels are helium integrity tested and all systems are liquid filled and sealed at the factory for its life time, eliminating the need for any liquid handling by the server OEM, or data center operator. Additional information on Asetek data center cooling solutions is available at http://www.asetek.com/markets/data-centers.
  2. cdawall where the hell are my stars

    Joined:
    Jul 23, 2006
    Messages:
    20,648 (7.07/day)
    Thanks Received:
    2,971
    Location:
    some AF base
    Those look like some mean rads/fans. I also spot a MCP355 in that first photo.
  3. MoonPig

    MoonPig

    Joined:
    Aug 7, 2008
    Messages:
    5,725 (2.63/day)
    Thanks Received:
    886
    Location:
    Wakefield, UK
    Duct tape looks interesting...

    Do want though.
  4. The Terrible Puddle

    Joined:
    Jun 2, 2011
    Messages:
    108 (0.09/day)
    Thanks Received:
    9
    Fans must be damn noisy.
  5. cdawall where the hell are my stars

    Joined:
    Jul 23, 2006
    Messages:
    20,648 (7.07/day)
    Thanks Received:
    2,971
    Location:
    some AF base
    Early build I am sure that one doesn't look done quite yet.


    and its an enterprise server they don't really care.
  6. TiN New Member

    Joined:
    Aug 28, 2009
    Messages:
    184 (0.10/day)
    Thanks Received:
    68
    Location:
    Taipei
    Most of watercooling builds from modders look much better/cleaner :D
    I would not install this into my servers, no way :D

    One of main ideas of watercooling - to reduce noise, what's on server market not important at all :)
  7. sy5tem

    Joined:
    Nov 13, 2004
    Messages:
    447 (0.13/day)
    Thanks Received:
    48
    Location:
    Canada/quebec/Montreal
    thats very true lol, i have 3 server at iWeb and 2 at my office ... and they all are very noisy but i dont care ... iWeb is far from me and in my office they are in an isolated cooled room :p so i don't see the point of WC server's (i use WC for my gaming system)
  8. lauri_hoefs New Member

    Joined:
    May 4, 2011
    Messages:
    24 (0.02/day)
    Thanks Received:
    6
    Location:
    Korpi
    Well, as you say, with servers noise is usually not an issue, as big iron really requires dedicated server rooms anyway. But I don't think noise reduction is what this is for.

    I think the benefit is, that with water the temperature delta is much smaller compared to air cooling, so you could gain longer lifespans for hot components. This could be a huge benefit with mission critical hardware.

    And you could conduct most of the heat out of the server room through a couple of water pipes, instead of huge air ducts, so that might help with server room placement issues. (edit: Take a look at the last picture, the blade chassis, and you get the idea.)
  9. overclocking101

    overclocking101

    Joined:
    Apr 8, 2009
    Messages:
    2,886 (1.49/day)
    Thanks Received:
    405
    Location:
    vermont
    my thoughts exactly
  10. TiN New Member

    Joined:
    Aug 28, 2009
    Messages:
    184 (0.10/day)
    Thanks Received:
    68
    Location:
    Taipei
    lauri_hoefs

    Before thinking about watercooling in mission critical servers - answer question - what's temperatures we are talking about if both watercooling loop and heatsink as populated in same chassis, but big bulky rads occupy valuable space in these Asetek designs. Also for MC tasks service/maintenance time must be minimized. In case of FAN failure - few mins to replace fan, most servers don't need even reboot because fans usually hotswap. Even if CPU fan fail, MC servers may allow to replace CPU/cooling hotswap. What's about watercooling loop? :)

    Understand me right, I love Asetek WB's and phasechange units , but servers....different story :)
  11. brandonwh64

    brandonwh64 Addicted to Bacon and StarCrunches!!!

    Joined:
    Sep 6, 2009
    Messages:
    18,440 (10.36/day)
    Thanks Received:
    5,995
    Location:
    Chatsworth, GA
    Most servers are in an air conditioned rooms. The thought if this would not really be usefull to major company's that own such rooms. Our server rooms are cooled to 65F deg 24/7 a simple air cooler would suffice.
    Crunching for Team TPU
  12. lauri_hoefs New Member

    Joined:
    May 4, 2011
    Messages:
    24 (0.02/day)
    Thanks Received:
    6
    Location:
    Korpi
    I see what you mean, but I don't think this would make replacing fans or other parts any harder or slower, if designed well, of course. I don't think putting the rads inside the casings is a good idea, or that it would be a good idea to replace cooling systems in existing server rooms.

    But the blades, they are a different story. In existing blades the fans are either hotswappable ones on the blade cards, or on the chassis. Why would it be different with water cooling? And if you look at the blade chassis, it looks like it's actually designed to be hotswappable and easy to maintain, but of course we can't be sure without any extra info.

    Yes, I see several problems with the concept too, but I also see why this could be feasible.

    Ideally, taking a little longer to replace a blade card could pay out if there was more uptime due to fewer component failures :)

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page