# does this make sense?

#### freeboy

As my temps climb up in a water cooled system... there is a greater delta, hopefully I am using this word correctly, between the ambient temp of air flowing through the rad and the coolant. Does it stand to reason that as the temps rise in my loop I am getting a greater amount of heat wattage dissipated at the same fan speeds due to this, or is so fractional even on efficient rads and system that the delta or difference makes little difference? I understand its theoretical, but I have a fairly highly efficient model rad and good shrouding, but only three fans and not six... and am trying to judge the system.

In other words am I pulling out more heat at the loop temp at 85C than at 55c in watts dissipated?

#### MxPhenom 216

##### Corsair Fanboy
If your hardware is getting hotter, that heat is being transferred and carried by the water in your loop to the radiator. So yes, as your temperatures rise till it hits equilibrium you will have more heat to dissipate.

You will hit equilibrium when your temperatures hit a point where they are not rising or decreasing.

#### The Von Matrices

If you are keeping system identical except for the heat input from the processor/graphics card/etc., then this will be true.

But you can have a high loop temperature and not dissipate any more heat. For example, If you lower the speed of the pump then the coolant temperature will rise, but the system is not dissipating heat at a faster rate.

#### MxPhenom 216

##### Corsair Fanboy
If you are keeping system identical except for the heat input from the processor/graphics card/etc., then this will be true.

But you can have a high loop temperature and not dissipate any more heat. For example, If you lower the speed of the pump then the coolant temperature will rise, but the system is not dissipating heat at a faster rate.
Or if you don't have the right speed fans for your radiator.

#### freeboy

I am using all things equal .. thanks...