• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Why don't cases seperate CPU from GPU for thermal reasons?

As a watercooler I don't have that problem, but something like having the GPU card with a riser isolating that from the other part of the PC could be a working solution?

I believe the original Q is exploring the idea of a true dual-compartment design, one that offers real thermal isolation. This means each compartment would have its own dedicated airflow with separate intake and exhaust. What we have today are a good number of options in mini-ITX cases that let you mount the GPU behind the motherboard to somewhat mimic that dual compartmentalised design, but these chambers are not physically sealed off. You still end up with CPU and GPU blasted heat mixing within the case and the absence of distinct configurable cooling to deal with those individual chambers.

In general, on the back of this thread:

Its not a necessary requirement but I kinda like the concept. I'm not sure how I feel about the increased case footprint and thickness that this case design seems to require but it defo seems like an interesting/fun option to have. All it really takes is one fearless Maverick of a manufacturer to roll the dice, with execution and attention to detail prioritised, and boom bobs done the dirty and came out clean with the dice nailing six every time. People are drawn to novelty, whether practical or not - I'm betting it would easily draw attention! lol if circus-level RGB, pricier AIOs on 65w processors and huge fish tank cases with tiny components can pull it off, and not forgetting the push on mainstream injection of LCD screens on case, mobo, air coolers, AIOs, fans, GPUs, etc (the computex 2025 madness concurs), surely something a little more useful with sealed off compartments for improved cooling can't be that wild of a concept to take off. I say bring it!

@Pets4Ever, beyond the explored concept, something like this also looks promising:

 
Last edited:
Sure you do, you just accept the higher temps on components you do not care about as much as what you are watercooling :D
Weeell, maaybe... I'll get a new case later this month and put a third radiator, I know what you're meaning but I can get even better the heat off from CPU/GPU: :toast:
 
Weeell, maaybe... I'll get a new case later this month and put a third radiator, I know what you're meaning but I can get even better the heat off from CPU/GPU: :toast:
AIO is just a bit better than air.. I should build a loop. How hard can it be..

:nutkick:
 
AIO is just a bit better than air.. I should build a loop. How hard can it be..

:nutkick:
Just make sure that the fittings are tight, fucked up two motherboards and a R9 290 when had a leak on two cases:rolleyes:

I guess that getting the hoses/tubes nicely is the hardest part, that takes the most time for me.
 
The two main sources of heat are the CPU and GPU.
This part right here, disagreement in philosophy = disagreement in engineering.
Some of us already separate these heat sources with blowers or physical partitions.
So on that point it hasn't been a problem for ages. There are some bigger issues though.
Like what do you do about runaway chipset temps? Gotta put something on it.
Luckily my X570 board has a fan or I'd be in trouble.
1749006060842.png
 
This part right here, disagreement in philosophy = disagreement in engineering.
Some of us already separate these heat sources with blowers or physical partitions.
So on that point it hasn't been a problem for ages. There are some bigger issues though.
Like what do you do about runaway chipset temps? Gotta put something on it.
Luckily my X570 board has a fan or I'd be in trouble.
View attachment 402418
IDK it just seems like there should be a better way to go about it besides slapping everything in the same compartment.


Right now I'm working on designing a case that intuitively seems like it would make temp management easier. Essentially, 4 compartments. Motherboard & CPU, PSU, GPU, and Drives. Each would be designed to function as a wind tunnel that accommodates fans of appropriate size. By keeping the 4 compartments less cluttered, the idea is to allow for fewer, larger fans that can quietly provide the cooling needed without requiring high RPMs and creating much noise. If this case ever gets made I'll post an update. Maybe I'll post some shitty Paint sketches later idk lol
 
I drilled holes for the hoses myself. Some generic black thing from IIRC from CompUSA.
That's a old case. Must post picture!

I'm trying to tune up My cooling solution, and for me it's stupid that intake is in front where in front new gpu's have this pass through that will shove hot air just to cpu cooler.
I think that cases should use back as air intake, it can lessen the heat load on cpu, and gpu.
View attachment 402141
Top mounted radiators that exhaust through the top can benefit from rear intake especially when the rad is next to it.

Yes

imgur.com/a/lian-li-x-noctua-URqWPPU
You might fair better if you reverse the fan flow especially considering how close the rear fan is to the CPU.
1749016219343.png


Going back to the OP's question I think the reason is because there is no way to practically standardize it with air cooling with the great variety of GPU and CPU cooling designs. I've come to the conclusion for ATX layouts the best thing is to have a side panel consisting entirely of intake fans blowing air onto everything.
 
Last edited:
I've come to the conclusion for ATX layouts the best thing is to have a side panel consisting entirely of intake fans blowing air onto everything.
Conventional wisdom, and even some testing, seems to indicate that side intakes mess up the front-to-back or back-to-front airflow with turbulence. What kind of setup are you talking?
 
Conventional wisdom, and even some testing, seems to indicate that side intakes mess up the front-to-back or back-to-front airflow with turbulence. What kind of setup are you talking?
Literally only side intake airflow with passive exhaust everywhere else. Imagine taking off your side panel and replacing it with a box fan or 9 x 120mm fans or 4 x 200mm fans.
 
No "sometimes". All the time. The CPU and GPU must know how to communicate at all times. That requires logic, or computer programming based on established protocols.

That is not what you said earlier. Before you replied to my comment about ATX boards, claiming the distances between the CPU and PCIe slots vary. No they don't - at least not to the first PCIe slots which is where graphics cards typically reside.
I muddled it up from the beginning , sorry.

By logic I was referring to that there e.g can be lanes from CPU or from the chipset, etc not the protocol. And by distance that the electrical traces are and can be longer than to the first slot(and likely vary slightly between models as well). So from an “electrical” point of view there is differences. Also some boards use the second slot .. but nitpicks :)

I 100% agree with you on the rest, and let hope we get less furnace components:)

Let’s nerd out more next time
 
That's a old case. Must post picture!


Top mounted radiators that exhaust through the top can benefit from rear intake especially when the rad is next to it.


You might fair better if you reverse the fan flow especially considering how close the rear fan is to the CPU.
View attachment 402426

Going back to the OP's question I think the reason is because there is no way to practically standardize it with air cooling with the great variety of GPU and CPU cooling designs. I've come to the conclusion for ATX layouts the best thing is to have a side panel consisting entirely of intake fans blowing air onto everything.
That's what i tried and got far worse temps.

The way i have it now is the best.
 
That's what i tried and got far worse temps.
That's kind of surprising but possibly you would need to cordon off the area on the top bottom and side of the CPU cooler so the top fan won't suck away the intake before it reaches the CPU or so the GPU exhaust wouldn't mix with the intake.

1749018865223.png
 
Anywhoo.. even back then I was into high heat applications
Back then, I was searching for ET with 4 - 6 computers running full throttle, 24/7. Despite being in the cool basement, I ended up putting a window AC to keep ambient room temps down. I finally came to my senses when the "level payment" for my electric bill doubled.

Conventional wisdom, and even some testing, seems to indicate that side intakes mess up the front-to-back or back-to-front airflow with turbulence.
"Testing" being the key word there. I've done much with many cases and fan configurations and totally agree, side panel fans actually cause turbulence that disrupts the desired "flow" of air through the case, degrading the cooling. This was verified by measuring via software and with my trusty multimeter using thermal-couplers placed at various points inside the cases.

There was a notable exception. When the side panel fan blew into a tube attached to the side panel, and that tube channeled that air directly onto a downward firing CPU cooler, or directly onto the GPU's cooler, case cooling was not impacted and processor cooling was improved. But note this was dependent on the location of that side panel fan. Every case was different and with some, the side panel fan was not aimed at one of the processors. And then case cooling suffered.

IMO, side panel case cooling is, generally speaking, counterproductive - with only a few exceptions.

And by distance that the electrical traces are and can be longer than to the first slot(and likely vary slightly between models as well).
As the crow flies, the physical distance between the CPU and the first PCIe slot will be the same. This is because the ATX Form Factor dictates where the CPU socket will be mount as well as where the PCIe slot on the motherboard and in the case will be located.

Now for sure, if the motherboard maker decides to run the circuit traces around the motherboard a few times before connecting to the slot, that will affect latency adversely. But that certainly would be poor design and since the motherboard makers, even with budget boards, don't want to introduce unnecessary latency issues, they design their boards with traces as short as possible - much in part, because that is cheaper too.

The required logic would not be different as again, dealing with wait states is already in the cold.
 
As the crow flies, the physical distance between the CPU and the first PCIe slot will be the same. This is because the ATX Form Factor dictates where the CPU socket will be mount as well as where the PCIe slot on the motherboard and in the case will be located.

Now for sure, if the motherboard maker decides to run the circuit traces around the motherboard a few times before connecting to the slot, that will affect latency adversely. But that certainly would be poor design and since the motherboard makers, even with budget boards, don't want to introduce unnecessary latency issues, they design their boards with traces as short as possible - much in part, because that is cheaper too.

The required logic would not be different as again, dealing with wait states is already in the cold.
But, now you are just twisting and continuing to use your definitions of my words? Or over exaggerating my words?

Some boards have an x4 on top, some route lanes through chipset (logic) and various versions of this have existed for years. Traces are layed to minimize length (aLao ro ensure the right length) and interference, both will be different depending on pcb layers, chipset, form factor, features. And logic as in circuits is not the same as the electrical protocol logic.

I’m not sure if there is miscommunication here, if we mean different things, or just want to be right. Either way I’m happy, I’m not in It for the winning :)

Over and out
 
Actually, I was trying clarify (untwist) by explaining the difference between the physical location requirements of the PCIe slots and CPU socket as defined and required by the ATX Form Factor standard, compared to how the manufacturers' design/layout of their circuit "traces" may add to that distance.

So actually, I think we are on the same page now. :)

Have a good day.
 
But, now you are just twisting and continuing to use your definitions of my words? Or over exaggerating my words?

Some boards have an x4 on top, some route lanes through chipset (logic) and various versions of this have existed for years. Traces are layed to minimize length (aLao ro ensure the right length) and interference, both will be different depending on pcb layers, chipset, form factor, features. And logic as in circuits is not the same as the electrical protocol logic.

I’m not sure if there is miscommunication here, if we mean different things, or just want to be right. Either way I’m happy, I’m not in It for the winning :)

Over and out
Actually, I was trying clarify (untwist) by explaining the difference between the physical location requirements of the PCIe slots and CPU socket as defined and required by the ATX Form Factor standard, compared to how the manufacturers' design/layout of their circuit "traces" may add to that distance.

So actually, I think we are on the same page now. :)

Have a good day.

Keeping it thread-relevant, both of your conclusions are right but wrongly concluded - you are not on the same page but definitely in the same compartment :laugh:

Literally only side intake airflow with passive exhaust everywhere else. Imagine taking off your side panel and replacing it with a box fan or 9 x 120mm fans or 4 x 200mm fans.

This one's been around a while... I'm just digging through my mental archive of the old and cursed modded images. lol the best of the bunch was this Aussie guy who slapped on a living room fan like it was a NASA prototype.

Nowadays theres a bunch of great case options out there which are fully perforated/meshed-up on all sides so no shortage of passive exhaust. A full-on fan fest of a side panel might do a much better job then expected. Regardless of the gains, sadly I’ve sold my soul to tempered glass. Wasn't always like that but now I can't live without the silicon porn stars strutting their stuff through the window.
 
This part right here, disagreement in philosophy = disagreement in engineering.
Some of us already separate these heat sources with blowers or physical partitions.
So on that point it hasn't been a problem for ages. There are some bigger issues though.
Like what do you do about runaway chipset temps? Gotta put something on it.
Luckily my X570 board has a fan or I'd be in trouble.
View attachment 402418

I thought all of the x570 chipsets, until very near the end of production, had an embedded cooler on them for that very reason. I think only a few of the x570 boards I dealt with skipped the active cooling, and those had the tendency to have large passive cooling (and still run a bit hot).

-Edit-
Google seems to back me up on this one:
X570 chipsets are known for running hot, and temperatures of 70-80°C are considered normal, sometimes even reaching 90°C under load. This is often due to the chipset's design and the increased power consumption of PCIe 4.0. Unless the system shuts down due to overheating, it's unlikely to be a serious issue, according to a Reddit thread.
-Edit end-
 
I thought all of the x570 chipsets, until very near the end of production, had an embedded cooler on them for that very reason. I think only a few of the x570 boards I dealt with skipped the active cooling, and those had the tendency to have large passive cooling (and still run a bit hot).

-Edit-
Google seems to back me up on this one:
X570 chipsets are known for running hot, and temperatures of 70-80°C are considered normal, sometimes even reaching 90°C under load. This is often due to the chipset's design and the increased power consumption of PCIe 4.0. Unless the system shuts down due to overheating, it's unlikely to be a serious issue, according to a Reddit thread.
-Edit end-
My X570 overheated due to poor mounting pressure on the heatsink from loose screws. The first sign of issues was NVMe's being dropped from the system during a heavy file copies. Fixing the loose screws fixed it, then I did an additional mod and temps were much nicer.
 
Last edited:
I thought all of the x570 chipsets, until very near the end of production, had an embedded cooler on them for that very reason. I think only a few of the x570 boards I dealt with skipped the active cooling, and those had the tendency to have large passive cooling (and still run a bit hot).

-Edit-
Google seems to back me up on this one:
X570 chipsets are known for running hot, and temperatures of 70-80°C are considered normal, sometimes even reaching 90°C under load. This is often due to the chipset's design and the increased power consumption of PCIe 4.0. Unless the system shuts down due to overheating, it's unlikely to be a serious issue, according to a Reddit thread.
-Edit end-
You're correct, X570 was actively cooled, X570S was the passively cooled variant.
 
Lots and lots of ducting! I'm actually a little surprised nobody linked this yet:

At this point in time I don't really think it's so much cases that need changing but rather ATX. BTX was the solution, but Intel didn't make it mandatory and OEMs weren't interested in retooling all of their equipment.

That being said there are some case ideas with potential when used with AIOs/custom loop. These are about the best I think we will see anytime soon.
 
Lots and lots of ducting! I'm actually a little surprised nobody linked this yet:

At this point in time I don't really think it's so much cases that need changing but rather ATX. BTX was the solution, but Intel didn't make it mandatory and OEMs weren't interested in retooling all of their equipment.

That being said there are some case ideas with potential when used with AIOs/custom loop. These are about the best I think we will see anytime soon.

I usually watch OT’s SFF content and occasionally dip into the peripheral stuff. If I had paid more attention to the video titles rather then occasionally relying on thumbnails, I wouldn't have missed this one. I'll check it out - 10 mins of bed time story telling from one of the best content creators on YT - that'll be a treat and a half.
 
I found a good solution :)

 
1749086606636.png


omen 45 L does this - has 3 decorative fans up front then a top AIO compartment separate from the case... mase so much sense tbh
 
Cooler Master Cosmos II with an Arctic LF II 420 externally mounted. 5 Silverstone FHP141's at full speed (173cfm each) keep my 5.9GHz 14700K nice and cool. I also had the same setup mounted on the Corsair 780 T next to it but I had to do an RMA on that 420 (wonky pump) so I put a spare LF II 280 W/ 4 Silverstone FHP141's in intending it to be temporary. It works so well with my 6GHz 14900K I just left it like that. I was going to use a LF II 360 but I would have had to drill new mounting holes and the 280 bolted right in.

Not only do they have outstanding performance but none of my LF II's would have fit inside the cases, especially with the 38mm fans in push/pull. I paid $20 for the Cosmos II and $35 for the 780 T in mint condition. The Cosmos required a lot of Dremel surgery but all of the sliding panels work and if I put the center panel in and removed the AIO it'd look totally stock. The 780 T required very little work for the mods. 240 & 280mm AIO's will mount in the top with no cutting or drilling, a bit of work was needed to get the 420 to fit properly.
 

Attachments

  • 20250604_202112.jpg
    20250604_202112.jpg
    408.5 KB · Views: 21
Last edited:
3 x 180mm fans in the bottom pushing 110CFM straight up seems to have worked well for Silverstone in the Raven RV02 case as proved by Gamers Nexus as it still holds the coolest case award. The only problem is that if you wanted to go AIO or Custom loop you either needed one of their proprietary 180mm AIO coolers or external rad for custom loops


20201027_195419.jpg
 
Back
Top