• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Silverstone HELA 2050 W

Joined
Mar 3, 2011
Messages
964 (0.19/day)
Location
Greece
The HELA 2050 has enough output power to start a truck and comes with a 12-pin PCIe connector for support of the next-generation graphics cards releasing in 2022. Currently, it is the strongest desktop PSU money can buy, and build quality is also topnotch. However, it is priced accordingly.

Show full review
 
The price versus current PSUs on the market is not outrageous though. Example my 1200W Seasonic was $349.

My PSU 1200W Platinum = 0.29$ / WATT
Silverstone 2050W Platinum = 0.29$ / WATT
 
Last edited:
A PSU for a kilowatt PC? That would be physically uncomfortable, you're going to need a special air conditioning setup just to handle the hot spot this thing will be in. This isn't high end, this is industrial.
 
lol

beyond the typical receptacle power output in North America, unless you run a dedicated circuit with 12ga wire and a 20A breaker, at least on 120VAC

I got a 550W seaosnic focus gold for $60 CAD new, $/W for PSUs has been at an all time low lately, but I guess when there is no gpus to build new PSs there must be a good amount of PSUs sitting on shelves
 
An $600 PSU w/o a power switch.
Yeah, fuck this shit.

Well, I am sure, they figured just miners will be buying them, so the off position is not needed. :roll:
 
In many silent PC's that will be quite irritating. Shouldn't that be a downside too?

I don't think anyone would run this in a PC specified to be as silent as possible.
 
2050w PSU! Bit crazy, funny how despite power shortages in Europe and surging electricity prices we're seeing these escalations in power consumption again from GPU and CPU. How much is it going to cost to run this anywhere near full capacity 365 days a year?

Never been happier to spend £65 on a w850w on latest rebuild. Rig I'm using to play 4k on pulls down about 600w (not including the screen) in demanding games and won't be pursuing more than that I think!
 
Will this PSU future proof me for an entry level Intel CPU and Nvidia GPU in 2023 or should I go bigger wattage?
 
Why would anyone build smth like this with no on/off switch?
I'm not trying to be a wiseguy, i genuinely don't understand how and why they'd do this.
Can anyone explain?
 
2KW output is insane. It's also beginning to approach the limits of a what a consumer grade PSU can deliver. Why? Because 230V mains can deliver about 3KW max and you wouldn't want to be pulling that much power continuously for safety reasons, either. And your electricity bill. AFAIK there's no standard mains supply in the world that can deliver more than 3KW from a standard wall socket, therefore, the max PSU output is likely to be around 2.8KW if it's really efficient. I wouldn't wanna be anywhere near that if there's a short circuit... A partial short circuit would be even worse, since it would just keep pouring power into it, quite likely starting a fire.
 
2KW output is insane. It's also beginning to approach the limits of a what a consumer grade PSU can deliver. Why? Because 230V mains can deliver about 3KW max and you wouldn't want to be pulling that much power continuously for safety reasons, either. And your electricity bill. AFAIK there's no standard mains supply in the world that can deliver more than 3KW from a standard wall socket, therefore, the max PSU output is likely to be around 2.8KW if it's really efficient. I wouldn't wanna be anywhere near that if there's a short circuit... A partial short circuit would be even worse, since it would just keep pouring power into it, quite likely starting a fire.

Agreed.

I love the fact that we are seeing node shrinks literally every other year. However, it's shame that there is little to no focus on power consumption efficiency . Node shrinks have been used to draw more power, rather than less. We have GPUs now consuming between 400W-500W, at non-startup peaks. All of the rumors that are circulating are saying next-gen GPUs will be consuming more power. I think government regulation will eventually come down on these.
 
5 years warranty, my $149 Evga 1300 G2 , has 10 year warranty
 
Power consumptions announced by nVidia and Intel are crazy. Instead of raising efficiency, they are just pulling more power to provide more speed - it is crazy. And unsustainable. If they continue down that road, it will be end of PC platform as we know it... Conspiracy theorist in me suggest that maybe is ultimate goal in the end...
 
Agreed.

I love the fact that we are seeing node shrinks literally every other year. However, it's shame that there is little to no focus on power consumption efficiency . Node shrinks have been used to draw more power, rather than less. We have GPUs now consuming between 400W-500W, at non-startup peaks. All of the rumors that are circulating are saying next-gen GPUs will be consuming more power. I think government regulation will eventually come down on these.
Yeah, for comparison, we all know just how brightly a 100W incandescent lamp can illuminate a room, so a 500W version would be far too bright, literally pouring energy into the room forcing one to wear shades. The PC would do this as extreme heat that would significantly warm up the area around it and the room. Too much for a desktop, more like something that needs to live in an air conditioned data centre set to frosty!
 
I wonder if the HELA naming is analogous to Asus Thor PSUs.

While on the topic of energy efficiency, the new nodes are more efficient than the older nodes, it’s just the entire industry is moving towards stuffing more and more transistors in the same area, at which point the switching losses and power leakage becomes exponential.

And while Radeon this generation has higher peak average current draw at 20~ms than GeForce, the latter literally fries it’s own protection circuit running ultra-optimised games. Whether that is fault of the protection circuit or the GPU core, I tend to lean towards latter pulling almost 1000w transient at uS scale. I hope Nvidia shoot itself on the foot with 500w next gen, and serve to be an example of why not to shove transistor until core energy density approach nuclear pile, with transient towards rocket nozzles.
At least on CPU side we have PL2 and PTT figures, GPU boosting algorithm and power use is just unknown until observed. That and a high end GPU easily pulls more power than CPU.
 
How can there be no information on the 12 volt out puts? Can you not simply remove whatever heatsink from them and see what is written on the 12 volt devices? I generally absolutely love all the tear downs and component reviews on here but not this review. The lack of detailed data combined with a comment you made about not wanting to bother to tear it down completely almost indicates laziness or lack of interest in this particular review.
 
Yes, I got lazy, after thousands of tear-downs and reviews. Comments like yours make me wonder why I still bother writing PSU reviews.
Also you clearly have NO IDEA how hard is to remove/desolder heatsinks from Enhance.

Lack of detailed DATA!!! SERIOUSLY!!!!
 
Power consumptions announced by nVidia and Intel are crazy. Instead of raising efficiency, they are just pulling more power to provide more speed - it is crazy. And unsustainable. If they continue down that road, it will be end of PC platform as we know it... Conspiracy theorist in me suggest that maybe is ultimate goal in the end...
The way i see it, they'd love for all of us to be good little peons on console, OR PREBUILTS every last one of us enthusiasts.
It would be so simple to make standard console with standard cpu, gpu, ram and stuff, so much less updates to worry about, so much easier to develop stuff when everyone has the same hardware.
I can just see it now: you can buy MSI Console, Asus Console, Gigabutt Console, AssRock Console etc.
That's the way i see it, i might be wrong tho.
 
Yes, I got lazy, after thousands of tear-downs and reviews. Comments like yours make me wonder why I still bother writing PSU reviews.
Also you clearly have NO IDEA how hard is to remove/desolder heatsinks from Enhance.

Lack of detailed DATA!!! SERIOUSLY!!!!

I have been repairing and replacing individual Mosfets and IGBTS in industrial data center power supplies and the network distribution nodes for 40 years that most technicians would never even attempt to repair so obviously you and I have a totally different understanding what is complicated or difficult.
 
I have been repairing and replacing individual Mosfets and IGBTS in industrial data center power supplies and the network distribution nodes for 40 years that most technicians would never even attempt to repair so obviously you and I have a totally different understanding what is complicated or difficult.
In that case maybe you should do a PSU review by those high standards, i'd love to see it.
 
Back
Top