• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What kills gpus?

That hasn't been in problem in more than 15 years.. and even then it was only a problem because makers were switching away from leaded solder. There were growing pains associated with that development. They had those problems worked out within a year.
Yep just ask the early Xbox 360's... oh wait we can't, they are all dead.
 
GPU's don't exactly die, it's usually the components around it that simply wear out.

Capacitors are known to wear harder once they are running above "nominal" temperatures - a capacitor baking at 80 degrees average is not good compared to 60 or even 40 degrees.

All videocards have been tested before shipped out - they actually put those things in the oven to see what the card does and to assure your getting kinda of what you paid for.

I would not worry - by the time your GPU actually dies your looking 5+ years ahead of 24/7 usage.
 
For sure I never saw a dead core on those gpu repair videos. /sarcasm

I did not state how the gpu itself dies.

Just checking if there are any new information in this older topic

I would not buy any reflow item when i was told before purchase. Nor any things put in oven. I consider that a trash item value knowing most likely the money is lost and gone. E.g. The 35€ graphic card I bought and sold early 2023.
 
Water, power surge that'll fry your parts.. so maybe don't take it swimming and maybe get a UPS if you get nasty storms in your area.

For over two decades of tinkering, GPUs have only ever died... by my hand... while tweaking or doing something stupid :D (I almost killed my 5090).
 
I miss single slot cards that could game. Like an HD 4850, maybe later snag another for crossfire and have better performance than the HD 4870. THe 4870 is 150w only. Today. 9070 XT is 300w no problem, big heatsink middle of the road performance!

And they say we are more efficient today. Not at double the wattage we aren't.... But is that an apples to apples comparison when DLSS and DX_12 wasn't a think. Those cards are only DX10 I believe.

So it takes ( as a guess..) 150w more power usage today to buy and game on a "70" series AMD GPU versus that of the past.

Modern cards.... may not have to be so big and bulky. Public demands ##'s FPS ## FPS +++++ 60 million isn't even enough.
Its why I have been praying someone like Unreal or ID really invested in mGPU.

My previous rig was a pair of AMD 6870 1Gb card back when BF3 came out. I miss those days where they were actually faster than the top end single card but they did have issues with crossfire at times and also the lower memory buffer meant the 1% and .1% were lower when it hit the limit :(

Best thing is that if you look at AMDs latest in the datacenter they have taken this design language and ran with it with the MI355X parts.



God I would love to rock a pair of 6950XTs or something like a pair of 9070 or 9070XTs just for fun again.
 
Its why I have been praying someone like Unreal or ID really invested in mGPU.

My previous rig was a pair of AMD 6870 1Gb card back when BF3 came out. I miss those days where they were actually faster than the top end single card but they did have issues with crossfire at times and also the lower memory buffer meant the 1% and .1% were lower when it hit the limit :(

Best thing is that if you look at AMDs latest in the datacenter they have taken this design language and ran with it with the MI355X parts.



God I would love to rock a pair of 6950XTs or something like a pair of 9070 or 9070XTs just for fun again.
I did some mGPU with 6700 and 6700XT cards. Also 6500 XT in mGPU configuration. Was only good for DX12 anyway. Most people dont need it or care or even knew it was possible anyways.

mGPU support was dropped at 7000 series. So you could do the 6950 XT in crossfire (cause I wanted to say crossfire) and probably have an outstanding experience, with your dx 12 games.

Having a second GPU while using only 1 for non DX12 games will cost you a performance penalty that most people wouldnt enjoy.

Anyways---
All of the interesting computer stuff, is starting to be a thing of the distant past. Quad fire is now a dream. Quad SLI a fantasy. :(
 
MGPU's works in compute, it works less efficient in gaming. Micro-stuttering was a thing. And not all games would scale perfectly 50% extra with installing a card.
For sure I never saw a dead core on those gpu repair videos. /sarcasm

I did not state how the gpu itself dies.

I actually did kill a GPU, but not in the sense you expect.

One of m was overvolting, from stock 1.4V up to 2.16V while running on phasechange (-40 celcius).

The other one was attaching a what was a CPU heatsink with pure screws straight into the PCB mounting holes. It was not the screws that done it, but the pressure onto the GPU. That kind of cracked the thing.
 
Anyways---
All of the interesting computer stuff, is starting to be a thing of the distant past. Quad fire is now a dream. Quad SLI a fantasy. :(
Technically it's not... you'd just have to write your own drivers to make it work... but it is still technically possible on pretty much any PCIE bus.

Also, we all know why they did this, they wanted to move all of that upscale to datacenter so they could charge 10x the price.

Once that cools down they might bring it back...
 
Technically it's not... you'd just have to write your own drivers to make it work... but it is still technically possible on pretty much any PCIE bus.

Also, we all know why they did this, they wanted to move all of that upscale to datacenter so they could charge 10x the price.

Once that cools down they might bring it back...
It technically is. There are no motherboards produced that support SLI or Crossfire. This support happens to be a chipset requirement and in the past would advertise such support. And that hasn't been a thing for several years now. So its not just about drivers for the GPU, the board and also the software (games) would need to implement the support as well.
 
Technically it's not... you'd just have to write your own drivers to make it work... but it is still technically possible on pretty much any PCIE bus.

Also, we all know why they did this, they wanted to move all of that upscale to datacenter so they could charge 10x the price.

Once that cools down they might bring it back...
It technically is. There are no motherboards produced that support SLI or Crossfire. This support happens to be a chipset requirement and in the past would advertise such support. And that hasn't been a thing for several years now. So its not just about drivers for the GPU, the board and also the software (games) would need to implement the support as well.
SLI was always vendor/license/chipset locked. ATI CrossFire had some restrictions, but mostly hardware-based.
"AMD CrossFireX" and "AMD MGPU" are not and never were chipset-locked.

"Multiple GPU" is returning, but not for rendering; compute-only.
AMD and Intel both are very open about wanting their AI/MI customers to have as many of their GPGPUs in-system(s) as possible.

Ironically,
If game rendering eschews Rasterization entirely for (compute-based) AI inference, we'll already have multi-GPU capability. :laugh:
 
SLI was always vendor/license/chipset locked. ATI CrossFire had some restrictions, but mostly hardware-based.
"AMD CrossFireX" and "AMD MGPU" are not and never were chipset-locked.

"Multiple GPU" is returning, but not for rendering; compute-only.
AMD and Intel both are very open about wanting their AI/MI customers to have as many of their GPGPUs in-system(s) as possible.

Ironically,
If game rendering eschews Rasterization entirely for (compute-based) AI inference, we'll already have multi-GPU capability. :laugh:
Heh, yeah. It's all dead.
Today we get, Single GPU, no agiea physx, no NV Physx, no SLI, no crossfire, no mGPU, no NVlink, none of it. All wiped off the features table. Have some Frame Gen instead. :D
 
Space Invaders cards died due to a combination of extremely high heat close to the PCIE slot VRAM chips, combined by issues with lackluster soldering materials.

Some 3000 series cards also died to similar issues (which was addressed afaik, on 3090 TI's).
 
Back
Top