• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RTX 3080 Users Report Crashes to Desktop While Gaming

I think the problem is over-reving. When it goes past 7000 rotaions per minute, fuel system should cut ignition to prevent damaging pistons and the valve-train.
 
I think the problem is over-reving. When it goes past 7000 rotaions per minute, fuel system should cut ignition to prevent damaging pistons and the valve-train.
Off topic but I hate rev limiters, it feels like someone pulled the handbrake on when you exceed that limit. :shadedshu:
Nanny state for engines.
 
I wonder, how many dumbasses use 2x 8 pins on a single power cable?
 
The delusion of the article poster... Nvidia never has system crashing driver problems... Never...

*cough* I smell some serious fecal matter in the air... *cough*
 
I use Corsair HX 1000W platinum PSU and guess what MSI RTX 3080 Gaming X keeps crashing to the desktop when playing games. I could easily run 2x EVGA 1080TI & 10700K OC to 5.1Ghz without a single crash so it's surely not a PSU thing. Lowering base clock via afterburner to 1440 MHz and boost to 1710 MHz seems to solve the problem in my case. It sucks paying price premium over FE and have to settle for the same clocks in the end.
 
Have no sympathy for gerbils buying this abomination of a video card.

YOU people bitched (and rightly so) on AMD Vega Series because of excsessive heat and wattage issues. An increase in performance at the expense of increase of wattage. As stated again I bought a EVGA 1070 SC because of the excessive wattage/ heat issue that the Vega Series gave. The 1070 is a great card that I still use in my back up computer.

NOW I get to bitch on how crappy these cards really are for the same damn thing. An increase in performance at the expense of increase of wattage.

30% increase in performance for 30% wattage increase IS NOT AN ADVANCEMENT.

And you are not even getting the quality of Silicon that you should get for a $800 (add shipping and tax and most 3080 video cards will hit this price) video card that will work nicely as a winter heater. Nvidia is hoarding the best for later cards.

The world does NOT run on 4K. Real 4K game playing for the masses is over 5 years off as they haven't milked the 1080p market to death yet. Right now 1440p is now the sweet spot for video games and prices are starting to come down for everyone to enjoy the experience.

Just wait until the holiday season really begins to make your purchases. By the time the newer 3080's should have their issues ironed out.

If the AMD gives me a Video card with similar performance of a 3080 with LESS wattage usage. I'll buy one. And if they get their act together I'll buy from Nvidia as well.

This is not me taking one side. This is me looking at how the industry if giving less to the masses by marketing hype.
 
I have a RTX2080, and after updating to latest drivers and Windows 2004 May Update, I had my first BSOD in 2 years. Definitely some software issue, my RTX3080 is on the way.
 
I wonder, how many dumbasses use 2x 8 pins on a single power cable?
Used to run my 980 Ti with one cable tho now using two as it was recommended.
 
Have no sympathy for gerbils buying this abomination of a video card.

YOU people bitched (and rightly so) on AMD Vega Series because of excsessive heat and wattage issues. An increase in performance at the expense of increase of wattage. As stated again I bought a EVGA 1070 SC because of the excessive wattage/ heat issue that the Vega Series gave. The 1070 is a great card that I still use in my back up computer.

NOW I get to bitch on how crappy these cards really are for the same damn thing. An increase in performance at the expense of increase of wattage.

30% increase in performance for 30% wattage increase IS NOT AN ADVANCEMENT.

And you are not even getting the quality of Silicon that you should get for a $800 (add shipping and tax and most 3080 video cards will hit this price) video card that will work nicely as a winter heater. Nvidia is hoarding the best for later cards.

The world does NOT run on 4K. Real 4K game playing for the masses is over 5 years off as they haven't milked the 1080p market to death yet. Right now 1440p is now the sweet spot for video games and prices are starting to come down for everyone to enjoy the experience.

Just wait until the holiday season really begins to make your purchases. By the time the newer 3080's should have their issues ironed out.

If the AMD gives me a Video card with similar performance of a 3080 with LESS wattage usage. I'll buy one. And if they get their act together I'll buy from Nvidia as well.

This is not me taking one side. This is me looking at how the industry if giving less to the masses by marketing hype.
Vega didn't offer much performance for all that power draw (just a tad faster than a 1070, available almost a year before).
On top of that going from Vega56 to Vega64, perf/W took almost a 20% drop, making it obvious the architecture was at its limit. By comparison, going from 3080 to 3090, perf/W only declines 8%. And most of that is probably because of the added VRAM.

Personally, I'd like to see manufacturers put a 250W hard limit on their GPUs, but as long as people are buying these monstrosities (in my eyes), who am I to argue?
 
Have no sympathy for gerbils buying this abomination of a video card.

YOU people bitched (and rightly so) on AMD Vega Series because of excsessive heat and wattage issues. An increase in performance at the expense of increase of wattage. As stated again I bought a EVGA 1070 SC because of the excessive wattage/ heat issue that the Vega Series gave. The 1070 is a great card that I still use in my back up computer.

NOW I get to bitch on how crappy these cards really are for the same damn thing. An increase in performance at the expense of increase of wattage.

30% increase in performance for 30% wattage increase IS NOT AN ADVANCEMENT.

And you are not even getting the quality of Silicon that you should get for a $800 (add shipping and tax and most 3080 video cards will hit this price) video card that will work nicely as a winter heater. Nvidia is hoarding the best for later cards.

The world does NOT run on 4K. Real 4K game playing for the masses is over 5 years off as they haven't milked the 1080p market to death yet. Right now 1440p is now the sweet spot for video games and prices are starting to come down for everyone to enjoy the experience.

Just wait until the holiday season really begins to make your purchases. By the time the newer 3080's should have their issues ironed out.

If the AMD gives me a Video card with similar performance of a 3080 with LESS wattage usage. I'll buy one. And if they get their act together I'll buy from Nvidia as well.

This is not me taking one side. This is me looking at how the industry if giving less to the masses by marketing hype.

While you are quick to complain, Ampere is still have higher perf/watt than Turing . Talking about high power consumption alone is pointless because it is configurable, for example thin and light laptop can have 2080 Super configured with 90W TGP and still be faster than a desktop 5600 XT with 180W TGP.
This was why Fermi and Vega were failures, not because they are power hungry, they are just too inefficient

perfwatt_2560.gif
perfwatt_3840_2160.png


Now the Perf/Watt for 3080 at 1080/1440p is not correct, W1zzard said so himself because the power consumption is measured with 4K gaming, at 1080p/1440p the 3080 is not drawing as much power.

At the end of the day, the first thing most people care about is perf/dollar, followed by perf/watt then additional features. With Ampere Nvidia has placed the most emphasis on Perf/dollar while getting a little efficiency uplift on the side.

I'm sure Nvidia specifically chose Samsung 8N for this reason, had gaming Ampere been produced at TSMC 7+ it would have been much more efficient but pricier. Yes AMD has the chance to achieve slightly higher efficiency with RDNA2, however they won't be competitive in the Perf/dollar without drowing their profit margins (which is already really low to begin with).
 
Last edited:
Performance per watt is possibly the least impressive metric in a new GPU architecture. It would take an unprecedented amount of bad design to conceive a GPU with worse perf/watt. Vega had better performance per watt compared to Polaris as well, that's why they put in APUs.

So when NVIDIA proudly states that their new GPUs have higher performance per watt, that basically means nothing. What's impressive is when you get higher perf/watt and lower power.
 
Last edited:
Performance per watt is possibly the least impressive metric in a new GPU architecture. It would take an unprecedented amount of bad design to conceive a GPU with worse perf/watt. Vega had better performance per watt compared to Polaris as well.

So when NVIDIA proudly states that their new GPUs have higher performance per watt, that basically means nothing. What's impressive is when you get higher perf/watt and lower power.

Just wait for Laptop with Ampere GPU then, 3080 with 90W TGP would dominate any other mobile chip.
Mobile Navi is pretty much defunct.
 
3080 with 90W TGP would dominate any other mobile chip.

Dream on, a 90W 3080 would probably be no faster or even slower than a mobile 2080.
 
Wanna bet ?

You know what's funny ? The RTX 3000 mobile name is already used for mobile Quadros. They must have knew from back then how great Ampere will be for mobile :roll:
 
Back
Top