• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RTX 3080 Users Report Crashes to Desktop While Gaming

This is just silicon lottery, some chips just can't clock past 2Ghz no matter the voltage. My 2080 Ti is the same and i'm not alone (GPU is stable at 1995mhz/0.975V but 2010mhz is not stable no matter what).
Best thing to do is set a maximum clock in the freq/voltage curve (undervolting) that the GPU will not boost above, so yeah while the GPU may not set some benchmark record the efficiency gain is nice and that is what matter when gaming.
 

Attachments

  • undervolt.png
    undervolt.png
    31.9 KB · Views: 205
So it begans. There were multiple problems with the RTX 2000 series when that was released too. Best to wait several months for them to iron out all the problems.

Yah the first 2080ti's had micron chips that liked to kick the bucket.
 
Was looking forward to spoiling myself this round of GPUs, but these early issues always give me the jitters. Will wait to see how this plays out, not like there is stock anywhere anyways :laugh:
 
Yah the first 2080ti's had micron chips that liked to kick the bucket.
Samsung's did it too... but the issue that plagued these out of the gate has all but been eliminated. According to mindfactory, 2080Ti return rates were ~5% whereas the 5700XT was 3.5%. Of the 2080Ti's the MSI Lighting and two Palit branded cards somehow managed to reach 11% return rates, the rest were much lower overall. Nvidia, overall, was slightly lower.

 
Over 2GHz sounds like a lot. Their spec boost is 1710MHz.
Heck, my Turing 2080 has the same spec boost and it rarely, if ever, exceeds 2GHz and crashes shortly after that (apparently to voltage deficit).
 
Over 2GHz sounds like a lot. Their spec boost is 1710MHz.
Heck, my Turing 2080 has the same spec boost and it rarely, if ever, exceeds 2GHz and crashes shortly after that (apparently to voltage deficit).
IDK man the stock boost of 2080ti is 1545Mhz, and most cards boost to 1800~1900 stock on air anyway.
 
IDK man the stock boost of 2080ti is 1545Mhz, and most cards boost to 1800~1900 stock on air anyway.
At stock yes, most Turings and Pascals boost to 18xx or 19xx and should work fine in games at high 1900 MHz range. Over 2000 starts becoming a problem. I am kind of surprised that Ampere would even try to boost that high given the OC results we have seen in reviews.
 
At stock yes, most Turings and Pascals boost to 18xx or 19xx and should work fine in games at high 1900 MHz range. Over 2000 starts becoming a problem. I am kind of surprised that Ampere would even try to boost that high given the OC results we have seen in reviews.
Not that surprising, given it how short that peak boost lasts.
Clock speed-wise the Turing clocks slightly higher than Ampere.
For example my card does peak over 2010Mhz briefly and settles at around 1840 at stock.
The really good chips can peak over 2100Mhz.
 
Or, you know........... they simply never experienced the issue during testing. Feels like inffered cherry picking samples here?
I wonder how many ran Furmark.. :p :rolleyes:
 
Well then, I'm thinking maybe the extreme shortage of RTX 3080s at the beginning is actually a good thing for me. I'll stick with my 2080ti until the 20GB version comes out or at least until I hear they've ironed out the bugs in the 3080 (and they're back in stock).
Don’t be betting that 20 GB of VRAM is going to run cooler. Not unless something is actively done by Nvidia about cooling it.
 
So it begans. There were multiple problems with the RTX 2000 series when that was released too. Best to wait several months for them to iron out all the problems.
Waiting a couple of months is generally a good idea with any new platform, most substantial problems should be identified by then.

It's too early to tell whether these claims are actual hardware issues, driver issues or just cases of user error and misinformation. The most important thing is to precisely identify potential problems, so unrelated problems are not mixed together, like what happened with the RTX 2080 Ti "memory defects", where a few defective cards where mixed in with some early driver bugs. The embarrassing driver issues were quickly resolved, but not before many users confused this with actually bad samples. RTX 2080 Ti still had lower RMA rates than most other GPUs, but the myth of defective GPUs still lives on.

-----

I'm will look out for reports of the long-term reliability of GDDR6X hardware with PAM4 encoding.
 
As with the 5700xts I suspect the issue is other parts in the system.

How many of these users have a 750w PSU that has been cycled through their last two builds.

It's easy to blame the shiny new GPU but maybe you should look to that dusty 8 year old $69 80+ bronze unit.
 
At stock yes, most Turings and Pascals boost to 18xx or 19xx and should work fine in games at high 1900 MHz range. Over 2000 starts becoming a problem. I am kind of surprised that Ampere would even try to boost that high given the OC results we have seen in reviews.

My gtx1080 (pascal) runs fine at 2140 core, must have been lucky. seems to me these new cards will benefit from a full water block on them.
 
Last edited by a moderator:
These GPUs are not the first ones to draw a lot of power. A 5700 uses a lot less power too.
From what I understand, it's not the wattage. It's how rapid the power requirements change on the newer cards, causing the PSU to miss the target.

I don't think most users are using $250+ PSU's like most reviewers are.
 
Last edited:
As with the 5700xts I suspect the issue is other parts in the system.

How many of these users have a 750w PSU that has been cycled through their last two builds.

It's easy to blame the shiny new GPU but maybe you should look to that dusty 8 year old $69 80+ bronze unit.
So true.
Even expensive PSUs go bad though, depending on how much load they get over time. They may no longer be able to deliver clean power and handle rapidly changing loads any more, even though they should on paper. I would recommend switching any PSU that's 6-8 years old. (possibly not throwing it away, but putting it in a spare computer or so, where it may still be "usable")
 
From what I understand, it's not the wattage. It's how rapid the power requirements change on the newer cards, causing the PSU to miss the target.

That's how all GPUs behave, when a frame is rendered the power spikes up and in-between it falls down to a very low value. This is nothing new, that's what I am saying.

And at the end of the day, if they did somehow managed to make a GPU which draws current in such a bizarre way that most PSU can't handle it, that's still their fault.

As with the 5700xts I suspect the issue is other parts in the system.

How many of these users have a 750w PSU that has been cycled through their last two builds.

It's easy to blame the shiny new GPU but maybe you should look to that dusty 8 year old $69 80+ bronze unit.

This is quite an absurd argument, you are basically suggesting it's always something else and never the GPU. As I already said this is not the first time a GPU uses a lot of power, so why is this time around different ?
 
Last edited:
At stock yes, most Turings and Pascals boost to 18xx or 19xx and should work fine in games at high 1900 MHz range. Over 2000 starts becoming a problem. I am kind of surprised that Ampere would even try to boost that high given the OC results we have seen in reviews.

>2000 isn't nearly as much an issue for Pascal in terms of stability as it seemed to be for Turing. And now Ampere.

The only thing stopping you on Pascal was temperature and losing bins. 1070's and 1080's all boosted over 2k and a few bins over as well. I had a 1070 FTW doing 2140+, and this 1080 can do 2100. And its not like I'm doing anything special. Stock air cooler, pretty shit maintenance :D

I think what's tipping Ampere over is just too many new variables. Memory upgrade, GPU Boost changes... and a pretty high TDP to keep under control. They'll probably get it right in the end, and it might very well cost clocks as well, in a definitive way. They might neuter these cards, in a worst case scenario, force a certain voltage floor or limit the peak clock... not a pretty sight.
 
But I thought nvidia was pure perfection.. :rolleyes: this should teach people to wait before they buy but nah that would require the use of logic, too hard.
 
That's how all GPUs behave, when a frame is rendered the power spikes up and in-between it falls down to a very low value. This is nothing new, that's what I am saying.

And at the end of the day, if they did somehow managed to make a GPU which draws current in such a bizarre way that most PSU can't handle it, that's still their fault.



This is quite an absurd argument, you are basically suggesting it's always something else and never the GPU. As I already said this is not the first time a GPU uses a lot of power, so why is this time around different ?

Fully agree. Its up to Nvidia to comply with standards not the other way around.

In that sense this reminds of the power draw over PCI slot thing a few years back.
 
But I thought nvidia was pure perfection.. :rolleyes: this should teach people to wait before they buy but nah that would require the use of logic, too hard.
If everybody waited, who'd spot these issues for you? (Ok, you're probably not getting one of these any more than I will, but still...)
 
If everybody waited, who'd spot these issues for you? (Ok, you're probably not getting one of these any more than I will, but still...)

Maybe we should figure out a masterplan to get all the annoying people to buy the first ones and beta test for us.

Scalpers were doing a fine job but noooo
 
  • Like
Reactions: bug
Fully agree. Its up to Nvidia to comply with standards not the other way around.

In that sense this reminds of the power draw over PCI slot thing a few years back.
Which standards? Last I checked, you didn't get a PCI-SIG stamp of approval for anything that used more than 8+6 pins for power.
Though that might have changed, I think they had an update over the summer, but I don't know what's in it.

Maybe we should figure out a masterplan to get all the annoying people to buy the first ones and beta test for us.

Scalpers were doing a fine job but noooo
Let's not forget those that actually have that much disposable income and are happy to assist.
 
Back
Top