• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA AIC Partners Clarify RTX 3080/3090 Crash to Desktop Issues, Capacitor Choices

Yeah, this makes early adoption look worse than before that's for sure. I'll wait a while before upgrading now.

Edit: I probably couldn't buy a 3080 or 3090 if I wanted to anyways since stock is gone.
 
I can not understand of why there is such a buzz about an preliminarily assessment from all brands.
All their voices combined and then filtered from sweet-talk they deliver a simple message .... we are aware about the problem.
 
Off cause they blame "...overzealous Boost algorithms ..."

SO OBVIOUS! NOW THEY ARE SOFTENING UP PEOPLE so they will accept a new firmware with nerfed boost clocks.

Lowering performance is the only way they can fix their poor capacitor choices without taking back the GFX card.

IF YOU HAVE A 3080, RETURN IT NOW WHILE YOU HAVE A CHANCE, DO NOT ACCEPT LOWER PERFORMANCE.

Some manufacturers made poor capacitor choices to make more money, it was greed pure and simple.

Who the flock saves a couple of measly cents on a $699 card, they must be real cheap.

What I heard is that ASUS and GIGABYTE went all in even on the cheapest versions, have anyone heard of any problems with their cards?
This is why I enjoy hardware launches. So much conspiracy theories and hate from people who aren't affected because they didn't even buy the product, so much virtue signalling and drama. Someone more cynical might compare such situations to a horde of rabid dogs looking for a fight, but rabid dogs actually have teeth, so more like a bunch of hyperactive kittens - cute when angry, but ultimately powerless.

I kind of regret ordering one. Not because of the issue at hand, I don't bother with overclocking, but the whole drama might lead to some price cuts - not very probable, but who knows. So yeah, put as much tinfoil on your head as you want, it's unlikely most people will care. And since the new drivers apparently fix the issue without nerfing performance (and so does turning hardware scheduling off, if some reports are to be believed), there might be more to the issue than "omg, bad caps!".
 
So much conspiracy theories and hate from people who aren't affected because they didn't even buy the product, so much virtue signalling and drama.

Who told you that when ever NVIDIA makes hardware launches the vast majority of consumers they act as blind sheep this opening it wallet ?
If NVIDIA thinks that from entire customers base worldwide that it will mobilize 1% for a new purchase they are more crazy that what I thought so far.
 
Who told you that when ever NVIDIA makes hardware launches the vast majority of consumers they act as blind sheep this opening it wallet ?
If NVIDIA thinks that from entire customers base worldwide that it will mobilize 1% for a new purchase they are more crazy that what I thought so far.
I'm not saying that anywhere. I just find it amusing that the most angry and aggressive consumers are usually those completely unaffected by the issue.
 
I'm not saying that anywhere. I just find it amusing that the most angry and aggressive consumers are usually those completely unaffected by the issue.
Understood .. I can only talk for my self, I have good emotional reasons to watch AMD logo this getting on flames and burn, but I am in denial to share my sentiments with AMD product owners.
I bet that this makes me a Good person :)
 
Understood .. I can only talk for my self, I have good emotional reasons to watch AMD logo this getting on flames and burn, but I am in denial to share my sentiments with AMD product owners.
I bet that this makes me a Good person :)
Well, might be a start in a good direction ;)
Maybe, they aren't paralyzed by it?
Paralyzed how? I mean, the 3080 is obviously working on a fringe, with no headroom left. As any product, it has flaws. I just spent my money on it (and the cheapest ones are closer to 900$ where I live, yay Europe), knowing full well that overclocking it even slightly might have unwanted consequences or be impossible at all, and some things might come out after some time.
It's exactly the same with anything, things might have hidden defects and that's just something you have to accept as a consumer, especially if you're an early adopter. The manufacturer obviously did something to mitigate the issue, and at this point it seems to be just the "outrage culture" making people behave stupid.
 
Paralyzed how? I mean, the 3080 is obviously working on a fringe, with no headroom left. As any product, it has flaws. I just spent my money on it (and the cheapest ones are closer to 900$ where I live, yay Europe), knowing full well that overclocking it even slightly might have unwanted consequences or be impossible at all, and some things might come out after some time.
It's exactly the same with anything, things might have hidden defects and that's just something you have to accept as a consumer, especially if you're an early adopter. The manufacturer obviously did something to mitigate the issue, and at this point it seems to be just the "outrage culture" making people behave stupid.
Mind you, I feel for the consumers who break their cards me being one of them. It is just that people might not put blame where they may. You not being one of them, I'm in total agreement. I don't like sjws myself.
Also, this discussion with you brought me another perspective. People not just want purchase justification, but also effort justification - it is not enough that Nvidia did a terrific job to them, nor the device already comes packed with its unique character and quirks. They want the unreachable in the name of personal glory. They don't see already developed traits, they want to make traits for their own. It is as if, had Nvidia left overclocking headroom, people would have been happier. So funny to see consumer word of mouth marketing has superseeded Nvidia's concerted efforts to their own branding.
Guys, I also don't understand the "hardware scheduling" stuff. Didn't Nvidia let go of that since Kepler?
 
Last edited:
They are either using the terms interchangeably or they know what they're talking about.
People use names as a generic term all the time and are normally understood unless speaking to a pedant.
ie: Hoover meaning vacuum, coke meaning cola etc.
 
The English up above is so bad I can't tell if they're pro Nvidia, AMD or choice.
I've given up trying to decipher, my sanity is not worth the kost. ;)
 
Someone more cynical might compare such situations to a horde of rabid dogs looking for a fight, but rabid dogs actually have teeth, so more like a bunch of hyperactive kittens - cute when angry, but ultimately powerless.

I've been bitten and scratched by a very scared small kitten I was trying to catch, when I was younger. It left me with bloody hands that took a few days to stop hurting. Trust me, kittens and cats are anything but powerless, and once you experienced the pain they can inflict you'll try to avoid them at all costs. I've seen huge dogs running away from small cats, like their life depended on it. Probably after similar painful experiences.
 
I've given up trying to decipher, my sanity is not worth the kost. ;)
Lol, it's actually sad because I really was reading the posts a couple times trying to make sense of it all.

I've been bitten and scratched by a very scared small kitten I was trying to catch, when I was younger. It left me with bloody hands that took a few days to stop hurting. Trust me, kittens and cats are anything but powerless, and once you experienced the pain they can inflict you'll try to avoid them at all costs. I've seen huge dogs running away from small cats, like their life depended on it. Probably after similar painful experiences.

Smaller cats have super sharp claws since they're thinner. At least that sounds believable right?
 
Low quality post by Xzibit
You dont say
tenor.gif
 
Nvidia and Intel have laughed in the face of consumers knowing they had the market covered. AMD has already spanked Intel and can’t wait to see them slap around Ngreedia
 
Designing for today’s electronics is much more challenging as the edges of all of the digital square waves for example are much sharper than in the past and the power supply decoupling capacitors and the printed circuit board inter-plane capacitance needs to be combined for a low impedance at much higher frequencies. For high signal integrity and low electromagnetic interference (EMI) the layout and decoupling parts used are critical.

"POSCAP" is a solid electrolytic chip capacitor developed by SANYO (now Panasonic) where the Anode is sintered Tantalum and the Cathode is a highly conductive polymer formed. The resonant frequency depends on the series and is approximately 100KHz to 1MHz and above this frequency the impedance increases as it is dominated by the series inductance. Refer page 19;


Multilayer Ceramic Capacitors (MLCC) soldered in parallel with POSCAP are normally needed to lower the impedance at the higher frequencies and have a resonant frequency of approximately 10MHz to 100MHz for example.


Printed Circuit Board (PCB) inter-plane capacitance is created by pairs of large conductor planes with a dielectric (eg FR-4) normally connected in parallel to the power supply decoupled by the POSCAP and MLCC capacitors. These planes have extremely low series inductance and provide low power supply impedance from 500MHz to above 5GHz. The capacitance value however is much lower than MLCC and POSCAP types.

Therefore, without Ceramic Capacitors or MLCC you may have signal integrity issues depending on the supply current and the capacitance needed.
 
Last edited:
Nvidia and Intel have laughed in the face of consumers knowing they had the market covered. AMD has already spanked Intel and can’t wait to see them slap around Ngreedia

Except that AMD is like a mouse nibbling at a snake. The mouse can eventually kill the snake but it's not going to happen anytime soon. Just think, if everyone used the same process (nm). How would AMD be doing? Relatively worse in my opinion.
 
This is why not to blindly buy product right at launch fueled with hype.

At least we could appreciate their sacrifices, those cards are not cheap by any mean. Here, RTX 3080 $1000, RTX 3090 over $2000

I always wait for a couple months or even a year, more AIB model, ironed out driver, etc.
 
Last edited:
Yeah, why give it to you (the customer) as an extra feature, when they can sell it to you as an extra feature and at the same time help their models move up on the performance charts in hardware sites?

Certainly, therefore I call events such as these....'Karma'
 
Finally we have some confirmation on the SP-CAP and MLCC usage


Yes, replacing 2 SP-CAP with 20 MLCC can improve stability by 30mhz, with the same driver version. With the latest driver the stability would be much better at high clock I guess.
 
This is another experiment in the right direction, but the word confirmation it can not be used.
In the English language that electronic engineers and industry using , confirmation translates to at least of ten samples testing and identical behavior at the end of the test.
I am one step closer to speculate GPU architecture Drama and or creepage distance issue above the 1.9GHz mark.
 
Here's a clarification for y'all:


The cards are 100% stable at stock and start crashing when people are OC'ing them too much which has always been the case. I mean the beast already consumes 320W, why would you want to add 2% OC on top of that while increasing your power consumption by additional ~ 50W?
is that something Nvidia said ( need source for that) or something you made up?

I've given up trying to decipher, my sanity is not worth the kost. ;)
wait, what sanity?

I got some in my pocket.
 
Finally we have some confirmation on the SP-CAP and MLCC usage


Yes, replacing 2 SP-CAP with 20 MLCC can improve stability by 30mhz, with the same driver version. With the latest driver the stability would be much better at high clock I guess.
This men has balls!

It's a nice demostration, confirmed is risky work, but it's a very good work
 
Back
Top