• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

MSI Talks about NVIDIA Supply Issues, US Trade War and RTX 2080 Ti Lightning

Looking at that first picture, what the hell was NVIDIA thinking?

As so often happens in this industry, they let the engineers do the thinking, but clearly didn't take the time to optimise the design.
It's also a rather power hungry chip, so it needs good power delivery, which means a more expensive board.
I did a rough count and it looks like the RTX 2080Ti has 3x as many tantalum capacitors as the GTX 1080Ti, both being MSI cards in this case.
Even the Titan X (Pascal) looks simple in comparison, which makes the RTX 2080Ti look like a huge failure from a design standpoint.
I wouldn't be surprised if this was the most complex consumer card that Nvidia has made to date and I'm not talking about the GPU itself, but rather the PCB design.
 
Looking at that first picture, what the hell was NVIDIA thinking?

You only need that one look at those Pascal and Turing boards to see that RTX is the most wasteful practice in GPU history with the most meagre payoff ever. They took years and several generations to fine tune Kepler, and then they do this. o_O

I seriously question if we should be paying for all of that waste - since day one. Still not convinced.
 
Priceless, i am sure nVidia are paying more due to size. Personally i believe they should of not released it, i guess they wanted it out before AMD release their own version of ray tracing.

Yeah well done nVidia, awesome marketing strategy knowing full well the card would be expensive to begin with, knowingt full well nVidia tax and then 3rd party's have to find a way to make a profit too.

They are talking about AMD cpus.

So nearly doubling the component count wouldn't increase the cost and production time? I guess you'er not that familiar with SMT/SMD production?
Do you think all these "little" parts are free?

front.jpg


back.jpg


Just compare that to the 1080Ti and you'll see that just the power delivery circuitry is a lot more complex.

front.jpg


back.jpg


Instead of normal solid state capacitors, they're using what looks like tantalum capacitors, those are some 4-5x more expensive for starters.
GDDR6 is most likely more expensive than GDDR5X, as it's "new" technology and production has most likely not ramped up fully.
Obviously the GPU itself is more expensive, as it's a bigger chip, but that one is on Nvidia.

Then take into account that the machines that are picking and placing the components can only operate so fast. A decent SMT machine today can do 100,000 components per hour, but this also depends on the PCB layout and the type of components. However, a PCB normally goes through a couple of these machines and in-between there are reflow ovens that the boards have to pass through to solder the components to the board. The production lines aren't exactly moving at more than a snails pace, so it takes time to do these things. Once the boards have had all the components placed and soldered, they're then both manually checked and these days most likely machine checked for any issues with the component placement and soldering. This takes time. Finally the parts that have to be added by hand are are added and the boards and then going through yet another reflow oven to solder those parts in place. Once that's done, you have people fitting the cooling to the cards and that's obviously a manual job, which takes time. Then you have to test the cards to make sure they're working properly. Normally there should be a burn-in test that can take 24h or even longer.

So the more complex the PCB is, the longer it takes to make and the more expensive it gets.


THANK YOU. People think that things should be given to them for free and that if a card is complex and advanced, "they should not of releasing it". WTF.
 
Look how relatively barren Vega Frontier Edition is by comparison:
AMD-Radeon-Vega-Frontier-Edition-PCB_1.jpg

Or the Sapphire RX 580 Nitro+:
580np-11b.jpg


NVIDIA is not making any AIB friends with the Turing cards.
 
By nVidia, which, surprisingly, is able to somehow get (less than) 100 cards for reviewers, even though they are sold out in most shops.

How could nVidia achieve that? Mysterious...

If a company can't even spare 100 cards for marketing to try to sell a whole product lineup, they have issues.

As so often happens in this industry, they let the engineers do the thinking, but clearly didn't take the time to optimise the design.
It's also a rather power hungry chip, so it needs good power delivery, which means a more expensive board.
I did a rough count and it looks like the RTX 2080Ti has 3x as many tantalum capacitors as the GTX 1080Ti, both being MSI cards in this case.
Even the Titan X (Pascal) looks simple in comparison, which makes the RTX 2080Ti look like a huge failure from a design standpoint.
I wouldn't be surprised if this was the most complex consumer card that Nvidia has made to date and I'm not talking about the GPU itself, but rather the PCB design.

Did you look at the Titan V design? You said they "didn't take time to optimize the design" but why wouldn't they? They had time and you think it's not optimized but it probably is.
 
Did you look at the Titan V design? You said they "didn't take time to optimize the design" but why wouldn't they? They had time and you think it's not optimized but it probably is.

It does indeed look like they borrowed a lot from the Titan V power design. Obviously that card is using HBM, so it should be quite different, but doesn't seem to be all that different.
Normally the third party board makers tend to improve upon the reference designs, be it from Nvidia or AMD, hence my commend about optimisation.
That said, things aren't always improved upon. Read @W1zzard's commends on older graphics card reviews about the fact that the main Voltage controller always seems to be a "budget" part that doesn't support things like I2C for monitoring. This is not just done on the lower-end cards, but on flagship cards as well. There's really no reason to cut corners when it comes to things like this, but apparently Nvidia could save a quarter per card, so they decided to pocket that difference. On occasion, the third party board makers add better components, but with no software support, it doesn't always add additional features. It's possible that we're looking at a power hog here, but GDDR6 was supposed to have lower power draw than GDDR5/5x, which should result in an overall lower card power draw. In other words, the only reason for the rather extreme power regulation would be due to the GPU itself using exponentially more power than the equivalent Pascal based GPU.

Just looking at the board layouts, I'm getting a weird feeling that something didn't quite go as planned on Nvidia's side, at least not when it came to raw performance. Whereas Pascal had so good performance that they could move GPU's down a tier, compared to past architectures, it would seem like Turing worked out the other way around. As such, the RTX 2080 Ti should've been the Titan card, but due to poor performance compared to Pascal, it sits where it sits today, is costly to produce and also retails at a Titan level price point.
This is the first time since at least the 400-series that we've seen a top of the range card retail for over $1,000 (ok, the MSRP point is $999, but that's unlikely to happen).
Sure, that's over an eight year time period and we need to consider things like inflation, but in this case, it's an overly inflated price for an overly complex and under performing part. On top of that, top-of-the-range Ti cards have only existed since the 700-series, but for three generations, they never surpassed $699.

Yes, Nvidia has added a ton of new technology and in as much as no-one was really asking for any of that in a consumer graphics card today, it does help to push the envelope in terms of technology. Unfortunately it seems to have been a push too far, too soon.
 
Back
Top