Monday, October 9th 2017

MSI Talks about NVIDIA Supply Issues, US Trade War and RTX 2080 Ti Lightning

Back on September 27th, MSI talked candidly with PConline at the MSI Gaming New Appreciation Conference, in Shanghai. Multiple MSI executives were available to answer questions regarding products, launches, and potential issues. The first question asked was about the brewing US-Chinese trade war and if it will affect prices of graphics cards and CPUs. To which, Liao Wei, Deputy General Manager of MSI Global Multimedia Business Unit, and MSI Headquarters Graphics Card Products gave an actual answer. Stating that the since NVIDIA's GPU core is handled by a TSMC in Taiwan and memory is handled by Samsung and Hynix in South Korea and the United States respectively, there is little chance of further graphics card price hikes. However CPU side prices may increase on the Intel side, however, AMD is expected to be unaffected.
They were next asked why NVIDIA's GeForce RTX graphics cards are already out of stock and hard to come by, To which Liao Wei again replied, stating that the RTX 2080 shipments are actually holding quite well, however, the RTX 2080 Ti shipments have been relatively small which is due to production yields. While this is to be expected considering the large size of the GPU die it also means NVIDIA appears to be focusing more on RTX 2080 production. Liao Wei also touched on the fact that the RTX series is far more complex than the previous generation, with the RTX 2080 Ti using some 2600 components compared to the GTX 1080 Ti at 1600. For further comparison, it is said the RTX 2080 uses 2400 components and the RTX 2070 some 2200. So not only are the newer GPUs more complex and made of more components, but they also face a production crunch. Originally it was possible to produce 3500 GPUs a day for the GTX 1080 Ti, while the RTX 2080 Ti can only be produced at a rate of 1800 per day. It is likely this is before screening for defective chips.

With prices so high everyone is already waiting impatiently on the RTX 2070 launch, to see just what it offers and PConline were not afraid to ask when it would become available. MSI have stated they will officially launch their RTX 2070 graphics cards on October 17th, with production already well underway. They also touched base on when the MSI Lightning series would make its return with Yang Shengzhi, Senior Associate of MSI Global Multimedia Products Department, stating it would be available very soon and that they will have additional features that consumers have been gravitating towards such as, a better appearance and RGB lighting. That said, they reaffirm that the cards true purpose will be for overclocking, however, they want to make it more consumer friendly. Source: PConline
Add your own comment

30 Comments on MSI Talks about NVIDIA Supply Issues, US Trade War and RTX 2080 Ti Lightning

#1
AsRock
TPU addict
AMD is expected to be unaffected.
Priceless, i am sure nVidia are paying more due to size. Personally i believe they should of not released it, i guess they wanted it out before AMD release their own version of ray tracing.

Yeah well done nVidia, awesome marketing strategy knowing full well the card would be expensive to begin with, knowingt full well nVidia tax and then 3rd party's have to find a way to make a profit too.
US Trade War and nVidia Tax
Posted on Reply
#2
FordGT90Concept
"I go fast!1!11!1!"
Intel doesn't surprise me. Their CPU production infrastructure is very global. The odds of something hitting China are good.


I'm mostly surprised by how production costs for Turing is so much higher than Pascal. What are these "components" he's describing?


Hynix memory is manufactured in the USA? Color me surprised.
Posted on Reply
#3
okidna
AsRock said:
Priceless, i am sure nVidia are paying more due to size. Personally i believe they should of not released it, i guess they wanted it out before AMD release their own version of ray tracing.
Whoa hold your horses, that's not about GPU price, but CPU price :

crazyeyesreaper said:
However CPU side prices may increase on the Intel side, however, AMD is expected to be unaffected.
Posted on Reply
#4
TheOne
Hard to believe there is a shortage when so many were given out to reviewers.
Posted on Reply
#5
Frick
Fishfaced Nincompoop
TheOne said:
Hard to believe there is a shortage when so many were given out to reviewers.
Reviwers cards are statistically non-existant.
Posted on Reply
#6
AsRock
TPU addict
okidna said:
Whoa hold your horses, that's not about GPU price, but CPU price :
Ahh ok, but still high ass nVidia tax still, i still think they should not of released it.
Posted on Reply
#7
TheOne
Frick said:
Reviwers cards are statistically non-existant.
My point was they've spent more time and energy trying to hype this series rather than sure up stock for actual customers.
Posted on Reply
#8
yotano211
TheOne said:
Hard to believe there is a shortage when so many were given out to reviewers.
Most of the time those review cards have to be returned or passed on to other reviewers.
Posted on Reply
#9
TheOne
yotano211 said:
Most of the time those review cards have to be returned or passed on to other reviewers.
That is becoming more common with review samples, I don't how many had to return these, but the youtubers were playing with theirs to see if they could get high scores on 3D Mark in SLI/NVLink.
Posted on Reply
#10
yotano211
TheOne said:
That is becoming more common with review samples, I don't how many had to return these, but the youtubers were playing with theirs to see if they could get high scores on 3D Mark in SLI/NVLink.
I guess the youtubers that got to keep theirs have bigger followers.
Posted on Reply
#11
TheOne
yotano211 said:
I guess the youtubers that got to keep theirs have bigger followers.
Probably, it was shame what they were doing to them though, especially with people waiting.
Posted on Reply
#12
yotano211
TheOne said:
Probably, it was shame what they were doing to them though, especially with people waiting.
Not really a shame. I dont know the contract they have with Nvidia but I am sure the youtubers that got to keep them had to advertise them, including trying to break records on 3dmark or other silly things.
Posted on Reply
#13
TheOne
yotano211 said:
Not really a shame. I dont know the contract they have with Nvidia but I am sure the youtubers that got to keep them had to advertise them, including trying to break records on 3dmark or other silly things.
I think it was boredom, volt modding and air conditioned water cooling, but I don't know I just skimmed one video.
Posted on Reply
#14
yotano211
TheOne said:
I think it was boredom, volt modding and air conditioned water cooling, but I don't know I just skimmed one video.
I didnt like the one that Linus did.
Posted on Reply
#15
TheOne
yotano211 said:
I didnt like the one that Linus did.
He's not very methodical.
Posted on Reply
#16
John Naylor
I gotta think that prices will remain high until the stock of 10xx cards dwindles.

As for youtubers, I pay no attention unless it's from a author with a reputable web site. Id rather see Bert and Ernie do tech reviews than Jayz2cents and Linus. Hightechlegion, aliebabeltech, hardwarecanucks , Bill Owen and Singularity Computers are the only ones i find worthwhile.
Posted on Reply
#17
Frick
Fishfaced Nincompoop
TheOne said:
My point was they've spent more time and energy trying to hype this series rather than sure up stock for actual customers.
They have departments and budgets for both, Marketing is one and Production or whatever it's called is the other.
Posted on Reply
#18
Basard
FordGT90Concept said:
I'm mostly surprised by how production costs for Turing is so much higher than Pascal. What are these "components" he's describing?
Most likely all of the surface mount transistors, resistors, capacitors..... AND the founder's edition cards have something like 75 screws holding the heat sinks together.

TheOne said:
Probably, it was shame what they were doing to them though, especially with people waiting.
I would expect nothing less. It helps us know what we're buying--or not buying.
Posted on Reply
#19
TheOne
Basard said:
I would expect nothing less. It helps us know what we're buying--or not buying.
I wouldn't mind a good in depth OC video, but what they were doing was impractical and compromising the hardware, and I just hate seeing things wasted.
Posted on Reply
#20
TheLostSwede
FordGT90Concept said:
I'm mostly surprised by how production costs for Turing is so much higher than Pascal. What are these "components" he's describing?
So nearly doubling the component count wouldn't increase the cost and production time? I guess you're not that familiar with SMT/SMD production?
Do you think all these "little" parts are free?





Just compare that to the 1080Ti and you'll see that just the power delivery circuitry is a lot more complex.





Instead of normal solid state capacitors, they're using what looks like tantalum capacitors, those are some 4-5x more expensive for starters.
GDDR6 is most likely more expensive than GDDR5X, as it's "new" technology and production has most likely not ramped up fully.
Obviously the GPU itself is more expensive, as it's a bigger chip, but that one is on Nvidia.

Then take into account that the machines that are picking and placing the components can only operate so fast. A decent SMT machine today can do 100,000 components per hour, but this also depends on the PCB layout and the type of components. However, a PCB normally goes through a couple of these machines and in-between there are reflow ovens that the boards have to pass through to solder the components to the board. The production lines aren't exactly moving at more than a snails pace, so it takes time to do these things. Once the boards have had all the components placed and soldered, they're then both manually checked and these days most likely machine checked for any issues with the component placement and soldering. This takes time. Finally the parts that have to be added by hand are are added and the boards and then going through yet another reflow oven to solder those parts in place. Once that's done, you have people fitting the cooling to the cards and that's obviously a manual job, which takes time. Then you have to test the cards to make sure they're working properly. Normally there should be a burn-in test that can take 24h or even longer.

So the more complex the PCB is, the longer it takes to make and the more expensive it gets.
Posted on Reply
#21
Basard
TheOne said:
I wouldn't mind a good in depth OC video, but what they were doing was impractical and compromising the hardware, and I just hate seeing things wasted.
Well, I partially agree. It's their job though. I'd do the same If I had a pile of RTX cards at my disposal, but then, I might just sell them all and buy something useful--probably why I'm not a youtuber.
Posted on Reply
#22
FordGT90Concept
"I go fast!1!11!1!"
Looking at that first picture, what the hell was NVIDIA thinking?
Posted on Reply
#23
medi01
TheOne said:
Hard to believe there is a shortage when so many were given out to reviewers.
By nVidia, which, surprisingly, is able to somehow get (less than) 100 cards for reviewers, even though they are sold out in most shops.

How could nVidia achieve that? Mysterious...
Posted on Reply
#24
Aquinus
Resident Wat-man
crazyeyesreaper said:
It is likely this is before screening for defecting chips.
Defecting chips? Where are they defecting to, AMD? :laugh:
Posted on Reply
#25
TheLostSwede
FordGT90Concept said:
Looking at that first picture, what the hell was NVIDIA thinking?
As so often happens in this industry, they let the engineers do the thinking, but clearly didn't take the time to optimise the design.
It's also a rather power hungry chip, so it needs good power delivery, which means a more expensive board.
I did a rough count and it looks like the RTX 2080Ti has 3x as many tantalum capacitors as the GTX 1080Ti, both being MSI cards in this case.
Even the Titan X (Pascal) looks simple in comparison, which makes the RTX 2080Ti look like a huge failure from a design standpoint.
I wouldn't be surprised if this was the most complex consumer card that Nvidia has made to date and I'm not talking about the GPU itself, but rather the PCB design.
Posted on Reply
Add your own comment