Thursday, February 16th 2012

NVIDIA Kepler Yields Lower Than Expected.

NVIDIA seems to be playing the blame game according to a article over at Xbit. This is what they had to say, "Chief executive officer of NVIDIA Corp. said that besides continuously increasing capital expenditures that the company ran into in the recent months will be accompanied by lower than expected gross margins in the forthcoming quarter. The company blames low yields of the next-generation code-named Kepler graphics chips that are made at TSMC’s 28nm node. “Decline [of gross margin] in Q1 is expected to be due to the hard disk drive shortage continuing, as well as a shortage of 28nm wafers. We are ramping our Kepler generation very hard, and we could use more wafers. The gross margin decline is contributed almost entirely to the yields of 28nm being lower than expected. That is, I guess, unsurprising at this point,” said Jen-Hsun Huang, chief executive officer of NVIDIA, during a conference call with financial analysts.

NVIDIA’s operating expenses have been increasing for about a year now: from $329.6 million in Q1 FY2012 to $367.7 million in Q4 FY2012 and expects OpEx to be around $383 million in the ongoing Q1 FY2013. At the same time, the company expects its gross margins in Q1 FY2013 to decline below 50% for the first time in many quarters to 49.2%. Nvidia has very high expectations for its Kepler generation of graphics processing units (GPUs). The company claims that it had signed contracts to supply mobile versions of GeForce “Kepler” chips with every single PC OEM in the world. In fact, NVIDIA says Kepler is the best graphics processor ever designed by the company. [With Kepler, we] won design wins at virtually every single PC OEM in the world. So, this is probably the best GPU we have ever built and the performance and power efficiency is surely the best that we have ever created,” said Mr. Huang.

Unfortunately for NVIDIA, yields of Kepler are lower than the company originally anticipated and therefore their costs are high. Chief exec of NVIDIA remains optimistic and claims that the situation with Fermi ramp up was ever worse than that. “We use wafer-based pricing now, when the yield is lower, our cost is higher. We have transitioned to a wafer-based pricing for some time and our expectation, of course, is that the yields will improve as they have in the previous generation nodes, and as the yields improve, our output would increase and our costs will decline,” stated the head of NVIDIA.

Kepler is NVIDIA's next-generation graphics processor architecture that is projected to bring considerable performance improvements and will likely make the GPU more flexible in terms of programmability, which will speed up development of applications that take advantage of GPGPU (general purpose processing on GPU) technologies. Some of the technologies that NVIDIA promised to introduce in Kepler and Maxwell (the architecture that will succeed Kepler) include virtual memory space (which will allow CPUs and GPUs to use the "unified" virtual memory), pre-emption, enhance the ability of GPU to autonomously process the data without the help of CPU and so on. Entry-level chips may not get all the features that Kepler architecture will have to offer."Source: Xbit Laboratories
Add your own comment

75 Comments on NVIDIA Kepler Yields Lower Than Expected.

#1
DannibusX
omegared26 said:
dont worry nvidia fanboys, Kepler will be out when AMD will have the 8xxx series
Wow. You're worthless.
Posted on Reply
#2
omegared26
:)

you think so? kid did i talked bad with you? maybe your good for nothing but i didnt sayed nothing to you so pls just shut up and cya DannibusX
Posted on Reply
#4
hellrazor
mastrdrver said:
How is it that nVidia is the only one affected by the hard drive shortage? AMD never said it's GPU sales were affected by the shortage.
Because they have computers, and computers really like to have hard drives?
Posted on Reply
#6
m1dg3t
Nvidia never surprises me, NEVER :ohwell:
Posted on Reply
#7
EpicShweetness
I've heard this little line to many times from Nvidia ppl "All your AMD cards are good at is gaming"
Why yes sir you are correct my AMD card is very competitive in that sense to your Nvidia one, and guess what? I bought my card for gaming why would I want a chip that has all these extra features that sound awesome, but I hardly use! By making such a colossal chip it's much more complex and much harder to develop, stop wasting your time with it Nvidia, and bring us gaming performance so I don't have to pay so much!
Posted on Reply
#8
omegared26
EpicShweetness said:
I've heard this little line to many times from Nvidia ppl "All your AMD cards are good at is gaming"
Why yes sir you are correct my AMD card is very competitive in that sense to your Nvidia one, and guess what? I bought my card for gaming why would I want a chip that has all these extra features that sound awesome, but I hardly use! By making such a colossal chip it's much more complex and much harder to develop, stop wasting your time with it Nvidia, and bring us gaming performance so I don't have to pay so much!
I used in 10 years Ati cards and Nvidia cards and i saw big differences at collors and quality of image and my conclusion was Ati-Amd is better in all things (price,video quality.....)
Posted on Reply
#9
Super XP
EpicShweetness said:
I've heard this little line to many times from Nvidia ppl "All your AMD cards are good at is gaming"
Why yes sir you are correct my AMD card is very competitive in that sense to your Nvidia one, and guess what? I bought my card for gaming why would I want a chip that has all these extra features that sound awesome, but I hardly use! By making such a colossal chip it's much more complex and much harder to develop, stop wasting your time with it Nvidia, and bring us gaming performance so I don't have to pay so much!
NV has no choice but to design multi-use Graphics Cards that do a lot more than just gaming. This is needed so they can remain competitive, and not just in gaming.

If NV just stuck to simple designs that offered solid top class gaming performance, they would would have been close to filing for Chapter 11. IMO.
Posted on Reply
#10
Rowsol
omegared26 said:
you think so? kid did i talked bad with you? maybe your good for nothing but i didnt sayed nothing to you so pls just shut up and cya DannibusX
mmmmk
Posted on Reply
#12
Recus
omegared26 said:
dont worry nvidia fanboys, Kepler will be out when AMD will have the 8xxx series
Since mid-range Kepler will be faster than all 7000s one card will rule them all. Flagship Kepler will compete with 8000.
Posted on Reply
#14
radrok
omegared26 said:
I used in 10 years Ati cards and Nvidia cards and i saw big differences at collors and quality of image and my conclusion was Ati-Amd is better in all things (price,video quality.....)
Could you please explain this "difference" in colours?
I'm really looking forward to what you will pull out now.
Posted on Reply
#15
omegared26
radrok said:
Could you please explain this "difference" in colours?
I'm really looking forward to what you will pull out now.
man the collors on amd are better than nvidia, you will see that if you have 2xPc's with same monitors (one with amd card and one with nvidia card), amd has better collors, nvidia has pale collors and believe me when you will make that tests u will see im right
Posted on Reply
#16
radrok
You should calibrate the monitor every time you switch GPU before doing any analysis on colour.
If you just plug and forget then you can't really complain about colours.
Posted on Reply
#17
Recus
omegared26 said:
And loool since nvidia will cost 3xtimes more than amd u should buy 4 of those but i bet u dont have money for that
While I'm buying 4 cards I don't care about the price.
Posted on Reply
#18
Fluffmeister
omegared26 said:
man the collors on amd are better than nvidia, you will see that if you have 2xPc's with same monitors (one with amd card and one with nvidia card), amd has better collors, nvidia has pale collors and believe me when you will make that tests u will see im right
AMD does have awesome collors.
Posted on Reply
#19
pr0n Inspector
omegared26 said:
I used in 10 years Ati cards and Nvidia cards and i saw big differences at collors and quality of image and my conclusion was Ati-Amd is better in all things (price,video quality.....)
Are you trying to reverse-troll?
Because the last thing I want is a video card meddling with my images and videos. Maybe that's why I color-manage everything I can.
Posted on Reply
#20
W1zzard
colors are even better on LSD

i think the OP is talking about analog VGA outputs, which due to their analogness can suffer from picture quality degradation depending on the components used on the board. shouldn't happen for dvi/hdmi/dp
Posted on Reply
#21
wolf
Performance Enthusiast
Yeilds on big chips, on a new node, are low? really? noooooo....... :rolleyes:

it's happened before, it'll happen again, they will be cometitive cards I'm sure of that.
Posted on Reply
#22
Red_Machine
AMD has nVIDIA by the balls. They've got an arrangement with TSMC to prioritise AMD chips, which leaves nVIDIA with very little fab time.
Posted on Reply
#23
eidairaman1
The Exiled Airman
Red_Machine said:
AMD has nVIDIA by the balls. They've got an arrangement with TSMC to prioritise AMD chips, which leaves nVIDIA with very little fab time.
You have to realize TSMC does other chips aswell not just AMD or Nvidias.
Posted on Reply
#24
the54thvoid
eidairaman1 said:
You have to realize TSMC does other chips aswell not just AMD or Nvidias.
Yar. I think Apple has a deal with TSMC. A big deal.
Posted on Reply
#25
eidairaman1
The Exiled Airman
the54thvoid said:
Yar. I think Apple has a deal with TSMC. A big deal.
However It seems TSMC has alot of teething problems as of the last 5 years.
Posted on Reply
Add your own comment