Friday, August 15th 2008

Jen-Hsun Huang (NVIDIA): ''We Underestimated RV770''

NVIDIA suffered its first red-quarter in five years. There are several contributors to this, namely an up to US $200M write-off to cover expenses in recalling and restoring faulty mobile graphics processors.

Another factor has been a replenished product lineup from competitor AMD/ATI that is taking on NVIDIA products at mid thru high and enthusiast segments of the market, in essence ATI now has products to counter NVIDIA at every possible segment, with more dressing up to go to office.

Seeking Alpha spoke with CEO Jen-Hsun Huang, he was quoted saying:
We underestimated the price performance of our competitor’s most recent GPU, which led us to mis-position our fall lineup. The first step of our response was to reset our price to reflect competitive realities. Our action put us again in a strong competitive position but we took hard hits with respect to our overall GPU ASPs and ultimately to our gross margins. The price action was particularly difficult since we are just ramping 55-nanometer and the weak market resulted in taking longer than expected to work through our 65-nanometer inventory.
Huang says that with their transit to the 55nm silicon fabrication process, they hope to do better.Source: Seeking Alpha
Add your own comment

92 Comments on Jen-Hsun Huang (NVIDIA): ''We Underestimated RV770''

#1
Nyte
Weer said:
Technically speaking, GDDR5 is a waste of money when you can get the same bandwidth with a higher bus width.
Higher bus width is considerably more complex in area, cost, design than just buying GDDR5 chips which cost almost the same as GDDR3 chips.

Trust me.
Posted on Reply
#2
Scrizz
The Gddr5 makes a huge difference, wish I had it :cry:
Posted on Reply
#3
KainXS
I have to agree with tk, the difference between the GTX280 and 4870 is 20% tops

maybe not even that high
Posted on Reply
#4
Megasty
KainXS said:
I have to agree with tk, the difference between the GTX280 and 4870 is 20% tops

maybe not even that high
Even in Crysis its really close...This comes from my good buddy that managed to blow one of his 3 GTX280s :D

http://www.youtube.com/watch?v=DYnXxI1UjxE

I say he did a great job with the editing the 4 videos together :toast:
Posted on Reply
#5
captainskyhawk
candle_86 said:
the 4870 is slower than the 280, the 4870's compitition is the 260. It took the 4870 X2 for AMD to take a lead.
I'd say that considering the 4870 actually bests the 280 in 50% of games up to 1920, it's hardly slower -- I'd say they were pretty much neck to neck, with the 280 having some benefits at even higher resolutions.
Posted on Reply
#6
btarunr
Editor & Senior Moderator
candle_86 said:
I am not a fanboi
The Sun rises in Sunnyvale, sets in Tokyo.
Posted on Reply
#7
knowledge123
I would hope that this would put them in a more humble and reflective mood; after their 'we're going to open a can of whoop-ass' (:rolleyes:) comments and smack talking I am glad that ATi have lived up to and beyond what nVida thought that they were capable of. :toast: :cool: :slap:
Let us hope that they won't be as stuck-up about themselves. nVidia are a good company, but i feel that in recent times that they have grown too big for their boots, and I'm very happy that ATi have put them back in line. :)
Posted on Reply
#8
Wile E
Power User
candle_86 said:
I am not a fanboi i am making a statement which is true, clock for clock the HD4850 and 4870 are close the 4870 is slightly faster clock for clock but not much, as said 8% at best and thats above 1680x1050, below that they are rather close, because of the higher latancy associated with GDDR5, this is not my statement this was made in a newspost here on TPU a month or so back. Call me what you want, but im stating facts. The reason the 4870 is faster is its core and mem are clocked higher than the 4850 thats it.
You also realize that the 4870's native ram speed is actually 100Mhz lower than the 50's, right? The 70's check in at 900Mhz, the 50's check in at 999Mhz. Bump that GDDR5 to 999Mhz, and see what performance differences you come up with.
Weer said:
Technically speaking, GDDR5 is a waste of money when you can get the same bandwidth with a higher bus width.
Do you realize that it is more expensive to increase the bit width of the bus than it is to use GDDR5?
Posted on Reply
#9
blueskynis
candle_86 said:
Nvidia will have a response they always do. And GDDR5 is pure marketing, a 4850 with the core running same speed as the 4870 is freakishly close to it in scores that that point, the GDDR5 has to much latancy to be useful. GDDR3 is still more efficent untill they fix the high lat of GDDR5. I have a good idea though Nvidia will launch an anwser to the HD4870x2, most likly a dual GPU G200B based card.
High latency does not impact the performance of GPUs like it does impact CPUs.
Posted on Reply
#10
candle_86
knowledge123 said:
I would hope that this would put them in a more humble and reflective mood; after their 'we're going to open a can of whoop-ass' (:rolleyes:) comments and smack talking I am glad that ATi have lived up to and beyond what nVida thought that they were capable of. :toast: :cool: :slap:
Let us hope that they won't be as stuck-up about themselves. nVidia are a good company, but i feel that in recent times that they have grown too big for their boots, and I'm very happy that ATi have put them back in line. :)
If you rememeber anything you'd know Nvidia keeps ATI in check, not the other way around. This would be the second time ATI has a decent card out, and the last time that happened Nvidia fought back hard.

FX5800 was a joke, FX5900Ultra was actully faster in games of its day than the 9800Pro, it wasn't untill the next gen GPU's arrived did the FX5900Ultra loose its place, but the 6800Ultra was tied with the x800XT, and the 6800GT soundly trumped the x800PRo. When ATI got cocky with the x850XT a few months later the 7800GTX, ATI got cockcy 7800GTX 512, ATI tried again 7900GTX, and again ATI tried so we got the 7950GX2. ATI offered little to no threat to Nvidia for 2007 and half of 2008. Nvidia doesn't like to loose, all ATI did was wake a sleeping beast.
Posted on Reply
#11
Wile E
Power User
candle_86 said:
If you rememeber anything you'd know Nvidia keeps ATI in check, not the other way around. This would be the second time ATI has a decent card out, and the last time that happened Nvidia fought back hard.

FX5800 was a joke, FX5900Ultra was actully faster in games of its day than the 9800Pro, it wasn't untill the next gen GPU's arrived did the FX5900Ultra loose its place, but the 6800Ultra was tied with the x800XT, and the 6800GT soundly trumped the x800PRo. When ATI got cocky with the x850XT a few months later the 7800GTX, ATI got cockcy 7800GTX 512, ATI tried again 7900GTX, and again ATI tried so we got the 7950GX2. ATI offered little to no threat to Nvidia for 2007 and half of 2008. Nvidia doesn't like to loose, all ATI did was wake a sleeping beast.
ATI didn't get cocky. How in hell did you come to that conclusion? First, the 5900Ultra didn't beat the 9800Pro, and even if it did happen to match it in some games, ATI still had the 9800XT.

7800GTX might have been the answer to X850, but ATI answered right back with X1800. Then nVidia released 7900, and ATI threw X1900 in their face. I can't remember if 7950 or 1950 came out first, but that doesn't matter, ATI either matched, or beat nVidia in every price segment at that time. They just never put out an answer to GX2. They didn't have to. The driver support for it was so terrible, it died before it could ever take off.

Your fanboyism has blinded you to what ACTUALLY happened. NV didn't take a solid lead until they released 8800, which ATI left unanswered for much too long.
Posted on Reply
#12
newconroer
PCpraiser100 said:
HA! Another way of persuading customers to come back to the now crippled video card company! I love it! Good job Nvidia, however its not worth it. BTW I hate Crysis anyway so chow...

BTW shiman0, GDDR5 has a lot more bandwidth that Nvidia is planning to put this memory in their GTZ 300 series. Since new technologies mean steep price, ATI will have GDDR5-powered cards in the $100-$200 price range in no time!
If you 'hate' Crysis, then you should be laughing at the absurdity that is the 4870 X2, (or a GTX 280) for that matter.


:)


We don't have any proof about Nvidia using DDR5, even if they said they might down the road. PROOF, is when it's in your system, and WORKING.

blueskynis said:
High latency does not impact the performance of GPUs like it does impact CPUs.
Despite that lower latency, in theory should do wonders for your system performance, unfortunatley it doesn't. And it probably doesn't as much as we'd like to think with GPUs, however I find it's usually the other way around.
I would continue to take lower latency memory over higher frequency, definatley on a graphics card. Random access seeking time is far more important to me than bandwidth, especially if the bus is twice the size as well.

DDR3 has proven itself to be very good memory for GPUs, DDR5 has proven itself to be acceptable memory for GPUs. That's two entirely different things.
Posted on Reply
#13
[I.R.A]_FBi
candle, if gddr5 is just marketing, how much do u have to lose that nvidia will leave gdd3 soon?
Posted on Reply
#14
qwerty_lesh
altho i agree with most of whats said here, i dont feel that ati's 2xxx series was all that good, from what ive seen they really didnt make much of a comeback untill they released the 3xxx series, i know where i work nvidia was still popular for price and performance untill the 3xxx series got released. and yeah the 4xxx series rocks, they perform very well and are cheaper then current nvidia counterparts.
Posted on Reply
#15
Tatty_One
Senior Moderator
I really dont know why some get so antagonised about this gfx card war, it's pretty simple really, one side comes out on top sometimes, the other side other times....thats gotta be good right? good for us consumers at least, so now we find ourselves in a position that ATi has the fastest single card solution and NVidia the 2nd fastest, ATi just about the 3rd with NVidia just about the fourth, so what.....but to say that just because the "other side" to your preference brings a techonolgy to the table that pushes the boundries of performance (AKA GDDR5) that its pointless is TBH, fanboism of the highest order......and thats coming from a fanboi!

Enjoy the technology, enjoy the breakthroughs and enjoy the competative pricing that brings because no doubt, down the line the "other side" will edge in front....and when they do, things get even cheaper! :rockout:
Posted on Reply
#16
TheMailMan78
Big Member
Ya know you guys are arguing for nothing. He didnt say ATI anywhere! I think hes talking about Intels IGPU :laugh:

Anyway ATI has won this match. No arguing this by any logic. However this is a war not a battle so to both teams..."ALEMEN FOWARD!" :rockout:
Posted on Reply
#17
btarunr
Editor & Senior Moderator
TheMailMan78 said:
Ya know you guys are arguing for nothing. He didnt say ATI anywhere! I think hes talking about Intels IGPU :laugh:
He said GPU, not IGP. Intel doesn't make discrete GPUs now. Obviously he wasn't referring to S3 Graphics.
We underestimated the price performance of our competitor’s most recent GPU, which led us to mis-position our fall lineup.
Posted on Reply
#18
TheMailMan78
Big Member
btarunr said:
He said GPU, not IGP. Intel doesn't make discrete GPUs now. Obviously he wasn't referring to S3 Graphics.
Your right. I was just joking anyway ;) I think its pretty clear what he was saying.
Posted on Reply
#19
erocker
Tatty_One said:
I really dont know why some get so antagonised about this gfx card war, it's pretty simple really, one side comes out on top sometimes, the other side other times....thats gotta be good right? good for us consumers at least, so now we find ourselves in a position that ATi has the fastest single card solution and NVidia the 2nd fastest, ATi just about the 3rd with NVidia just about the fourth, so what.....but to say that just because the "other side" to your preference brings a techonolgy to the table that pushes the boundries of performance (AKA GDDR5) that its pointless is TBH, fanboism of the highest order......and thats coming from a fanboi!

Enjoy the technology, enjoy the breakthroughs and enjoy the competative pricing that brings because no doubt, down the line the "other side" will edge in front....and when they do, things get even cheaper! :rockout:
Exactly. People argue for the sake of arguing thinking thier view is more important or more correct then the other's for the sake of boosting thier self-esteem. I like Huang's statement as it's honest. It's not something that is seen very much.
Posted on Reply
#20
Nyte
We should make non-constructive arguments a bannable offense, that would silence alot of fanboys in my opinion.
Posted on Reply
#21
Viscarious
I cant stop laughing at all of you people!

Buy what you want and shut up. Play your games and be happy. If you want to play the latest games then buy the best card from your favorite company but cut all the bashing and biased remarks. Your not doing anything but raising your blood pressure and starting useless arguments.

Megasty's sig says everything anyone should really give a damn about. "gods are created through gaming not 3dmark..." Who gives a crap about your 22,000 3Dmark score when you go 3 and 22 in Call of Duty 4.

I know this wont stop anything so I'll come back to get another good laugh in. :D
Posted on Reply
#22
cdawall
where the hell are my stars
i got 22 and 3 with a 6200TC hehe
Posted on Reply
#23
Viscarious
cdawall said:
i got 22 and 3 with a 6200TC hehe
Exactly. Ride that card hard till it dies!
Posted on Reply
#24
Megasty
I really don't get it when it comes to the fanboism. These are gfx cards not baseball teams. Gfx card performance is set in stone right off the production block. They have a variable range of operation due to bin & driver issues. Huang's statement is very suitable to them being the gfx leaders for the last 2 years. They got too complacent & it costed them, but its not the end of the world. NV will always be NV.
Posted on Reply
#25
CDdude55
Crazy 4 TPU!!!
I would put a GTX 280 with my Core 2 Duo E4400 at stock 2.0Ghz.:)

I know, im a mad man.
Posted on Reply
Add your own comment