1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Kepler Yields Lower Than Expected.

Discussion in 'News' started by TheMailMan78, Feb 16, 2012.

  1. eidairaman1

    eidairaman1

    Joined:
    Jul 2, 2007
    Messages:
    12,976 (4.86/day)
    Thanks Received:
    1,631
    However It seems TSMC has alot of teething problems as of the last 5 years.
     
    1c3d0g says thanks.
  2. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.44/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    AMD GPU sales were affected by the shortage, and they probably did say so in their own report:

    http://www.anandtech.com/show/5465/...ort-169b-revenue-for-q4-657b-revenue-for-2011

    Also while on laptops AMD has a bigger marketshare, in desktops Nvidia has a 60%, so it's more affected than AMD there. In any case Nvidia's Q4 results were much better than AMD's Q4, so it's just a matter of explaining why their operating expenses were higher than before.
     
  3. TheMailMan78

    TheMailMan78 Big Member

    Joined:
    Jun 3, 2007
    Messages:
    21,133 (7.83/day)
    Thanks Received:
    7,659
    That and AMD has way more fab time then NVIDIA. There is a reason NVIDIA was down graded. NVIDIA saying this is just telling you "Get ready to pay out the ass for our new GPU" Stock holders are not fanboys. They play no favorites.
     
  4. Prima.Vera

    Prima.Vera

    Joined:
    Sep 15, 2011
    Messages:
    2,250 (1.98/day)
    Thanks Received:
    293
    Exactly what I was thinking. Plus, you can add the obvious delay in launching the cards. Fermi all over again! :) ;)
     
  5. BlackOmega

    BlackOmega

    Joined:
    Feb 7, 2009
    Messages:
    624 (0.30/day)
    Thanks Received:
    159
    Location:
    Michigan, USA
    Actually I have to agree with this.
    He can't but I can ;).

    Nvidia has sacrificed image quality in lieu of performance.
    Now this goes back a little bit but back when I was using some 8800's in SLI when I switched from the 175.19 driver to the 180.xx driver I noticed that my framerate doubled [in BF2142] but all of the colors washed out. At the time I was using a calibrated Dell Trinitron Ultra-Scan monitor so I immediately noticed the difference in color saturation and overall image quality.
    I actually switched back to the 175.19 driver and used it as long as I possibly could. Then I made the switch to ATi and couldn't have been happier. Image quality and color saturation was back, not to mention the 4870 I bought simply SMOKED my SLI getup. :D

    EDIT:
    Makes me wonder if the same thing that happened when Fermi came out is going to happen again. People waited and waited, then Fermi debuted, was a flop and all of the ATi cards sold out overnight.
     
  6. alwayssts

    Joined:
    May 13, 2008
    Messages:
    371 (0.16/day)
    Thanks Received:
    82
    EXACTLY.

    Compound this:

    AMD has 32 CUs and really only needs slightly more than 28 most of the time. 7950 is a fine design, and it doesn't really hurt the design if yields are low on 7970. Tahiti is over-designed, prolly because of the exact reason mentioned; big chip on new node. Even if GK104 did have the perfect mix of rop:shader ipc, the wider bus and (unneeded) bandwidth of 7950 should make up that performance versus a similar part with 256bit bus because 7950 is not far off that reality. Point AMD on flexibility to reach a certain performance level.

    Again, I think the 'efficient/1080p/gk104-like' 32 ROP design will come with Sea Islands when 28nm is mature and 1.5v 7gbps gddr5 is available..think something similar to a native 7950 with a 256-bit bus using higher clocks. Right now, that chip will be Pitcairn (24 ROPs) because it is smaller and lines up with market realities. Point AMD on being realistic.

    nVIDIA appears to have 16 less-granular big units, which itself is a yield problem...like Fermi on a less-drastic level because the die is smaller. If the shader design is 90% ppc (2 CU versus 1 SM) or less versus AMD, every single SM is needed to balance the design. I wager that is either a reality or very close to it considering 96sp, even with realistic use of SFU, is not 90% of 128. Yeah, scalar is 100% efficient, but AMD's 4vliw/MIMD designs are not that far off on average. Add that Fermi should need every bit of of 5ghz memory bandwidth per 1ghz core clock and 2 SM (ie 32 ROP/16/256-bit SM, 28 ROP/14 SM/224-bit) and you don't have any freaking wiggle room at all if your memory controller/core design over or under-perform.

    Conclusion:

    So if you are nVIDIA you are sitting with a large die, with big units that are all needed at it's maximum level to compete against the salvage design of the competition. Efficient as fermi can be yes, smart choices for this point in time...not even close.

    Design epic fail.
     
  7. theoneandonlymrk

    theoneandonlymrk

    Joined:
    Mar 10, 2010
    Messages:
    3,411 (2.02/day)
    Thanks Received:
    572
    Location:
    Manchester uk
    if you just plug and forget with both cards you have a reasonable comparison untweeked and nv look poorer, simples
     
    More than 25k PPD
  8. radrok

    radrok

    Joined:
    Oct 26, 2011
    Messages:
    2,990 (2.73/day)
    Thanks Received:
    803
    Location:
    Italy
    Do you realize it makes no sense to not optimize things? If default is fine for you then okay, be my guest.
     
  9. theoneandonlymrk

    theoneandonlymrk

    Joined:
    Mar 10, 2010
    Messages:
    3,411 (2.02/day)
    Thanks Received:
    572
    Location:
    Manchester uk
    read again ,i never said that i said if you plug and foreget both that would then be a fair comparison and NV look worse,,, simples
     
    More than 25k PPD
  10. cadaveca

    cadaveca My name is Dave

    Joined:
    Apr 10, 2006
    Messages:
    14,049 (4.51/day)
    Thanks Received:
    7,205
    Location:
    Edmonton, Alberta
    To me, it makes no sense TO optimize anything. The average user is going to do just that, so while "optimized" systems amy be better, most users will do no such thing, jsut beucase it's pain in the butt, or they do not know how.

    For a professional, where colour matters, sure, calibration of your tools is 100% needed. But not all PC users use their PCs in a professional context, and most definitely not the gamer-centric market that find their way on to TPU.


    You need to be able to relate the user experience, nto the optimal, unless every user can get the same experience with minimal effort. When that requires education of the consumer, you can forget about it.
     
  11. radrok

    radrok

    Joined:
    Oct 26, 2011
    Messages:
    2,990 (2.73/day)
    Thanks Received:
    803
    Location:
    Italy
    I understand your point Dave, still I think that is a waste to not get self informed about things and get the best experience you can out of your purchases.


    With all due respect, your sentence makes no sense to me, sorry.
     
  12. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.44/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    AMD does not have "better" colors, it has "more saturated" colors. Oversaturated colors. Several studies have dmostrated that when people are presented 2 identical images side by side, one being natural and the other being oversaturated, they tend to prefer the oversaturated one, well 70% of people do. But the thing is it's severely oversaturated and colors are not natural by any means. They are not the colors you can find in real life.

    So what is "better"? What is your definition of better? I guess if you belong to the 70% of people whose definition of better is more saturated then I guess that AMD has a more appealing default color scheme. If your definition of better is "more close to reality, more natural" then you'd prefer Nvidia's scheme.

    Saying that AMD has better color is like saying that fast food tastes better, because they use additives to make it "taste more". I guess people who get addicted to fast food do think it tastes better, but in the end it's just a matter of taste and so is colors.
     
    driver66 says thanks.
  13. TheMailMan78

    TheMailMan78 Big Member

    Joined:
    Jun 3, 2007
    Messages:
    21,133 (7.83/day)
    Thanks Received:
    7,659
    Having using AMD for years and just now using a NVIDIA card I can say with full confidence what you just said is BS. They look the same. I didn't even have recalibrate for process colors.
     
    Zubasa, HalfAHertz and driver66 say thanks.
  14. radrok

    radrok

    Joined:
    Oct 26, 2011
    Messages:
    2,990 (2.73/day)
    Thanks Received:
    803
    Location:
    Italy
    I agree with you TheMailMan78, in fact no one has given us proof to strengthen their argument.
    That's why I asked the person who brought the "colour" argument in the first place.
     
  15. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.44/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    It was true some years ago at least, I honestly don't know if it's true now, but people still say the same. In any case my point was that there's no "better" color, just more saturated or less saturated color and it's all about what you prefer. The one truth is that most of the media we are fed nowadays is oversaturated anyway, so it's just a matter of what extent of oversaturation you really prefer.

    And I find kinda funny that you chose to call BS on my post and not any of the preceeding ones. :cool:
     
  16. TheMailMan78

    TheMailMan78 Big Member

    Joined:
    Jun 3, 2007
    Messages:
    21,133 (7.83/day)
    Thanks Received:
    7,659
    I call yours BS because I expect more out of you....;):toast:

    Dont sink to it man.
     
  17. pr0n Inspector

    pr0n Inspector

    Joined:
    Dec 8, 2008
    Messages:
    1,334 (0.62/day)
    Thanks Received:
    164
    There used to be a 16-235 vs 0-255 issue. But that was dealt with long ago and it was not the video card's job anyway.
     
  18. LAN_deRf_HA

    LAN_deRf_HA

    Joined:
    Apr 4, 2008
    Messages:
    4,555 (1.90/day)
    Thanks Received:
    952
    Nvidia sacrificed IQ with the 7xxx series, that was it. Still to this day I rag on people who bought 7xxx cards because it was empty framerates. First time I can recall a new card gen having lower IQ than the previous one. The driver issue you talk about is well behind BOTH companies. Both got into the habit of releasing drivers around card release time that had IQ errors that increased performance. Namely I can think of this happening in Crysis 1 around the time the 3870/8800 GT were being compared, but the issue was always corrected in successive driver releases.

    You're doing it wrong. You need screenshots. I've seen this a lot in AA quality comparison shots in reviews as recently as Metro 2033's release. AMD cards are more saturated, at least that recently.
     
  19. sergionography

    Joined:
    Feb 13, 2012
    Messages:
    266 (0.27/day)
    Thanks Received:
    33
    well nvidia did drop the hot clocks which allowed more cores in the gpu and will no longer be limited in clocks as the shaders and the cores will have the same frequency(before since they had hot clocks they always had scaling issues), they radically changed the fermi make up and seems like they know what they are doing, as for the gtx660 i read leaks that it was a 340mm2 chip compared to the 365mm2 of the hd7970 and is meant to compete and come close to the hd7970 which seems reasonable, tho im not sure how they will pull a gtx680/670 (probably will be like the gtx470/480 with disabled hardware)

    so while i agree with you overall nvidia isnt in such a bad place, only their biggest chip is.
    so in the worst case nvidia will end up with a top end gpu that is 10-20% slower than amds top end, but I doubt that, even with the 256bit bandwidth that everyone is all crazy about i dont think it should be a problem in most scenarios, especially considering the fact that most people buying nvidia dont really do multiple gpu setups while for amd its almost a must for eyefinity.

    also i heard leaks nvidia was debating whether they should call the gk104 gtx660 or gtx680 when the gk110 was supposed to be for that but isnt coming anytime soon, so idk whether the yeild issues force nvidia to do so, or whether they think the gk104 is sufficient, either way we need competition already, and for cards with 340mm2 and 365mm2 die sizes they should be well in the 350-400$ price range, and thats considering the TSMC 20% more expensive wafer prices
     
    Last edited: Feb 18, 2012
  20. pr0n Inspector

    pr0n Inspector

    Joined:
    Dec 8, 2008
    Messages:
    1,334 (0.62/day)
    Thanks Received:
    164

    I don't think we were talking about the image quality of 3D engines.
     
  21. TheGuruStud

    TheGuruStud

    Joined:
    Sep 15, 2007
    Messages:
    1,620 (0.62/day)
    Thanks Received:
    168
    Location:
    Police/Nanny State of America
    I have my 7950, nvidia, so na na na boo boo. Go cry to mommy. We knew yields were low LAST YEAR (for both camps)!.

    Fantastic card, btw :) Runs much better than the 6950s I had. At 1,175 core so far. Still testing :)
    With a non-reference cooler and OCed it still won't go above low 60s. The fans are still silent.
     
  22. Inceptor

    Inceptor

    Joined:
    Sep 21, 2011
    Messages:
    497 (0.44/day)
    Thanks Received:
    119
    The other way around.
     
    1c3d0g says thanks.
  23. Wrigleyvillain

    Wrigleyvillain PTFO or GTFO

    Joined:
    Oct 13, 2007
    Messages:
    7,667 (2.99/day)
    Thanks Received:
    1,775
    Location:
    Chicago
    Uh idk...can't speak to multi-monitor really but offhand I know a lot more people running Crossfire than SLI and always have pretty much (if the opposite is in fact what you were saying).
     
  24. erocker

    erocker Super Moderator Staff Member

    Joined:
    Jul 19, 2006
    Messages:
    39,795 (13.19/day)
    Thanks Received:
    14,178
    For Nvidia Surround (3 monitors) you need two cards. For AMD Eyefinity you only need one.
     
    m1dg3t and Razerian say thanks.
  25. m1dg3t

    m1dg3t

    Joined:
    May 22, 2010
    Messages:
    2,247 (1.39/day)
    Thanks Received:
    513
    Location:
    Canada
    And with eyefinity you can run up to 6 screen's ;)
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page