1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Why does GTX690 consume less power than GTX680 2 way SLI?

Discussion in 'NVIDIA' started by Gainward, May 1, 2012.

  1. Gainward

    Joined:
    Nov 2, 2010
    Messages:
    39 (0.03/day)
    Thanks Received:
    0
    Location:
    Sweden/Helsingborg
    So we've seen the spec of this new card (GTX690) and I'm really impressed with it's TDP (300W)
    while GTX680 has a TDP of 195W, and considering that GTX690 based of 2 GTX680 GPUs so I assumed GTX690 would have a TDP of 380W or something


    Here, I read that GTX690 doesn't even consume 300W, it's actually 263W :eek:


    So I'm a little bit confused :banghead:
     
  2. the54thvoid

    the54thvoid

    Joined:
    Dec 14, 2009
    Messages:
    3,338 (1.90/day)
    Thanks Received:
    1,566
    Location:
    Glasgow - home of formal profanity
    Slower clocks, better cooling (= cooler chips = better power efficiency). Add in some tinkering and you have some reasons.

    And then wait for reviews for actual power draw and then there might well be power throttling.
     
  3. Fourstaff

    Fourstaff Moderator Staff Member

    Joined:
    Nov 29, 2009
    Messages:
    9,185 (5.20/day)
    Thanks Received:
    1,976
    Location:
    Home
    Also, all the best chips are binned to the 690
     
  4. MasterInvader

    MasterInvader

    Joined:
    Sep 16, 2011
    Messages:
    275 (0.25/day)
    Thanks Received:
    58
    Location:
    Portugal
    Easy answer the 690 don´t have Dual 680´s on board, it´s more like Dual "670´s" and it´s same thing with the GTX590.

    Regarding performance a dual GPU card will never be as good as 2WaySLI [dual cards].
     
  5. sneekypeet

    sneekypeet Unpaid Babysitter Staff Member

    Joined:
    Apr 12, 2006
    Messages:
    21,582 (6.97/day)
    Thanks Received:
    6,109
    Seriously? Not trolling, but it seems if they were binning cores, they wouldn't be running them underclocked from the original cores. Or are you saying they are binned by voltage used only?
     
  6. Gainward

    Joined:
    Nov 2, 2010
    Messages:
    39 (0.03/day)
    Thanks Received:
    0
    Location:
    Sweden/Helsingborg
    Why wouldn't they use those "best gpus" to 680s?
     
  7. LAN_deRf_HA

    LAN_deRf_HA

    Joined:
    Apr 4, 2008
    Messages:
    4,554 (1.92/day)
    Thanks Received:
    952
    Pretty sure when they bin for dual gpu cards it's for heat/power. Regardless of how well binned they are they'd still suck down a good bit more at 680 speeds.
     
  8. Gainward

    Joined:
    Nov 2, 2010
    Messages:
    39 (0.03/day)
    Thanks Received:
    0
    Location:
    Sweden/Helsingborg
    would the 690 consume the same if it runs the same frequencies? (as 680)
     
  9. Fourstaff

    Fourstaff Moderator Staff Member

    Joined:
    Nov 29, 2009
    Messages:
    9,185 (5.20/day)
    Thanks Received:
    1,976
    Location:
    Home
    In addition to running underclocked and undervolted, they still do need to cherry pick some (at very least in the earlier batches of production), otherwise there will be problems. Don't quote me on this though, its just a tidbit I ran across but have since lost the proof for it. They did it for the 5970, then the 6990, and then 590, so I suppose they did it with 690 too.
     
  10. DarkOCean

    DarkOCean

    Joined:
    Jan 28, 2009
    Messages:
    1,616 (0.78/day)
    Thanks Received:
    349
    Location:
    on top of that big mountain on mars(Romania)
    Nvidia are a little optimistic about their TDPs ,wait for wizz review and see the truth when its out.
     
  11. cadaveca

    cadaveca My name is Dave

    Joined:
    Apr 10, 2006
    Messages:
    13,980 (4.51/day)
    Thanks Received:
    7,115
    Location:
    Edmonton, Alberta
    Typically chips used in dual GPU cards kinda get the same binning as laptop chips do. In this, they tend to look for chips with the lowest leakage, which, of course, means heat. And I mean directly; leakage = heat given off.

    THEN they downclock even more(maybe, but usually for laptop chip, dual GPU chips) to increase the size of that bin to a reasonable amount so that they have a large supply of chips, with the demand of such a product in mind dictating where that goal is. They have a specific number of chip in mind, so they adjust clocks and allowed leakage limits to make sure they can fill that "ordered number" of chips.

    This is why we get re-spins. If targets are not met, in the way I just described, or there is a critical flaw that prevents operation, they then make changes, and get new wafers. A0 could be first design, first run, A1 same design second run, B0 a revisied design first run, B1 a second run, etc, etc, etc...

    There's no way to know exactly what those revisions mean...or what sort of sorting they are doing for what power targets and frequencies(as power consumed is largely tied to the frequency the chip runs at), so it's impossible to figure out just exactly how they are binning chips, but generally speaking, it's actually pretty basic.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page