1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GIGABYTE GeForce GTX 680 Graphics Card Pictured

Discussion in 'News' started by btarunr, Mar 19, 2012.

  1. jaredpace

    jaredpace New Member

    Joined:
    Sep 16, 2009
    Messages:
    165 (0.08/day)
    Thanks Received:
    65
  2. MxPhenom 216

    MxPhenom 216 Corsair Fanboy

    Joined:
    Aug 31, 2010
    Messages:
    10,574 (6.22/day)
    Thanks Received:
    2,640
    Location:
    Seattle, WA
    Crunching for Team TPU
  3. TRWOV

    TRWOV

    Joined:
    Aug 11, 2011
    Messages:
    4,058 (2.99/day)
    Thanks Received:
    2,742
    Location:
    Mexico
    Crunching for Team TPU
  4. alexsubri

    alexsubri

    Joined:
    Feb 7, 2010
    Messages:
    1,391 (0.73/day)
    Thanks Received:
    199
    I'm more into Graphical Design :)
     
  5. radrok

    radrok

    Joined:
    Oct 26, 2011
    Messages:
    3,031 (2.37/day)
    Thanks Received:
    816
    Location:
    Italy
  6. semantics

    semantics

    Joined:
    Jan 13, 2011
    Messages:
    124 (0.08/day)
    Thanks Received:
    17
    benchmarks have unrealistic and nonsensical amounts of load on gpu tasks, who would have known :D
     
  7. Rahmat Sofyan

    Rahmat Sofyan

    Joined:
    Sep 7, 2011
    Messages:
    120 (0.09/day)
    Thanks Received:
    31
    Location:
    Pekanbaru - Riau - Indonesia - Earth
    ...

    hahaha...LOL, it'll increase the heat I guess :laugh:
     
  8. CrAsHnBuRnXp

    CrAsHnBuRnXp

    Joined:
    Oct 19, 2007
    Messages:
    5,698 (2.07/day)
    Thanks Received:
    727
    Was the bold necessary?

    Source

    Simple Google search. First result yielded that. Not sure what you mean with 4K though.

    Ok now this just sounds like to me someone that is brain dead and just wants the latest and greatest and if it has a .1 of a difference that must mean its super better than the regular non .1 version because they really have no tech knowledge to begin with.

    And even though those cards didnt have that .1, didnt nvidia still whoop their ass?
     
    Last edited: Mar 20, 2012
  9. rvalencia

    Joined:
    Nov 3, 2011
    Messages:
    89 (0.07/day)
    Thanks Received:
    8
    NV's DX10.X kitbash has some DX10.1 like features.
     
  10. micropage7

    micropage7

    Joined:
    Mar 26, 2010
    Messages:
    6,516 (3.51/day)
    Thanks Received:
    1,557
    Location:
    Jakarta, Indonesia
    [​IMG]
    i dunno whats the point they place the connector like that?
     
  11. [H]@RD5TUFF

    Joined:
    Nov 13, 2009
    Messages:
    5,615 (2.82/day)
    Thanks Received:
    1,707
    Location:
    San Diego, CA
    Cable management is my only guess, at least they are trying something new, instead of cranking out the same old stuff.
     
  12. HuLkY

    HuLkY

    Joined:
    Mar 5, 2012
    Messages:
    68 (0.06/day)
    Thanks Received:
    5
    Location:
    Alexandria, Egypt
    The Design, SUCKS! what the hell is this "GeForce GTX" Thing! at least bring the old GTX280 glowing LED :( , and no back plate?!

    In EVGA We TRUST
     
  13. Dj-ElectriC

    Dj-ElectriC

    Joined:
    Aug 13, 2010
    Messages:
    2,317 (1.35/day)
    Thanks Received:
    874
    People must must understand that although GTX680 is a high-end card it's replacement will come relatively fast. It's just NVIDIA's quick answer there to HD7900.
    Don't get bummed for not getting high-end looks on that card.
    "In EVGA We TRUST" ? makes no sense.
    I will trust whoever gives me a good product.
     
  14. Casecutter

    Joined:
    Apr 19, 2011
    Messages:
    1,262 (0.86/day)
    Thanks Received:
    104
    Location:
    So. Cal.
    Probably has to do with freeing up property on the PCB and traces. The card has the construction of a $250-300 parts and components, but with that you get firmware/software optimizations that maximize game play to be 60Fps... right?

    Can we continue using the normal maximum/average/low way of calculating performance when 60fps is always the target? From what I see this thing promotes the "average" and nothing higher. It a way of unearthing efficiency and lowering thermals, while giving it a shot of nitrous when not keeping up to the mean! So will all the old data and graphs necessarily translate to apple-to-apples as provided in the past? Will it look/play great... in theory yes, though will Adaptive V-Sync sire new glitchy-ness over the traditional lag and spike as we’ve know it in past. Averaging at 60 Fps will not look all that different than say a 7970, which at say the old way of looking at an average hits say 80 Fps (that's what Nvidia is betting on). This really in my mind stops the whole I’m fastest, changes it to I’m the sam at providing the average, because no one can tell the difference. Kind’of the old "more than a mouth full is a waste"!

    I don’t know what this now means, but traditional testing like W1zzard been done may well have very little merit any longer. We might need more of graphs like [H]ard|OPC has done those "spiky" graphs as the games played; although now there will be this slightly wavy green line hugging right a 60fps. Well that’s boring might really minimize graphic card reviews as we know it, sure plays BF3.... 60fps.

    It feels like Nvidia took the ball and ran into left field and is saying, "the game as you knew it has changed". Which isn’t wrong or bad, but it really feels like the old way of figuring the best card has changed... but the price remains the same! Now, here’s a question why now can there be enthusist cards? In theory any GPU that’s strong and sought enough to game the newest or most demanding titles as long as they can clock it fast enough for the few milliseconds to keep the game from dipping below 60fps is all the best newest offerings need to be. The new mantra will be "we can render that @ XXXX resolution with this level/type of AA (as in Nvidia TXAA, anti-aliasing algorithm). It’s no longer about being fastest or faster. So if AMD take the Cape Verde and does this, are we all okay with it?

    :twitch:
     
    Last edited: Mar 20, 2012

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page