1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 760 2048 MB

Discussion in 'Reviews' started by W1zzard, Jun 25, 2013.

  1. DayKnight

    Joined:
    Nov 27, 2012
    Messages:
    194 (0.32/day)
    Thanks Received:
    9
    I dont care. I am way happy with overall performance of my GPU!.

    Though, I did want GTX 660 to have 1000 or more cuda cores.
  2. Ed_1

    Joined:
    Dec 14, 2006
    Messages:
    304 (0.11/day)
    Thanks Received:
    45
    right it only reference that are hot ,any of 3rd party twin fan units run much cooler and that with not even setting up fan profile .
    for example I think reference 660ti was around 75c under full load but with MSI 660ti it went down to 68-70c and with slight adjustments to fan profile it is no problem keeping temps at 60c .
    This is with minimal noise to, a auto profile not straight line xx rpm across temps .
  3. Casecutter

    Joined:
    Apr 19, 2011
    Messages:
    1,127 (0.94/day)
    Thanks Received:
    82
    Location:
    So. Cal.
    That's basically the sum of it, and on those points it still feels pricey... I mean other than the "hocus pocus" that is its 256-Bit (provides little to no assistance) it should've been a 660TiGSO/SE
  4. Ed_1

    Joined:
    Dec 14, 2006
    Messages:
    304 (0.11/day)
    Thanks Received:
    45
    except they are pricing it much lower than the 660ti was released at , which was in 300+$ range .
    But yes it is just tweaked GT104 core ,no smaller die size has happened here .
  5. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    9,821 (4.04/day)
    Thanks Received:
    3,480
    I do love a good rebrand!

    /sarcasm
  6. Casecutter

    Joined:
    Apr 19, 2011
    Messages:
    1,127 (0.94/day)
    Thanks Received:
    82
    Location:
    So. Cal.
    GTX660Ti MSRP was $300 and then account the 14% lower Cuda and TMU counts on this gelding... 300 -14% = $258, so not any real price differential. Then consider how this continues to "water down" the GK-104 total wafer deviates to now "4". Nvidias’ price per chip is really low vs. a Tahiti that has 3 (although the Tahiti LE nowhere near the volumes) and a bigger die of 365mm2 vs 294mm2 (almost 25% bigger). Nvidia is still left with the loin share of meat on their GK104 production.

    Considering that we've seen both "Tahiti LE 7870" at $200 -AR and now special 7950's at $220 -AR. $250 wasn't nearly as assertive as they could have been, although as the GTX670 goes EoL we might see Nvidia get aggressive. I thought a we might have seen $230 MSRP, but that would've put the stockpiles of GTX670 in a precarious position.
    Last edited: Jun 27, 2013
  7. Lagittaja

    Joined:
    Dec 12, 2012
    Messages:
    19 (0.03/day)
    Thanks Received:
    1
    'cept the Reference clocked EVGA blower goes for 199€+shipping (12€ to Finland).
    Reference clocked ACX 209€, SC blower 209€ and SC ACX 209€.

    Cheapest 7950 goes for 240€+shipping from Germany The good 7950's go for 280€+

    AMD is in agony lol, even more after you consider the reference 760 cards on average oc'd to 1245 on the core, checked over a dozen reviews and not all of them even said what was the actual boost clock under load while only mentioning what GPU-Z said for the boost.
  8. Casecutter

    Joined:
    Apr 19, 2011
    Messages:
    1,127 (0.94/day)
    Thanks Received:
    82
    Location:
    So. Cal.
    Going by W1zzard 6 reviews the average is like 1175Mhz (16%) while that's substantial, it's the luck of the draw as what that the dynamic OC actually provides in FpS increases. His reviews show wide ranging results of between 12.8 - 17.7% increase in that 1175Mhz range. Even his highest recorded on the EVGA GTX 760 SC at 1220 MHz, gave a 16% increase to BF3

    Compare that to say W1zzards Joker (Tahiti LE) review with 24% OC provides 18% FpS, or the 7950 IceQ where a 27% OC realizes 22.8% FpS.

    Pinning your hopes on what you get from Nvidia boost clock profiles is more of a "crap shoot" than what you normally presume you might get from OC'n a card. Sure sometime a card is a dud, but now while an Nvidia may show a high Mhz that might not be the only constraint that holds back performance.
  9. Ed_1

    Joined:
    Dec 14, 2006
    Messages:
    304 (0.11/day)
    Thanks Received:
    45
    I have a MSI 660ti PE/OC so I know the specs ,the 660ti was just a 670 with 192 memory bus .
    Yes they lowered the core count but gave back a 256 bus which from a performance point seems more balanced (It is faster than 660ti by small amount ) .

    As for pricing seems about right for performance it gives, sure lower is always better from consumer point of view .
  10. Lagittaja

    Joined:
    Dec 12, 2012
    Messages:
    19 (0.03/day)
    Thanks Received:
    1
    lol butt hurt AMD fan
    P.S. Both my rigs have AMD gpu's right now.
  11. Casecutter

    Joined:
    Apr 19, 2011
    Messages:
    1,127 (0.94/day)
    Thanks Received:
    82
    Location:
    So. Cal.
    It got the same clocks as the 670, but with the one memory controller that didn't function, along with one block of the rasters operators fuse off, is how I articulate that. The 760 is basically a price reduction in the guise of a new improved model, that other than higher clocks and improve boost profiles is not providing much different than a custom OC GTX660Ti gave use back around September of 2012 for $320. So basically Nvidia is giving on-par performance at 18% less cash, while using up parts they'd accumulate in a bin, I'm ok with that.

    Not fan just expounding data and known specification... Not, "I've look around on the review-sites and this is some opinion"... basiclly why you don't show or expound as to what I said, instead of just childish banter.

    Who's the Fan. :slap:
  12. Lagittaja

    Joined:
    Dec 12, 2012
    Messages:
    19 (0.03/day)
    Thanks Received:
    1
    You clearly don't even know how GPU Boost 1.0 or 2.0 actually works :D

    The EVGA card.
    [​IMG]

    That's stock, 1072 base 1137 boost
    AVERAGE 1219.
    Increase the base to 1220 and boost to 1286 and the actual boost will be even higher.

    You sir fail :nutkick:

    lol

    Or how about the reference card with stock base at 980 boost at 1033
    [​IMG]
    Average 1067

    booyah
    Last edited: Jun 29, 2013
  13. Casecutter

    Joined:
    Apr 19, 2011
    Messages:
    1,127 (0.94/day)
    Thanks Received:
    82
    Location:
    So. Cal.
    Oh I see what you mean, what I'd like to see is that same graph while showing the OC'd settings W1zzard ran with.

    But what you post explains little. I'd like someone to explain how on the first EVGA chart the Max and median can have the same 1228Mhz?

    While those charts are interesting that doesn't dispel the fact that the EVGA GTX 760 SC at 1220MHz base clock (14% overclocking) and 1840 MHz memory (23% overclock), then only offers a 16% increase to BF3. I think what we see here is that when OC'd if there's temperature head-room the dynamic OC will force it to increase voltage to maintain that highest plateau no matter if the render load actually requires it. It be interesting to see the power it uses to hit that factored in.

    As W1zzard, charts are indicating "A light color means the clock / voltage combination is rarely used and a dark color means it's active a lot."

    What I find odd on the reference card graph is that from base of 980Mhz it isn't Boosting the claim 1033Mhz right off, but running at less than that... more often as show by the dark diamonds? I thought your suppose to get at least as minimum a 1033Mhz that’s advertise. It seems strange for Nvidia to state and advertise the reference 980/1033Mhz Boost when clearly by W1rrards' chart it appears the card runs fairly often below the 1033Mhz, while averages 1067Mhz Boost? I would say they could logically advertise that as the average/nominal. By the EVGA chart it shows stock it maintains higher more often than W1zzards 1220Mhz OC'd number from the previous page. While W1zzards' chart never even has even light diamond showing the 1137 MHz GPU Boost the card is advertised at?

    So yes I clearly don't get their Boost algorithms. Please point me to a good and compressive article that explain Nvidia Boost 2.0, so I/we can fully understand what you already must completely grasp. If you could spend sometime to provide explanations to what I'm pointing out that would be helpful. Posting some graphs and saying I don’t understand is your prerogative. I've searched, basically I come up with the marketing graphs that Nvidia has provided; although that just skims the surface.

    http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan/30.html
    http://www.hardwarecanucks.com/foru...-geforce-gtx-titan-gk110-s-opening-act-4.html
  14. W1zzard

    W1zzard Administrator Staff Member

    Joined:
    May 14, 2004
    Messages:
    14,645 (3.93/day)
    Thanks Received:
    11,383
    the transparency can be kinda misleading because it's only like 98% transparent. if enough (but still few) samples add up it will make the point look dark, and another point which has way more samples will appear just as dark.

    that's why i added the statistical analysis. median = max happens when more than 50% of samples are at the maximum (see below)

    i thought about adding a histogram, but that's too complex for most readers

    [​IMG]
    [​IMG]

    nvidia's "rated" boost clock frequency is some kind of average, it might not be an actual clock frequency
    Last edited: Jun 30, 2013
  15. Ketxxx

    Ketxxx Heedless Psychic

    Joined:
    Mar 4, 2006
    Messages:
    11,510 (3.75/day)
    Thanks Received:
    570
    Location:
    Kingdom of gods
    I think its fair to point out to everybody screaming the 760 is "giving hell" to a 7950 that the 7950 results likely aren't updated from the 7950 review, meaning old, well known problematic drivers. Then theres also the fact to consider are the 7950 results from one of the first 7950s which had a GPU clock of only 800MHz or from a updated 7950 with its 950MHz GPU clock? All of this should be bared in mind people.
  16. W1zzard

    W1zzard Administrator Staff Member

    Joined:
    May 14, 2004
    Messages:
    14,645 (3.93/day)
    Thanks Received:
    11,383
    what makes you think that? how is that even possible, unless i finally admit that i'm a time traveller
  17. Ketxxx

    Ketxxx Heedless Psychic

    Joined:
    Mar 4, 2006
    Messages:
    11,510 (3.75/day)
    Thanks Received:
    570
    Location:
    Kingdom of gods
    I said "likely" because obviously I don't know if you periodically re-run tests with updated drivers to keep results more accurate or not, nor do you specify. I'm not psychic nor do I own a crystal ball.

    I simply pointed out a reminder to people they should bare in mind the 7950 results are going to be impacted if they are the same results as from the original review, and again as its not specified depending on if those results were got from a early 800MHz 7950 or a later 950MHz 7950 is going to have a impact as well and such things should be taken into account by everybody saying the 760 is giving the 7950 "hell".
  18. W1zzard

    W1zzard Administrator Staff Member

    Joined:
    May 14, 2004
    Messages:
    14,645 (3.93/day)
    Thanks Received:
    11,383
    don't need to be a psychic, just read and use your brain, which is apparently too much to expect. howtf did you come to "likely" ?

    how do you explain the games selection that i test? the test setup page specifies the driver version, too. oh and there is haswell
  19. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    9,821 (4.04/day)
    Thanks Received:
    3,480
    I'm not sure, but I think this might be a small hint at what hardware and driver versions are used for the tests...

    [​IMG]
    W1zzard says thanks.
  20. Lagittaja

    Joined:
    Dec 12, 2012
    Messages:
    19 (0.03/day)
    Thanks Received:
    1
    Temperature. The card has a temperature target of 80*C. It will boost to that 1124Mhz instantly but when it reaches it's temperature target it will downclock to save it's ass from overheating..
    Now take a look at the EVGA card with better cooler. :peace:

    Now, when the temperature goes down a bit it'll up the speed a little bit. It'll give it a little more if temp allows. Also don't look at the Dynamic OC clocks/voltage picture as linear, it just shows the clocks as how they were used by the card.
    First it goes to 975 and then 988 and then 1001? No. Straight to max it can go and then back down if the temperature goes too high.
    With Titan/700 series you can change the temperature target. That is the beauty of GPU Boost 2.0

    With AMD's Powertune it only takes power consumption into account and isn't that a predefined value so there is no actual calculations going on by the driver? Or have I misunderstood?
    The GPU Boost 2.0 will look at the power consumption by actual meters on board the card and temperature sensor of the GPU.
    If you want the card not to consume much power, okay you can do that. Just change the power target and give it priority.
    If you don't want the card to go higher than say 77*C under load, okay you can do that. Just change the temperature target and give it priority.
    Or you can set them both and link them.

    In my opinion. This is freaking beautiful.
    I can't wait to get my card. Waiting on EVGA to release the damn FTW version :D
    Last edited: Jun 30, 2013
  21. W1zzard

    W1zzard Administrator Staff Member

    Joined:
    May 14, 2004
    Messages:
    14,645 (3.93/day)
    Thanks Received:
    11,383
    powertune does not _measure_ power consumption. it basically looks at an elaborate gpu load % and guesstimates power draw from that. this means every card behaves the same, there is no temperature variation or manufacturing variance
    Lagittaja says thanks.
  22. Lagittaja

    Joined:
    Dec 12, 2012
    Messages:
    19 (0.03/day)
    Thanks Received:
    1
    +1 Thanks for the explanation.
    Btw, did you notice any coil whining while testing the EVGA card? There's a dude at EVGA forums asking if any others are having that.
  23. Casecutter

    Joined:
    Apr 19, 2011
    Messages:
    1,127 (0.94/day)
    Thanks Received:
    82
    Location:
    So. Cal.
    Okay, that's something I hadn't contemplated it did right off. I just see it if there's no real reason why it couldn't/wouldn't build gradually for the render load, instead of jumping to use power and build heat it doesn't absolutely need for the rendering load. I just figure that more often the graphic load wouldn't be there, so why go full-out 100% and consumes energy (heat) if only increasing the FpS above what it needs give smooth play. I suppose that's where the idea of their Adaptive VSync (60Fps) software, which isn’t used in such tests so that may account for the jumping straight up to max. Also, I take it the "OC vs. Voltage" graphs aren't quantifying a particular title or benchmark, so it's hard to determine the render load(s) they're depicting. Here's something I'd like to ask are there differences in the dynamic profiles a card like the EVGA has loaded on its BIOS, verse the reference card?
    Are there any use of drivers in Nvidia’s Boost, I never heard there is?
    And is why I said earlier,
    Didn’t intend for this to go so far off the tracks, but it is how we discover what is so often not truly ascertained in the marketing slides companies offer.
    Last edited: Jul 1, 2013

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page