• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 760 2048 MB

Joined
Nov 27, 2012
Messages
357 (0.19/day)
Likes
28
System Name Broken Butterfly.
Processor Intel Core i5-3570.
Motherboard Asus P8H67-M LE.
Cooling Cooler Master Hyper 212 Plus.
Memory 2x4gB Corsair Value Series.
Video Card(s) EVGA GTX 660 FTW Signature 2.
Storage 180gB Intel SSD 330 Series, WD Blue 500gB (AAKX).
Display(s) Samsung BX2031.
Case CM Elite 334.
Power Supply Corsair CX 430.
#26
I dont care. I am way happy with overall performance of my GPU!.

Though, I did want GTX 660 to have 1000 or more cuda cores.
 
Joined
Dec 14, 2006
Messages
376 (0.09/day)
Likes
53
System Name Ed-PC
Processor Intel i5-3570k
Motherboard Asus P8Z77 V-Pro
Cooling CM 212 evo
Memory Crucial Ballistix Tactical Tracer DDR3 1600 8GB
Video Card(s) Nvidia MSI 660ti PE OC
Storage WD black 500gig
Case Corsair 500R
Audio Device(s) onboard
Power Supply Corsair 650TX V2
Software Win7 Pro 64bit
#27
Too hot and, by the way,who buys reference cards?
Non-reference cards should provide better thermals ,but they come with a price .
So, this last breath of GK104 is nothing more than squeezing the last buck out of it before back to school or even christmas ...
right it only reference that are hot ,any of 3rd party twin fan units run much cooler and that with not even setting up fan profile .
for example I think reference 660ti was around 75c under full load but with MSI 660ti it went down to 68-70c and with slight adjustments to fan profile it is no problem keeping temps at 60c .
This is with minimal noise to, a auto profile not straight line xx rpm across temps .
 
Joined
Apr 19, 2011
Messages
1,735 (0.70/day)
Likes
206
Location
So. Cal.
#28
So, this last breath of GK104 is nothing more than squeezing the last buck out of it.
higher power consumption on newer series card, running on even more cutback silicon
That's basically the sum of it, and on those points it still feels pricey... I mean other than the "hocus pocus" that is its 256-Bit (provides little to no assistance) it should've been a 660TiGSO/SE
 
Joined
Dec 14, 2006
Messages
376 (0.09/day)
Likes
53
System Name Ed-PC
Processor Intel i5-3570k
Motherboard Asus P8Z77 V-Pro
Cooling CM 212 evo
Memory Crucial Ballistix Tactical Tracer DDR3 1600 8GB
Video Card(s) Nvidia MSI 660ti PE OC
Storage WD black 500gig
Case Corsair 500R
Audio Device(s) onboard
Power Supply Corsair 650TX V2
Software Win7 Pro 64bit
#29
That's basically the sum of it, and on those points it still feels pricey... I mean other than the "hocus pocus" that is its 256-Bit (provides little to no assistance) it should've been a 660TiGSO/SE
except they are pricing it much lower than the 660ti was released at , which was in 300+$ range .
But yes it is just tweaked GT104 core ,no smaller die size has happened here .
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
14,659 (3.97/day)
Likes
8,232
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K at stock (hits 5 gees+ easily)
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (4 x 4GB Corsair Vengeance DDR3 PC3-12800 C9 1600MHz)
Video Card(s) Zotac GTX 1080 AMP! Extreme Edition
Storage Samsung 850 Pro 256GB | WD Green 4TB
Display(s) BenQ XL2720Z | Asus VG278HE (both 27", 144Hz, 3D Vision 2, 1080p)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair HX 850W v1
Software Windows 10 Pro 64-bit
#30
I do love a good rebrand!

/sarcasm
 
Joined
Apr 19, 2011
Messages
1,735 (0.70/day)
Likes
206
Location
So. Cal.
#31
except they are pricing it much lower than the 660ti was released at
GTX660Ti MSRP was $300 and then account the 14% lower Cuda and TMU counts on this gelding... 300 -14% = $258, so not any real price differential. Then consider how this continues to "water down" the GK-104 total wafer deviates to now "4". Nvidias’ price per chip is really low vs. a Tahiti that has 3 (although the Tahiti LE nowhere near the volumes) and a bigger die of 365mm2 vs 294mm2 (almost 25% bigger). Nvidia is still left with the loin share of meat on their GK104 production.

Considering that we've seen both "Tahiti LE 7870" at $200 -AR and now special 7950's at $220 -AR. $250 wasn't nearly as assertive as they could have been, although as the GTX670 goes EoL we might see Nvidia get aggressive. I thought a we might have seen $230 MSRP, but that would've put the stockpiles of GTX670 in a precarious position.
 
Last edited:
Joined
Dec 12, 2012
Messages
32 (0.02/day)
Likes
5
System Name Butterfly Effect
Processor i7-3770K 4.0Ghz uv
Motherboard Asus Maximus IV Gene-Z
Cooling HR-02 Macho
Memory 4x4GB 2133/10-12-12-32/2T
Video Card(s) GTX 670 2GB (OC 1267/1750)
Storage 840 EVO 250GB + WD30EFRX
Display(s) BenQ G2410HD @ 75Hz
Case Lian Li PC-A05NB
Audio Device(s) Asus Xonar DX
Power Supply Seasonic 660XP²
#32
Listed for 239€ in Europe... Would of liked to see it a tad bit cheaper, maybe 229€, but it's not too bad. How does it stack up against the Tahiti LE variants of 7870? I'd be interested in seeing that.
'cept the Reference clocked EVGA blower goes for 199€+shipping (12€ to Finland).
Reference clocked ACX 209€, SC blower 209€ and SC ACX 209€.

Cheapest 7950 goes for 240€+shipping from Germany The good 7950's go for 280€+

AMD is in agony lol, even more after you consider the reference 760 cards on average oc'd to 1245 on the core, checked over a dozen reviews and not all of them even said what was the actual boost clock under load while only mentioning what GPU-Z said for the boost.
 
Joined
Apr 19, 2011
Messages
1,735 (0.70/day)
Likes
206
Location
So. Cal.
#33
consider the reference 760 cards on average oc'd to 1245 on the core, checked over a dozen reviews and not all of them even said what was the actual boost clock under load while only mentioning what GPU-Z said for the boost.
Going by W1zzard 6 reviews the average is like 1175Mhz (16%) while that's substantial, it's the luck of the draw as what that the dynamic OC actually provides in FpS increases. His reviews show wide ranging results of between 12.8 - 17.7% increase in that 1175Mhz range. Even his highest recorded on the EVGA GTX 760 SC at 1220 MHz, gave a 16% increase to BF3

Compare that to say W1zzards Joker (Tahiti LE) review with 24% OC provides 18% FpS, or the 7950 IceQ where a 27% OC realizes 22.8% FpS.

Pinning your hopes on what you get from Nvidia boost clock profiles is more of a "crap shoot" than what you normally presume you might get from OC'n a card. Sure sometime a card is a dud, but now while an Nvidia may show a high Mhz that might not be the only constraint that holds back performance.
 
Joined
Dec 14, 2006
Messages
376 (0.09/day)
Likes
53
System Name Ed-PC
Processor Intel i5-3570k
Motherboard Asus P8Z77 V-Pro
Cooling CM 212 evo
Memory Crucial Ballistix Tactical Tracer DDR3 1600 8GB
Video Card(s) Nvidia MSI 660ti PE OC
Storage WD black 500gig
Case Corsair 500R
Audio Device(s) onboard
Power Supply Corsair 650TX V2
Software Win7 Pro 64bit
#34
GTX660Ti MSRP was $300 and then account the 14% lower Cuda and TMU counts on this gelding... 300 -14% = $258, so not any real price differential. Then consider how this continues to "water down" the GK-104 total wafer deviates to now "4". Nvidias’ price per chip is really low vs. a Tahiti that has 3 (although the Tahiti LE nowhere near the volumes) and a bigger die of 365mm2 vs 294mm2 (almost 25% bigger). Nvidia is still left with the loin share of meat on their GK104 production.

Considering that we've seen both "Tahiti LE 7870" at $200 -AR and now special 7950's at $220 -AR. $250 wasn't nearly as assertive as they could have been, although as the GTX670 goes EoL we might see Nvidia get aggressive. I thought a we might have seen $230 MSRP, but that would've put the stockpiles of GTX670 in a precarious position.
I have a MSI 660ti PE/OC so I know the specs ,the 660ti was just a 670 with 192 memory bus .
Yes they lowered the core count but gave back a 256 bus which from a performance point seems more balanced (It is faster than 660ti by small amount ) .

As for pricing seems about right for performance it gives, sure lower is always better from consumer point of view .
 
Joined
Dec 12, 2012
Messages
32 (0.02/day)
Likes
5
System Name Butterfly Effect
Processor i7-3770K 4.0Ghz uv
Motherboard Asus Maximus IV Gene-Z
Cooling HR-02 Macho
Memory 4x4GB 2133/10-12-12-32/2T
Video Card(s) GTX 670 2GB (OC 1267/1750)
Storage 840 EVO 250GB + WD30EFRX
Display(s) BenQ G2410HD @ 75Hz
Case Lian Li PC-A05NB
Audio Device(s) Asus Xonar DX
Power Supply Seasonic 660XP²
#35
Going by W1zzard 6 reviews the average is like 1175Mhz (16%) while that's substantial, it's the luck of the draw as what that the dynamic OC actually provides in FpS increases. His reviews show wide ranging results of between 12.8 - 17.7% increase in that 1175Mhz range. Even his highest recorded on the EVGA GTX 760 SC at 1220 MHz, gave a 16% increase to BF3

Compare that to say W1zzards Joker (Tahiti LE) review with 24% OC provides 18% FpS, or the 7950 IceQ where a 27% OC realizes 22.8% FpS.

Pinning your hopes on what you get from Nvidia boost clock profiles is more of a "crap shoot" than what you normally presume you might get from OC'n a card. Sure sometime a card is a dud, but now while an Nvidia may show a high Mhz that might not be the only constraint that holds back performance.
lol butt hurt AMD fan
P.S. Both my rigs have AMD gpu's right now.
 
Joined
Apr 19, 2011
Messages
1,735 (0.70/day)
Likes
206
Location
So. Cal.
#36
the 660ti was just a 670 with 192 memory bus
It got the same clocks as the 670, but with the one memory controller that didn't function, along with one block of the rasters operators fuse off, is how I articulate that. The 760 is basically a price reduction in the guise of a new improved model, that other than higher clocks and improve boost profiles is not providing much different than a custom OC GTX660Ti gave use back around September of 2012 for $320. So basically Nvidia is giving on-par performance at 18% less cash, while using up parts they'd accumulate in a bin, I'm ok with that.

lol butt hurt AMD fan
Not fan just expounding data and known specification... Not, "I've look around on the review-sites and this is some opinion"... basiclly why you don't show or expound as to what I said, instead of just childish banter.

Who's the Fan. :slap:
 
Joined
Dec 12, 2012
Messages
32 (0.02/day)
Likes
5
System Name Butterfly Effect
Processor i7-3770K 4.0Ghz uv
Motherboard Asus Maximus IV Gene-Z
Cooling HR-02 Macho
Memory 4x4GB 2133/10-12-12-32/2T
Video Card(s) GTX 670 2GB (OC 1267/1750)
Storage 840 EVO 250GB + WD30EFRX
Display(s) BenQ G2410HD @ 75Hz
Case Lian Li PC-A05NB
Audio Device(s) Asus Xonar DX
Power Supply Seasonic 660XP²
#37
You clearly don't even know how GPU Boost 1.0 or 2.0 actually works :D

The EVGA card.


That's stock, 1072 base 1137 boost
AVERAGE 1219.
Increase the base to 1220 and boost to 1286 and the actual boost will be even higher.

You sir fail :nutkick:

lol

Or how about the reference card with stock base at 980 boost at 1033

Average 1067

booyah
 
Last edited:
Joined
Apr 19, 2011
Messages
1,735 (0.70/day)
Likes
206
Location
So. Cal.
#38
Increase the base to 1220 and boost to 1286 and the actual boost will be even higher.
Oh I see what you mean, what I'd like to see is that same graph while showing the OC'd settings W1zzard ran with.

But what you post explains little. I'd like someone to explain how on the first EVGA chart the Max and median can have the same 1228Mhz?

While those charts are interesting that doesn't dispel the fact that the EVGA GTX 760 SC at 1220MHz base clock (14% overclocking) and 1840 MHz memory (23% overclock), then only offers a 16% increase to BF3. I think what we see here is that when OC'd if there's temperature head-room the dynamic OC will force it to increase voltage to maintain that highest plateau no matter if the render load actually requires it. It be interesting to see the power it uses to hit that factored in.

As W1zzard, charts are indicating "A light color means the clock / voltage combination is rarely used and a dark color means it's active a lot."

What I find odd on the reference card graph is that from base of 980Mhz it isn't Boosting the claim 1033Mhz right off, but running at less than that... more often as show by the dark diamonds? I thought your suppose to get at least as minimum a 1033Mhz that’s advertise. It seems strange for Nvidia to state and advertise the reference 980/1033Mhz Boost when clearly by W1rrards' chart it appears the card runs fairly often below the 1033Mhz, while averages 1067Mhz Boost? I would say they could logically advertise that as the average/nominal. By the EVGA chart it shows stock it maintains higher more often than W1zzards 1220Mhz OC'd number from the previous page. While W1zzards' chart never even has even light diamond showing the 1137 MHz GPU Boost the card is advertised at?

So yes I clearly don't get their Boost algorithms. Please point me to a good and compressive article that explain Nvidia Boost 2.0, so I/we can fully understand what you already must completely grasp. If you could spend sometime to provide explanations to what I'm pointing out that would be helpful. Posting some graphs and saying I don’t understand is your prerogative. I've searched, basically I come up with the marketing graphs that Nvidia has provided; although that just skims the surface.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan/30.html
http://www.hardwarecanucks.com/foru...-geforce-gtx-titan-gk110-s-opening-act-4.html
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
17,160 (3.43/day)
Likes
18,113
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
#39
the transparency can be kinda misleading because it's only like 98% transparent. if enough (but still few) samples add up it will make the point look dark, and another point which has way more samples will appear just as dark.

that's why i added the statistical analysis. median = max happens when more than 50% of samples are at the maximum (see below)

i thought about adding a histogram, but that's too complex for most readers




nvidia's "rated" boost clock frequency is some kind of average, it might not be an actual clock frequency
 
Last edited:

Ketxxx

Heedless Psychic
Joined
Mar 4, 2006
Messages
11,507 (2.65/day)
Likes
562
Location
Kingdom of gods
System Name Prowler. V9.
Processor Intel i5 3570k @ 4.6GHz 1.2v
Motherboard Asrock Z77 Extreme6
Cooling Modded CoolIT ECO ALC, 3x 120mm Coolermaster Sickleflow fans
Memory 2x4GB G.Skill Ripjaws @ 2133MHz 10-10-10-25 1N (T)
Video Card(s) HD7950 Vapor-X @ 1.25GHz 1.05v, 6GHz 1.5v
Storage WD Caviar Black 640GB, 32MB cache, SATA
Display(s) 22" LG Flatron W2242S
Case NZXT Apollo
Audio Device(s) Asus Xonar DX 7.1 PCI-E
Power Supply Corsair HX850w modular
Software Windows 7 x64
#40
I think its fair to point out to everybody screaming the 760 is "giving hell" to a 7950 that the 7950 results likely aren't updated from the 7950 review, meaning old, well known problematic drivers. Then theres also the fact to consider are the 7950 results from one of the first 7950s which had a GPU clock of only 800MHz or from a updated 7950 with its 950MHz GPU clock? All of this should be bared in mind people.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
17,160 (3.43/day)
Likes
18,113
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
#41
that the 7950 results likely aren't updated from the 7950 review
what makes you think that? how is that even possible, unless i finally admit that i'm a time traveller
 

Ketxxx

Heedless Psychic
Joined
Mar 4, 2006
Messages
11,507 (2.65/day)
Likes
562
Location
Kingdom of gods
System Name Prowler. V9.
Processor Intel i5 3570k @ 4.6GHz 1.2v
Motherboard Asrock Z77 Extreme6
Cooling Modded CoolIT ECO ALC, 3x 120mm Coolermaster Sickleflow fans
Memory 2x4GB G.Skill Ripjaws @ 2133MHz 10-10-10-25 1N (T)
Video Card(s) HD7950 Vapor-X @ 1.25GHz 1.05v, 6GHz 1.5v
Storage WD Caviar Black 640GB, 32MB cache, SATA
Display(s) 22" LG Flatron W2242S
Case NZXT Apollo
Audio Device(s) Asus Xonar DX 7.1 PCI-E
Power Supply Corsair HX850w modular
Software Windows 7 x64
#42
I said "likely" because obviously I don't know if you periodically re-run tests with updated drivers to keep results more accurate or not, nor do you specify. I'm not psychic nor do I own a crystal ball.

I simply pointed out a reminder to people they should bare in mind the 7950 results are going to be impacted if they are the same results as from the original review, and again as its not specified depending on if those results were got from a early 800MHz 7950 or a later 950MHz 7950 is going to have a impact as well and such things should be taken into account by everybody saying the 760 is giving the 7950 "hell".
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
17,160 (3.43/day)
Likes
18,113
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
#43
I said "likely" because obviously I don't know if you periodically re-run tests with updated drivers to keep results more accurate or not, nor do you specify. I'm not psychic nor do I own a crystal ball.
don't need to be a psychic, just read and use your brain, which is apparently too much to expect. howtf did you come to "likely" ?

how do you explain the games selection that i test? the test setup page specifies the driver version, too. oh and there is haswell
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
14,659 (3.97/day)
Likes
8,232
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K at stock (hits 5 gees+ easily)
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (4 x 4GB Corsair Vengeance DDR3 PC3-12800 C9 1600MHz)
Video Card(s) Zotac GTX 1080 AMP! Extreme Edition
Storage Samsung 850 Pro 256GB | WD Green 4TB
Display(s) BenQ XL2720Z | Asus VG278HE (both 27", 144Hz, 3D Vision 2, 1080p)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair HX 850W v1
Software Windows 10 Pro 64-bit
#44
I'm not sure, but I think this might be a small hint at what hardware and driver versions are used for the tests...

 
Joined
Dec 12, 2012
Messages
32 (0.02/day)
Likes
5
System Name Butterfly Effect
Processor i7-3770K 4.0Ghz uv
Motherboard Asus Maximus IV Gene-Z
Cooling HR-02 Macho
Memory 4x4GB 2133/10-12-12-32/2T
Video Card(s) GTX 670 2GB (OC 1267/1750)
Storage 840 EVO 250GB + WD30EFRX
Display(s) BenQ G2410HD @ 75Hz
Case Lian Li PC-A05NB
Audio Device(s) Asus Xonar DX
Power Supply Seasonic 660XP²
#45
Oh I see what you mean, what I'd like to see is that same graph while showing the OC'd settings W1zzard ran with.

But what you post explains little. I'd like someone to explain how on the first EVGA chart the Max and median can have the same 1228Mhz?

While those charts are interesting that doesn't dispel the fact that the EVGA GTX 760 SC at 1220MHz base clock (14% overclocking) and 1840 MHz memory (23% overclock), then only offers a 16% increase to BF3. I think what we see here is that when OC'd if there's temperature head-room the dynamic OC will force it to increase voltage to maintain that highest plateau no matter if the render load actually requires it. It be interesting to see the power it uses to hit that factored in.

As W1zzard, charts are indicating "A light color means the clock / voltage combination is rarely used and a dark color means it's active a lot."

What I find odd on the reference card graph is that from base of 980Mhz it isn't Boosting the claim 1033Mhz right off, but running at less than that... more often as show by the dark diamonds? I thought your suppose to get at least as minimum a 1033Mhz that’s advertise. It seems strange for Nvidia to state and advertise the reference 980/1033Mhz Boost when clearly by W1rrards' chart it appears the card runs fairly often below the 1033Mhz, while averages 1067Mhz Boost? I would say they could logically advertise that as the average/nominal. By the EVGA chart it shows stock it maintains higher more often than W1zzards 1220Mhz OC'd number from the previous page. While W1zzards' chart never even has even light diamond showing the 1137 MHz GPU Boost the card is advertised at?

So yes I clearly don't get their Boost algorithms. Please point me to a good and compressive article that explain Nvidia Boost 2.0, so I/we can fully understand what you already must completely grasp. If you could spend sometime to provide explanations to what I'm pointing out that would be helpful. Posting some graphs and saying I don’t understand is your prerogative. I've searched, basically I come up with the marketing graphs that Nvidia has provided; although that just skims the surface.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan/30.html
http://www.hardwarecanucks.com/foru...-geforce-gtx-titan-gk110-s-opening-act-4.html
Temperature. The card has a temperature target of 80*C. It will boost to that 1124Mhz instantly but when it reaches it's temperature target it will downclock to save it's ass from overheating..
Now take a look at the EVGA card with better cooler. :peace:

Now, when the temperature goes down a bit it'll up the speed a little bit. It'll give it a little more if temp allows. Also don't look at the Dynamic OC clocks/voltage picture as linear, it just shows the clocks as how they were used by the card.
First it goes to 975 and then 988 and then 1001? No. Straight to max it can go and then back down if the temperature goes too high.
With Titan/700 series you can change the temperature target. That is the beauty of GPU Boost 2.0

With AMD's Powertune it only takes power consumption into account and isn't that a predefined value so there is no actual calculations going on by the driver? Or have I misunderstood?
The GPU Boost 2.0 will look at the power consumption by actual meters on board the card and temperature sensor of the GPU.
If you want the card not to consume much power, okay you can do that. Just change the power target and give it priority.
If you don't want the card to go higher than say 77*C under load, okay you can do that. Just change the temperature target and give it priority.
Or you can set them both and link them.

In my opinion. This is freaking beautiful.
I can't wait to get my card. Waiting on EVGA to release the damn FTW version :D
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
17,160 (3.43/day)
Likes
18,113
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
#46
With AMD's Powertune it only takes power consumption into account and isn't that a predefined value so there is no actual calculations going on by the driver?
powertune does not _measure_ power consumption. it basically looks at an elaborate gpu load % and guesstimates power draw from that. this means every card behaves the same, there is no temperature variation or manufacturing variance
 
Joined
Dec 12, 2012
Messages
32 (0.02/day)
Likes
5
System Name Butterfly Effect
Processor i7-3770K 4.0Ghz uv
Motherboard Asus Maximus IV Gene-Z
Cooling HR-02 Macho
Memory 4x4GB 2133/10-12-12-32/2T
Video Card(s) GTX 670 2GB (OC 1267/1750)
Storage 840 EVO 250GB + WD30EFRX
Display(s) BenQ G2410HD @ 75Hz
Case Lian Li PC-A05NB
Audio Device(s) Asus Xonar DX
Power Supply Seasonic 660XP²
#47
+1 Thanks for the explanation.
Btw, did you notice any coil whining while testing the EVGA card? There's a dude at EVGA forums asking if any others are having that.
 
Joined
Apr 19, 2011
Messages
1,735 (0.70/day)
Likes
206
Location
So. Cal.
#48
It will boost to that 1124Mhz instantly but when it reaches it's temperature target it will downclock to save it's ass from overheating...
Also don't look at the Dynamic OC clocks/voltage picture as linear, it just shows the clocks as how they were used by the card. First it goes to 975 and then 988 and then 1001? No. Straight to max it can go and then back down if the temperature goes too high.
Okay, that's something I hadn't contemplated it did right off. I just see it if there's no real reason why it couldn't/wouldn't build gradually for the render load, instead of jumping to use power and build heat it doesn't absolutely need for the rendering load. I just figure that more often the graphic load wouldn't be there, so why go full-out 100% and consumes energy (heat) if only increasing the FpS above what it needs give smooth play. I suppose that's where the idea of their Adaptive VSync (60Fps) software, which isn’t used in such tests so that may account for the jumping straight up to max. Also, I take it the "OC vs. Voltage" graphs aren't quantifying a particular title or benchmark, so it's hard to determine the render load(s) they're depicting. Here's something I'd like to ask are there differences in the dynamic profiles a card like the EVGA has loaded on its BIOS, verse the reference card?
With AMD's Powertune no actual calculations going on by the driver?
Are there any use of drivers in Nvidia’s Boost, I never heard there is?
nvidia's "rated" boost clock frequency is some kind of average, it might not be an actual clock frequency.
this means every (AMD) card behaves the same, there is no temperature variation or manufacturing variance
And is why I said earlier,
It's the luck of the draw as what that the dynamic OC actually provides in FpS increases.
Didn’t intend for this to go so far off the tracks, but it is how we discover what is so often not truly ascertained in the marketing slides companies offer.
 
Last edited: