Monday, March 6th 2017

GeForce GTX 1080 Ti Overclocked Beyond 2 GHz Put Through 3DMark

An NVIDIA GeForce GTX 1080 Ti reference-design graphics card was overclocked to 2062 MHz core, and 11404 MHz (GDDR5X-effective) memory, and put through the 3DMark suite. The card was able to sustain its overclock without breaking a sweat, with its core temperature hovering around 63°C. Apparently, the card's power-limit was manually set to 122%, to sustain the overclock. In the standard FireStrike benchmark (1080p), the card churned up graphics scores of 31,135 points, followed by 15,093 points in FireStrike Extreme (1440p), and 7,362 points in the 4K Ultra HD version of the benchmark, FireStrike Ultra. The card also scored 10,825 points in the TimeSpy DirectX 12 benchmark. Overall, the card falls within 30-40% performance of an overclocked GTX 1080.
Sources: ChipHell, VideoCardz
Add your own comment

72 Comments on GeForce GTX 1080 Ti Overclocked Beyond 2 GHz Put Through 3DMark

#51
GreiverBlade
overclocked or boosted? because it's like the previous batch ...

User A: "my card OC @ 2025mhz UUU!1!1!!!"
User B: " wow my card does not even goes up past 1706mhz"
User C: ".... i think User A is talking about boost ..."
User B: "oh .. well if so ... mine in boost goes up to 2100mhz..."

:rolleyes:

edit: yep after a second look at the pics : max boost frequency 2062mhz .... meaning a OC clock of 1531mhz (well if it does 2062 for 1531 that's a better boosting ratio than my 2100/2088 for 1706, although i keep my card at 1557 since it's not really underperforming for now)

BOOOORING
Caring1Maybe in gaming, but in benchmarks the 1060 is still ahead.
errrr @Caring1 as much as i like you and your post ... i have to write : if the card perform better in gaming but slightly worse in benchmarks ... then it's the "winner" ... benchmark serve nothing but self contentment (well gaming too ... but you don't interact with a benchmark ...)

for me (non game) Benchmark is worth horse sh!t when it come to decide which card is better.
Posted on Reply
#52
Prima.Vera
GreiverBlade...for me 3dmark is worth horse sh!t when it come to decide which card is better.
Contrary to others opinions, 3DMark's benchmarks engines are very well coded to use up to 16 threads of CPU, and to fully utilize all GPU capacity despite of brand. That's why those software are used by 90% of the reviewers out there.
The game engines however...hmm. Most of them are unoptimised to fully use all resources of the CPU/GPU, but specially they are optimized by the way the wind blows. Or the cash flow from you know who... ;)
Posted on Reply
#53
GreiverBlade
Prima.VeraContrary to others opinions, 3DMark's benchmarks engines are very well coded to use up to 16 threads of CPU, and to fully utilize all GPU capacity despite of brand. That's why those software are used by 90% of the reviewers out there.
The game engines however...hmm. Most of them are unoptimised to fully use all resources of the CPU/GPU, but specially they are optimized by the way the wind blows. Or the cash flow from you know who... ;)
yep but, nope ... bench is not indicative of "real use situation"

ah TPU GPU reviews use benchies for reviews? i always saw games :D (and heaven 4.0 sometime ... since heaven is a GPU hog ... well that one is a tad better.... for GPU reviews)

thought if the card A perform better in gaming than card B and worse in bench: Card A is the best one of the 2 ... (not literally speaking, if you focus on breaking benchmark score ... well yep you take Card B ... or a Titan XP/1080Ti ... )

let me re edit my post ... i should generalize ;)
Posted on Reply
#54
Caring1
GreiverBladeif the card perform better in gaming but slightly worse in benchmarks ... then it's the "winner" ... benchmark serve nothing but self contentment (well gaming too ... but you don't interact with a benchmark ...)

for me (non game) Benchmark is worth horse sh!t when it come to decide which card is better.
I agree, but reviewers and tech sites tend to cite figures from benches, not real world usage.
IMO it depends on the game played which is better between the 1060 and 480.
Posted on Reply
#55
bug
GreiverBladeyep but, nope ... bench is not indicative of "real use situation"
As @Prima.Vera said, a benchmark like 3DMark will show what a card can do. Whether or not a game engine uses a card to its fullest, is a different discussion. In the end, yes, it all comes down to how the few titles one plays work on a given card. But dismissing synthetic benchmarks altogether isn't a good idea.
Posted on Reply
#56
PowerPC
P4-630
That's a bit overboard, don't you think? If it was half a year, I would tend to agree. But it's looking more like 2-3 months until a potentially groundbreaking card from AMD (again, we don't know, but RX480 didn't disappoint). I don't get how 2-3 months isn't worth the wait right now.
Posted on Reply
#57
peche
Thermaltake fanboy
Captain_TomBut let's be clear: I don't have a crystal ball, and I am not saying Vega will 100% beat Pascal. But if Vega 10 can't at least MATCH the 1080 Ti, it will be a Bulldozer-level failure in my opinion.
+1 here...
Despite all said, there is another thing to thank, this war helped us to get a better price tag on several things....
Posted on Reply
#58
GhostRyder
Prima.VeraCheaper?? You're joking right? Or you grow money on the trees?
I meant as I could sell my Titan XP for still above 1k (Heck I still see them sell for around 1200) and then buy a GTX 1080ti and end up getting back some money for similar/more performance in games.
GreiverBladeoverclocked or boosted? because it's like the previous batch ...

User A: "my card OC @ 2025mhz UUU!1!1!!!"
User B: " wow my card does not even goes up past 1706mhz"
User C: ".... i think User A is talking about boost ..."
User B: "oh .. well if so ... mine in boost goes up to 2100mhz..."

:rolleyes:

edit: yep after a second look at the pics : max boost frequency 2062mhz .... meaning a OC clock of 1531mhz (well if it does 2062 for 1531 that's a better boosting ratio than my 2100/2088 for 1706, although i keep my card at 1557 since it's not really underperforming for now)

BOOOORING


errrr @Caring1 as much as i like you and your post ... i have to write : if the card perform better in gaming but slightly worse in benchmarks ... then it's the "winner" ... benchmark serve nothing but self contentment (well gaming too ... but you don't interact with a benchmark ...)

for me (non game) Benchmark is worth horse sh!t when it come to decide which card is better.
Yea, it may not go as much further as originally thought. Or it may just be able to hold that area of boost better.
Posted on Reply
#59
efikkan
sweetGTX 1060 was faster than RX 480 when released but now you can see RX 480 caught up and pulled ahead.
No, RX 480 is only better in AMD optimized games.
sweetIf AMD can pull a Fury die sink which is able to clock at 1500 MHz, and of coz add more fking ROPs, they can match 1080Ti for real.
Nope. Fiji couldn't match GM200 because AMD couldn't keep their 4096 cores saturated. If ROPs were the problem you would have seen a clear correlation with higher resolutions.
Captain_TomWow you just flat out can't read. Compare what it used to be to now. I said the 480 closed most of the gap, and Vega adds substantial Architectural enhancements. In fact the difference between Vega and Polaris seems to be FAR larger than that between Fiji and Polaris.

Let's see how this pans out ;)
I can read, but you either can't remember or understand your own words:
Keep in mind that Polaris closed most of the TFLOPS/performance gap that used to exist
Regardless of how you want to interpret this, Polaris did little to close the gap as you claimed.
I've already provided you with the Performance/FLOPs for the GPUs:
RX 480 (5161 GFlop/s) is beaten by the GTX 1060 (3855 GFlop/s), even though RX 480 has 34% more computational power
You can't seriously claim that Polaris "closed most of the […] gap", that's insane.

And in terms of performance/watt it's even worse. GP104 is ~80% and GP102 ~85% more efficient than Polaris, which is a larger gap than with Maxwell. I'm pointing this out because AMD needs to solve all these problems before they can become competitive. It would be an achievement of the century to be able to do it in a single year! So please stop terrorizing everyone that's bursting your AMD bubble by pointing out the facts and realistic estimates for AMD products!
Posted on Reply
#60
deemon
PowerPCThat's a bit overboard, don't you think? If it was half a year, I would tend to agree. But it's looking more like 2-3 months until a potentially groundbreaking card from AMD (again, we don't know, but RX480 didn't disappoint). I don't get how 2-3 months isn't worth the wait right now.
2-3 months left of the 4 years wait. I myself gave up on it year ago and went with 1070 ... hopefully AMD delivers for next GPU refresh :D
Posted on Reply
#61
P4-630
deemonI myself gave up on it year ago and went with 1070
Great card! No regrets!
:toast:
Posted on Reply
#63
rtwjunkie
PC Gaming Enthusiast
BasardAMD should have never bought ATI......
That has been the only thing keeping them afloat financially. Without ATI, they'd have gone under. Also, being diversified, unlike Intel and NVIDIA is an advantage long-term, I believe.
Posted on Reply
#64
bug
rtwjunkieThat has been the only thing keeping them afloat financially. Without ATI, they'd have gone under. Also, being diversified, unlike Intel and NVIDIA is an advantage long-term, I believe.
Hehe. as far as "diversified" is concerned, it's AMD that's trailing. Nvidia has SoCs and through them a foothold in car infotainment systems. And intel is in so many markets, I don't know where to begin.
Posted on Reply
#65
Captain_Tom
peche+1 here...
Despite all said, there is another thing to thank, this war helped us to get a better price tag on several things....
Honestly for me it's not just about price this time: I really do want AMD to raise the performance bar substantially.


What I don't get is why so many people are impress by the performance of the Titan/1080 Ti. I mean it's bloody 2017 and we are supposed to be impressed with BARELY acceptable 4K performance for $700?! I got a Fury @ 1135/525 a year ago for $310, and the FAR newer 1080 Ti is only ~70% stronger for 225% the money. Wake me up when we have ANY card that can do some 4K at 100Hz+ gaming...
Posted on Reply
#67
GreiverBlade
i just realized that my 1070 factory OC (untouched) boost to 1923 from 1557 ... oh well only 139mhz lower in boost ... :D
Posted on Reply
#68
bug
Captain_TomHonestly for me it's not just about price this time: I really do want AMD to raise the performance bar substantially.


What I don't get is why so many people are impress by the performance of the Titan/1080 Ti.
I don't think anyone is impressed by performance alone. It's the efficiency that got to incredible levels with Pascal (i.e. what you can squeeze out of 250W). But you can't talk about efficiency to an AMD fan, because, well...

At the same time, yes, I also await the day when a mid range card will push 4k, that will be quite an achievement. But that day will not be in 2017 or in 2018. Hopefully in 2019.
Posted on Reply
#69
Captain_Tom
bugI don't think anyone is impressed by performance alone. It's the efficiency that got to incredible levels with Pascal (i.e. what you can squeeze out of 250W). But you can't talk about efficiency to an AMD fan, because, well...
Hmm? Not sure why. After all it wasn't until Pascal that Nvidia was more efficient. That is unless you count the 580 as efficient, and you ignore that the R9 Nano existed.


Efficiency, like price - is a design choice. Anyone can do it.
Posted on Reply
#70
Prima.Vera
rtwjunkieThat has been the only thing keeping them afloat financially. Without ATI, they'd have gone under...
Some bad mouths are or were claiming the opposite, haha...
Posted on Reply
#71
bug
Prima.VeraSome bad mouths are or were claiming the opposite, haha...
Well, it's not exactly the case here, but that is the reason conglomerates exist: when one part of the business stalls, if falls back on the parts that don't.
Posted on Reply
#72
Vayra86
Captain_TomHonestly for me it's not just about price this time: I really do want AMD to raise the performance bar substantially.


What I don't get is why so many people are impress by the performance of the Titan/1080 Ti. I mean it's bloody 2017 and we are supposed to be impressed with BARELY acceptable 4K performance for $700?! I got a Fury @ 1135/525 a year ago for $310, and the FAR newer 1080 Ti is only ~70% stronger for 225% the money. Wake me up when we have ANY card that can do some 4K at 100Hz+ gaming...
Why people are impressed? Because they know that 4K is still in its early stages, and because the 1080ti keeps following the natural trend of +-30% perf increase to the previous gen card, and usually even more than that (in fact Pascal breaks the trend by another 5-10% extra perf compared to Fermi > Kepler > Maxwell) but not only that; it does so with a much better perf/watt, low temps, and good OC capability. Because people can push the most powerful SKU and with decent cooling can stick below 70 C easy, whereas the previous gen would constantly be bumping into BIOS temp limits of 80 C+.

The fact is, you are sugar coating what hasn't come to market yet (Vega), because '1080ti' cant even do 4k/ultra/60 fps. The logic is lost here. Vega will NOT be able to push that setting either, I'll provide you that bit of realism right now.

Also, you are STILL double posting.
Posted on Reply
Add your own comment
May 15th, 2024 05:27 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts