• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce GTX 1080 Ti Overclocked Beyond 2 GHz Put Through 3DMark

GTX 1060 was faster than RX 480 when released but now you can see RX 480 caught up and pulled ahead.
Maybe in gaming, but in benchmarks the 1060 is still ahead.
These are similar platforms showing Ryzen CPU's and the two GPU's mentioned, if you look at the graphics score it shows the 1060 ahead despite being paired with the slower CPU.
http://www.3dmark.com/spy/1325916?_ga=1.266540942.181503479.1487944691
http://www.3dmark.com/spy/1325911?_ga=1.207237779.181503479.1487944691
 
overclocked or boosted? because it's like the previous batch ...

User A: "my card OC @ 2025mhz UUU!1!1!!!"
User B: " wow my card does not even goes up past 1706mhz"
User C: ".... i think User A is talking about boost ..."
User B: "oh .. well if so ... mine in boost goes up to 2100mhz..."

:rolleyes:

edit: yep after a second look at the pics : max boost frequency 2062mhz .... meaning a OC clock of 1531mhz (well if it does 2062 for 1531 that's a better boosting ratio than my 2100/2088 for 1706, although i keep my card at 1557 since it's not really underperforming for now)

BOOOORING

Maybe in gaming, but in benchmarks the 1060 is still ahead.
errrr @Caring1 as much as i like you and your post ... i have to write : if the card perform better in gaming but slightly worse in benchmarks ... then it's the "winner" ... benchmark serve nothing but self contentment (well gaming too ... but you don't interact with a benchmark ...)

for me (non game) Benchmark is worth horse sh!t when it come to decide which card is better.
 
Last edited:
...for me 3dmark is worth horse sh!t when it come to decide which card is better.
Contrary to others opinions, 3DMark's benchmarks engines are very well coded to use up to 16 threads of CPU, and to fully utilize all GPU capacity despite of brand. That's why those software are used by 90% of the reviewers out there.
The game engines however...hmm. Most of them are unoptimised to fully use all resources of the CPU/GPU, but specially they are optimized by the way the wind blows. Or the cash flow from you know who... ;)
 
Contrary to others opinions, 3DMark's benchmarks engines are very well coded to use up to 16 threads of CPU, and to fully utilize all GPU capacity despite of brand. That's why those software are used by 90% of the reviewers out there.
The game engines however...hmm. Most of them are unoptimised to fully use all resources of the CPU/GPU, but specially they are optimized by the way the wind blows. Or the cash flow from you know who... ;)
yep but, nope ... bench is not indicative of "real use situation"

ah TPU GPU reviews use benchies for reviews? i always saw games :D (and heaven 4.0 sometime ... since heaven is a GPU hog ... well that one is a tad better.... for GPU reviews)

thought if the card A perform better in gaming than card B and worse in bench: Card A is the best one of the 2 ... (not literally speaking, if you focus on breaking benchmark score ... well yep you take Card B ... or a Titan XP/1080Ti ... )

let me re edit my post ... i should generalize ;)
 
if the card perform better in gaming but slightly worse in benchmarks ... then it's the "winner" ... benchmark serve nothing but self contentment (well gaming too ... but you don't interact with a benchmark ...)

for me (non game) Benchmark is worth horse sh!t when it come to decide which card is better.
I agree, but reviewers and tech sites tend to cite figures from benches, not real world usage.
IMO it depends on the game played which is better between the 1060 and 480.
 
yep but, nope ... bench is not indicative of "real use situation"

As @Prima.Vera said, a benchmark like 3DMark will show what a card can do. Whether or not a game engine uses a card to its fullest, is a different discussion. In the end, yes, it all comes down to how the few titles one plays work on a given card. But dismissing synthetic benchmarks altogether isn't a good idea.
 
12499997.jpg
That's a bit overboard, don't you think? If it was half a year, I would tend to agree. But it's looking more like 2-3 months until a potentially groundbreaking card from AMD (again, we don't know, but RX480 didn't disappoint). I don't get how 2-3 months isn't worth the wait right now.
 
But let's be clear: I don't have a crystal ball, and I am not saying Vega will 100% beat Pascal. But if Vega 10 can't at least MATCH the 1080 Ti, it will be a Bulldozer-level failure in my opinion.
+1 here...
Despite all said, there is another thing to thank, this war helped us to get a better price tag on several things....
 
Cheaper?? You're joking right? Or you grow money on the trees?
I meant as I could sell my Titan XP for still above 1k (Heck I still see them sell for around 1200) and then buy a GTX 1080ti and end up getting back some money for similar/more performance in games.
overclocked or boosted? because it's like the previous batch ...

User A: "my card OC @ 2025mhz UUU!1!1!!!"
User B: " wow my card does not even goes up past 1706mhz"
User C: ".... i think User A is talking about boost ..."
User B: "oh .. well if so ... mine in boost goes up to 2100mhz..."

:rolleyes:

edit: yep after a second look at the pics : max boost frequency 2062mhz .... meaning a OC clock of 1531mhz (well if it does 2062 for 1531 that's a better boosting ratio than my 2100/2088 for 1706, although i keep my card at 1557 since it's not really underperforming for now)

BOOOORING


errrr @Caring1 as much as i like you and your post ... i have to write : if the card perform better in gaming but slightly worse in benchmarks ... then it's the "winner" ... benchmark serve nothing but self contentment (well gaming too ... but you don't interact with a benchmark ...)

for me (non game) Benchmark is worth horse sh!t when it come to decide which card is better.
Yea, it may not go as much further as originally thought. Or it may just be able to hold that area of boost better.
 
GTX 1060 was faster than RX 480 when released but now you can see RX 480 caught up and pulled ahead.
No, RX 480 is only better in AMD optimized games.

If AMD can pull a Fury die sink which is able to clock at 1500 MHz, and of coz add more fking ROPs, they can match 1080Ti for real.
Nope. Fiji couldn't match GM200 because AMD couldn't keep their 4096 cores saturated. If ROPs were the problem you would have seen a clear correlation with higher resolutions.

Wow you just flat out can't read. Compare what it used to be to now. I said the 480 closed most of the gap, and Vega adds substantial Architectural enhancements. In fact the difference between Vega and Polaris seems to be FAR larger than that between Fiji and Polaris.

Let's see how this pans out ;)
I can read, but you either can't remember or understand your own words:
Keep in mind that Polaris closed most of the TFLOPS/performance gap that used to exist
Regardless of how you want to interpret this, Polaris did little to close the gap as you claimed.
I've already provided you with the Performance/FLOPs for the GPUs:
RX 480 (5161 GFlop/s) is beaten by the GTX 1060 (3855 GFlop/s), even though RX 480 has 34% more computational power
You can't seriously claim that Polaris "closed most of the […] gap", that's insane.

And in terms of performance/watt it's even worse. GP104 is ~80% and GP102 ~85% more efficient than Polaris, which is a larger gap than with Maxwell. I'm pointing this out because AMD needs to solve all these problems before they can become competitive. It would be an achievement of the century to be able to do it in a single year! So please stop terrorizing everyone that's bursting your AMD bubble by pointing out the facts and realistic estimates for AMD products!
 
That's a bit overboard, don't you think? If it was half a year, I would tend to agree. But it's looking more like 2-3 months until a potentially groundbreaking card from AMD (again, we don't know, but RX480 didn't disappoint). I don't get how 2-3 months isn't worth the wait right now.

2-3 months left of the 4 years wait. I myself gave up on it year ago and went with 1070 ... hopefully AMD delivers for next GPU refresh :D
 
AMD should have never bought ATI......

That has been the only thing keeping them afloat financially. Without ATI, they'd have gone under. Also, being diversified, unlike Intel and NVIDIA is an advantage long-term, I believe.
 
That has been the only thing keeping them afloat financially. Without ATI, they'd have gone under. Also, being diversified, unlike Intel and NVIDIA is an advantage long-term, I believe.
Hehe. as far as "diversified" is concerned, it's AMD that's trailing. Nvidia has SoCs and through them a foothold in car infotainment systems. And intel is in so many markets, I don't know where to begin.
 
+1 here...
Despite all said, there is another thing to thank, this war helped us to get a better price tag on several things....

Honestly for me it's not just about price this time: I really do want AMD to raise the performance bar substantially.


What I don't get is why so many people are impress by the performance of the Titan/1080 Ti. I mean it's bloody 2017 and we are supposed to be impressed with BARELY acceptable 4K performance for $700?! I got a Fury @ 1135/525 a year ago for $310, and the FAR newer 1080 Ti is only ~70% stronger for 225% the money. Wake me up when we have ANY card that can do some 4K at 100Hz+ gaming...
 
Sleep tight then sweet prince.
 
i just realized that my 1070 factory OC (untouched) boost to 1923 from 1557 ... oh well only 139mhz lower in boost ... :D
 
Honestly for me it's not just about price this time: I really do want AMD to raise the performance bar substantially.


What I don't get is why so many people are impress by the performance of the Titan/1080 Ti.

I don't think anyone is impressed by performance alone. It's the efficiency that got to incredible levels with Pascal (i.e. what you can squeeze out of 250W). But you can't talk about efficiency to an AMD fan, because, well...

At the same time, yes, I also await the day when a mid range card will push 4k, that will be quite an achievement. But that day will not be in 2017 or in 2018. Hopefully in 2019.
 
I don't think anyone is impressed by performance alone. It's the efficiency that got to incredible levels with Pascal (i.e. what you can squeeze out of 250W). But you can't talk about efficiency to an AMD fan, because, well...


Hmm? Not sure why. After all it wasn't until Pascal that Nvidia was more efficient. That is unless you count the 580 as efficient, and you ignore that the R9 Nano existed.


Efficiency, like price - is a design choice. Anyone can do it.
 
That has been the only thing keeping them afloat financially. Without ATI, they'd have gone under...
Some bad mouths are or were claiming the opposite, haha...
 
Some bad mouths are or were claiming the opposite, haha...
Well, it's not exactly the case here, but that is the reason conglomerates exist: when one part of the business stalls, if falls back on the parts that don't.
 
Honestly for me it's not just about price this time: I really do want AMD to raise the performance bar substantially.


What I don't get is why so many people are impress by the performance of the Titan/1080 Ti. I mean it's bloody 2017 and we are supposed to be impressed with BARELY acceptable 4K performance for $700?! I got a Fury @ 1135/525 a year ago for $310, and the FAR newer 1080 Ti is only ~70% stronger for 225% the money. Wake me up when we have ANY card that can do some 4K at 100Hz+ gaming...

Why people are impressed? Because they know that 4K is still in its early stages, and because the 1080ti keeps following the natural trend of +-30% perf increase to the previous gen card, and usually even more than that (in fact Pascal breaks the trend by another 5-10% extra perf compared to Fermi > Kepler > Maxwell) but not only that; it does so with a much better perf/watt, low temps, and good OC capability. Because people can push the most powerful SKU and with decent cooling can stick below 70 C easy, whereas the previous gen would constantly be bumping into BIOS temp limits of 80 C+.

The fact is, you are sugar coating what hasn't come to market yet (Vega), because '1080ti' cant even do 4k/ultra/60 fps. The logic is lost here. Vega will NOT be able to push that setting either, I'll provide you that bit of realism right now.

Also, you are STILL double posting.
 
Back
Top