Wednesday, May 18th 2016

NVIDIA GeForce GTX 1070 Clock Speeds Revealed

NVIDIA posted the product page of its upcoming GeForce GTX 1070 graphics card, confirming its clock-speeds, and related specifications. The card features a nominal GPU clock speed of 1506 MHz, with a maximum GPU Boost frequency of 1683 MHz. The memory is clocked at 2000 MHz (actual), or 8 GHz (GDDR5-effective), working out to a memory bandwidth of 256 GB/s. The company also rates the card's single-precision floating point performance at 6.45 TFLOP/s. Other key specs include 1,920 CUDA cores, 120 TMUs, and 64 ROPs. The GeForce GTX 1070 goes on sale, on the 10th of June, 2016.
Add your own comment

123 Comments on NVIDIA GeForce GTX 1070 Clock Speeds Revealed

#101
BiggieShady
Vayra86No, not always. Maxwell was very much voltage limited before temperature.
They limited the voltage very reasonably IMO for overclocking on air. Maybe you think specifically of water cooling. When you say limited, do you mean limited in a way that you get unstable voltages and clocks at relatively low temperatures, or limited in a way that maximum clock and voltage offsets are still stable at relatively low temperatures (and you feel the gpu could do more)?
Posted on Reply
#102
Vayra86
BiggieShadyThey limited the voltage very reasonably IMO for overclocking on air. Maybe you think specifically of water cooling. When you say limited, do you mean limited in a way that you get unstable voltages and clocks at relatively low temperatures, or limited in a way that maximum clock and voltage offsets are still stable at relatively low temperatures (and you feel the gpu could do more)?
Limited in the sense that adding more voltage does not pay off in terms of higher clocks. Maxwell has a bit of a glass ceiling that way unless you start doing stuff that won't work for 24/7 use. Water cooling is almost useless on Maxwell compared to solid air coolers.

I'm not talking about the artificial BIOS limitations. Kepler had the same limitation (1.21v) but was temperature limited before voltage.
Posted on Reply
#103
BiggieShady
Vayra86I'm not talking about the artificial BIOS limitations.
Oh, got it.
Vayra86Limited in the sense that adding more voltage does not pay off in terms of higher clocks.
Yeah, that only means that there exist much lower temperature point where that voltage delta would result in higher clocks. On LN2 GM204 goes up to 2.3 GHz, and GM200 up to 2 GHz.
Jump from 28nm to 16nm and you are right there on air lol
Posted on Reply
#104
Vayra86
BiggieShadyOh, got it.

Yeah, that only means that there exist much lower temperature point where that voltage delta would result in higher clocks. On LN2 GM204 goes up to 2.3 GHz, and GM200 up to 2 GHz.
Jump from 28nm to 16nm and you are right there on air lol
That's the thing, we don't game on LN2 and water versus air won't do the trick ^^

Maybe Pascal will be different, but currently, everything points to the opposite. If all cards cap out at 2 Ghz, which now seems to be the case on either water or air from the numbers we have, that's Maxwell v2. And let's not forget Nvidia needed those numbers to make their claims - while their stock clocks are a good 300 mhz lower. In the meantime their architecture talks speak of removing GPU functionalities that were not strictly needed for gaming or streamlining them to achieve higher clocks (enter the GP100 + Nvlink release for the pro market). All of this doesn't point to a big gain from going 28nm > 16nm, but rather a combination of these efforts.

So far the only real gain we see from 16nm is the vastly reduced leakage which results in lower power draw - the actual performance still requires roughly the same power envelope because it needs higher clock to get there.
Posted on Reply
#105
Caring1
I have a feeling EVGA will be announcing their GTX 1070 6Gb soon.
Posted on Reply
#106
BiggieShady
Vayra86we don't game on LN2
No, we don't ... I thought it was obvious why I mentioned LN2: what's achievable on maxwell only under liquid nitrogen, we get on pascal by pulling sliders in afterburner. I unsuccessfully wanted to put things in perspective.
Posted on Reply
#107
newconroer
Eroticusso 200$ less for different BIOS and GDDR5 ?
At least this time there's an actual different DDR architecture being used. In past generations the difference between little and big brother flag ship cards has been minimal and in recent years, with a simple overclock, little brother was as strong as big brother.

I really thought Nvidia would have dumped the two flagship card strategy and just gone with one flagship, and one extreme (or flagship +). I guess their manufacturing still isn't up to snuff, that they keep pumping out what are defective products and have to downgrade them.
Posted on Reply
#108
rtwjunkie
PC Gaming Enthusiast
newconroerAt least this time there's an actual different DDR architecture being used. In past generations the difference between little and big brother flag ship cards has been minimal and in recent years, with a simple overclock, little brother was as strong as big brother.

I really thought Nvidia would have dumped the two flagship card strategy and just gone with one flagship, and one extreme (or flagship +). I guess their manufacturing still isn't up to snuff, that they keep pumping out what are defective products and have to downgrade them.
Wait, what flagship(s)?. I know it's just a minor point, but we already know the 104 chip used by 1080 and 1070 is not the big/full chip. We know because Nvidia has said so, and it's the way they've released since the 6xx series. The flagship of Pascal is yet to come.
Posted on Reply
#109
BiggieShady
rtwjunkieThe flagship of Pascal is yet to come.
... and in the meantime mid range chip is the fastest on planet by a healthy margin, and it will be effectively nvidia's "flagship" through the rest of this year also with near to flagship price currently with founders edition ... I think that's what confuses people.
1080 has much to offer and they will sell as hotcakes for $600 until Christmas when the price will be reduced with the launch of AMD Vega series.
Posted on Reply
#110
newconroer
rtwjunkieWait, what flagship(s)?. I know it's just a minor point, but we already know the 104 chip used by 1080 and 1070 is not the big/full chip. We know because Nvidia has said so, and it's the way they've released since the 6xx series. The flagship of Pascal is yet to come.
Poor wording by me?

I've been treating the x70/x80/Titan as the flagships, with three variants. Anything under that is midrange or lower.
I'll say then, that I wish they'd dump the x70 iteration of enthusiast products, only have the high end x80 model and then the big daddy flagship.

However, because little brothers are just broken big brothers, I can't see a way around that. Unless they make flagships first, treat them as big brother and use the broken ones as little brothers.

You know if we just replace 'brother' with 'sister' people will think we're talking about Bioshock and not graphics cards.
BiggieShady...they will sell as hotcakes for $600 until Christmas when the price will be reduced with the launch of AMD Vega series.
And the release of the TI (or whatever they call the next flagship). Who here is going to feel silly when the TI launches for the same price as the standard 1080 did?

Come on board partners, throw a wrench in here somewhere and stop the madness.
Posted on Reply
#111
ViperXTR
True Flagships are the x80Ti and Titan these days,starting from the GTX 680/HD 7970 era.

Nvidia making GK104, a midrange GPU and ended up as fast/faster vs HD 7970, AMD's high end, so they turned it into an x80 part instead of their usual x60 part for their usual midrange for their Gx04 codenames.

And more rumour/guess specs for their "true" flagship and mid-mainstream, could see that happening on the 1060 though heh
Posted on Reply
#112
EarthDog
EarthDogSo...... those mismatched memory people from the speculation thread... what say you?
BUMP..

It was such a big deal before... now nobody wants to/can comment??
Posted on Reply
#113
BiggieShady
EarthDogBUMP..

It was such a big deal before... now nobody wants to/can comment??
Why don't you post the thread link ... we should at least know who are the individuals you are calling out.
Posted on Reply
#116
EarthDog
While it isn't a direct translation to games, that result sure doesn't look like the 980Ti is "running circles" around the 1070 ehh @Vayra86 ? :pimp:
Posted on Reply
#117
matar
P4-630Mine as well...Asus GTX1070 Strix.:D Running on intel HD530 at the moment :D
LOL i feel you HD530 pause all your games
thankfully i have good video cards in my case 2 GTX 560Ti 2GB in SLi and i have laying around a GT 740 spare just in case if my video cards ever failed
Posted on Reply
#118
Vayra86
EarthDogWhile it isn't a direct translation to games, that result sure doesn't look like the 980Ti is "running circles" around the 1070 ehh @Vayra86 ? :pimp:
Unsigned drivers, 1860 mhz Core Clock... (+200mhz from stock :))

Oh wow, with a beefy OC the 1070 can *match* a 980ti at stock.

Who's not running circles around what?? @EarthDog, I had expected more of you mate.
Posted on Reply
#119
EarthDog
1. It is beating a 980ti by 5%, not matching it.

2.
This time we are not including overclocking results in our charts
Did I miss where they overclocked in that chart after they said they didn't? They said what 3DMark is REPORTING, but, with new cards, particularly unreleased cards, it typically isn't accurate. It also shows its about 5% faster. So if you take this supposed overclock you are talking about away, its likely right around 980Ti speeds... a far cry from 'running circles' around it.... whatever you define 'running circles' around it to be...

EDIT: If you will note in W1z's 1080 review a 453MHz overclock yielded 12.8% increase in performance. So even if that result is overclocked (which I don't believe it is), a 200Mhz overclock would yield right around 5% gains putting it at 980Ti speeds. Again a far cry from 'running circles' around it.
www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/30.html
Posted on Reply
#120
Vayra86
EarthDog1. It is beating a 980ti by 5%, not matching it.

2.
Did I miss where they overclocked in that chart after they said they didn't? They said what 3DMark is REPORTING, but, with new cards, particularly unreleased cards, it typically isn't accurate. It also shows its about 5% faster. So if you take this supposed overclock you are talking about away, its likely right around 980Ti speeds... a far cry from 'running circles' around it.... whatever you define 'running circles' around it to be...

EDIT: If you will note in W1z's 1080 review a 453MHz overclock yielded 12.8% increase in performance. So even if that result is overclocked (which I don't believe it is), a 200Mhz overclock would yield right around 5% gains putting it at 980Ti speeds. Again a far cry from 'running circles' around it.
www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/30.html
Well ofcourse it was over exaggerated to say 'running circles' but it is unlikely 3DMark is reporting 'the wrong clock' that just so happens to correlate well with the reports about Pascal and clocking. Even if this is just GPU Boost, we have no idea how it really works out in terms of headroom of further OC'ing, while we *do* know that the 980ti can extract a good bit of perf from that.

Nvidia needed 2.1 Ghz clocks on the 1080 to produce a snappy marketing line. Let's not get ahead of ourselves :) With regards to the question posed by the OP, I think it is a safe bet to say the 980ti will be at the same perf level, if not a slightly higher perf level taking the end result of OC's into account on both cards.
Posted on Reply
#122
EarthDog
Vayra86Well ofcourse it was over exaggerated to say 'running circles' but it
Oh.. OF COURSE you were exaggerating!!! :)

It is quite common for 3DMark to have wrong clocks... I just said that. Now, am I sure...? Of course not. However, when it matches a 1080 exactly, and knowing it has a history of not reporting the correct clocks (particularly on new cards), it isn't a leap to think the clocks are reporting wrong in 3DMark. I also mentioned the overclocking with the 1080 and what that yielded...so...

Only time will tell, but they were spot on with the 1080 in their leaked benchmarks... I wouldn't bet my life on it, but its going to be damn close to the 980ti contrary to your assertion (which you are backing off of now).
Posted on Reply
#123
Vayra86
P4-630This thread you mean: www.techpowerup.com/forums/threads/waiting-for-gtx1070-non-reference-or-buy-a-gtx980ti.222683/ :D
EarthDogOh.. OF COURSE you were exaggerating!!! :)

It is quite common for 3DMark to have wrong clocks... I just said that. Now, am I sure...? Of course not. However, when it matches a 1080 exactly, and knowing it has a history of not reporting the correct clocks (particularly on new cards), it isn't a leap to think the clocks are reporting wrong in 3DMark. I also mentioned the overclocking with the 1080 and what that yielded...so...

Only time will tell, but they were spot on with the 1080 in their leaked benchmarks... I wouldn't bet my life on it, but its going to be damn close to the 980ti contrary to your assertion (which you are backing off of now).
Lol. And Lol. Yeah I may have to come back on my earlier remark there, just a little ^^ But... still waiting on in game benches.
Posted on Reply
Add your own comment
May 16th, 2024 05:36 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts