Wednesday, May 18th 2016

NVIDIA GeForce GTX 1070 Clock Speeds Revealed

NVIDIA posted the product page of its upcoming GeForce GTX 1070 graphics card, confirming its clock-speeds, and related specifications. The card features a nominal GPU clock speed of 1506 MHz, with a maximum GPU Boost frequency of 1683 MHz. The memory is clocked at 2000 MHz (actual), or 8 GHz (GDDR5-effective), working out to a memory bandwidth of 256 GB/s. The company also rates the card's single-precision floating point performance at 6.45 TFLOP/s. Other key specs include 1,920 CUDA cores, 120 TMUs, and 64 ROPs. The GeForce GTX 1070 goes on sale, on the 10th of June, 2016.
Add your own comment

123 Comments on NVIDIA GeForce GTX 1070 Clock Speeds Revealed

#76
ZoneDymo
FluffmeisterWe have all certainly established medi01 is not a fan of those meanies nVidia.

He does need to tone it down though, it's getting old now.
Ermm not sure where you are getting that from, did you even read what was typed?
Medi01 simply stated that a GTX1080 is not that interesting to get compared to a GTX980Ti, thats just consumer advice (and perhaps more important regarding your remark, both are Nvidia cards....)
Posted on Reply
#77
jabbadap
Legacy-ZAThe perfect balance would have been 7.5TFP/s and 2048 CUDA cores... It's not there, so I guess nVidia is making sure they have that little bit left if AMD is going to pull out a similar performance card but for a much better price point. They will then probably release the Ti version with GDDR5X memory too. Pretty much going to be a nice middle finger for 1070 buyers.
Nah, there's greater profits to get from mobile space. That would probably be gtx1080M or quadro P5000M(with lower clock speeds and lower fp32 TFlops)... But of course it's all depends how many working dies at that core config are there and how close to the own top dog it's reaching. Not saying it's not possible, hell nvidia has even done it before(Tesla gtx260 and Fermi times gtx560ti 448). The ball is now in amds corner, when they finally get competition out nvidia will react accordingly.

I think now it's safe to say next nvidia's card are once rumored gp104-150 as gtx1060ti and gp106 based gtx1060/gtx1050ti. I'm quite sure that nvidia will not let amd to take whole profitable low-mid range with their polaris gpus.
Posted on Reply
#78
HumanSmoke
medi01AMD just made vague statements about Zen while saying nothing definitive via AMD talking head John Taylor as told to a shitty gossip site
Who cares? Stay on topic and you wont be remembered as a complete twat.
Posted on Reply
#79
Fluffmeister
ZoneDymoErmm not sure where you are getting that from...
Seriously? I think you need to stick your head back into the sand.
ZoneDymoMedi01 simply stated that a GTX1080 is not that interesting to get compared to a GTX980Ti, thats just consumer advice (and perhaps more important regarding your remark, both are Nvidia cards....)
The GTX 1080 is interesting to everyone, even medi01 admits the market is getting flooded with cheap 980 Ti's (and that is already better than anything the competition currently offers).

We all no doubt appreciate medi01's words of wisdom, and you're right they are both nVidia cards.... thanks for pointing that out.
Posted on Reply
#80
GhostRyder
FluffmeisterWe have all certainly established medi01 is not a fan of those meanies nVidia.

He does need to tone it down though, it's getting old now.
Pot calling the kettle black again? Since we have all established your loyalties I would suggest against talking about others and their loyalties...
medi01Let me spoil it for ya:
1) 980Ti's easily hit 1400-1500 Mhz. That's 50% OC.
2) OCed 980Ti is rather close to 1080, for (thank you, nZilla, do moar, please, these dudes with G-Sync monitor deserve even moar).
3) 980Ti are dumped en mass, cards can be had for about 400$. Where do you get with 699$/789Euro (fuck you, nZilla) 1080 vs that?
4) Most reviewers were able to OC 1080 for only 12% of stock, so it ain't that good.
5) +8-15 fps at 4k is hardly "spanking", no matter how you spin it.
The problem is your assuming things based on the first string of reviewers instead of waiting and giving time. Many of the OC software's are not even yet setup properly to support the 1080 which will change in time. Not to mention driver updates and what not... The gains of the 1080 will hit much harder once consumers have a chance and companies have a chance to adjust drivers, software and such to fit the card. Its price tag on the FE version is high, but everyone just has wait till they hit the market to judge and see what happens. Same goes for the 1070, since we have yet to see anything other than its base and boost clocks...
Posted on Reply
#81
efikkan
GhostRyderThe problem is your assuming things based on the first string of reviewers instead of waiting and giving time. Many of the OC software's are not even yet setup properly to support the 1080 which will change in time.
No, overclocking tools interface with NVAPI and doesn't need to be calibrated to each new product. If reviewers are having troubles with overclocking it's caused by the hardware and/or the driver. Keep in mind that GTX 1080/1070/... will be basically "pre-overclocked" compared to Maxwell and Kepler, so a lot of the overclocking headroom is already utilized.
GhostRyderThe gains of the 1080 will hit much harder once consumers have a chance and companies have a chance to adjust drivers, software and such to fit the card.
By software, do you mean games? If games needs to be optimized specifically for a product then someone has written terrible code...
Posted on Reply
#82
rtwjunkie
PC Gaming Enthusiast
efikkanBy software, do you mean games? If games needs to be optimized specifically for a product then someone has written terrible code...
No, I'm sure by software he means things like Precision and Afterburner, for example. Those programs need to be updated to include coding for new GPU's as they come out.
Posted on Reply
#83
Fluffmeister
GhostRyderPot calling the kettle black again? Since we have all established your loyalties I would suggest against talking about others and their loyalties...
It's funny isn't it? I don't see you getting upset with medi01's trolling, we all know where you stand too.

Like I said, you stink of double standards.

*Can't believed I interrupted a game of DoD to respond to you're usual BS.
Posted on Reply
#84
efikkan
rtwjunkieNo, I'm sure by software he means things like Precision and Afterburner, for example. Those programs need to be updated to include coding for new GPU's as they come out.
They only need to update when the native API changes, and they either work properly or they don't. All they do is just to pass some parameters to the driver, and that doesn't need to be calibrated to new GPUs.
Posted on Reply
#85
GhostRyder
efikkanNo, overclocking tools interface with NVAPI and doesn't need to be calibrated to each new product. If reviewers are having troubles with overclocking it's caused by the hardware and/or the driver. Keep in mind that GTX 1080/1070/... will be basically "pre-overclocked" compared to Maxwell and Kepler, so a lot of the overclocking headroom is already utilized.


By software, do you mean games? If games needs to be optimized specifically for a product then someone has written terrible code...
rtwjunkieNo, I'm sure by software he means things like Precision and Afterburner, for example. Those programs need to be updated to include coding for new GPU's as they come out.
Yep, that's what I was talking about. Fact is time will show better overclocking especially once they are in the hands of the public. Then we can judge both the 1080 and the 1070 and their overclocking to see if it really lives up to standards, has limiters, or if its going to be a binning game to get close to the numbers shown.
FluffmeisterIt's funny isn't it? I don't see you getting upset with medi01's trolling, we all know where you stand too.

Like I said, you stink of double standards.

*Can't believed I interrupted a game of DoD to respond to you're usual BS.
First, apparently you can't read since I obviously said something addressing his foolish remarks...

Second, your constant claiming that I am a fanboy is laughable... Guess I am the worst fanboy out there for the reds since I own a majority of products from anything but AMD. In fact, the only AMD product I own at the moment is the 3 R9 290X's in my main rig which currently has an Intel processor. Guessing my laptop (GTX 675m), backup/spare rig (i5 3570k 550ti), server (Intel Xeon 5670x2, Nvidia GTX 950), that all have the opposite sides products must really make me a total fanboy. Getting kinda tiresome that by your standards anyone not insulting AMD on a constant basis is a fanboy of the red team.
Posted on Reply
#86
Fluffmeister
GhostRyderFirst, apparently you can't read since I obviously said something addressing his foolish remarks...

Second, your constant claiming that I am a fanboy is laughable... Guess I am the worst fanboy out there for the reds since I own a majority of products from anything but AMD. In fact, the only AMD product I own at the moment is the 3 R9 290X's in my main rig which currently has an Intel processor. Guessing my laptop (GTX 675m), backup/spare rig (i5 3570k 550ti), server (Intel Xeon 5670x2, Nvidia GTX 950), that all have the opposite sides products must really make me a total fanboy. Getting kinda tiresome that by your standards anyone not insulting AMD on a constant basis is a fanboy of the red team.
You're sweet, I can post pics of the various AMD CPU's I've owned over the years too plus countless ATi cards (naturally when they were the performance kings in those respective fields), I guess that means you need to take back those misguided opinions you have of me too.

Hell just bought my stepfather a cheap HP 255 laptop, fully AMD powered.... yeah your boring me too. You seem to single me out these days, I guess it's gotten personal now? I'm flattered and all that but I'd prefer if you put me on ignore and do us both a favour.
Posted on Reply
#87
bogmali
In Orbe Terrum Non Visi
I would suggest continuing the discussion without the name-calling/trolling/baiting. Is that too much to ask?
Posted on Reply
#88
Fluffmeister
bogmaliI would suggest continuing the discussion without the name-calling/trolling/baiting. Is that too much to ask?
Well said, not sure why GhostRyder thanked your post considering he was the one name-calling/trolling/baiting me. Hey ho, comedy gold all-round.

Anyway, looking forward to see what custom AIB GTX 1070's can do. :cool:
Posted on Reply
#89
Caring1
medi01AMD just made bold statements about Zen, namely, it will compete with Intel "not only on price" but also on "performance, power". They plan to get into "elitbooks" and "XPS" and something from (fucking) Lenovo too.
They already compete in that end of the market with APU's :rolleyes:
Posted on Reply
#90
ViperXTR
lel whatever happened to the thread.

Hope there will be a smaller version of the card, i don't feel like slapping a huge card on my matx rig, though i hear some not so good feedbacks on the prolonged 970 minis of Gigabyte and Asus
Posted on Reply
#91
GhostRyder
Love how fluffmeister always jumps in acting all innocent. Reminds me of South Parks Eric Cartmen...
Caring1They already compete in that end of the market with APU's :rolleyes:
Compete is a strong word, they do a decent job if you need the GPU and don't want a discreet GPU because of power draw but they are in dire need of a complete overhaul.

One big concern I still have for the GTX 1070 is the power limit. Wonder how far it will go?
Posted on Reply
#92
BiggieShady
GhostRyderOne big concern I still have for the GTX 1070 is the power limit. Wonder how far it will go?
Compared to GTX 1080? Well, base and boost clocks are lowered both to widen the gap between 1070 and 1080, and to improve yields with 5 disabled TPCs. So, that's higher chance that silicon lottery may give you a low binned chip that goes crazy with slight overclock, compared to last generation.
970 and 980 were much more closely clocked at stock and therefore had much more similar binning process.
Statistically 1070 should clock worse than 1080, but highest clocking pascal should be cherry picked 1070 on account of 5 disabled TPCs.

edit: I just realized you were talking about the power limit
Posted on Reply
#93
medi01
AMD VP did definitely say that, speaking in his name would be too much even for wccft.
Promises look quite optimistic (I was worried they kept it mum for a while). With "we'll close the gap" and "we never been so close".
(although I can remember times when they were ahead, it was more than 10 years ago so probably doesn't count)

Anyhow, if AMD delivers, you can get your beloved Intel/nVidia chips for less, why not cross finger for AMD for that alone?

I, for one, will likely buy 480(x) (to replace 380) and later on likely Zen, (to replace i5 750).
GhostRyderThe gains of the 1080 will hit much harder once consumers have a chance and companies have a chance to adjust drivers, software and such to fit the card.
Well, on a driver front, especially when talking about anomalies that, for instance, 290x keeps becoming faster vs 780Ti for the third consecutive year, people normally accounted that for AMD needing more time to develop good drivers. So theory goes that nVidia drivers are good and fast right away.
EroticusGTX 1080 vs Titan X not even 1.3 stronger, so don't keep your hopes that high.
That graph was 1080 vs cheaper previous gen card with model number that ended with 80, not vs Tx.

Curious part of it is, graphs for 1080 vs 980 are exactly the same as for 1070 vs 970... =)))
EarthDogWe are seeing reference cards hit 2100Mhz and generally being temperature limited.
Isn't it nearly always the case with overclocking?
There is certain threshold when upping this and that reasonably increases power consumption, but at some point power consumption grows too quickly.

Now, 980Ti was a brilliant OC-er and hands down best card high end tier of the last gen. (yep, I said that. Actually, I have no problems saying that, it's just some nZilla fanboi go defensive for no reason when reading my comments). It could easily get +40% of the stock, and even 50% wasn't that rare.

With 12% OC 1080 simply doesn't cut it. It might in the future, but not that there isn't much time left if Vega is coming in Oct, as rumored.
Oh, that if GloFo / Samsung 14nm doesn't fruck up, which, from what I've heard, namely:
a) Apple's TSMC 16nm chips are superior to the same chip on Samsungs 14nm (I, frankly, don't get how it could be "the same chip" given the differences in the process, but, oh well)
b) Nobody really forced nVidia to go with TSMC. They would NOT risk their high end dominance to simply get cheaper chips, so at the very least, there is no clear superiority in what Samsung's 14nm is doing
c) Some vague badmouthing about GloFo underperforming and underperforming

it could well do.
Caring1They already compete in that end of the market with APU's :rolleyes:
There is no Dell XPS with AMD APU/CPU.
Also no high end Lenovo either.
HP might have some, as they (Compaq) tend to favor AMD.
Posted on Reply
#94
loop
MrGeniusAgain with the lack of reading comprehension. :shadedshu:

So let me clarify. When compared to another single GPU the GTX 1080 , even with NO OC, spanks their ass every time. The NON OC GTX 1070 probably will too(if not then close to it). You'll see. Then we'll OC them both and the previous gen. cards will get spanked even worse. Comparing an OCed card to a non OCed card is hardly a fair comparison.

Comparing 4K FPS results is meaningless to the vast majority of us too. Who cares? You even threw in some NoAA results to muck things up even worse.

Nice job! :rolleyes:
R u serious? Did you see ;any of the charts i posted??????
Did i said to anyone who is going to buy a new vga now go and take 980ti and not 1080?????????
My point is if you already have a custom 980ti now, there is no need to update, 30% is only when compared to stock 980ti (1ghz) the cards i posted above ARE of course oced BUT factory and working like that STOCK, both can oced at least 1.4ghz, Or users have to underclocked their cards because reviewers wnat to show what it have to be shown
R u mr genious or mister dumb?
Posted on Reply
#95
P4-630
loopthere is no need to update
How do we update our videocard to GTX1070? :D
I'd like to know!
Posted on Reply
#96
rtwjunkie
PC Gaming Enthusiast
P4-630How do we update our videocard to GTX1070? :D
I'd like to know!
It will be a Patch Tuesday download! :-)
Posted on Reply
#97
loop
upgrade not update ok now?
Posted on Reply
#98
Caring1
rtwjunkieIt will be a Patch Tuesday download! :)
More like an Nvidia driver update in 2 or 3 years will update the 1080 to 1070 performance levels, just in time for a new round of super cards. :laugh:
Posted on Reply
#99
GhostRyder
BiggieShadyCompared to GTX 1080? Well, base and boost clocks are lowered both to widen the gap between 1070 and 1080, and to improve yields with 5 disabled TPCs. So, that's higher chance that silicon lottery may give you a low binned chip that goes crazy with slight overclock, compared to last generation.
970 and 980 were much more closely clocked at stock and therefore had much more similar binning process.
Statistically 1070 should clock worse than 1080, but highest clocking pascal should be cherry picked 1070 on account of 5 disabled TPCs.

edit: I just realized you were talking about the power limit
Yea, I just meant in general the power limit that the card is allowed. If its hindered hard and vendors don't have custom versions/modded bios to allow for more I wonder how that will effect it and how much voltage will effect these cards in general. Maxwell was not very responsive to voltage last round but the way this is going it sounds like this round is different which is what I hope for (Just a personal preference, plus I want a reason to grab classifieds or similar with heavy voltage delivery systems for better overclocking).
Posted on Reply
#100
Vayra86
medi01Isn't it nearly always the case with overclocking?
There is certain threshold when upping this and that reasonably increases power consumption, but at some point power consumption grows too quickly.
No, not always. Maxwell was very much voltage limited before temperature.
Posted on Reply
Add your own comment
May 16th, 2024 19:41 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts