• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GTX 1070 specs and the reason that nvidia didn't unveil it!

Reason they don't need to speak too much about the 1070 is because it's not the main card. 1080 is. You don't go into all the depth about the 2nd tier card when you're touting how good the first one is.

And while this thread is full of nonsense and rumour:

1080 reaches 2.5GHz under AIB water cooling for a mighty 12.8 Teraflops.

http://wccftech.com/geforce-gtx-108...iants-including-25-ghz-liquid-cooled-edition/

If this is true I'll be amused at the ensuing hand wringing from certain peeps. If it's not true - then all is good in the world of balance.
 
The upshot here is that while this is the first time NVIDIA has used this specific ROP/MC configuration in a product, this is not the first product they have designed with segmented or otherwise unbalanced memory configurations. Since the GTX 500 series, on some midrange SKUs NVIDIA has used unbalanced/asymmetrical memory configurations, most recently on the GTX 660 and GTX 660 Ti. In the case of both of those cards, NVIDIA utilized a 192-bit memory bus with 2GB of VRAM attached, which meant that some memory controllers had more VRAM attached to them than others. The end result as it turns out is very similar, and while NVIDIA has never explained in-depth how they handle memory allocation on those cards, it turns out that it’s very similar to GTX 970’s memory segmentation. Which is to say that NVIDIA actually has multiple generations of experience with segmented memory, and this is not the first time they have implemented it. Rather this is first time we’ve seen such a configuration on a high-performance card such as the GTX 970
cite
http://www.anandtech.com/show/8935/...cting-the-specs-exploring-memory-allocation/2
toward the bottom

Actually, even 400 series had asymetrical memory. 460 v2 comes to mind.
 
Hello people.
I will probably buy 1070 after it's released since i didnt use desktop PC for like 10 years,
I am new to this graphics card and desktop hardware stuff. So as it may look stupid to you but what does it mean 1070 has 1920 SP ? and what is SP :)
 
...what is SP :)
In simplest terms they are shaders, or Shader Processors.

SP = Shader Processor = Shader Processing Unit = Shading Unit = CUDA core = Shader

BTW, 1920 shaders is a low ball guesstimate. Other guesstimates are as high as 2304 shaders. The true number will likely not be released until the 1070 hits shelves(or there abouts).

http://www.techpowerup.com/gpudb/2840/geforce-gtx-1070
http://www.anandtech.com/show/10304/nvidia-announces-the-geforce-gtx-1080-1070/2

EDIT: If we were talking AMD terms SP = Stream/Shader Processor = Shader Processing Unit = Shading Unit = Shader
 
Last edited:
The 1070 is not fully revealed, because for Nvidia's purposes the 1080 is the most important piece of tech at the moment. It is the current flagship of the Pascal family, and will forever be the standard-bearer of the architecture, as the GTX 680 was for Kepler, 7970 for GCN, GTX 980 for [real] Maxwell and Fury X for the Fiji product family.

It really isn't a surprise. AMD does it all the time. Look at how many times Anandtech has had to make tables documenting two-prong releases like this, in which the lesser card's specs are virtually blank and the writer has to substitute question marks and amusing snippets like (who knows?) or (less than insert name of flagship) instead of real numbers.

It's a cut down version of the big brother that you can get for less $$$. And as far as Nvidia or AMD is concerned, that's all you need to know at the time of release.

Whether you think that's fair is up to you, but if they fully documented the 1070, the 1080 would no longer be the star of the show. When you're promoting a brand new architecture or technology, the posterboy needs to be flawless and the best you can give. The gimped, cheapened 1070 is not that product.

That said, this has been an amusing theory to read.
 
Reason they don't need to speak too much about the 1070 is because it's not the main card. 1080 is. You don't go into all the depth about the 2nd tier card when you're touting how good the first one is.

And while this thread is full of nonsense and rumour:

1080 reaches 2.5GHz under AIB water cooling for a mighty 12.8 Teraflops.

http://wccftech.com/geforce-gtx-108...iants-including-25-ghz-liquid-cooled-edition/

If this is true I'll be amused at the ensuing hand wringing from certain peeps. If it's not true - then all is good in the world of balance.
...............................................I am about to explode with curiosity if that is the case!!!!! Now I really want one so badly if they really can go almost an extra 1ghz on the core clock!!!

Personally, I will still be waiting for both either way to get reviewed before I make a purchase. But right now the 1080 sounds fantastic, not sure yet about the 1070.
 
Apparently the GTX 1070 is nothing but a damaged 1080 chip.
Um pretty much every card generation has done that. 980/970, 680/670, incase you think AMD doesn't do it well they have done it just the same 290x/290 etc. Its done to reduce waste of just scrapping a chip that still can be used at a cheaper point.

BTW, 1920 shaders is a low ball guesstimate. Other guesstimates are as high as 2304 shaders. The true number will likely not be released until the 1070 hits shelves(or there abouts).
I am betting its 2048, it would be a round number.
 
In simplest terms they are shaders, or Shader Processors.

SP = Shader Processor = Shader Processing Unit = Shading Unit = CUDA core = Shader

BTW, 1920 shaders is a low ball guesstimate. Other guesstimates are as high as 2304 shaders. The true number will likely not be released until the 1070 hits shelves(or there abouts).

http://www.techpowerup.com/gpudb/2840/geforce-gtx-1070
http://www.anandtech.com/show/10304/nvidia-announces-the-geforce-gtx-1080-1070/2

EDIT: If we were talking AMD terms SP = Stream/Shader Processor = Shader Processing Unit = Shading Unit = Shader

thanks a lot. it was really informative :) 1070 is currently dream card I'm hoping it will be able to allow me to play+stream at the same time.
 
ZOMG the card has 1080 in the name LULzzzzlelellelelelelellelelehehehehehhehehekekekkehiim12
go home tpu is not Reddit we are grown ups here .. well mostly
this thread serves no informative purpose other then to circle jerk
NO
 
.. well mostly
this thread serves no informative purpose other then to circle jerk
NO

so THAT'S what's all over the floor.
 
No. It's mostly cat shit. With one large dinosaur dropping. ;)
 
ZOMG the card has 1080 in the name LULzzzzlelellelelelelellelelehehehehehhehehekekekkehiim12
go home tpu is not Reddit we are grown ups here .. well mostly
this thread serves no informative purpose other then to circle jerk
NO

and you're the gonorrhea.

I love when people actively shit on a thread, especially a benign one. It shows the vulnerability of the poster; reflects poorly on themselves.

Just unsub it if you don't like it and let people discuss what they want in peace.
 
Does this mean GTX 1070 won't be able to render at 1080p ? Will it stretch the image from 1070p to 1080p or will you get black lines? XD
 
Maybe not, but I don't know any game yet that uses 8GB of vram at 1080p, I will buy a GTX1070 and use it for 1080p and even 7GB vram is more than enough for a very long time!

I did not buy a GTX970 because of the memory issue, since it only had 3.5GB fast vram and there are games now reaching over 3.5GB vram usage.
So you wont buy a 970 because of the memory issue, but you WOULD buy a 1070 with said issue? Yet you yourself acknowledge the 970 memory bus is starting to become an issue, the same thing will happen to the 1070. The 970 came out a year and a half ago, the same timespan could also apply to the 1070.

games like forza run horribly at 1080p with less than 4GB VRAM. Look up the performance of a 2GB 770 vs a 4GB one in that game.

With game devs finally using console hardware to full effect, expect this even more in the next few years. the ps3 and 360 effectively had a 256mb video buffer, yet the pc ports could easily hit a GB of vram usage, and were pushing above that late in the generation BF3 could hit 2GB vram usage). It wouldnt surprise me if games started using 7GB+ at high rez in the next few years, at which point the 1070 will be in the position the 970 is now, with certain games just pushing the edge of it's usable vram. Seems like a huge waste of money to buy a chip you know is crippled in a way its lifespan could be cut short due to an artificial limitation.
 
I'm confused first off why people are surprised and second off why people care so much. The card will play 4K at least as good as the 980/290(x)/390(x) and will finally let NV play VR worth a shit.
 
So you wont buy a 970 because of the memory issue, but you WOULD buy a 1070 with said issue? Yet you yourself acknowledge the 970 memory bus is starting to become an issue, the same thing will happen to the 1070. The 970 came out a year and a half ago, the same timespan could also apply to the 1070.

games like forza run horribly at 1080p with less than 4GB VRAM. Look up the performance of a 2GB 770 vs a 4GB one in that game.

With game devs finally using console hardware to full effect, expect this even more in the next few years. the ps3 and 360 effectively had a 256mb video buffer, yet the pc ports could easily hit a GB of vram usage, and were pushing above that late in the generation BF3 could hit 2GB vram usage). It wouldnt surprise me if games started using 7GB+ at high rez in the next few years, at which point the 1070 will be in the position the 970 is now, with certain games just pushing the edge of it's usable vram. Seems like a huge waste of money to buy a chip you know is crippled in a way its lifespan could be cut short due to an artificial limitation.

As I said, I play at 1080p, so even 7GB vram will be plenty for me for many years.
I did not buy a GTX970 because it probably has not enough vram to play the upcoming games at decent settings.
 
What you ignore, bc you only seem to have read the summary article: Actually it's "faster than Titan X" in VR when using simultaneous multiprojection", not in general.
 
*source: ... umm... I promised to not tell.

Somehow there are 3 pages of discussion even without a source so this is pure conjecture/nonsense.

Gotta love it when TPU is used as a tool to spread baseless speculation.

EDIT: If/when there is any credit to this information, it'd be front page news, don't you think?
 
Last edited:
Back
Top