• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce GTX 960 Launch Date Revealed

I doubt it. I don't think GTX 960 will be much faster than HD7970. And those came with 3GB and they work just fine for resolutions they are capable of running anyway.
I would say 3GB, but it's 128bit. We all remember what happened last time we got a GPU with unbalanced memory. :P

Remember, 2.1GB is still over 2, so you would have to bump up to 4. I think enough games in the next few years will surpass the 2GB mark to want/need 4GB.
 
The 980 has a 256-bit bus, I don't know why anyone would expect anything more than 128-bit for the mid-range. Plus, the 256-bit handles 4k just fine, so there should be no question 128-bit will handle 1080p just fine. I'd bet two of these in SLI would kill 1440p.

Memory bandwidth is almost never the limiting factor.
 
Memory bandwidth is almost never the limiting factor.

Somehow I think this will die together with "you don't need more than 500W for single GPU systems" - i.e. never...
 
Exactly, because everyone has $350, $550, or a grand for a graphics card.

better off just getting a higher performing second hand older card or waiting for the new gen to come out then to waste money on this mediocre performing thing, why buy a new card when you know its going to be "meh" at best?
 
better off just getting a higher performing second hand older card or waiting for the new gen to come out then to waste money on this mediocre performing thing, why buy a new card when you know its going to be "meh" at best?

Think of this card as Tonga... but good.
 
Think of this card as Tonga... but good.

Basically, this. The 960 should be able to do about 60% of the performance of the 980, based on the specs we know and the rumors of shader counts. But its power consumption should be amazing. I'd be surprised if it used 100w.
 
Affirmative I'll keep my current GTX760…
970 maybe next December…:roll:
 
The 980 has a 256-bit bus, I don't know why anyone would expect anything more than 128-bit for the mid-range. Plus, the 256-bit handles 4k just fine, so there should be no question 128-bit will handle 1080p just fine. I'd bet two of these in SLI would kill 1440p.

Memory bandwidth is almost never the limiting factor.

4k resolution yes but all Nvidia GeForce cards are limited to 8-bit output and 4k content is 10-bit 4:4:4 even 4k streams will be 10-bit 4:2:2 so attaching a GeForce to a 10-bit panel or decent TV isn't the smartest thing. Not to mention they are compressing 4k output already to 8-bit 4:2:2 instead of 8-bit 4:4:4 so your already getting degrading results if content quality matters to you when your not gaming.
 
4k resolution yes but all Nvidia GeForce cards are limited to 8-bit output and 4k content is 10-bit 4:4:4 even 4k streams will be 10-bit 4:2:2 so attaching a GeForce to a 10-bit panel or decent TV isn't the smartest thing. Not to mention they are compressing 4k output already to 8-bit 4:2:2 instead of 8-bit 4:4:4 so your already getting degrading results if content quality matters to you when your not gaming.

I think you'd be hard pressed to find anyone that could actually tell the difference in a blind test.
 
I think you'd be hard pressed to find anyone that could actually tell the difference in a blind test.

Its pretty straight forward unless you have bad eye-sight. They are countless examples on the web for comparisons.

1364467829_10_bit_vs_8_bit_Encoding.jpg


10-bit.jpg


If the output is not native you'll experience more banding (color smearing).

Blue-Rays are also going to be providing 4k content in 10-bit 4:4:4.
 
Year 2015, 128 bit bus, 2 GB and asking 200 $ (or more) is overpriced for me.

Hey, the HD2900XT had a 512-bit bus. Honestly, what do the numbers matter as long as the card performs? Things like larger memory buses and more VRAM take up die space and power, and Nvidia is trying to cut down on both. If this card lands between the GTX770 and GTX780, it will be an absolute steal.

Its pretty straight forward unless you have bad eye-sight. They are countless examples on the web for comparisons.

10-bit.jpg

Your proof\example in discussing image quality is seriously a 285x158 jpg?
 
Its pretty straight forward unless you have bad eye-sight. They are countless examples on the web for comparisons.

1364467829_10_bit_vs_8_bit_Encoding.jpg


10-bit.jpg


If the output is not native you'll experience more banding (color smearing).

Blue-Rays are also going to be providing 4k content in 10-bit 4:4:4.

And everything you see on the internet is artificially made to look a lot worse than the real world difference because most are viewing them on 8-bit panels with 8-bit consumer cards. Heck, the pictures you posted are 8-bit jpegs. How can you show the difference 10-bit makes in an 8-bit picture?

I've actually seen them side by side, it is pretty much impossible to tell the difference.
 
And everything you see on the internet is artificially made to look a lot worse than the real world difference because most are viewing them on 8-bit panels with 8-bit consumer cards. Heck, the pictures you posted are 8-bit jpegs. How can you show the difference 10-bit makes in an 8-bit picture?

I've actually seen them side by side, it is pretty much impossible to tell the difference.

Your proof\example in discussing image quality is seriously a 285x158 jpg?

No point in providing native 10-bit examples if your viewing them through a 8-bit GPU process... :rolleyes:
 
Last edited:
No point in providing native 10-bit examples if your viewing them through a 8-bit GPU process... :rolleyes:
Exactly. It is far better to show a 8-bit image, and make an artificially terrible copy of it and claim the difference is what you see between 8-bit and 10-bit...:rolleyes:

Look how much smoother the 10-bit image is! Oh wait, that is how smooth an 8-bit image is because it IS and 8-bit image...so what does a 10-bit look like? Answer: pretty much the same at the 8-bit.
 
Last edited:
I'm glad a few people jumped in and shut down the 128 bit jibber jabber. 128 is just a number, what matters is performance, this is a new generation, with a new architecture and memory controller, and even the memory itself, the proof is always in the pudding and only so much can be taken from a spec sheet, so lets just wait and see eh?

as for 2gb only... I feel there will be 4gb cards. texture size is growing and so is memory needs. 2gb wasnt enough for my GTX670 IMO and it won't be enough for this card if you intend to keep it for 2-3 years.

you only need to be needing 2.1gb of vram for the difference between 2 and 4 to be obvious. 2gb is clearly the stock amount to help this card reach it's price point and also help push people to go the 970 if they want 4.
 
That 10bit examples above are greatly exaggerated just to prove a point.. The same as 4K upscaling examples where you supposedly make grainy crap video into a super sharp image. And we know it's never like that, but you'll see fancy side to side comparisons, just to prove a point.

And with 10bit output you also need monitor that actually is capable of displaying 10bit, otherwise, it's like sticking a V10 750HP engine on a bicycle...
 
Those that say this is "meh" are missing the point. This card is meant for the resolution that the overwhelming majority of gamers have: 1080p. For THAT resolution it should be a screamer at minimal power usage. You can't compare it and say it won't do 1600 with all visual features.

With compression, this Maxwell can perform just as well at 128 bit as older dies did on 256 bit. I know it's hard to let go, but times and technology change, and get more efficient, so the old standards no longer apply.
 
Last edited:
Okay, just an example for everyone who doesn't get the bandwith on these GPUs. The Titan has a 384bit bus while a GTX 680 only has 256, hence 50% more memory bandwidth (assuming clock and latencies are identical.

I'll try to explain the whole concept a bit more: the following is a simplified model of the factors that determine the performance of RAM (not only on a graphics cards).

Factor A: Frequency

RAM is running at a clock speed. RAM running at 1 GHz "ticks" 1,000,000,000 (a billion) times a second. With every tick, it can receive or send one bit on every lane. So a theoretical RAM module with only one memory lane running at 1GHz would deliver 1 Gigabit per second, since there are 8 bits to the bytes that means 125 Megabyte per second.

Factor B: "Pump Rate"

DDR-RAM (Double Data Rate) can deliver two bits per tick, and there even are "quad-pumped" buses that deliver four bits per tick, but I haven't heard of the latter being used on graphics cards.


Factor C: Bus width


RAM doesn't just have one single lane to send data. Even the Intel 4004 had a 4 bit bus. The graphics cards you here have 256 bus lanes and 384 bus lanes respectively.

All of the above factors are multiplied to calculate the theoretical maximum at which data can be sent or received:

**Maximum throughput in bytes per second= Frequency * Pumprate * BusWidth / 8 **

Now lets do the math for these two graphics cards. They both seem to use the same type of RAM (GDDR5 with a pump rate of 2), both running at 3 GHz.

GTX-680: 3 Gbps * 2 * 256 / 8 = 192 GB/s

GTX-Titan: 3 Gbps * 2 * 384 / 8 = 288 GB/s

Factor D: Latency - or reality kicks in

This factor is a LOT harder to calculate than all of the above combined. Basically, when you tell your RAM "hey, I want this data", it takes a while until it comes up with the answer. This latency depends on a number of things and is really hard to calculate, and usually results in RAM systems delivering way less than their theoretical maxima. This is where all the timings, prefetching and tons of other stuff comes into the picture. Since it's not just numbers that could be used for marketing, where higher numbers translate to "better", the marketing focus is mostly on other stuff.

Conclusion
So, since NVIDIA is making use of the advanced texture compression I see absolutely no problem regarding the smaller memory bus. The new architecture gives them the ability to decrease the bus bandwith and really shouldn't be considered a problem. Since more than half of the "gaming community" uses less than 1080p (most of them are on 1680x1050) there is absolutely nothing wrong with the 2 GB VRAM. Okay, enough.
 
Last edited:
How does this 960 compare to the 780?
 
I'm sure the cut-down GM204 will be in the 960 Ti right around the time when the 970 sales slow, which will probably be around March at this rate. It was rumored to be released a few months back, but now we get this version of the 960 instead.

I presume this was meant to be lower priced, but when they saw the sales of 970 (and to a lesser degree 980), they decided not to kill the golden goose by making a part that bludgeoned its sales the way the 970 bludgeons the 980. If not for the widespread complaints of coil whine and the lack of a reference design (with nVidia reference cooler), the 970 would be virtually the only card selling.

So the last thing nVidia really wants on a new GPU die is one that lets people forego the 970 in favor of a 960...
 
the lack of a reference design (with nVidia reference cooler)
The 970 actually has one I think, though it could just be marketing. I know EVGA said they might have one out by early 2015 since there was so much interest.
 
I have NO coil whine with my Gigabyte 970, I am very happy with it!
 
That 10bit examples above are greatly exaggerated just to prove a point.. The same as 4K upscaling examples where you supposedly make grainy crap video into a super sharp image. And we know it's never like that, but you'll see fancy side to side comparisons, just to prove a point.

And with 10bit output you also need monitor that actually is capable of displaying 10bit, otherwise, it's like sticking a V10 750HP engine on a bicycle...

At least you understand what it takes for it to work.

People running 8-bit gpus with 6-bit TN panels want to see a difference. I'm pretty sure its the same old, I want to argue for the sake of it mentality :rolleyes:

All Nvidia has to do is enable 10bit processing on there GeForce line like they do their Quadro cards. AMD has been doing it for awhile and they can be future prof for true 4k content. I'm sure a lot of people will appreciate it down the line even those eyeing this GTX 960.
 
Last edited:
Back
Top