• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 960 Specs Confirmed

If the GTX 960 is 10% better performance-wise and only drains half the power of my HD 7870 XT, it's an instant buy for me.
 
It's also liable to become the top card in the Chinese market for Colorful and similar manufacturers from that ilk.
It sure is. Bet vendors like them will make custom card based on it with little price impact.
 
@Cheeseball seems that most of us have to just wait & see how good is this card will be once it's out.
 
I wonder how it compares to the 660 ti, in terms of raw performance?

This would be a good card to swap the 660 ti out with in this machine. I could then replace the 5870 in another machine with my 660 ti.
 
You guys think this would be a good replacement for my aging HD 7870 XT (Tahiti LE, 1536 cores) on my secondary PC?
Tahiti LE?
Nope it will not. It will be better in some areas (lower power consumption and noise) but for $200 in will not be an upgrade that will really make a difference. Except if your favorite games use PhysX effects. In that case it could look as a serious upgrade to you.
 
You'd be surprised. Here are my total figures for VRAM usage on 1440p. I like to think I review a spread of varied games of all styles. Some of those include Early Access titles which are optimised horribly.
1440p.png


Now include the chart with Watch_Dogs, Assassin's Creed Unity, Dragon Age Inquisition, or any other one that represents more of what people who play at 1440p (or higher) will have when going from PS4 or Xbox One exclusives to PC port.

A bunch of indies and some 360-built ports--might as well include the Saints Row Gat out of Hell benchmarks too right?--don't really represent why having less than 2GB is unwise going forward.
 
Looks like a very nice card for 1080p gaming. I hope the rumored $200 price point is true. It will be very successful if so.
If nVidia goes aggressive with pricing down to $180, that card will sell like hot cakes and will be what 7850 did for AMD few years back.
 
Now include the chart with Watch_Dogs, Assassin's Creed Unity, Dragon Age Inquisition, or any other one that represents more of what people who play at 1440p (or higher) will have when going from PS4 or Xbox One exclusives to PC port.

A bunch of indies and some 360-built ports--might as well include the Saints Row Gat out of Hell benchmarks too right?--don't really represent why having less than 2GB is unwise going forward.

I'll run an article later tonight or tomorrow with a couple of the AAA titles, detailing VRAM as well as memory bandwidth usage and PCIe bus usage on Maxwell. Hoping to get hold of a card without Maxwell compression with similar GB/s memory bandwidth figures (looks like the 770 is a match) to note differences not just in VRAM usage, but also memory bandwidth usage. But that depends on whether I can source a card for tests.
 
  • Like
Reactions: xvi
Dragon Age Inquisition takes 2400 VRAM on res 1440 x 900 :)
 
What would this say about mobile solution? 860m is already based on Maxwell, if this is a "960m" coming along, would it be any better than the 860m?
 
That's strange, NVIDIA uses a cut down GM204 with 1024/128bit for their mobile GTX 965M, but it seems that their GM206 (this GTX 960) has the same configuration?


They are stockpiling chips better than 1024/128 for their GTX 960 Ti
 
So, double the 660 performance means Nvidia is saying hands-down this beats a 770, correct?
is this stock or oc (the double 660 claim) ?
if its stock that would mean I could reach gtx 970 levels of performance :0
 
is this stock or oc (the double 660 claim) ?
if its stock that would mean I could reach gtx 970 levels of performance :0

Who knows? the whole thing is vague.
 
Really curious how this performs up against a 780.
 
All I wanna know is whether it will make Crysis 3 look shitty for decent playability. And of course, FC4 and the last 2 COD stuff - all requireDX11/SM5 which my prehistoric GTS 250 ain't got. However, the "future proof" aspect keeps me worried... so may end up with the 970 ultimately.
 
All I wanna know is whether it will make Crysis 3 look shitty for decent playability. And of course, FC4 and the last 2 COD stuff - all requireDX11/SM5 which my prehistoric GTS 250 ain't got. However, the "future proof" aspect keeps me worried... so may end up with the 970 ultimately.

Just buy the best you can get right now. "future proof" really needs to stop "trending" when it comes to this stuff.
 
Any word about video decoding options, tegra x1 has full h265/vp9 deocoding. Really hope this has too.

If its like all other GeForce cards it wont. Tegra X1 supports 10bit making it possible to be fully 4k compliant. GeForces are all 8bit. Decoding and encoding will still work.

For full H265/VP9 you have to get Radeon HD 6xx0 or newer, FirePro or Quadro card
through DP 1.2+ or HDMI 1.4+.

Content->Processing->Panel
 
Last edited:
All I wanna know is whether it will make Crysis 3 look shitty for decent playability. And of course, FC4 and the last 2 COD stuff - all requireDX11/SM5 which my prehistoric GTS 250 ain't got. However, the "future proof" aspect keeps me worried... so may end up with the 970 ultimately.

You obviously keep a card for a long time so maybe a GTX 970 would be best for you but I think it's a pretty safe bet that the GTX 960 is going to come in between a 760 and a 770. To which side it leans to is unknown. Probably towards the 770 side. We'll know that pretty soon but a GTX 960 would be a heck of a nice upgrade for you from that GTS 250.
 
Last edited:
You obviously keep a card for a long time so maybe a GTX 970 would be best for you but I think it's a pretty safe bet that the GTX 960 is going to come in between a 760 and a 770. To which side it leans to is unknown. Probably towards the 770 side. We'll know that pretty soon but a GTX 960 would be a heck of a nice upgrade for you from that GTS 250.
Yeah, I know, the 960 would definitely be a great upgrade *NOW*, but the card I want should be able to tackle, say Crysis 4 - at least minimally (>30fps @ lowest settings) @1080p. Is that too naive to expect? I hope W1zzard includes the FC4 and last 2 COD stuff in the review. Will get a pretty good idea then. Not expecting too much though.
 
Yeah, I know, the 960 would definitely be a great upgrade *NOW*, but the card I want should be able to tackle, say Crysis 4 - at least minimally (>30fps @ lowest settings) @1080p. Is that too naive to expect? I hope W1zzard includes the FC4 and last 2 COD stuff in the review. Will get a pretty good idea then. Not expecting too much though.


Wait for Wizz's review before making a decision.
 
If its like all other GeForce cards it wont. Tegra X1 supports 10bit making it possible to be fully 4k compliant. GeForces are all 8bit. Decoding and encoding will still work.
For full H265/VP9 you have to get Radeon HD 6xx0 or newer, FirePro or Quadro card
Say what???? That's news considering even AMD's latest Tonga offering doesn't have H.265 decode support

DXVA_285_575px.png
 
I'll run an article later tonight or tomorrow with a couple of the AAA titles, detailing VRAM as well as memory bandwidth usage and PCIe bus usage on Maxwell. Hoping to get hold of a card without Maxwell compression with similar GB/s memory bandwidth figures (looks like the 770 is a match) to note differences not just in VRAM usage, but also memory bandwidth usage. But that depends on whether I can source a card for tests.

I'd be very interested in the results! In the past people have compared the 770 2GB and 770 4GB and not found any benefit to the increased vram, except in sli and barely then. Vram requirements are a very hot topic right now. Many are *claiming* that lack of vram is hurting performance in new games, but they always have cards that are slow as well as low on vram, and assuming vram is the culprit when it probably isn't.

Would be great to have a testbed set up with two identical fast cards except for double the vram on one. The 770s are pretty ideal, or maybe the 960s once the 4GB version comes out.
 
If its like all other GeForce cards it wont. Tegra X1 supports 10bit making it possible to be fully 4k compliant. GeForces are all 8bit. Decoding and encoding will still work.

For full H265/VP9 you have to get Radeon HD 6xx0 or newer, FirePro or Quadro card
through DP 1.2+ or HDMI 2.0.

Content->Processing->Panel

I have to admit I have no idea what do you mean. But I can guarantee that you can't hardware decode h265/vp9 with hd6xxx, heck tonga r9-285 was the first amd gpu that could decode h.264 4k60p video.
 
I have to admit I have no idea what do you mean. But I can guarantee that you can't hardware decode h265/vp9 with hd6xxx, heck tonga r9-285 was the first amd gpu that could decode h.264 4k60p video.

H265/VP9 are mainly for the 4k 10bit 4:2:0+ standard. You do have the option for lower or higher quality options.

You could let the CPU do the work load if its fast enough and still be crippled by your 8bit GPU.

Content True H265/VP9 4k 10bit 4:2:0+ -> Processing CPU/GPU if your GPU is processing it at 8bit out your already downgrading the quality before it gets to your panel.
 
Last edited:
Back
Top