• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA to Launch GeForce GTX 960 in January

you already need pretty much nothing less then the top end to play the latest games on the highest settings.
Blame the game developers for that.
 
Cool. I have GTX 660 and the wife is on an HD 6850, so the GTX 960 is potentially a decent upgrade for us.
 
2Gb defnetly is not enough,
i play with an gtx750ti 2Gb and have to reduce the textures to minimum on assassins creed, medium on far cry4, medium on shadow of mordor and medium on watch dogs to avoid stuturing.
i also close firefox before playing to save some more 200mb of memory, and it will only get worse with new games, so no 2Gb is not enough.
 
2Gb defnetly is not enough,
i play with an gtx750ti 2Gb and have to reduce the textures to minimum on assassins creed, medium on far cry4, medium on shadow of mordor and medium on watch dogs to avoid stuturing.
i also close firefox before playing to save some more 200mb of memory, and it will only get worse with new games, so no 2Gb is not enough.

Your GPU is fine for a lot of games at high settings but it's underpowered for the most demanding games. More VRAM won't help with your GPU.
 
2Gb defnetly is not enough,
i play with an gtx750ti 2Gb and have to reduce the textures to minimum on assassins creed, medium on far cry4, medium on shadow of mordor and medium on watch dogs to avoid stuturing.
i also close firefox before playing to save some more 200mb of memory, and it will only get worse with new games, so no 2Gb is not enough.

yeah, that has nothing to do with video card RAM. My 2 year old GTX670 w/ 2gb can run all those games on high to ultra at 1080, no problem. Also, freeing up system memory by closing apps has nothing to do with the VRAM on the video card.

What does firefox have to do wit vram?

nothing
edit: nothing, unless he/she's also trying to play a Unity based browser game at the same time?? or maybe firefox on a 2nd monitor while gaming on the primary. Eitherway, system memory and vram are 2 different things
 
Last edited:
I guess it utilizes the GPU to accelerate page rendering. Not sure.

major edit: ok, ignore previous screenshots: noticed gpu mem usage set to "min." because i'm an idiot. Checked it again after 30 mins. of firefox/ typical browsing, max mem. usage was 300mb. So yeah, Windows combined w/ browser does use some vram, but 2gb is still plenty of overhead for almost every game at 1080p.

That being said, some games (titanfall, witcher 2, etc.) have "insane" texture resolutions specifically for cards w/ 4gb+ of vram, but Ulta still looks great in those games, so :rolleyes:
 
Last edited:
If it does use GPU rendering, it can't be more than a 100mb
It depends on what kind of pages you are browsing, not sure.
OI05jci.png

AI8GtFF.png
 
It will be great budget card for sure. Enough for average casual gamer. For the others there are already cards to go. :)
 
2Gb defnetly is not enough,
i play with an gtx750ti 2Gb and have to reduce the textures to minimum on assassins creed, medium on far cry4, medium on shadow of mordor and medium on watch dogs to avoid stuturing.
i also close firefox before playing to save some more 200mb of memory, and it will only get worse with new games, so no 2Gb is not enough.

You have stuttering on those games not because of not enough VRAM but because of the bus. 128bit bus is not wide enough to process the big size of the textures, any AA and AF, etc ;)
 
I remember telling someone in some thread not to hold their breath over this card.

I'm sorry there is no way these specs are real. With these specs it won't achieve 760 or 770 performance

This Rumor = my ass ;)
 
WTF 128 bit !!!!
128bit bus is not wide enough to process the big size of the textures, any AA and AF, etc ;)
Bus width is only one of many factors affecting memory bandwidth, which in turn is only one of many factors affecting overall performance.

You can't just say "it's 128 bit so it's not wide enough" or else you're no different than the people who say "my processor has more GHz so it's better!"

I feel like this thread has devolved into an extension of "Choose R9 290 Series for its 512-bit Memory Bus: AMD"
 
Last edited:
Wow this website is becoming the OLD WCCF tech. All rumors>full BS.
So what is your source OP? you dreamed it last night and made it a news?

You guys should use your brains and if you dont have a brain such as OP, then use LOGIC. 128bit is for low end cards. GTX 960 will have 256bit bandwidth just as GTX 760 and 4GB of ram. While 128bit will be left for GT cards or something like GTX750Ti's.

And at least i can provide proof. It's right there on Zauba. That's an official shipment stating 256bit-4gb of ram versus your stupid rumor. Which one shall we believe hmmmm.

GTX-960-Zauba.png
 
Wow this website is becoming the OLD WCCF tech. All rumors>full BS.
So what is your source OP? you dreamed it last night and made it a news?

You guys should use your brains and if you dont have a brain such as OP, then use LOGIC. 128bit is for low end cards. GTX 960 will have 256bit bandwidth just as GTX 760 and 4GB of ram. While 128bit will be left for GT cards or something like GTX750Ti's.

And at least i can provide proof. It's right there on Zauba. That's an official shipment stating 256bit-4gb of ram versus your stupid rumor. Which one shall we believe hmmmm.

GTX-960-Zauba.png
finally someone logical :)
Tell em Goku :D
Me thinks this article is the pure form of click bait :0
 
Last edited:
Bus width is only one of many factors affecting memory bandwidth, which in turn is only one of many factors affecting overall performance.

You can't just say "it's 128 bit so it's not wide enough" or else you're no different than the people who say "my processor has more GHz so it's better!"

I feel like this thread has devolved into an extension of "Choose R9 290 Series for its 512-bit Memory Bus: AMD"
Then what are the other many factors affecting memory bandwidth?
Anyone can criticize, very few can provide explanations... ;)
 
Those who are acting as if @btarunr just makes stuff up are aware he is the Moderator who does the news and has for quite awhile? Very little of what he posts is pure speculation. Most of what he posts does come to fruition pretty closely. He rarely posts only rumors.

That being said, if Bunjomanjoman would like to spend some research time and get something more concrete from the notoriously secretive Nvidia, please do, and then share with us.

Maybe the original post will not come true and maybe it will be like the manifest listed above. That's still not a reason to blatantly attack.
 
Last edited:
Those who are acting as if @btarunr just makes stuff up are aware he is the Moderator who does the news and has for quite awhile? Very little of what he posts is pure speculation. Most of what he posts does come to fruition pretty closely. He rarely posts only rumors.

That being said, if Bunjomanjoman would like to spend some research time and get something more concrete from the notoriously secretive Nvidia, please do, and then share with us.
so NVIDIA has in this short span of time increased Maxwell's performance and efficiency to be able to replace a gtx 770 with a 128 bit bus and 2GB vram ??

I will admit if this rumor said the bus was 192 bit I would have believed it but 128 NEVER :) ( That's half the performance)

This rumor is complete bs
 
so NVIDIA has in this short span of time increased Maxwell's performance and efficiency to be able to replace a gtx 770 with a 128 but bus and 2GB vram ??

I will admit if this rumor said the bus was 192 bit I would have believed it but 128 NEVER :) ( That's half the performance)

This rumor is complete bs

Possibly so, and well-spoken rebuttal. My point is for intelligent discussion and counterpoint. I actually tend to agree speculatively with you on the bus width. My objection is that instead of logically providing a counter, Bunjomanjoman decides to use his first post on TPU to blatantly and rudely attack.. There's a right way and a wrong way to argue.

As to the 960, again, I'm inclined to agree with you about 192-bit bus, but since I have not been able to find anything more concrete myself, because Nvidia is so secretive, I will look on btarunr's report as the absolute minimum for the specs, so it's not all bad.
 
Possibly so, and well-spoken rebuttal. My point is for intelligent discussion and counterpoint. I actually tend to agree speculatively with you on the bus width. My objection is that instead of logically providing a counter, Bunjomanjoman decides to use his first post on TPU to blatantly and rudely attack.. There's a right way and a wrong way to argue.

As to the 960, again, I'm inclined to agree with you about 192-bit bus, but since I have not been able to find anything more concrete myself, because Nvidia is so secretive, I will look on btarunr's report as the absolute minimum for the specs, so it's not all bad.
we shall see next year :D
Hopefully NVIDIA doesn't cut down on the VRAM me want's 4GB VRAM :D
 
Bus width is only one of many factors affecting memory bandwidth, which in turn is only one of many factors affecting overall performance.

You can't just say "it's 128 bit so it's not wide enough" or else you're no different than the people who say "my processor has more GHz so it's better!"

I feel like this thread has devolved into an extension of "Choose R9 290 Series for its 512-bit Memory Bus: AMD"
Yes, another big factor is the memory speed which can be clocked higher to make up for the memory bandwidth issue however as we know the limit right now is around 7ghz even with some overclockers hitting up to 8ghz on the memory that will not make up for it. That being said it does not matter because this card is clearly marketed in the low end area probably going to be more about hitting 1080p on high to ultra at this point.

Having a bigger bus to match the memory makes a card run a lot better the higher the resolution is which is the R9 290/X are very good at doing 4K and delivering at least some playable performance. That article is mostly in reference to that as if your choosing to buy 1-2 290/X or 970/980 cards (Even 780/ti) you are buying them for high resolutions and not 1080p. Buying SLI/CFX of these cards at this point is way overkill for anything below 1440p.

This card is not for these resolutions and is targeted in the lower-mid range area of gaming handling 1080p. Even if you SLI them you will probably only be able to do 1440p at reasonable settings. I actually think while this seems like its on the low end its in the appropriate spot.
 
These cards are really not meant for anything over 1080 in the first place...
That's what I see, a card for mainstream 1080p. Sure more than a 750ti which offers "better than Entry" 1080p, though not near superlative 1080p. If priced <$200 I might see Nvidia place very close to the 285, and offer good $/perf. It's not going to provide if you’re thinking of moving to 2560x upgrade with it. As others see it the gap form this GTX960-970 will need a 960Ti (192-Bit) at some point.

based on the performance of the 750ti and the 970, a middleground 960ti with 192bit bus width and 3GB memory could be easily a "best buy" of its time.
I'd say that is spot-on why cannibalize 970 sales. I don't see Nvidia looking to offer any furthur cut down GM204, (until or if at any time) only perhaps if AMD came with FinFet first, and they said in so many words they aren't...
 
Last edited:
Today's games do not really benefit from resolutions higher than 1080p because they were never designed for a resolution higher than that. Sure you can crank the resolution of a game up to 4K but you are not improving the graphics in anyway, you are just increasing the resolution. You need games to be developed with higher resolution TEXTURES first before you can really enjoy the visual experience of 1440p or even 4K gaming.
 
Back
Top