Saturday, May 19th 2012

NVIDIA Responds to Reports of Kepler V-Sync Stuttering Issue

Over the past month, users of NVIDIA GeForce Kepler-based graphics cards have been reporting intermittent stuttering in games, with v-sync enabled. A fairly long thread on NVIDIA forums formed the rally point of users noticing the issue, although the issues weren't universally reproduceable. Tom's Hardware sought a statement from NVIDIA on the issue. In its statement, NVIDIA said that it is looking into the issue, extensive testing has to be done, and that gamers noticing the issue should force v-sync to stay disabled via NVIDIA Control Panel.
We have received reports of an intermittent v-sync stuttering issue from some of our customers. We’ve root caused the issue to a driver bug and identified a fix for it. The fix requires extensive testing though, and will not be available until our next major driver release targeted for June (post-R300). For users experiencing this issue, the interim workaround is to disable v-sync via the Nvidia Control Panel or in-game graphics settings menu.
Source: Tom's Hardware
Add your own comment

47 Comments on NVIDIA Responds to Reports of Kepler V-Sync Stuttering Issue

#1
RejZoR
So much for Adaptive V-Sync...
Posted on Reply
#2
Fluffmeister
They have a fix which they are already testing internally at the moment. It's not the end of the world.
Posted on Reply
#3
radrok
If I had such an issue I'd be really pissed, can't bear with stuttering, it drives me mad.

I hope they'll find and deliver a fix soon for who is affected.
Posted on Reply
#4
Xzibit
Someone broke something. If it requires extensive testing then Kepler doesnt handle things like Fermi and its more extensive then they are leading on or else a Beta Driver would have been issued.

GPU Boost
Adaptive V-Sync

Hopefully it gets fixed cause who wants to pay for a brand new card around $400+ not to get smooth game play without tearing.
Posted on Reply
#5
theoneandonlymrk
what, a launch driver issue on a freshly launched Arch from Nvidia ,, some would go completely over the top bout this ,,,,,especially if it were Amd but these things happen and it will be ironed out in no time.

by: Xzibit
Hopefully it gets fixed cause who wants to pay for a brand new card around $400 not to get smooth game play without tearing.
as i have often said, he who buys such a thing hot off the design board and dosnt expect to find the odd issue is day dreaming or in rosey land.
Posted on Reply
#6
MxPhenom 216
Corsair Fanboy
by: RejZoR
So much for Adaptive V-Sync...
Things can't work right if the software for it sucks.....

by: Xzibit
Someone broke something. If it requires extensive testing then Kepler doesnt handle things like Fermi and its more extensive then they are leading on or else a Beta Driver would have been issued.

GPU Boost
Adaptive V-Sync

Hopefully it gets fixed cause who wants to pay for a brand new card around $400+ not to get smooth game play without tearing.
Thats obvious. Kepler is not the same architecture as Fermi, it is vastly different so its not going to handle things the same way. ITS NEW.
Posted on Reply
#7
Xzibit
by: nvidiaintelftw
Thats obvious. Kepler is not the same architecture as Fermi, it is vastly different so its not going to handle things the same way. ITS NEW.
All the more reason to TEST prior to launch. Like come-on. Your introducing Adaptive V-Sync and you break V-Sync. :laugh:

People in development need to get there act together.
Posted on Reply
#8
semantics
by: Xzibit
All the more reason to TEST prior to launch. Like come-on. Your introducing Adaptive V-Sync and you break V-Sync. :laugh:

People in development need to get there act together.
for Kepler but it works fine for fermi, and i think every nvidia chip that they still support way back to the 8000's if not earlier ionno what they dropped support for but i have a 8800 in another computer and i got adaptive vsync so i think they tested just fine, testing every configuration possible is a waste of time and money, piss off 5% of your users and be prepped to fix it and launch it for cheaper and quicker appeasing the other 95% or piss off 99% of your users because you never update or try anything new.
Posted on Reply
#9
atikkur
lesson learned, dont get too excited with new tech.
Posted on Reply
#10
Xzibit
by: semantics
for Kepler but it works fine for fermi, and i think every nvidia chip that they still support way back to the 8000's if not earlier ionno what they dropped support for but i have a 8800 in another computer and i got adaptive vsync so i think they tested just fine, testing every configuration possible is a waste of time and money, piss off 5% of your users and be prepped to fix it and launch it for cheaper and quicker appeasing the other 95% or piss off 99% of your users because you never update or try anything new.
That makes less sense. 296.10 are the last drivers for Non-Kepler. Every thing enabling Adaptive V-Sync is RC or Beta for fermi.

301.10 were the official 6xx series drivers. So if it works with Fermi and not Kepler makes no-sense at all.
Posted on Reply
#11
atikkur
by: Xzibit
That makes less sense. 296.10 are the last drivers for Non-Kepler. Every thing enabling Adaptive V-Sync is RC or Beta for fermi.

301.10 were the official 6xx series drivers. So if it works with Fermi and not Kepler makes no-sense at all.
i heard it has to do with gpu boost feature, so it only affects on kepler cards. non-kepler doesnt have this issue.
Posted on Reply
#12
Amrael
This issue was on board since day one of the 301.10 because it happened to me when I edited the .ini file for the driver to work on Nvidia 500 series. I chalked it up to an incompatibility issue but for it to happen on the hardware it was supposed to support is ridiculous. Its not the end of the world but maybe shoddy or rushed, something that a 400 or 500+ piece of hardware shouldn't have to suffer from. That's just my opinion.
Posted on Reply
#13
Ikaruga
It is there with fermi drivers too (just like the huge added input lag). Most serious gamers knew that 200+ drivers were buggy and V-sync (along with scaling) was unusable and simply wrong since 266.58.
Now this is suddenly a big issue, when the added extra code for adaptive v-sync made this more apparent to the lasermouse-kids, and they can't play "COD 732 MW:Whatever" and "BF27:The Revenge of the Lensflare" anymore.

I like Nvidia, and I understand that it must be a very hard task to maintain and organize an unified driver for such a vast ammount of chips and cards... but these are very expensive cards, and people are expecting nothing but the best for their money..... so NV should really focus more on their drivers imo.
Posted on Reply
#14
Vulpesveritas
Proof: Nvidia does not have infallible drivers.
Both AMD and Nvidia seem to have issues with their 28nm hardware, Nvidia moreso it seems by the lack of anything but the high-end GPUs.
Posted on Reply
#15
Amrael
No company does (have infallible drivers) but damn, these are really expensive parts. At least for me they are. Also why be empathetic with them, they just want our money it's not like they haven't been docking video card performance for years just to sell us "the next greatest thing." Anyway these last Nvidia cards are freaking monsters, hopefully they'll fix this bug for next driver release.
Posted on Reply
#16
Vulpesveritas
by: Amrael
No company does (have infallible drivers) but damn, these are really expensive parts. At least for me they are. Also why be empathetic with them, they just want our money it's not like they haven't been docking video card performance for years just to sell us "the next greatest thing." Anyway these last Nvidia cards are freaking monsters, hopefully they'll fix this bug for next driver release.
Eh I wouldn't call them "monsters", given they barely outperform the AMD Radeon HD 79xx series in gaming frames per second, and are horrible for any GPGPU tasks for their price.
But I agree on rather pricey. lol
Posted on Reply
#17
RejZoR
by: nvidiaintelftw
Things can't work right if the software for it sucks.....
Thats obvious. Kepler is not the same architecture as Fermi, it is vastly different so its not going to handle things the same way. ITS NEW.
I guess some companies still haven't figured out why they have QA departments. And i'm not saying NVIDIA is the only one that doesn't know it...
Posted on Reply
#18
Amrael
by: Vulpesveritas
Eh I wouldn't call them "monsters", given they barely outperform the AMD Radeon HD 79xx series in gaming frames per second, and are horrible for any GPGPU tasks for their price.
But I agree on rather pricey. lol
Ok they don't really crush them but they do beat them at most benchmarks and the price is not to high over the AMD offerings. Anyway don't think any video card is worth more than $300 and a lot of people should stop listening to lame propaganda about maxed out games. Most games on PC with a decent Video card run really well and beat consoles hands down so what's the need for $500+ video cards?
Posted on Reply
#19
Recus
That happens when you enable adaptive vsync and vsync in game options. : D

In my case, Binary Domain and Sniper Elite V2 doesn't have stuttering issue with adaptive vsync. : P
Posted on Reply
#20
Aquinus
Resident Wat-man
by: Amrael
Ok they don't really crush them but they do beat them at most benchmarks and the price is not to high over the AMD offerings. Anyway don't think any video card is worth more than $300 and a lot of people should stop listening to lame propaganda about maxed out games. Most games on PC with a decent Video card run really well and beat consoles hands down so what's the need for $500+ video cards?
What makes you think that it isn't worth the price? The performance of a lot of GPUs today are priced based on performance. The point (at least for me,) for playing a game on a computer instead of the console is for performance and graphics and for some strategy games for the mouse and keyboard, but honestly, that is about it. However I will agree with the other half of what you were saying in the sense that the best video card isn't needed, but I also bought the 6870 the day it was released, but honestly, enjoyed it enough to get a second for crossfire and it already ran games really well without the second card, but I got a second one because I wanted to know how smooth gaming could get, and it gets pretty smooth. I can completely understand why anyone would get a high-end gpu, but unless you're running 3 displays, I don't think more horsepower will really be worth it.
Posted on Reply
#21
Jurassic1024
by: Amrael
Anyway don't think any video card is worth more than $300 and a lot of people should stop listening to lame propaganda about maxed out games. Most games on PC with a decent Video card run really well and beat consoles hands down so what's the need for $500+ video cards?
That can be said for millions of products. We buy them because we want to, because we are enthusiasts. Just because you can't justify blowing $500 on a video card, doesn't mean the rest of us shouldn't bother.

It's pretty sad you posted that in the comment section of a computer hardware site. Makes no sense at all.

Here's your homework:
Google > define:enthusiast
Posted on Reply
#22
xenocide
I had my heart set on a GTX670, but with such terrible availability I'm going to either wait for the next series, or just get an HD7870 :/
Posted on Reply
#23
beck24
Amd fanbois gnash teeth over nvidia driver bug. AMD's numerous bugs and Crossfire stuttering problems, I hear only crickets.
Posted on Reply
#24
Xzibit
by: beck24
Amd fanbois gnash teeth over nvidia driver bug. AMD's numerous bugs and Crossfire stuttering problems, I hear only crickets.
Didnt know you had to be a AMD fan-boy to want your Nvidia hardware to work properly.

Both companies have issues but that not the news topic now is it.

Guess I should take my 3 most recent video cards and send them back to Nvidia. :rolleyes:

If a company promotes a feature an breaks another, its not working properly when it launches. Consumers and potential consumers have a right to complain about said feature.

Don't be ignorant.
Posted on Reply
#25
beck24
So they're fixing it. Don't get your panties in a wad. Relax.
Posted on Reply
Add your own comment