Aquinus
Resident Wat-man
- Joined
- Jan 28, 2012
- Messages
- 13,147 (2.94/day)
- Location
- Concord, NH, USA
System Name | Apollo |
---|---|
Processor | Intel Core i9 9880H |
Motherboard | Some proprietary Apple thing. |
Memory | 64GB DDR4-2667 |
Video Card(s) | AMD Radeon Pro 5600M, 8GB HBM2 |
Storage | 1TB Apple NVMe, 4TB External |
Display(s) | Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays |
Case | MacBook Pro (16", 2019) |
Audio Device(s) | AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers |
Power Supply | 96w Power Adapter |
Mouse | Logitech MX Master 3 |
Keyboard | Logitech G915, GL Clicky |
Software | MacOS 12.1 |
You're describing, once again, studdering. If the screen is tearing, then V-Sync probably isn't actually on because there are plenty of driver level settings that override what you and the application wants. It does introduce studder (not tearing) though. If you read that 2nd to last link you provided, the guy even edited his StackOverflow question to say that it appeared that scenes were getting rendered multiple times, so while the output was synchronized the actual rendering that altered the frame-buffer was not. Therefore the "changes" that actually occurred were happening slower than the frame rate.
This is the result of bad programming not bad v-sync.
This guy's question was answered:
StackOverflow said:It seems like you're not synchronizing your 60Hz frame loop with the GPU's 60Hz VSync. Yes, you have enabled Vsync in Nvidia but that only causes Nvidia to use a back-buffer which is swapped on the Vsync.
You need to set the swap interval to 1 and perform a glFinish() to wait for the Vsync.
Once again, look above. He has a physical memory problem which could very well be the problem (an occasional memory swap WILL drop a frame) and discussion of ours is only a tangent.
Edit: Adaptive v-sync isn't really pure v-sync.
Last edited: