• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

[DISPROVED] Apparently one can use Freesync with nVidia cards ...

HTC

Joined
Apr 1, 2008
Messages
4,700 (0.75/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
Hmm, what's the bet they don't patch it but they don't actually provide proper support
 
NVIDIA has no reason not to support the adaptive sync VESA standard that FreeSync implements. Maybe the 2### quietly does.
 
Interesting. Very interesting. I reckon that even though NVIDIA is so dominant over AMD, G-SYNC is tanking, so they've unofficially enabled it in order to gauge market response. If so, we should see a press release and official support in the next few months. Just my 2 cents.
 
Hmm, what's the bet they don't patch it but they don't actually provide proper support
They technically can't provide "proper support" because FreeSync certification is between AMD and the monitor manufacturer. They can make the card appropriately respond to the capabilities of the monitor which what adaptive sync is all about. There just won't be any optimization/certification unless NVIDIA makes the same arrangement with the monitor manufacturer. Even so, I'm not certain that a single monitor SKU could be made to be fully compliant with AMD and NVIDIA certifications. Adaptive sync strikes me more as a protocol than something that can't benefit from optimizations.

And here's the kicker, NVIDIA could still charge a premium for certification of monitors over AMD. Just because the solution is completely software based doesn't mean NVIDIA can't ask for a lot of money to certify that AMD doesn't. Don't quote me on this but I think AMD doesn't charge anything for certification. The manufacturer has to pay to ship the monitor to AMD with the understanding that AMD may not return it so, certification cost for AMD is the cost of the monitor and shipping it. There's no fee for certification.
 
NVIDIA has no reason not to support the adaptive sync VESA standard that FreeSync implements. Maybe the 2### quietly does.

Except, and judging by the comments on the source, it also works with the 1080 Ti.
 
We'll have to wait for comment from NVIDIA then. The author speculated it was the result of a change in Windows Desktop Manager that NVIDIA didn't account for.

One would certainly expect NVIDIA to make an announcement if they were jumping on the adaptive sync bandwagon.
 
Just use a monitor with a hz counter if you're not sure if it's freesync or not
 
unfortunately not for everyone:

Capture.PNG
 
When I set refresh rate for my monitor (Acer 34" 75Hz Freesync 21:9 monitor) to 75Hz and played my usual test game (just installed my RTX 2080 Ti into my gaming rig yesterday), I found that although framerate was very good, motion seemed to stutter. When I set it back to 60Hz, it was buttery smooth, whaddup widdat?!:confused:
 
I thought i had seen Linus do one not to long ago.
 
NVIDIA has no reason not to support the adaptive sync VESA standard that FreeSync implements. Maybe the 2### quietly does.

for no one

The article says first that its also the 1080ti working fine. Regardless of display. The only constant is the 411.70 driver

But then, in a comment below the article, you read this from the author:

1538311053952.png
 
G-SYNC is tanking
The nVidia tax is what tanked G-Sync. The price premium to get a half decent 4k display with G-Sync is absurd. nVidia has to understand that if you force people to pay for the GPU with one kidney, you can't try and force them to buy an overpriced G-Sync monitor with the second kidney as dialysis is kind of expensive and you're going to be fresh out of kidneys.
 
Btw: wasn't this supposed to be incompatible???

No. NVidia never exposed it. They make good cash pushing G-Sync. I have no doubt virtually all of thier newer cards are capable of it. But that means losing a large portion of G-Sync revenue by making it available.

I'm really curious to see how this plays out. Perhaps the next driver release it'll be gone...

Edit...

Update 9/30/18 3:22 AM
: After further research and the collection of more high-speed camera footage from our G-Sync displays, I'm confident the tear-free gameplay we're experiencing on our FreeSync displays in combination with GeForces is a consequence of Windows 10's Desktop Window Manager adding its own form of Vsync to the proceedings when games are in borderless windowed mode, rather than any form of VESA Adapative-Sync being engaged with our GeForce cards. Pending a response from Nvidia as to just what we're experiencing, I'd warn against drawing any conclusions from our observations at this time and sincerely apologize for the misleading statements we've presented in our original article. The original piece continues below for posterity.
 
Last edited:
  • Like
Reactions: HTC
The nVidia tax is what tanked G-Sync. The price premium to get a half decent 4k display with G-Sync is absurd. nVidia has to understand that if you force people to pay for the GPU with one kidney, you can't try and force them to buy an overpriced G-Sync monitor with the second kidney as dialysis is kind of expensive and you're going to be fresh out of kidneys.

That G-Sync tax as you put it is based upon the mistaken assumption that G-Sync and Freesync are the same thing. In once sense they are in that as named, both provide adaptive sync. However, the G-Sync package is not the same as the Freesync package.

-Freesync provided adaptive sync technology which kicks in at about 40fps. While it continues to 'sync" above 60 fps, the observable impact diminishes the further you get past 60 fps.

-Gsync provided adaptive sync technology which kicks in at about 30fps. While it continues to "sync" above 60 fps, the observable impact diminishes the further you get past 60 fps. However, the "G-sync Monitor Package" also provides a hardware module which probides Motion Blue Reduction technology (ULMB). In any game where you can maintain 70 fps or so, with the sync effect dimishing the higher you card can mange, ULMB provides an experience that most gamers view as superior to adative sync in any form. AMD has no corresponding technology. Although some monitor manufacturers did provide some for of MBR technology with the early Freesync models, the number of such offerings has diminished.

That's due to the conflict in how Freesysnc is marketed. It's supposed to be the less expensive alternative to freesysnc, .... but when it's equipped to compete head to head with G-Sync and ULMB by providing a hardware module for MBR, the "G-Sync tax" disappears. If you want to play at 1440p, especially at high refresh rates, then nvidia can deliver a lot more fps as well as the ability to virtually eliminate motion blur. As such, I'd recommend:

60 fps @ 1080p in Witcher 3 - 144 Hz TN Monitor w/ 1060 3GB w/ MBR capable monitor
40 fps @ 1440p in Witcher 3 - 144 Hz TN Monitor w/ 1060 6GB & G-Sync or w/ Fury & Freesync
100 fps @ 1440p in Witcher 3 - 165 Hz IPS Monitor (AUOptonics Panel) w/ 1080 Ti & G-Sync
75 fps @ 2160p in Witcher 3 - 144 Hz IPS Monitor (AUOptonics Panel) w/ 2080 *

* Would recommend the Ti (even 2) but won't until the 10xx series stck is sold out and tariff disappears so that price gets back to normal levels. However, I don't see 4k as a real option for all but the "money now object" crowd for at least one more generation.
 
@John Naylor You can have either G-SYNC mode on, or ULMB, but not both together. Are you saying that they can run together?

BTW, my current monitor, the BenQ XL2720Z has a strobing backlight and no restrictions on refresh rate over which it will work, so it's actually better than ULMB. Using it at 144Hz is particularly awesome with such smooth and clear animation (where there are no dropped frames from the PC) and at 60Hz, gloriously migraine inducing!

This is the killer feature that I bought it for and will look for again in my next monitor.
 
Pretty old news...
 
Thanks to this I have discovered something more useful (at least to me), you can use any videocard with an analog display, if you have a Ryzen APU.
Old VGA never dies.
 
It was a mistake... windows was applying vsync, due to them using windowed mode. Nothing to see here.
 
That G-Sync tax as you put it is based upon the mistaken assumption that G-Sync and Freesync are the same thing. In once sense they are in that as named, both provide adaptive sync. However, the G-Sync package is not the same as the Freesync package.

-Freesync provided adaptive sync technology which kicks in at about 40fps. While it continues to 'sync" above 60 fps, the observable impact diminishes the further you get past 60 fps.

-Gsync provided adaptive sync technology which kicks in at about 30fps. While it continues to "sync" above 60 fps, the observable impact diminishes the further you get past 60 fps. However, the "G-sync Monitor Package" also provides a hardware module which probides Motion Blue Reduction technology (ULMB). In any game where you can maintain 70 fps or so, with the sync effect dimishing the higher you card can mange, ULMB provides an experience that most gamers view as superior to adative sync in any form. AMD has no corresponding technology. Although some monitor manufacturers did provide some for of MBR technology with the early Freesync models, the number of such offerings has diminished.

That's due to the conflict in how Freesysnc is marketed. It's supposed to be the less expensive alternative to freesysnc, .... but when it's equipped to compete head to head with G-Sync and ULMB by providing a hardware module for MBR, the "G-Sync tax" disappears. If you want to play at 1440p, especially at high refresh rates, then nvidia can deliver a lot more fps as well as the ability to virtually eliminate motion blur. As such, I'd recommend:

60 fps @ 1080p in Witcher 3 - 144 Hz TN Monitor w/ 1060 3GB w/ MBR capable monitor
40 fps @ 1440p in Witcher 3 - 144 Hz TN Monitor w/ 1060 6GB & G-Sync or w/ Fury & Freesync
100 fps @ 1440p in Witcher 3 - 165 Hz IPS Monitor (AUOptonics Panel) w/ 1080 Ti & G-Sync
75 fps @ 2160p in Witcher 3 - 144 Hz IPS Monitor (AUOptonics Panel) w/ 2080 *

* Would recommend the Ti (even 2) but won't until the 10xx series stck is sold out and tariff disappears so that price gets back to normal levels. However, I don't see 4k as a real option for all but the "money now object" crowd for at least one more generation.
Great explanation, however, when it comes down to it, most people look at $$$ first, its not its "only 50 bucks dude." its a lot more.
 
After further research and the collection of more high-speed camera footage from our G-Sync displays, I'm confident the tear-free gameplay we're experiencing on our FreeSync displays in combination with GeForces is a consequence of Windows 10's Desktop Window Manager adding its own form of Vsync to the proceedings when games are in borderless windowed mode, rather than any form of VESA Adapative-Sync being engaged with our GeForce cards. Pending a response from Nvidia as to just what we're experiencing, I'd warn against drawing any conclusions from our observations at this time and sincerely apologize for the misleading statements we've presented in our original article. The original piece continues below for posterity.

tl:dr "We were totally wrong about this, but we're gonna leave this article up for the clicks it will draw"
 
tl:dr "We were totally wrong about this, but we're gonna leave this article up for the clicks it will draw"

Reported this post to get the title in the OP changed by a mod, since i was unable to do it myself.
 
At least the hack still works if you have an AMD GPU and route it through the nvidia one with software... if only i had an AMD IGP lol
 
Back
Top