• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GPUs Can be Tricked to Support AMD FreeSync

If you'd read OP you'd notice it is done using graphic card integrated into APU.


You must be new here. nVidia has it's own level, named after it's CEO.

I got sidetracked by the OP mentioning a 550+1080Ti combo, I guess. :)
 
This whole debacle would go away if VESA would buck up and made FreeSync a mandatory part of DisplayPort.
 
This whole debacle would go away if VESA would buck up and made FreeSync a mandatory part of DisplayPort.
You mean DisplayPort Adaptive-Sync, right? :)
As well as HDMI 2.1 with its (again optional) VRR.
 
This whole debacle would go away if VESA would buck up and made FreeSync a mandatory part of DisplayPort.

That's what it is? Although they call it Adaptive Sync. What am I missing?
 
In theory, this should allow you to pair your high-end GTX 1080 Ti with a $50 RX 550 that supports FreeSync, instead of paying the $200+ G-SYNC tax.

By g-sync tax I assume you mean the hardware module missing from Freesync monitors unless the monitor manufacturer provides it ? Yeah, the module that adds significant increase in cost when provided by those manufacturers ? Buying a Freesync monitor to run an nVidia card is like going off roading in a vehicle w/o 4 WD. No ULMB, no thanks.
 
By g-sync tax I assume you mean the hardware module missing from Freesync monitors unless the monitor manufacturer provides it ? Yeah, the module that adds significant increase in cost when provided by those manufacturers ? Buying a Freesync monitor to run an nVidia card is like going off roading in a vehicle w/o 4 WD. No ULMB, no thanks.

It's not like the cards couldn't comply. They just wanted to own the display market.

edit: For the record, I think they deserved that chance. They just failed.
 
Since 2009 in embedded DisplayPort. It was added to DisplayPort 1.2a in 2014 as an optional feature.
Btw, GSync in laptops uses eDP's Adaptive Sync.
 
Since 2009 in embedded DisplayPort. It was added to DisplayPort 1.2a in 2014 as an optional feature.
Btw, GSync in laptops uses eDP's Adaptive Sync.
Yeah, I meant it was defined and functional. I'm pretty sure HDMI had something similar as well, but I can't dig anything up atm.
 
Yeah, I meant it was defined and functional. I'm pretty sure HDMI had something similar as well, but I can't dig anything up atm.
HDMI did not have VRR support until the upcoming 2.1. Both GSync and FreeSync are doing proprietary stuff for HDMI support.
 
The only one who can actually scold and bring nvidia to their knees is Microsoft itself for screwing around their OS, and monetizing on OS feature fragmentation.

Currently gsync is already pain in the arse for them, it is constantly broken in insider builds as it interacts with whole display driver model.

Microsoft already did such move in the past by spanking Creative.

Somebody must have a serious talk with each other. As most importantly, the user experience suffers from this proprietary nonsense, gsync or freesync etc things when buying a panel, consumer choices are limited, thus mere people don't even understand that. A pure circus from both AMD and nVidia tbh... there are things that should be left alone and common, just like graphics api.
 
The only one who can actually scold and bring nvidia to their knees is Microsoft itself for screwing around their OS, and monetizing on OS feature fragmentation.

Currently gsync is already pain in the arse for them, it is constantly broken in insider builds as it interacts with whole display driver model.

Microsoft already did such move in the past by spanking Creative.

Somebody must have a serious talk with each other. As most importantly, the user experience suffers from this proprietary nonsense, gsync or freesync etc things when buying a panel, consumer choices are limited, thus mere people don't even understand that. A pure circus from both AMD and nVidia tbh... there are things that should be left alone and common, just like graphics api.

So apparently it's not just Linus with the problems :P

 
GSync's days are numbered the moment DP Adaptive Sync or more likely HDMI VRR (due to TVs using HDMI) starts being mandatory. Unfortunately, this is not too likely right now because there will be low-end monitor/TV manufacturers who will not want to have a (more expensive) scaler capable of that in their displays. We will get there some day though.
 
GSync's days are numbered the moment DP Adaptive Sync or more likely HDMI VRR (due to TVs using HDMI) starts being mandatory. Unfortunately, this is not too likely right now because there will be low-end monitor/TV manufacturers who will not want to have a (more expensive) scaler capable of that in their displays. We will get there some day though.

Well, for now, I get to use Freesync on HDMI as well (it's on all new Samsungs). 4K is limited to 60hz though.. unlike those new G-Sync BFGD's coming.
 
Sure (and I wish they did), but I'm just talking about the topic at hand. Nvidia. These slots don't grow on trees!
Are PCIe slots so valuable, really? :-)
I'm on mITX, so that's a different story. But based on System Specs most people here use ATX, so...?
 
Are PCIe slots so valuable, really? :)
I'm on mITX, so that's a different story. But based on System Specs most people here use ATX, so...?

They are to me..

I'm using ATX, but I'm kind of limited (of course, I'm also using a Core-X without all the PCIe lanes like it has with the i9... so one of my slots is disabled as it is.. then this Vega hogs up the space of two slots).
 
LOL you realize Gsync and ULMB are never active together do you

If you buy into Gsync for the ULMB youve lost the plot..,
I'm going to leave that without engaging into a dumb conversation you've just tried to start.
You have to realize that other people have a brain too,not just you.
 
OK..

Its still an important little detail when it comes to emulating Freesync on an Nv card.
 
OK..

Its still an important little detail when it comes to emulating Freesync on an Nv card.
What is an important detail ?
 
What is an important detail ?

That you arent missing anything freesync would offer if you lack ULMB. But I see you are in resistance mode, thats fine, just drop it.
 
That you arent missing anything freesync would offer if you lack ULMB. But I see you are in resistance mode, thats fine, just drop it.
No,but you are not getting 100% functionality getting a freesync monitor over g-sync one either. And strobing is a BIG one.

If anything it's you that needs to drop the condescending tone, not just here, but in general.No one was talking about ulmb in g-sync mode until you tried to convince people here that I was.
 
LOL you realize Gsync and ULMB are never active together do you

If you buy into Gsync for the ULMB youve lost the plot..,
The caveat is when people can sustain high fps, they tend to prefer ULMB over adaptive refresh rates.
 
No,but you are not getting 100% functionality getting a freesync monitor over g-sync one either. And strobing is a BIG one.

If anything it's you that needs to drop the condescending tone, not just here, but in general.No one was talking about ulmb in g-sync mode until you started thinking I was.

Fair enough but thats not the topic here. The topic is Freesync and opening up the use of monitors for Nvidia cards. And, on top of that, you can find monitors with strobe and no Gsync too.

Relax man. Shit.

You first posted with no Ulmb no thanks... which really is offtopic here
 
Back
Top