Negative. Adaptive sync is agnostic. If you have an adaptive sync graphics processor and an adaptive sync monitor, adapative sync will be enabled by default. The purpose of the technology is to act without user input. Again, the goal is to reduce bandwidth requirements as well as reduce idle power consumption. The branding matters not.
That's describing the panel which inadvertantly describes the minimum refreshrate the eDP will refresh at. The frame rate can be lower from the GPU--eDP will fill in the gaps to keep it at or above minimum.
Looks like you realized your mistake on the last paragraph but we'll address that soon.
AdaptiveSync is agnostic but again... FreeSync is not. And the point of this topic is FreeSync vs G-Sync.
From AMD: FreeSync is a unique AMD hardware/software solution that utilizes DisplayPort™ Adaptive-Sync protocols to enable user-facing benefits
More from AMD...
Q: What are the requirements to use FreeSync?
A: To take advantage of the benefits of Project FreeSync, users will require: a monitor compatible with DisplayPort™ Adaptive-Sync, a compatible AMD Radeon™ GPU with a DisplayPort™ connection, and a compatible AMD Catalyst™ graphics driver. AMD plans to release a compatible graphics driver to coincide with the introduction of the first DisplayPort™ Adaptive-Sync monitors.
Yup, agnostic.
I never said 1.3 is when it was first supported. It will be the first supported by NVIDIA, Intel, and the rest of the industry.
Actually you did say it. You also said with DisplayPort 1.3 will allow it to work natively without any other hardware. Want me to quote that too? You made a big stink about how DisplayPort 1.3 will be the end of G-Sync but nothing in DisplayPort 1.3's spec supports that claim of yours.
All hardware with DisplayPort 1.3 ports will support adaptive sync and that includes NVIDIA.
FYI, AMD Crimson drivers added "Low Framerate Compensation" for FreeSync:
G-sync lost its technical edge with a driver update. eDP/adaptive sync is just that awesome.
Edit: Interesting caveat there: "greater than or equal to 2.5 times the minimum refresh rate."
30 Hz -> 75 Hz
35 Hz -> 87.5
40 Hz -> 100 Hz
42 Hz -> 105 Hz
47 Hz -> 117.5 Hz
48 Hz -> 120 Hz
56 Hz -> 140 Hz
That's
definitely something buyers should be aware of.
Edit: Looks like LFC should work on all 144 Hz displays.
Did you just find out about the driver update?
There are downsides to it being a driver update such as needing driver updates for new monitors, some monitors may be left out (and some are btw) and it is definitely not the most elegant solution which leaves room for bugs (some experience overshoot, flickering and ghosting). Then you have the caveat that you mentioned. The upper end resolutions is where this will have the most effect.
Verdict: Good first start but still room for improvement
So no, Nvidia didn't lose the technical edge ...yet.
Also here is the video in which they recommended those changes to AMD.
This debate is tired. I said what I wanted to say.
You know the jargon but your information is off.
Do your per-emptive victory cheer. Get the last word in. I'm done. We are not getting anywhere. You never see anyone else's point of view but your own.