Wednesday, September 24th 2014

NVIDIA Sacrifices VESA Adaptive Sync Tech to Rake in G-SYNC Royalties

NVIDIA's G-SYNC technology is rivaled by AMD's project Freesync, which is based on a technology standardized by the video electronics standards association (VESA), under Adaptive Sync. The technology lets GPUs and monitors keep display refresh rates in sync with GPU frame-rates, so the resulting output appears fluid. VESA's technology does not require special hardware inside standards-compliant monitors, and is royalty-free, unlike NVIDIA G-SYNC, which is based on specialized hardware, which display makers have to source from NVIDIA, which makes it a sort of a royalty.

When asked by Chinese publication Expreview on whether NVIDIA GPUs will support VESA adaptive-sync, the company mentioned that NVIDIA wants to focus on G-SYNC. A case in point is the display connector loadout of the recently launched GeForce GTX 980 and GTX 970. According to specifications listed on NVIDIA's website, the two feature DisplayPort 1.2 connectors, and not DisplayPort 1.2a, a requirement of VESA's new technology. AMD's year-old Radeon R9 and R7 GPUs, on the other hand, support DisplayPort 1.2a, casting a suspicion on NVIDIA's choice of connectors. Interestingly, the GTX 980 and GTX 970 feature HDMI 2.0, so it's not like NVIDIA is slow at catching up with new standards. Did NVIDIA leave out DisplayPort 1.2a in a deliberate attempt to check Adaptive Sync?
Source: Expreview
Add your own comment

114 Comments on NVIDIA Sacrifices VESA Adaptive Sync Tech to Rake in G-SYNC Royalties

#101
Naito
HumanSmokeEveryone (should) buy on features, reliability, and performance - group those everybody's together and you have market share.
Marketing may muddle that a fair bit for the technologically un-savvy.
Posted on Reply
#102
HumanSmoke
NaitoMarketing may muddle that a fair bit for the technologically un-savvy.
Yep, and that also plays as a sub-set of OEM sales. OEM's are by far the largest market for discrete graphics, so their marketing, pricing, competition, and whatever features they throw into the mix (freebee's, financing, add-ons, discounts) have a large part to play. I know of a few people that won't stray from the old guard such as Hewlett-Packard even though their telephone support rivals Guantanamo Bay and American Idol in the pantheon of "cruel and unusual punishment"
Posted on Reply
#103
bwat47
astrix_auYeah mantle runs awesome on the 290x, other cards are a different story specially older GCN cards. I almost bought 2 GTX 780 ti's, but I liked the idea of low level API. I thinking possibly going one day to Nvidia possibly 2x 980ti's they seem well priced.
I have an older GCN 1.0 card (amd 280x) and bf4 with mantle runs absolutely butter smooth on it.
Posted on Reply
#104
arbiter
StriderPhysX is a perfect example, the engine can run on any hardware, the only reason it won't run on AMD or any other GPU is becasue it's locked, not becasue of any hardware limitation. This is something Nvidia did shortly after buying the engine from Ageia. In fact, you used to be able to run PhysX on an ATI GPU via a modified driver. However Nvidia went to great lengths to prevent this, and now if you want to run PhysX on anything but a pure Nvidia system, you need a hybrid AMD/Nvidia setup and modified driver. The only reason this is not blocked yet is because it's a software level block and there is little Nvidia can do to stop it.
Um Nvidia was WILLING to license Physx to AMD but they refused to license it, so that puts it on AMD.
Posted on Reply
#105
HumanSmoke
arbiterUm Nvidia was WILLING to license Physx to AMD but they refused to license it, so that puts it on AMD.
Correct. Nvidia made a number of overtures to AMD, including what amounted to a come and get itoffer that was picked up by most of the tech sites at the time
Jason Paul, GeForce Product Manager: We are open to licensing PhysX, and have done so on a variety of platforms (PS3, Xbox, Nintendo Wii, and iPhone to name a few). We would be willing to work with AMD, if they approached us. We can’t really give PhysX away for “free” for the same reason why a Havok license or x86 license isn’t free—the technology is very costly to develop and support. In short, we are open to licensing PhysX to any company who approaches us with a serious proposal.
Then of course there was the Radeon PhysX episode that AMD fanboy revisionists conveniently sweep under the carpet. You can understand AMD not wanting to be at the mercy of Nvidia owned tech going forward (just as Nvidia wouldn't support Mantle while AMD are in sole charge of its direction), but certain people seem intent on rewriting history to fit their own fairy tale ideal of good and evil.
Posted on Reply
#106
arbiter
HumanSmokeCorrect. Nvidia made a number of overtures to AMD, including what amounted to a come and get itoffer that was picked up by most of the tech sites at the time

Then of course there was the Radeon PhysX episode that AMD fanboy revisionists conveniently sweep under the carpet. You can understand AMD not wanting to be at the mercy of Nvidia owned tech going forward (just as Nvidia wouldn't support Mantle while AMD are in sole charge of its direction), but certain people seem intent on rewriting history to fit their own fairy tale ideal of good and evil.
AMD wants it for free so they get free R&D from nvidia, don't know what world AMD lives in to think nvidia is gonna foot the bill for all the R&D work while AMD gets to use it for free and make the profit from it.

Probably the reason AMD pushed for adaptive sync as a standard, cheap easy way to get their tech in monitors. If they tried going about nvidia's way they don't have the weight with most companies to do it.
Posted on Reply
#107
HisDivineOrder
I wouldn't worry about nVidia supporting or not supporting a technology that so far has no monitors even announced.

Perhaps when a monitor capable of using the new VESA spec actually is announced then nVidia can test to see if they can support it with a patch or an update to their cards or if they need a new line of cards to support it.

As it is right now, they'd be guessing since there are only Gsync capable monitors on the market.

If I read that quote exactly the way it's stated, I'm reading it as the guy saying, "(Today) we're focusing only on gsync (because there aren't any Freesync/Adaptive Sync (VESA's)-capable monitors out right now)."

And until they're even announced, it's going to be hard for nVidia to fully test said monitors out to be sure they can guarantee compatibility. Meanwhile, you want them to, what? Take sales away from Gsync because of monitors that might be available in six months, but probably will be out in 12?

Heh.
Posted on Reply
#108
arbiter
HisDivineOrderI wouldn't worry about nVidia supporting or not supporting a technology that so far has no monitors even announced.

Perhaps when a monitor capable of using the new VESA spec actually is announced then nVidia can test to see if they can support it with a patch or an update to their cards or if they need a new line of cards to support it.
Yea when looks like no monitors will be out using it til looking like mid to late q1 since chips won't be shipped til end of the year. It does seem to give a certain group of people to complain when in reality there is nothing out to use it.
Posted on Reply
#109
Relayer
arbiterUm Nvidia was WILLING to license Physx to AMD but they refused to license it, so that puts it on AMD.
GPU accelerated PhysX has zero value to AMD, which is why they won't pay for it. It would be like AMD charging for Mantle and expecting nVidia to buy it. Especially when you consider they've said they aren't interested for free. lol

Likely the last feature that would make me choose a card would be PhysX.
Posted on Reply
#110
Deadlyraver
NVIDIA is taking advantage of it's place too seriously, they have the power to do this so long as consumers will be consumers.
Posted on Reply
#112
arbiter
FluffmeisterWhen is Mantle going to be open?
Way its going Never. Longer and longer it takes less viable game dev's will consider it.
Posted on Reply
#113
Relayer
arbiterWay its going Never. Longer and longer it takes less viable game dev's will consider it.
They said the end of the year. What do you mean by, the way it is going?
Posted on Reply
Add your own comment
May 15th, 2024 05:35 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts