Wednesday, September 24th 2014

NVIDIA Sacrifices VESA Adaptive Sync Tech to Rake in G-SYNC Royalties

NVIDIA's G-SYNC technology is rivaled by AMD's project Freesync, which is based on a technology standardized by the video electronics standards association (VESA), under Adaptive Sync. The technology lets GPUs and monitors keep display refresh rates in sync with GPU frame-rates, so the resulting output appears fluid. VESA's technology does not require special hardware inside standards-compliant monitors, and is royalty-free, unlike NVIDIA G-SYNC, which is based on specialized hardware, which display makers have to source from NVIDIA, which makes it a sort of a royalty.

When asked by Chinese publication Expreview on whether NVIDIA GPUs will support VESA adaptive-sync, the company mentioned that NVIDIA wants to focus on G-SYNC. A case in point is the display connector loadout of the recently launched GeForce GTX 980 and GTX 970. According to specifications listed on NVIDIA's website, the two feature DisplayPort 1.2 connectors, and not DisplayPort 1.2a, a requirement of VESA's new technology. AMD's year-old Radeon R9 and R7 GPUs, on the other hand, support DisplayPort 1.2a, casting a suspicion on NVIDIA's choice of connectors. Interestingly, the GTX 980 and GTX 970 feature HDMI 2.0, so it's not like NVIDIA is slow at catching up with new standards. Did NVIDIA leave out DisplayPort 1.2a in a deliberate attempt to check Adaptive Sync?
Source: Expreview
Add your own comment

114 Comments on NVIDIA Sacrifices VESA Adaptive Sync Tech to Rake in G-SYNC Royalties

#76
HumanSmoke
15th WarlockThe Tegra theory would make sense, except for the fact that in its current iteration Nvidia is using an FPGA chip an programing it with proprietary software to enable G-sync.
Much cheaper than Tegra then. Seems to add to the weight of evidence that G-Sync is, if not paying its way directly for Nvidia, is recouping its development costs via a combination of sales, marketing, and brand differentiation.
Posted on Reply
#77
arbiter
HumanSmokeMuch cheaper than Tegra then. Seems to add to the weight of evidence that G-Sync is, if not paying its way directly for Nvidia, is recouping its development costs via a combination of sales, marketing, and brand differentiation.
New tech is always expensive, when there is say half dozen different monitors on market with g-sync costs will come down.
Posted on Reply
#78
Strider
I am sure this has been said somewhere in all of the comments, but what's one more time. lol

Adaptive Sync has been around far longer than G-Sync, just in laptops, and now that it's a display port 1.2a+ standard, it will be open to ALL desktop hardware.

Nvidia had no reason to create G-Sync, they could have done exactly what AMD did and push for adaptive sync, but they chose to create a completely separate proprietary technology. At a hardware level, G-Sync has no real advantages over adaptive or Freesync.

Nvidia only did what they did becasue they are like Apple, they go to great ends to maintain a closed ecosystem and are dead set against open anything in many aspects of their business. It's just how they operate.

PhysX is a perfect example, the engine can run on any hardware, the only reason it won't run on AMD or any other GPU is becasue it's locked, not becasue of any hardware limitation. This is something Nvidia did shortly after buying the engine from Ageia. In fact, you used to be able to run PhysX on an ATI GPU via a modified driver. However Nvidia went to great lengths to prevent this, and now if you want to run PhysX on anything but a pure Nvidia system, you need a hybrid AMD/Nvidia setup and modified driver. The only reason this is not blocked yet is because it's a software level block and there is little Nvidia can do to stop it.

The thing is, there is no point, by locking down PhysX Nvidia has come really close to killing it. The number of games that use it at the GPU level are miniscule, and dropping rapidly, compared to Havok or engine specific physics. Both of which can do anything PhysX can, and are not hardware locked or limited.

More recently, Nvidia has gone so far to lock the libraries used with GameWorks to actually hinder the performance of non-Nvidia based GPU's.

I am not trying to come off as a hater, or fanboy, just pointing out facts.

In my opinion, if this is true, it's a bad move for Nvidia. Hardware and software is moving more toward open standards, and Nvidia no longer rules the discrete GPU world, AMD has a very large share and it's showing no signs of slowing. In the end, this will only hurt Nvidia's business. There will be little to no reason to buy G-Sync over an adaptive sync capable display. There will be fewer displays and models that will support G-Sync over adaptive since it's a standard and G-Sync is not. The G-Sync displays will likely cost more, since the hardware is proprietary, and you will get no real added benefit other than the opportunity to wear your green team tag with pride.

=]
Posted on Reply
#79
Naito
I still see G-Sync kicking around for a while. Why? Because of a little feature that now seems to be exclusively paired with G-Sync, called ULMB (Ultra Low Motion Blur). It might not be long before Nvidia find out a way to have both technologies enabled at once (or at least to a degree). This will give an obvious advantage over Adaptive Sync, especially when high FPS rates are key. Adaptive Sync/G-Sync are pretty much useless for high FPS rates, so in a game like CS, hardcore gamers will probably go a G-Sync monitor which comes with ULMB (basically the LightBoost trick of yesteryear), to get the competitive edge. As far as I know, except for a one-off Samsungfeature (which wasn't as good as LightBoost hack), there are no other competing technologies to LightBoost/ULMB in the PC market.

So to sum up, if you want something like LightBoost or ULMB in the future, you'll most likely have to buy a G-Sync monitor as I'm sure ULMB will remain an exclusive feature.
Posted on Reply
#80
Relayer
Solidstate89So is nVidia just not going to support newer versions of DP? 1.3 is already available and with 4K becoming more and more the norm they can't hope to just not update their DP standard in the hopes of not supporting VESA Adaptive Sync.

They're going to have to support it eventually, whether they like it not.
Adaptivesync is an optional feature. Even then Adaptivesync is not Freesync. Freesync is AMD's way of offering real time dynamic refresh rate (syncing the monitor's refresh rate with the card's output). It uses the capabilities of Adaptivesync to accomplish it. nVidia can use other features of Adaptivesync (lowering refresh rate to fixed values for video playback, for example) and not offer real time refresh rate adjustment to interfere with Gsync.

Another possibility is they are going to offer it, but are just not saying so because they don't want to hurt current Gsync sales.
Posted on Reply
#81
HumanSmoke
StriderI am not trying to come off as a hater, or fanboy, just pointing out facts.
If that's the case, you're doing a piss poor job
Striderand Nvidia no longer rules the discrete GPU world, AMD has a very large share and it's showing no signs of slowing.
Meanwhile in the real world....Nvidia holds 62% of the discrete graphics market. These are facts - even when AMD/ATI have had a dominant product they haven't translated that into market share.

StriderPhysX is a perfect example, the engine can run on any hardware, the only reason it won't run on AMD or any other GPU is becasue it's locked, not becasue of any hardware limitation. This is something Nvidia did shortly after buying the engine from Ageia.
That's right - shortly after Nvidia PAID $150 million for Ageia, which was shortly after AMD turned down the opportunity to buy the same company.
Wanting PhysX but not wanting to pay for it...well, that's like waiting for your neighbour to buy a lawnmower rather than buy one yourself, then turning up on his doorstep to borrow it....and expecting him to provide the gas for it.
StriderThere will be little to no reason to buy G-Sync over an adaptive sync capable display.
You mean aside from the fact that you can't buy an Adaptive-Sync monitor at the moment?
StriderThere will be fewer displays and models that will support G-Sync over adaptive since it's a standard and G-Sync is not.
And Nvidia will most likely adopt Adaptive-Sync once it does become mainstream. At the moment Adaptive Sync is a specification - Nvidia makes no money off a VESA specification, it does however derive benefit from current G-Sync sales.
Posted on Reply
#82
Ferrum Master
HumanSmokeNvidia, is recouping its development costs via a combination of sales, marketing, and brand differentiation.
I guess you are still bad at math. This product can be proffitable only if sold many hundred thousands...

Tegra 4 had caused in two year time over 400mil operating loss in the tegra division, as the same R/D team is incapable of efficiency. I cannot understand what kind of numbers roll in your head, but to pull any kind of beta silicon, iron it out, feed the binary blob coders and advertising monkeys costs millions...

It won't be profitable ever... Snake oil.
Posted on Reply
#84
HumanSmoke
Ferrum MasterI guess you are still bad at math. This product can be proffitable only if sold many hundred thousands...
And I guess that you don't understand how a business built around a brand works. There is more to profit than sales of individual SKUs. You think AMD launched the 295X2 to deliberately lose money (I'm pretty sure it won't sell "hundred thousands" ) ?
Most other people who understand how the business actually works realise that halo and peripheral products are tools to enable further sales of mainstream products. Know what else doesn't turn a monetary profit? Gaming development SDK's, software utilities, Mantle(piece), PhysX, NVAPI, and most limited edition hardware. The profit comes through furtherance of the brand.
Why else do you think AMD poured R&D into their gaming program, Mantle(piece), and resources to bring analogues of ShadowPlay and GeForce Experience into being?

For some self-professed business genius you don't seem very astute in the strategy of marketing and selling a brand.
Posted on Reply
#85
Ferrum Master
HumanSmokeAnd I guess that you don't understand how a business built around a brand works. There is more to profit than sales of individual SKUs. You think AMD launched the 295X2 to deliberately lose money
Apples and oranges, dual card costs silicon wise nothing, just a new PCB. Not building whole new ASIC.

Mantle is just a natural side product of xbone/ps4 SDK development. It does not also require new ASIC's...

Boney... You act like a cheap car saler...
Posted on Reply
#86
RCoon
HumanSmokeMantle(piece)
I think I may need to patent that after mentioning it in the gpu release thread. Sounds like it's catching on.
Posted on Reply
#87
HumanSmoke
Ferrum MasterApples and oranges, dual card costs silicon wise nothing, just a new PCB. Not building whole new ASIC.
Yeah right, an AIO costs nothing! 16 layer PCB costs nothing! PLX chip costs nothing!
Ferrum MasterMantle is just a natural side product of xbone/ps4 SDK development. It does not also require new ASIC's...
So Mantle hasn't cost AMD anything? What about the game bundles AMD give away with their cards? AMD don't turn a profit on them. The $5-8million they gave EA/DICE for BF4 sponsorship? AMD don't sell BF4, they give it away - IT COSTS THEM - Why? because it furthers the brand to have AAA titles associated with the companies graphics.

Could you make your argument any weaker? (Don't answer that as I'm sure you'll outdo yourself ;))
RCoonI think I may need to patent that after mentioning it in the gpu release thread. Sounds like it's catching on.
Just doing my bit for the proud tradition of internet speculation and my love of the running gag - also an appreciative nod toward your theory on AMD's Forward Thinking™ Future
Posted on Reply
#88
Relayer
xenocideFactoring in how G-Sync works, I refuse to believe FreeSync will offer a comperable experience.
How Gsync works is irrelevant. How FreeSync works is all that matters.

In theory, it's a really simple process. The card polls the Monitor to find out what it's refresh rate range is. Say it's 30-60 Hz. Any frame that falls withing that range is released immediately and the screen is instructed to refresh with it. If the card is going to slow it will resend the previous frame. To fast and it will buffer it until the monitor is ready.

With that said we'll have to wait until samples are available for testing before we know for sure. It could be worse, or it could be better.
arbiterNew tech is always expensive, when there is say half dozen different monitors on market with g-sync costs will come down.
When there is competition in the marketplace, the price will come down. As long as nVidia is the only one producing the needed hardware they have no reason to lower prices because they are selling to multiple OEM's.
Posted on Reply
#89
Ferrum Master
HumanSmokeYeah right, an AIO costs nothing! 16 layer PCB costs nothing! PLX chip costs nothing!

So Mantle hasn't cost AMD anything? What about the game bundles AMD give away with their car

Could you make your argument any weaker?
Nor pcb nor plx chip costs. First of all the bridge ain't theirs. The PCB is already drawn usualy for mobile solution and you have to stack them toghether, the VRM part is designed also by their manufacturer examples... It ain't a new ASIC...

The water cooling solution is also not designed from zero, they use ASETEK and not again a new ASIC build from scratch.

Okay the battlefield PR... I see you have gone even further in cheap car sales. And how that is connected with gsync? They both invest into game companies.

How is pulling your nose into AMD's PR business connected to gsync R/D costs and profitability of this program...
Boney you left your pelvic bone somewhere...
Posted on Reply
#90
Naito
astrix_auI read somewhere that you can't use both at the same time. Here are a couple links I just found with those statements.

www.blurbusters.com/lightboost-sequel-ultra-low-motion-blur-ulmb/

hardforum.com/showthread.php?t=1812458

I don't know if they changed that yet though.
Yeah, it can't work in tandem (but may in future?), but currently only one exists where the other exists, I have not seen a monitor that supports only G-Sync without ULMB, or vice versa.
Posted on Reply
#91
Relayer
HumanSmokeIf that's the case, you're doing a piss poor job

Meanwhile in the real world....Nvidia holds 62% of the discrete graphics market. These are facts - even when AMD/ATI have had a dominant product they haven't translated that into market share.




That's right - shortly after Nvidia PAID $150 million for Ageia, which was shortly after AMD turned down the opportunity to buy the same company.
Wanting PhysX but not wanting to pay for it...well, that's like waiting for your neighbour to buy a lawnmower rather than buy one yourself, then turning up on his doorstep to borrow it....and expecting him to provide the gas for it.

You mean aside from the fact that you can't buy an Adaptive-Sync monitor at the moment?

And Nvidia will most likely adopt Adaptive-Sync once it does become mainstream. At the moment Adaptive Sync is a specification - Nvidia makes no money off a VESA specification, it does however derive benefit from current G-Sync sales.
Why would anyone care about nVidia, there market share, financials, etc? Unless you work for them, or own stock. I sure don't choose my hardware by looking at what everyone else buys or who's going to make the highest profit on my purchase. I buy by price, features, reliability, and performance. If Matrox offered a card that was better suited for my needs, I'd buy it. I wouldn't care that they only have a very tiny part of the market. I'd be happy that I could find a product that catered to my needs.
Posted on Reply
#92
Naito
RelayerWhy would anyone care about nVidia, there market share, financials, etc? Unless you work for them, or own stock. I sure don't choose my hardware by looking at what everyone else buys or who's going to make the highest profit on my purchase. I buy by price, features, reliability, and performance. If Matrox offered a card that was better suited for my needs, I'd buy it. I wouldn't care that they only have a very tiny part of the market. I'd be happy that I could find a product that catered to my needs.
That wasn't the original argument. It was in reply to:
Striderand Nvidia no longer rules the discrete GPU world, AMD has a very large share and it's showing no signs of slowing
HumanSmoke was just giving the actual statistics to prove that statement wrong.
Posted on Reply
#93
Relayer
NaitoThat wasn't the original argument. It was in reply to:



HumanSmoke was just giving the actual statistics to prove that statement wrong.
38% is a large share, and that's just Discrete. Add in APU's and the consoles and AMD does have major influence.
Posted on Reply
#94
Naito
Relayer38% is a large share, and that's just Discrete. Add in APU's and the consoles and AMD does have major influence.
Consoles doesn't influence the PC market that much. And factoring APUs, you'll then have to bring Intel IGPs into the equation.

Besides the statement again, says "Nvidia no longer rules the discrete GPU world", not other markets.
Posted on Reply
#95
HumanSmoke
Ferrum MasterNor pcb nor plx chip costs. First of all the bridge ain't theirs. The PCB is already drawn usualy for mobile solution and you have to stack them toghether, the VRM part is designed also by their manufacturer examples... It ain't a new ASIC...
Design costs money. Testing costs money. Validation costs money. The AIO costs money whether it is already designed or not ( You think Asetek said to AMD "Hey we made this from one our existing products, so you can have it for free" :shadedshu: ). The PCB cost is higher and still needs to be laid out - which costs money.
Ferrum MasterThe water cooling solution is also not designed from zero, they use satellite and not again a new ASIC build from scratch.
AMD have a contract to SELL AIO's to AMD, not give them away to the needy. The AIO adds to the bill of meterials (BoM) as does the PCB, and the PLX chip.
Ferrum MasterOkay the battlefield PR... I see you have gone even further in cheap car sales. And how that is connected with gsync? They both invest into game companies.
Which goes back exactly to what I was saying about profit not being inextricably linked to the direct sale of any particular part.
HumanSmokeObviously the company isn't losing out or they wouldn't be making it (or they assume the expenditure is worth it to the company in other terms) and OEM/ODM's wouldn't be using the module.
Ferrum MasterHow is pulling your nose into AMD's PR business connected to gsync R/D costs and profitability of this program...
Pulling my nose??? Also I never mentioned G-Sync's R&D costs other than the fact that G-Sync is a means to elevate, publicize, and market the Nvidia brand. Just as allying a AAA game title does exactly the same thing, just as producing a halo hardware part does exactly the same thing, just as investing money and resources into a software ecosystem to support the hardware does exactly the same thing.

This is obviously beyond you or you just fail to see how business uses brand strategy to maintain and build market share.It's not rocket science. Please feel free to do whatever you're doing, but your audience just decreased by one at least.
Ferrum MasterBoney you left your pelvic bone somewhere...
That another weird Latvian saying, like "pulling my nose"?
Posted on Reply
#96
NC37
ZOMG! nVidia not supporting new standards that are clearly better but for some reason they just don't want to...thats....thats....not new news one bit. Moving along, nothing to see here.
Posted on Reply
#97
Naito
Seems to be a bit too much of ye olde noise pulling going on in this thread.
Posted on Reply
#98
Ferrum Master
Boney, you are talking about bom costs that final buyer covers. Not the R/D cost...

You are comparing pocket money costs, like designing AIO... Not making a brand new product....

The saying is just as weird as you operate with facts... Just goofing around with numbers like a cheap car saler...
Posted on Reply
#99
Naito
NC37ZOMG! nVidia not supporting new standards that are clearly better but for some reason they just don't want to...thats....thats....not new news one bit. Moving along, nothing to see here.
But yet we have not seen Adaptive Sync/Free Sync in action/available commercially. How can you make such a claim?
Posted on Reply
#100
HumanSmoke
RelayerWhy would anyone care about nVidia, there market share, financials, etc? Unless you work for them, or own stock.
Well, firstly, all I was doing was pointing out an incorrect "statement of fact" from Strider, and secondly...
Because market share has a direct bearing upon revenue, and revenue has a direct bearing on R&D, and R&D has a direct bearing on future products, and future products have a direct bearing upon market share ? You see where it is going ? You really only have to look at the histories of 3dfx, S3, Rendition, Cirrus Logic, Trident, SGI, Tseng Labs, 3DLabs and a host of other IHV's to see what happens when the cycle fails to provide uplift.
RelayerI sure don't choose my hardware by looking at what everyone else buys or who's going to make the highest profit on my purchase. I buy by price, features, reliability, and performance. If Matrox offered a card that was better suited for my needs, I'd buy it. I wouldn't care that they only have a very tiny part of the market. I'd be happy that I could find a product that catered to my needs.
Likewise, if enough people think the same way then their market share increases, they have more funds for development, and they stay competitive. Matrox won't offer you a card better suited to your needs because they failed to evolve as fast as ATI and Nvidia. The G400 was a great series of cards, but the writing was on the wall when they failed to develop a relevant successor. Everyone (should) buy on features, reliability, and performance - group those everybody's together and you have market share.
Posted on Reply
Add your own comment
May 15th, 2024 22:40 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts