• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Sacrifices VESA Adaptive Sync Tech to Rake in G-SYNC Royalties

Joined
Aug 22, 2007
Messages
3,454 (0.57/day)
Location
CA, US
System Name :)
Processor Intel 13700k
Motherboard Gigabyte z790 UD AC
Cooling Noctua NH-D15
Memory 64GB GSKILL DDR5
Video Card(s) Gigabyte RTX 4090 Gaming OC
Storage 960GB Optane 905P U.2 SSD + 4TB PCIe4 U.2 SSD
Display(s) Alienware AW3423DW 175Hz QD-OLED + Nixeus 27" IPS 1440p 144Hz
Case Fractal Design Torrent
Audio Device(s) MOTU M4 - JBL 305P MKII w/2x JL Audio 10 Sealed --- X-Fi Titanium HD - Presonus Eris E5 - JBL 4412
Power Supply Silverstone 1000W
Mouse Roccat Kain 122 AIMO
Keyboard KBD67 Lite / Mammoth75
VR HMD Reverb G2 V2
Software Win 11 Pro
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
The Tegra theory would make sense, except for the fact that in its current iteration Nvidia is using an FPGA chip an programing it with proprietary software to enable G-sync.
Much cheaper than Tegra then. Seems to add to the weight of evidence that G-Sync is, if not paying its way directly for Nvidia, is recouping its development costs via a combination of sales, marketing, and brand differentiation.
 
Joined
Jun 13, 2012
Messages
1,326 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Much cheaper than Tegra then. Seems to add to the weight of evidence that G-Sync is, if not paying its way directly for Nvidia, is recouping its development costs via a combination of sales, marketing, and brand differentiation.

New tech is always expensive, when there is say half dozen different monitors on market with g-sync costs will come down.
 
Joined
May 23, 2011
Messages
52 (0.01/day)
System Name Strider's Rig
Processor AMD FX-8350 Vishera @ 4.5GHz
Motherboard ASUS Sabertooth 990FX R2.0 Gen 3
Cooling Thermaltake Frio Processor Cooler (Push/Pull)
Memory 16GB A-Data XPG Gaming Series DDR3 1866
Video Card(s) Sapphire R9 290 Tri-X OC Edition
Storage 5 WD Black SATA III / Seagate 600 Series 240GB SSD / 3TB WD My Book Ext/ 1TB Seagate Passport Ext
Display(s) Primary Acer 23 inch HD Widescreen LCD 1920 x 1080 / Secondary HP 1920x1080 "Work" LCD Display
Case Cooler Master HAF 932 Black Steel
Audio Device(s) Sound Blaster Z 5.1 / Steel Series 9H
Power Supply Corsair HX850W Professional Modular 850W PSU
Software Windows 7 Ultimate 64
I am sure this has been said somewhere in all of the comments, but what's one more time. lol

Adaptive Sync has been around far longer than G-Sync, just in laptops, and now that it's a display port 1.2a+ standard, it will be open to ALL desktop hardware.

Nvidia had no reason to create G-Sync, they could have done exactly what AMD did and push for adaptive sync, but they chose to create a completely separate proprietary technology. At a hardware level, G-Sync has no real advantages over adaptive or Freesync.

Nvidia only did what they did becasue they are like Apple, they go to great ends to maintain a closed ecosystem and are dead set against open anything in many aspects of their business. It's just how they operate.

PhysX is a perfect example, the engine can run on any hardware, the only reason it won't run on AMD or any other GPU is becasue it's locked, not becasue of any hardware limitation. This is something Nvidia did shortly after buying the engine from Ageia. In fact, you used to be able to run PhysX on an ATI GPU via a modified driver. However Nvidia went to great lengths to prevent this, and now if you want to run PhysX on anything but a pure Nvidia system, you need a hybrid AMD/Nvidia setup and modified driver. The only reason this is not blocked yet is because it's a software level block and there is little Nvidia can do to stop it.

The thing is, there is no point, by locking down PhysX Nvidia has come really close to killing it. The number of games that use it at the GPU level are miniscule, and dropping rapidly, compared to Havok or engine specific physics. Both of which can do anything PhysX can, and are not hardware locked or limited.

More recently, Nvidia has gone so far to lock the libraries used with GameWorks to actually hinder the performance of non-Nvidia based GPU's.

I am not trying to come off as a hater, or fanboy, just pointing out facts.

In my opinion, if this is true, it's a bad move for Nvidia. Hardware and software is moving more toward open standards, and Nvidia no longer rules the discrete GPU world, AMD has a very large share and it's showing no signs of slowing. In the end, this will only hurt Nvidia's business. There will be little to no reason to buy G-Sync over an adaptive sync capable display. There will be fewer displays and models that will support G-Sync over adaptive since it's a standard and G-Sync is not. The G-Sync displays will likely cost more, since the hardware is proprietary, and you will get no real added benefit other than the opportunity to wear your green team tag with pride.

=]
 
Joined
Oct 10, 2009
Messages
929 (0.18/day)
System Name Desktop | Laptop
Processor AMD Ryzen 7 5800X3D | Intel Core i7 7700HQ
Motherboard MAG X570S Torpedo Max| Neptune KLS HM175
Cooling Corsair H100x | Twin fan, fin stack & heat pipes
Memory 32GB G.Skill F4-3600C16-8GVK @ 3600MHz / 16-16-16-36-1T | 16GB DDR4 @ 2400MHz / 17-17-17-39-2T
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra | GTX 1050 Ti 4GB
Storage Kingston KC3000 1TB + Kingston KC3000 2TB + Samsung 860 EVO 1TB | 970 Evo 500GB
Display(s) 32" Dell G3223Q (2160p @ 144Hz) | 17" IPS 1920x1080P
Case Fractal Meshify 2 Compact | Aspire V Nitro BE
Audio Device(s) ifi Audio ZEN DAC V2 + Focal Radiance / HyperX Solocast
Power Supply Super Flower Leadex V Platinum Pro 1000W | 150W
Mouse Razer Viper Ultimate | Logitech MX Anywhere 2
Keyboard Razer Huntsman V2 Optical (Linear Red)
Software Windows 11 Pro x64
I still see G-Sync kicking around for a while. Why? Because of a little feature that now seems to be exclusively paired with G-Sync, called ULMB (Ultra Low Motion Blur). It might not be long before Nvidia find out a way to have both technologies enabled at once (or at least to a degree). This will give an obvious advantage over Adaptive Sync, especially when high FPS rates are key. Adaptive Sync/G-Sync are pretty much useless for high FPS rates, so in a game like CS, hardcore gamers will probably go a G-Sync monitor which comes with ULMB (basically the LightBoost trick of yesteryear), to get the competitive edge. As far as I know, except for a one-off Samsung feature (which wasn't as good as LightBoost hack), there are no other competing technologies to LightBoost/ULMB in the PC market.

So to sum up, if you want something like LightBoost or ULMB in the future, you'll most likely have to buy a G-Sync monitor as I'm sure ULMB will remain an exclusive feature.
 
Joined
Jan 13, 2009
Messages
424 (0.08/day)
So is nVidia just not going to support newer versions of DP? 1.3 is already available and with 4K becoming more and more the norm they can't hope to just not update their DP standard in the hopes of not supporting VESA Adaptive Sync.

They're going to have to support it eventually, whether they like it not.
Adaptivesync is an optional feature. Even then Adaptivesync is not Freesync. Freesync is AMD's way of offering real time dynamic refresh rate (syncing the monitor's refresh rate with the card's output). It uses the capabilities of Adaptivesync to accomplish it. nVidia can use other features of Adaptivesync (lowering refresh rate to fixed values for video playback, for example) and not offer real time refresh rate adjustment to interfere with Gsync.

Another possibility is they are going to offer it, but are just not saying so because they don't want to hurt current Gsync sales.
 
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
I am not trying to come off as a hater, or fanboy, just pointing out facts.
If that's the case, you're doing a piss poor job
and Nvidia no longer rules the discrete GPU world, AMD has a very large share and it's showing no signs of slowing.
Meanwhile in the real world....Nvidia holds 62% of the discrete graphics market. These are facts - even when AMD/ATI have had a dominant product they haven't translated that into market share.



PhysX is a perfect example, the engine can run on any hardware, the only reason it won't run on AMD or any other GPU is becasue it's locked, not becasue of any hardware limitation. This is something Nvidia did shortly after buying the engine from Ageia.
That's right - shortly after Nvidia PAID $150 million for Ageia, which was shortly after AMD turned down the opportunity to buy the same company.
Wanting PhysX but not wanting to pay for it...well, that's like waiting for your neighbour to buy a lawnmower rather than buy one yourself, then turning up on his doorstep to borrow it....and expecting him to provide the gas for it.
There will be little to no reason to buy G-Sync over an adaptive sync capable display.
You mean aside from the fact that you can't buy an Adaptive-Sync monitor at the moment?
There will be fewer displays and models that will support G-Sync over adaptive since it's a standard and G-Sync is not.
And Nvidia will most likely adopt Adaptive-Sync once it does become mainstream. At the moment Adaptive Sync is a specification - Nvidia makes no money off a VESA specification, it does however derive benefit from current G-Sync sales.
 
Last edited:
Joined
Nov 18, 2010
Messages
7,124 (1.45/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX + under waterblock.
Storage Optane 900P[W11] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO[FEDORA]
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 39 / Windows 11 insider
Nvidia, is recouping its development costs via a combination of sales, marketing, and brand differentiation.

I guess you are still bad at math. This product can be proffitable only if sold many hundred thousands...

Tegra 4 had caused in two year time over 400mil operating loss in the tegra division, as the same R/D team is incapable of efficiency. I cannot understand what kind of numbers roll in your head, but to pull any kind of beta silicon, iron it out, feed the binary blob coders and advertising monkeys costs millions...

It won't be profitable ever... Snake oil.
 
Joined
Jul 16, 2014
Messages
26 (0.01/day)
Location
Sydney, Australia
System Name Da Swift
Processor Intel i7 4770K
Motherboard ASUS Maximus VI Formula
Cooling Corsair H80i
Memory Kingston 16GB 1600Mhz
Video Card(s) 2x MSI R9 290X Gaming 4G OC (Crossfire)
Storage Samsung 840 EVO 500 GB (OS) Rapid, Samsung 840 EVO 256GB
Display(s) ASUS Swift PG278Q
Case Antec 900 II V3
Audio Device(s) Astro A50 Headset, Logitech Speakers
Power Supply Seasonic 1000w 80 Plus Platnum
Software Windows 8.1 64bit (Classic Shell - Windows 7 Theme)
Benchmark Scores 3DMark Fire Strike - 16353 http://www.3dmark.com/fs/2911875
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
I guess you are still bad at math. This product can be proffitable only if sold many hundred thousands...
And I guess that you don't understand how a business built around a brand works. There is more to profit than sales of individual SKUs. You think AMD launched the 295X2 to deliberately lose money (I'm pretty sure it won't sell "hundred thousands" ) ?
Most other people who understand how the business actually works realise that halo and peripheral products are tools to enable further sales of mainstream products. Know what else doesn't turn a monetary profit? Gaming development SDK's, software utilities, Mantle(piece), PhysX, NVAPI, and most limited edition hardware. The profit comes through furtherance of the brand.
Why else do you think AMD poured R&D into their gaming program, Mantle(piece), and resources to bring analogues of ShadowPlay and GeForce Experience into being?

For some self-professed business genius you don't seem very astute in the strategy of marketing and selling a brand.
 
Joined
Nov 18, 2010
Messages
7,124 (1.45/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX + under waterblock.
Storage Optane 900P[W11] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO[FEDORA]
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 39 / Windows 11 insider
And I guess that you don't understand how a business built around a brand works. There is more to profit than sales of individual SKUs. You think AMD launched the 295X2 to deliberately lose money

Apples and oranges, dual card costs silicon wise nothing, just a new PCB. Not building whole new ASIC.

Mantle is just a natural side product of xbone/ps4 SDK development. It does not also require new ASIC's...

Boney... You act like a cheap car saler...
 
Joined
Apr 19, 2012
Messages
12,062 (2.75/day)
Location
Gypsyland, UK
System Name HP Omen 17
Processor i7 7700HQ
Memory 16GB 2400Mhz DDR4
Video Card(s) GTX 1060
Storage Samsung SM961 256GB + HGST 1TB
Display(s) 1080p IPS G-SYNC 75Hz
Audio Device(s) Bang & Olufsen
Power Supply 230W
Mouse Roccat Kone XTD+
Software Win 10 Pro
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Apples and oranges, dual card costs silicon wise nothing, just a new PCB. Not building whole new ASIC.
Yeah right, an AIO costs nothing! 16 layer PCB costs nothing! PLX chip costs nothing!
Mantle is just a natural side product of xbone/ps4 SDK development. It does not also require new ASIC's...
So Mantle hasn't cost AMD anything? What about the game bundles AMD give away with their cards? AMD don't turn a profit on them. The $5-8million they gave EA/DICE for BF4 sponsorship? AMD don't sell BF4, they give it away - IT COSTS THEM - Why? because it furthers the brand to have AAA titles associated with the companies graphics.

Could you make your argument any weaker? (Don't answer that as I'm sure you'll outdo yourself ;))

I think I may need to patent that after mentioning it in the gpu release thread. Sounds like it's catching on.

Just doing my bit for the proud tradition of internet speculation and my love of the running gag - also an appreciative nod toward your theory on AMD's Forward Thinking™ Future
 
Last edited:
Joined
Jan 13, 2009
Messages
424 (0.08/day)
Factoring in how G-Sync works, I refuse to believe FreeSync will offer a comperable experience.
How Gsync works is irrelevant. How FreeSync works is all that matters.

In theory, it's a really simple process. The card polls the Monitor to find out what it's refresh rate range is. Say it's 30-60 Hz. Any frame that falls withing that range is released immediately and the screen is instructed to refresh with it. If the card is going to slow it will resend the previous frame. To fast and it will buffer it until the monitor is ready.

With that said we'll have to wait until samples are available for testing before we know for sure. It could be worse, or it could be better.

New tech is always expensive, when there is say half dozen different monitors on market with g-sync costs will come down.

When there is competition in the marketplace, the price will come down. As long as nVidia is the only one producing the needed hardware they have no reason to lower prices because they are selling to multiple OEM's.
 
Joined
Nov 18, 2010
Messages
7,124 (1.45/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX + under waterblock.
Storage Optane 900P[W11] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO[FEDORA]
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 39 / Windows 11 insider
Yeah right, an AIO costs nothing! 16 layer PCB costs nothing! PLX chip costs nothing!

So Mantle hasn't cost AMD anything? What about the game bundles AMD give away with their car

Could you make your argument any weaker?

Nor pcb nor plx chip costs. First of all the bridge ain't theirs. The PCB is already drawn usualy for mobile solution and you have to stack them toghether, the VRM part is designed also by their manufacturer examples... It ain't a new ASIC...

The water cooling solution is also not designed from zero, they use ASETEK and not again a new ASIC build from scratch.

Okay the battlefield PR... I see you have gone even further in cheap car sales. And how that is connected with gsync? They both invest into game companies.

How is pulling your nose into AMD's PR business connected to gsync R/D costs and profitability of this program...
Boney you left your pelvic bone somewhere...
 
Last edited:
Joined
Oct 10, 2009
Messages
929 (0.18/day)
System Name Desktop | Laptop
Processor AMD Ryzen 7 5800X3D | Intel Core i7 7700HQ
Motherboard MAG X570S Torpedo Max| Neptune KLS HM175
Cooling Corsair H100x | Twin fan, fin stack & heat pipes
Memory 32GB G.Skill F4-3600C16-8GVK @ 3600MHz / 16-16-16-36-1T | 16GB DDR4 @ 2400MHz / 17-17-17-39-2T
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra | GTX 1050 Ti 4GB
Storage Kingston KC3000 1TB + Kingston KC3000 2TB + Samsung 860 EVO 1TB | 970 Evo 500GB
Display(s) 32" Dell G3223Q (2160p @ 144Hz) | 17" IPS 1920x1080P
Case Fractal Meshify 2 Compact | Aspire V Nitro BE
Audio Device(s) ifi Audio ZEN DAC V2 + Focal Radiance / HyperX Solocast
Power Supply Super Flower Leadex V Platinum Pro 1000W | 150W
Mouse Razer Viper Ultimate | Logitech MX Anywhere 2
Keyboard Razer Huntsman V2 Optical (Linear Red)
Software Windows 11 Pro x64
I read somewhere that you can't use both at the same time. Here are a couple links I just found with those statements.

http://www.blurbusters.com/lightboost-sequel-ultra-low-motion-blur-ulmb/

http://hardforum.com/showthread.php?t=1812458

I don't know if they changed that yet though.

Yeah, it can't work in tandem (but may in future?), but currently only one exists where the other exists, I have not seen a monitor that supports only G-Sync without ULMB, or vice versa.
 
Joined
Jan 13, 2009
Messages
424 (0.08/day)
If that's the case, you're doing a piss poor job

Meanwhile in the real world....Nvidia holds 62% of the discrete graphics market. These are facts - even when AMD/ATI have had a dominant product they haven't translated that into market share.




That's right - shortly after Nvidia PAID $150 million for Ageia, which was shortly after AMD turned down the opportunity to buy the same company.
Wanting PhysX but not wanting to pay for it...well, that's like waiting for your neighbour to buy a lawnmower rather than buy one yourself, then turning up on his doorstep to borrow it....and expecting him to provide the gas for it.

You mean aside from the fact that you can't buy an Adaptive-Sync monitor at the moment?

And Nvidia will most likely adopt Adaptive-Sync once it does become mainstream. At the moment Adaptive Sync is a specification - Nvidia makes no money off a VESA specification, it does however derive benefit from current G-Sync sales.
Why would anyone care about nVidia, there market share, financials, etc? Unless you work for them, or own stock. I sure don't choose my hardware by looking at what everyone else buys or who's going to make the highest profit on my purchase. I buy by price, features, reliability, and performance. If Matrox offered a card that was better suited for my needs, I'd buy it. I wouldn't care that they only have a very tiny part of the market. I'd be happy that I could find a product that catered to my needs.
 
Joined
Oct 10, 2009
Messages
929 (0.18/day)
System Name Desktop | Laptop
Processor AMD Ryzen 7 5800X3D | Intel Core i7 7700HQ
Motherboard MAG X570S Torpedo Max| Neptune KLS HM175
Cooling Corsair H100x | Twin fan, fin stack & heat pipes
Memory 32GB G.Skill F4-3600C16-8GVK @ 3600MHz / 16-16-16-36-1T | 16GB DDR4 @ 2400MHz / 17-17-17-39-2T
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra | GTX 1050 Ti 4GB
Storage Kingston KC3000 1TB + Kingston KC3000 2TB + Samsung 860 EVO 1TB | 970 Evo 500GB
Display(s) 32" Dell G3223Q (2160p @ 144Hz) | 17" IPS 1920x1080P
Case Fractal Meshify 2 Compact | Aspire V Nitro BE
Audio Device(s) ifi Audio ZEN DAC V2 + Focal Radiance / HyperX Solocast
Power Supply Super Flower Leadex V Platinum Pro 1000W | 150W
Mouse Razer Viper Ultimate | Logitech MX Anywhere 2
Keyboard Razer Huntsman V2 Optical (Linear Red)
Software Windows 11 Pro x64
Why would anyone care about nVidia, there market share, financials, etc? Unless you work for them, or own stock. I sure don't choose my hardware by looking at what everyone else buys or who's going to make the highest profit on my purchase. I buy by price, features, reliability, and performance. If Matrox offered a card that was better suited for my needs, I'd buy it. I wouldn't care that they only have a very tiny part of the market. I'd be happy that I could find a product that catered to my needs.

That wasn't the original argument. It was in reply to:

and Nvidia no longer rules the discrete GPU world, AMD has a very large share and it's showing no signs of slowing

HumanSmoke was just giving the actual statistics to prove that statement wrong.
 
Joined
Jan 13, 2009
Messages
424 (0.08/day)
That wasn't the original argument. It was in reply to:



HumanSmoke was just giving the actual statistics to prove that statement wrong.
38% is a large share, and that's just Discrete. Add in APU's and the consoles and AMD does have major influence.
 
Joined
Oct 10, 2009
Messages
929 (0.18/day)
System Name Desktop | Laptop
Processor AMD Ryzen 7 5800X3D | Intel Core i7 7700HQ
Motherboard MAG X570S Torpedo Max| Neptune KLS HM175
Cooling Corsair H100x | Twin fan, fin stack & heat pipes
Memory 32GB G.Skill F4-3600C16-8GVK @ 3600MHz / 16-16-16-36-1T | 16GB DDR4 @ 2400MHz / 17-17-17-39-2T
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra | GTX 1050 Ti 4GB
Storage Kingston KC3000 1TB + Kingston KC3000 2TB + Samsung 860 EVO 1TB | 970 Evo 500GB
Display(s) 32" Dell G3223Q (2160p @ 144Hz) | 17" IPS 1920x1080P
Case Fractal Meshify 2 Compact | Aspire V Nitro BE
Audio Device(s) ifi Audio ZEN DAC V2 + Focal Radiance / HyperX Solocast
Power Supply Super Flower Leadex V Platinum Pro 1000W | 150W
Mouse Razer Viper Ultimate | Logitech MX Anywhere 2
Keyboard Razer Huntsman V2 Optical (Linear Red)
Software Windows 11 Pro x64
38% is a large share, and that's just Discrete. Add in APU's and the consoles and AMD does have major influence.

Consoles doesn't influence the PC market that much. And factoring APUs, you'll then have to bring Intel IGPs into the equation.

Besides the statement again, says "Nvidia no longer rules the discrete GPU world", not other markets.
 
Last edited:
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Nor pcb nor plx chip costs. First of all the bridge ain't theirs. The PCB is already drawn usualy for mobile solution and you have to stack them toghether, the VRM part is designed also by their manufacturer examples... It ain't a new ASIC...
Design costs money. Testing costs money. Validation costs money. The AIO costs money whether it is already designed or not ( You think Asetek said to AMD "Hey we made this from one our existing products, so you can have it for free" :shadedshu: ). The PCB cost is higher and still needs to be laid out - which costs money.
The water cooling solution is also not designed from zero, they use satellite and not again a new ASIC build from scratch.
AMD have a contract to SELL AIO's to AMD, not give them away to the needy. The AIO adds to the bill of meterials (BoM) as does the PCB, and the PLX chip.
Okay the battlefield PR... I see you have gone even further in cheap car sales. And how that is connected with gsync? They both invest into game companies.
Which goes back exactly to what I was saying about profit not being inextricably linked to the direct sale of any particular part.
Obviously the company isn't losing out or they wouldn't be making it (or they assume the expenditure is worth it to the company in other terms) and OEM/ODM's wouldn't be using the module.
How is pulling your nose into AMD's PR business connected to gsync R/D costs and profitability of this program...
Pulling my nose??? Also I never mentioned G-Sync's R&D costs other than the fact that G-Sync is a means to elevate, publicize, and market the Nvidia brand. Just as allying a AAA game title does exactly the same thing, just as producing a halo hardware part does exactly the same thing, just as investing money and resources into a software ecosystem to support the hardware does exactly the same thing.

This is obviously beyond you or you just fail to see how business uses brand strategy to maintain and build market share. It's not rocket science. Please feel free to do whatever you're doing, but your audience just decreased by one at least.
Boney you left your pelvic bone somewhere...
That another weird Latvian saying, like "pulling my nose"?
 
Joined
Oct 30, 2008
Messages
1,758 (0.31/day)
System Name Lailalo
Processor Ryzen 9 5900X Boosts to 4.95Ghz
Motherboard Asus TUF Gaming X570-Plus (WIFI
Cooling Noctua
Memory 32GB DDR4 3200 Corsair Vengeance
Video Card(s) XFX 7900XT 20GB
Storage Samsung 970 Pro Plus 1TB, Crucial 1TB MX500 SSD, Segate 3TB
Display(s) LG Ultrawide 29in @ 2560x1080
Case Coolermaster Storm Sniper
Power Supply XPG 1000W
Mouse G602
Keyboard G510s
Software Windows 10 Pro / Windows 10 Home
ZOMG! nVidia not supporting new standards that are clearly better but for some reason they just don't want to...thats....thats....not new news one bit. Moving along, nothing to see here.
 
Joined
Oct 10, 2009
Messages
929 (0.18/day)
System Name Desktop | Laptop
Processor AMD Ryzen 7 5800X3D | Intel Core i7 7700HQ
Motherboard MAG X570S Torpedo Max| Neptune KLS HM175
Cooling Corsair H100x | Twin fan, fin stack & heat pipes
Memory 32GB G.Skill F4-3600C16-8GVK @ 3600MHz / 16-16-16-36-1T | 16GB DDR4 @ 2400MHz / 17-17-17-39-2T
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra | GTX 1050 Ti 4GB
Storage Kingston KC3000 1TB + Kingston KC3000 2TB + Samsung 860 EVO 1TB | 970 Evo 500GB
Display(s) 32" Dell G3223Q (2160p @ 144Hz) | 17" IPS 1920x1080P
Case Fractal Meshify 2 Compact | Aspire V Nitro BE
Audio Device(s) ifi Audio ZEN DAC V2 + Focal Radiance / HyperX Solocast
Power Supply Super Flower Leadex V Platinum Pro 1000W | 150W
Mouse Razer Viper Ultimate | Logitech MX Anywhere 2
Keyboard Razer Huntsman V2 Optical (Linear Red)
Software Windows 11 Pro x64
Seems to be a bit too much of ye olde noise pulling going on in this thread.
 
Joined
Nov 18, 2010
Messages
7,124 (1.45/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX + under waterblock.
Storage Optane 900P[W11] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO[FEDORA]
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 39 / Windows 11 insider
Boney, you are talking about bom costs that final buyer covers. Not the R/D cost...

You are comparing pocket money costs, like designing AIO... Not making a brand new product....

The saying is just as weird as you operate with facts... Just goofing around with numbers like a cheap car saler...
 
Last edited:
Joined
Oct 10, 2009
Messages
929 (0.18/day)
System Name Desktop | Laptop
Processor AMD Ryzen 7 5800X3D | Intel Core i7 7700HQ
Motherboard MAG X570S Torpedo Max| Neptune KLS HM175
Cooling Corsair H100x | Twin fan, fin stack & heat pipes
Memory 32GB G.Skill F4-3600C16-8GVK @ 3600MHz / 16-16-16-36-1T | 16GB DDR4 @ 2400MHz / 17-17-17-39-2T
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra | GTX 1050 Ti 4GB
Storage Kingston KC3000 1TB + Kingston KC3000 2TB + Samsung 860 EVO 1TB | 970 Evo 500GB
Display(s) 32" Dell G3223Q (2160p @ 144Hz) | 17" IPS 1920x1080P
Case Fractal Meshify 2 Compact | Aspire V Nitro BE
Audio Device(s) ifi Audio ZEN DAC V2 + Focal Radiance / HyperX Solocast
Power Supply Super Flower Leadex V Platinum Pro 1000W | 150W
Mouse Razer Viper Ultimate | Logitech MX Anywhere 2
Keyboard Razer Huntsman V2 Optical (Linear Red)
Software Windows 11 Pro x64
ZOMG! nVidia not supporting new standards that are clearly better but for some reason they just don't want to...thats....thats....not new news one bit. Moving along, nothing to see here.

But yet we have not seen Adaptive Sync/Free Sync in action/available commercially. How can you make such a claim?
 
Top