• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel's Chris Hook Confirms Commitment to Support VESA Adaptive Sync on Intel GPUs

Joined
Aug 8, 2015
Messages
112 (0.04/day)
Location
Finland
System Name Gaming rig
Processor AMD Ryzen 7 5900X
Motherboard Asus X570-Plus TUF /w "passive" chipset mod
Cooling Noctua NH-D15S
Memory Crucial Ballistix Sport LT 2x16GB 3200C16 @3600C16
Video Card(s) MSI RTX 3060 TI Gaming X Trio
Storage Samsung 970 Pro 1TB, Crucial MX500 2TB, Samsung 860 QVO 4TB
Display(s) Samsung C32HG7x
Case Fractal Design Define R5
Audio Device(s) Asus Xonar Essence STX
Power Supply Corsair RM850i 850W
Mouse Logitech G502 Hero
Keyboard Logitech G710+
Software Windows 10 Pro
aint amd freestink in the supporting card s driver not the monitors ? I'm going to relook that up that's what I thought amd claimed and said on it so any monitor should work if so

Simply put, FreeSync allows AMD’s video cards and APUs directly and dynamically control the refresh rate of a connected monitor. Most monitors are locked into refresh 60 times per second, but quick ones will refresh 75, 120 or 144 times per second. With FreeSync enabled, the monitor will refresh its image in sync with the game that’s being played, up to its maximum level, and adjusting down when necessary.


any monitor has to be able to is that not just be a freestink monitor . sounds like there just branding any capable monitor as freestink as sales hype ? the I guess with that at least you think it can ??

sounds like monitor overclocking in the end

https://www.pcgamer.com/how-to-overclock-your-monitor-to-a-higher-refresh-rate/
well the NVidia cash cow you buy a 500 buck gpu of theres then feel the need to buy a monitor at its cost + another NVidia hardware built in it at there extra cost

like I say AMD freestink is just adaptive monitor overclocking through there driver software freestink approved or not if its capable to do so - a freestink branded monitor just shows you it can overclock with out guessing if the one you got /get will [opinion]

ya, NVidia could easily do it
I like the concept and AMD 's way [ it works and simple ] just sad it just AMD only applies it to there cards software . I don't see NVidia coming down off there high horse and implementing it in theres . I had to move off my amd cards due to lack of support of thins that worked great before . my 7850 was a solid card for what it was but when later drivers did not support games I run ? well time to move on to NVidia that all my stuff works

one example
https://steamcommunity.com/app/12140/discussions/0/864961721978159686/

use older 12.6 works use later don't . I'm not going to swap drivers all day to do this and the to do that crap as it was getting with amd then the blackscreening with the 14.xx and up

Freesync/Adaptive sync requires that the monitor uses a scaler that supports it, so it can't be done just on the graphics driver. All monitors have a scaler so it doesn't need extra hardware for Adaptive Sync, just one that is capable.

NVidia GSYNC module on the other hand is a separate piece of hardware which NVidia sells to the monitor manufacturer(s) at a set price and they need to integrate it to the product raising costs. The new HDR capable GSYNC module is said to cost around 400-500$ (for the manufacturer) which will add do to the cost of the monitor.
 
Last edited:
Joined
Apr 3, 2012
Messages
4,360 (0.99/day)
Location
St. Paul, MN
System Name Bay2- Lowerbay/ HP 3770/T3500-2+T3500-3+T3500-4/ Opti-Con/Orange/White/Grey
Processor i3 2120's/ i7 3770/ x5670's/ i5 2400/Ryzen 2700/Ryzen 2700/R7 3700x
Motherboard HP UltraSlim's/ HP mid size/ Dell T3500 workstation's/ Dell 390/B450 AorusM/B450 AorusM/B550 AorusM
Cooling All stock coolers/Grey has an H-60
Memory 2GB/ 4GB/ 12 GB 3 chan/ 4GB sammy/T-Force 16GB 3200/XPG 16GB 3000/Ballistic 3600 16GB
Video Card(s) HD2000's/ HD 2000/ 1 MSI GT710,2x MSI R7 240's/ HD4000/ Red Dragon 580/Sapphire 580/Sapphire 580
Storage ?HDD's/ 500 GB-er's/ 500 GB/2.5 Samsung 500GB HDD+WD Black 1TB/ WD Black 500GB M.2/Corsair MP600 M.2
Display(s) 1920x1080/ ViewSonic VX24568 between the rest/1080p TV-Grey
Case HP 8200 UltraSlim's/ HP 8200 mid tower/Dell T3500's/ Dell 390/SilverStone Kublai KL06/NZXT H510 W x2
Audio Device(s) Sonic Master/ onboard's/ Beeper's!
Power Supply 19.5 volt bricks/ Dell PSU/ 525W sumptin/ same/Seasonic 750 80+Gold/EVGA 500 80+/Antec 650 80+Gold
Mouse cheap GigaWire930, CMStorm Havoc + Logitech M510 wireless/iGear usb x2/MX 900 wireless kit 4 Grey
Keyboard Dynex, 2 no name, SYX and a Logitech. All full sized and USB. MX900 kit for Grey
Software Mint 18 Sylvia/ Opti-Con Mint KDE/ T3500's on Kubuntu/HP 3770 is Win 10/Win 10 Pro/Win 10 Pro/Win10
Benchmark Scores World Community Grid is my benchmark!!
I don't care what any of you say. The guys that have done the work testing the systems have proven to me, G-Synch is not worth the money. In fact, Nvidea, of which I have a couple cards, work fine. But, like so many people say wrong, AMD cards also work.

My next card will be a 580. Laugh if you must. I have a 1080p TV and an A-10 build I will be using with the 580, to play games, as I find time to. If any of you think, I will not get at least 70 fps on my games, you are stoned or, stupid. The complete reality is, we as humans cannot see past 40 fps. Granted, I understand the TV/Monitor refresh vs. the gpu FPS. I still think there is too much e-peening about this.

How many here are PRO Gamers? And... isn't true that, very few, of them even bother with 4k? I, a few days ago, stood face to screen with a 1080p and a 4k monitor, side by side.

Know what? They looked the same. Both were 42 inch. Both were playing the same thing, from the same antenna, of a 4k stream. No difference!

Granted, 4k is a better quality signal, has more stuffs. but it is barely, if any, noticeable on a 42 inch screen. So, on my, TV, at 1080 and 40 inches, it would not make a difference. If I were to buy a 50, or bigger, inch screen, One would probably notice the difference.

2 cents.

BTW, I would get a Freesync, why pay more? Even if you have the money?

The video that Kyle made? What makes any of you not believe what was said? 50% either way. 100% said the $200 was not worth it.

Linus, that spent all day testing it, showed the lag associated with Gsynch, in all but one scenario with VESA off.

I cannot justify the slight difference Nvidea has. Nor should it matter, unless one is a Pro-Gamer.

Pay more.for what? Again? This is like a political argument, too much bias!

:lovetpu:
 
Joined
Feb 17, 2017
Messages
852 (0.32/day)
Location
Italy
Processor i7 2600K
Motherboard Asus P8Z68-V PRO/Gen 3
Cooling ZeroTherm FZ120
Memory G.Skill Ripjaws 4x4GB DDR3
Video Card(s) MSI GTX 1060 6G Gaming X
Storage Samsung 830 Pro 256GB + WD Caviar Blue 1TB
Display(s) Samsung PX2370 + Acer AL1717
Case Antec 1200 v1
Audio Device(s) aune x1s
Power Supply Enermax Modu87+ 800W
Mouse Logitech G403
Keyboard Qpad MK80
Isn't that basically "seems smoother to me, therefore it is?"

Sorry dude, I'm not really buying it either in this instance.

It's not like you have to BUY it, it's like that, either you believe it or not doesn't really matter.


Dude, FreeSync's maximum supported refresh rate range is 9–240 Hz. How much lower do you want to go? If for whatever reason your game is running at 8 frames or less, adjust your bloody settings instead.

https://en.wikipedia.org/wiki/FreeSync

And this is a gross oversimplification, but also where you logically get to if you extrapolate on how NV's and AMD's respective approaches work: if you have two screens with the same specs and one does directly what the PC tells it while the other relies on an in-between board, which one is likely to produce a better result?

The fact is, for all practical intends and purposes the two are pretty evenly matched, while one costs considerably more.

Oh yeah? Now find me a monitor that has a freesync working from under 30 fps please. Actually most of them starts from like 45 or higher, leaving drops of frame (beneath a reasonable amount) uncovered, so there's that if you need stats compared to stats, but as i already said the difference is noticeable only in person, it's not like you can compare their specs.
 
Joined
Sep 17, 2014
Messages
20,972 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
There's nothing i can show that can prove anything because you have to own or at least see it in person, you can't show differences like those in any video or benchmark, it's not like we're saying x videocard is faster than y videocard. Also input lag isn't like the only thing to evaluate stuff here, g-sync works from lower frequency/framerate and feels smoother, besides, that is a pretty approximate test and crysis 3 isn't even a good game to test these stuff on.
He himself doesn't understand what he tried to prove in the end, so yeah as i said, there's no video and comparison that tells you which one is the best, you just have to try yourself. And yes since there's hardware involved, that's most likely what's going to happen in the end.
Oh and you're starting to sound more and more like a butthurt fanboy.



Was he? Then i guess the community was wrong, or he's very dependant on the amount of money he receives from a company, that would explain why he's totally unreliable and everyone with a normally functioning brain would see that

Sorry bud but the perceivable differences between Gsync and recent FreeSync (not the earliest implementation) is just not there - its entirely in the realm of unnoticeable and Linus's test shows that quite well, in fact he has also tested several different setting tweaks by forcing Vsync on and off etc. and Gsync and FreeSync both respond differently to either setting - each has its ideal setup and once used, they both work very well and are near identical even in button-to-pixel response and frame pacing. I'm not even a fan of Linus at all, but that was a pretty decent test.

The problems with FreeSync happened indeed because monitors would not support a wide range of refresh rates, but realistically, if you use a decent monitor, FreeSync is completely similar. And even with less decent monitors with a tighter FreeSync range, as long as you're in that range and using the right ingame settings, the experience is 100% the same as with Gsync. The only thing you may not get is monitor strobing but you can't even use that together with adaptive sync anyway.

About sub 30 fps... you can just Vsync (available on any monitor at 0,-) that and the input lag penalty is nearly the same, because its SLOW anyway. Not sure why that is even remotely part of this discussion... who the hell games at sub 30 fps on high end gear? You're grasping at straws here desperately defending a grossly overpriced adaptive sync. Buyer's remorse?

In the end, budget matters. Spending 300-400 extra on Gsync is money that cannot go to a higher resolution, a better panel, or getting a VA/IPS over a crappy TN. Those are much more vital elements to a good monitor. You can alleviate most of the problems with fixed refresh rate screens with clever tweaking and sync options through frame capping, Fast Sync and Adaptive Sync from the driver. With the added bonus of being able to add strobe as well, which does a LOT more when it comes to quality of life for gaming. Hell you can even spend the 400 bucks on a fatter GPU so you never hit low FPS anyway and remove the need for adaptive sync entirely. Gsync is literally one of the worst investments to make on a monitor unless your budget is infinite. Never mind the vendor lock in, because you tend to keep a monitor for several GPU's worth.
 
Last edited:
Joined
Jun 29, 2018
Messages
9 (0.00/day)
Processor Q6700 2.66 GHz
Motherboard yup
Cooling fan
Software installed
While I'm happy to see an open source standard, as opposed to the very expensive alternative, embraced by Intel here... I fail to see what good it does... unless Intel really has something up their sleeve in the discrete graphics card department.
It allows for stutter-free gameplay experience on iGPUs where hitting that VSync cap can prove difficult more often than not.

I wish my HD530 would support FreeSync ...
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,732 (3.41/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
I know that the age-old standard vSync isn't perfect... but if your game already isn't performing well, I'd think it would still be laggy, no matter what vSync you're using (or none at all). FPS dips that low aren't good no matter what.
 
Joined
Jan 8, 2017
Messages
8,989 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
FreeSync and Vulkan are being pushed by Raja and Chris

And it's a good thing, too bad Intel practically has nothing relevant to exert this push upon.

and there are Monitors like the LG 43UD79-B which has a range of 56~61Hz, absolute garbage.

Meh, still better than nothing and it's cheap.
 
Joined
Feb 17, 2017
Messages
852 (0.32/day)
Location
Italy
Processor i7 2600K
Motherboard Asus P8Z68-V PRO/Gen 3
Cooling ZeroTherm FZ120
Memory G.Skill Ripjaws 4x4GB DDR3
Video Card(s) MSI GTX 1060 6G Gaming X
Storage Samsung 830 Pro 256GB + WD Caviar Blue 1TB
Display(s) Samsung PX2370 + Acer AL1717
Case Antec 1200 v1
Audio Device(s) aune x1s
Power Supply Enermax Modu87+ 800W
Mouse Logitech G403
Keyboard Qpad MK80
Sorry bud but the perceivable differences between Gsync and recent FreeSync (not the earliest implementation) is just not there - its entirely in the realm of unnoticeable and Linus's test shows that quite well, in fact he has also tested several different setting tweaks by forcing Vsync on and off etc. and Gsync and FreeSync both respond differently to either setting - each has its ideal setup and once used, they both work very well and are near identical even in button-to-pixel response and frame pacing. I'm not even a fan of Linus at all, but that was a pretty decent test.

The problems with FreeSync happened indeed because monitors would not support a wide range of refresh rates, but realistically, if you use a decent monitor, FreeSync is completely similar. And even with less decent monitors with a tighter FreeSync range, as long as you're in that range and using the right ingame settings, the experience is 100% the same as with Gsync. The only thing you may not get is monitor strobing but you can't even use that together with adaptive sync anyway.

About sub 30 fps... you can just Vsync (available on any monitor at 0,-) that and the input lag penalty is nearly the same, because its SLOW anyway. Not sure why that is even remotely part of this discussion... who the hell games at sub 30 fps on high end gear? You're grasping at straws here desperately defending a grossly overpriced adaptive sync. Buyer's remorse?

In the end, budget matters. Spending 300-400 extra on Gsync is money that cannot go to a higher resolution, a better panel, or getting a VA/IPS over a crappy TN. Those are much more vital elements to a good monitor. You can alleviate most of the problems with fixed refresh rate screens with clever tweaking and sync options through frame capping, Fast Sync and Adaptive Sync from the driver. With the added bonus of being able to add strobe as well, which does a LOT more when it comes to quality of life for gaming. Hell you can even spend the 400 bucks on a fatter GPU so you never hit low FPS anyway and remove the need for adaptive sync entirely. Gsync is literally one of the worst investments to make on a monitor unless your budget is infinite. Never mind the vendor lock in, because you tend to keep a monitor for several GPU's worth.

It is there. The fact you and others here don't see it doesn't make it "not there". The test Linus made proves nothing, and isn't even a reliable test. Input lag =/= low framerate, the 2 don't feel at all the same and not always having one is better than the other, depends on the game and everything, it's not like you have to play all the time at sub 30 FPS (or better sub 40 since there's very few monitors that support that low framerate for Freesync, most of them don't go under 40Hz/fps) but even with high end hardware you'll end up having some dips to under 40fps at certain resolutions, 30-75Hz/FPS is where this tech should work better. I'm not defending it, i'm just stating the facts, nvidia's tech is better, then you can say whatever you want about the price, about the perf/price ratio, about quality of monitors coming with the tech, about availability, i'm strictly talking about performance and no, it's not buyer's remorse because i didn't buy none of those, but i tried them extensively, if i was to buy something i'd find a good deal for a g-sync monitor, even because i have a nvidia card.

Oh cmon, stop with this 300-400-500, anyone seem to have a different perspective on these prices here, you can find good monitors for like 100/150€ over the freesync versions when possible, that is, i'm totally with you on the fact it's probably better to have other techs instead of these.
 
Last edited:
Joined
Sep 17, 2014
Messages
20,972 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
It is there. The fact you and others here don't see it doesn't make it "not there". The test Linus made proves nothing, and isn't even a reliable test. Input lag =/= low framerate, the 2 don't feel at all the same and not always having one is better than the other, depends on the game and everything, it's not like you have to play all the time at sub 30 FPS (or better sub 40 since there's very few monitors that support that low framerate for Freesync, most of them don't go under 40Hz/fps) but even with high end hardware you'll end up having some dips to under 40fps at certain resolutions, 30-75Hz/FPS is where this tech should work better. I'm not defending it, i'm just stating the facts, nvidia's tech is better, then you can say whatever you want about the price, about the perf/price ratio, about quality of monitors coming with the tech, about availability, i'm strictly talking about performance and no, it's not buyer's remorse because i didn't buy none of those, but i tried them extensively, if i was to buy something i'd find a good deal for a g-sync monitor, even because i have a nvidia card.

Oh cmon, stop with this 300-400-500, anyone seem to have a different perspective on these prices here, you can find good monitors for like 100/150€ over the freesync versions when possible, that is, i'm totally with you on the fact it's probably better to have other techs instead of these.

Something can be technically better and still totally not worth it. It goes like that more often than not: examples being "the megapixel race" the "screen real estate race on smartphones" or "1ms 240hz at the expense of everything else" etc etc ad infinitum.

My main point is, Gsync is heavily overrated. There are many other and cheaper ways to get a good, and even much better viewing and gaming experience. That combined with vendor lockinand a fat price increase makes for a pretty questionable way to use budget - whereas Freesync with its low cost is inherently much more sensible.
 
Last edited:
Joined
Apr 3, 2010
Messages
800 (0.16/day)
Location
US
System Name Desktop
Processor AMD Ryzen 5 5600X [3.7GHz/4.6GHz][6C/12T]
Motherboard ASUS TUF Gaming X570-PRO [X570]
Cooling Cooler Master Hyper 212 RGB Black Edition
Memory G.SKILL Ripjaws V Series 32GB [DDR4 3600][2x16GB][16-19-19-39@1.35V]
Video Card(s) ASUS KO GeForce RTX 3060 Ti V2 OC Edition 8GB GDDR6 [511.65]
Storage [OS] Samsung 970 Evo 500GB | [Storage] 980 1TB | 860 Evo 1TB | 850 Evo 500GB | Seagate Firecuda 2TB
Display(s) LG 27GL850 [27"][2560x1440@144Hz][Nano IPS][LED][G-SYNC Compatible][DP]
Case Corsair Obsidian 750D
Audio Device(s) Realtek ALC S1200A High Definition Audio CODEC
Power Supply EVGA SuperNOVA 1000 G1+ [+12V: 83.3A 999.6W][80 Plus Gold]
Mouse Logitech M570 Trackball
Keyboard Corsair Gaming K55 RGB
Software Microsoft Windows 10 Pro [21H1][64-bit]
Joined
Feb 17, 2017
Messages
852 (0.32/day)
Location
Italy
Processor i7 2600K
Motherboard Asus P8Z68-V PRO/Gen 3
Cooling ZeroTherm FZ120
Memory G.Skill Ripjaws 4x4GB DDR3
Video Card(s) MSI GTX 1060 6G Gaming X
Storage Samsung 830 Pro 256GB + WD Caviar Blue 1TB
Display(s) Samsung PX2370 + Acer AL1717
Case Antec 1200 v1
Audio Device(s) aune x1s
Power Supply Enermax Modu87+ 800W
Mouse Logitech G403
Keyboard Qpad MK80
Something can be technically better and still totally not worth it. It goes like that more often than not: examples being "the megapixel race" the "screen real estate race on smartphones" or "1ms 240hz at the expense of everything else" etc etc ad infinitum.

My main point is, Gsync is heavily overrated. There are many other and cheaper ways to get a good, and even much better viewing and gaming experience. That combined with vendor lockinand a fat price increase makes for a pretty questionable way to use budget - whereas Freesync with its low cost is inherently much more sensible.

Not totally not worth it. Gsync is a better version of adaptive sync compared to freesync, and if one isn't worth it then the other one isn't too, not money speaking, i already said multiple times i'm totally with you regarding the price, although it's not 300-400 or even 500$ more compared to freesync monitors.
 
Joined
Jan 8, 2017
Messages
8,989 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I'm sure NVIDIA will block it long before Intel starts using it anyway.

I am not quite sure about that. I don't fully understand how it works but given the recent AMD Freesync thing it seems that this is tied more to Windows and how it handles framebuffers, more specifically about how it lets third parties access framebuffers from other devices. It may be up to MS to block this sort of thing and if they do then that would be some proper corporate ill intent.
 
Joined
Apr 3, 2010
Messages
800 (0.16/day)
Location
US
System Name Desktop
Processor AMD Ryzen 5 5600X [3.7GHz/4.6GHz][6C/12T]
Motherboard ASUS TUF Gaming X570-PRO [X570]
Cooling Cooler Master Hyper 212 RGB Black Edition
Memory G.SKILL Ripjaws V Series 32GB [DDR4 3600][2x16GB][16-19-19-39@1.35V]
Video Card(s) ASUS KO GeForce RTX 3060 Ti V2 OC Edition 8GB GDDR6 [511.65]
Storage [OS] Samsung 970 Evo 500GB | [Storage] 980 1TB | 860 Evo 1TB | 850 Evo 500GB | Seagate Firecuda 2TB
Display(s) LG 27GL850 [27"][2560x1440@144Hz][Nano IPS][LED][G-SYNC Compatible][DP]
Case Corsair Obsidian 750D
Audio Device(s) Realtek ALC S1200A High Definition Audio CODEC
Power Supply EVGA SuperNOVA 1000 G1+ [+12V: 83.3A 999.6W][80 Plus Gold]
Mouse Logitech M570 Trackball
Keyboard Corsair Gaming K55 RGB
Software Microsoft Windows 10 Pro [21H1][64-bit]
I am not quite sure about that. I don't fully understand how it works but given the recent AMD Freesync thing it seems that this is tied more to Windows and how it handles framebuffers, more specifically about how it lets third parties access framebuffers from other devices. It may be up to MS to block this sort of thing and if they do then that would be some proper corporate ill intent.

It's possible that it will be to difficult to block to be worth the effort, but NVIDIA will probably still look into it. Of course it would be nice if NVIDIA would just support it as a budget option, I just hope if Intel is able to achieve a dedicated GPU on par or better than NVIDIA's that they price them reasonably.
 
Joined
Feb 23, 2008
Messages
1,064 (0.18/day)
Location
Montreal
System Name Aryzen / Sairikiki / Tesseract
Processor 5800x / i7 920@3.73 / 5800x
Motherboard Steel Legend B450M / GB EX58-UDP4 / Steel Legend B550M
Cooling Mugen 5 / Pure Rock / Glacier One 240
Memory Corsair Something 16 / Corsair Something 12 / G.Skill 32
Video Card(s) AMD 6800XT / AMD 6750XT / Sapphire 7800XT
Storage Way too many drives...
Display(s) LG 332GP850-B / Sony w800b / Sony X90J
Case EVOLV X / Carbide 540 / Carbide 280x
Audio Device(s) SB ZxR + GSP 500 / board / Denon X1700h + ELAC Uni-Fi 2 + Senn 6XX
Power Supply Seasonic PRIME GX-750 / Corsair HX750 / Seasonic Focus PX-650
Mouse G700 / none / G602
Keyboard G910
Software w11 64
Benchmark Scores I don't play benchmarks...
Oh yeah? Now find me a monitor that has a freesync working from under 30 fps please. Actually most of them starts from like 45 or higher, leaving drops of frame (beneath a reasonable amount) uncovered, so there's that if you need stats compared to stats, but as i already said the difference is noticeable only in person, it's not like you can compare their specs.

You do realize that there are no G-Sync monitors that go under 30Hz either, right? You were criticizing FreeSync, and the tech itself is not the limitation. Manufacturers are free to implement it the way they choose, and you are free to buy the one that suits your needs. The FreeSync monitors that you complain about for not going under 45Hz pretty much all cost less than $300. You will not find a G-Sync display at that range. You can always spend more for a higher quality FreeSync display and it'll still cost less than a comparable G-Sync equivalent. There is no arguing with that.

My next card will be a 580. Laugh if you must. I have a 1080p TV and an A-10 build I will be using with the 580, to play games, as I find time to. If any of you think, I will not get at least 70 fps on my games, you are stoned or, stupid. The complete reality is, we as humans cannot see past 40 fps. Granted, I understand the TV/Monitor refresh vs. the gpu FPS. I still think there is too much e-peening about this.

Know what? They looked the same. Both were 42 inch. Both were playing the same thing, from the same antenna, of a 4k stream. No difference!

Granted, 4k is a better quality signal, has more stuffs. but it is barely, if any, noticeable on a 42 inch screen. So, on my, TV, at 1080 and 40 inches, it would not make a difference. If I were to buy a 50, or bigger, inch screen, One would probably notice the difference.

There's no reason to laugh at a 580, at 1080p that thing runs pretty much any game maxed out if you keep the AA reasonable. That said, it's the A-10 that will be holding you back. Unless you are talking about a Fairchild Republic A-10, in which case I'd love to know how you plug a 580 and a TV to it.

Regarding TV size, 1080p is quite sharp even for 50", it all depends how far from the TV you sit and what the dot pitch is.

The "we as humans cannot see past 40 fps" part is categorically wrong, just google it. Going from 60Hz to 75Hz alone makes a huge difference, and 120Hz is like a whole another level. That said, trying for higher than 144Hz is useless for all practical reasons.
 
Joined
Feb 17, 2017
Messages
852 (0.32/day)
Location
Italy
Processor i7 2600K
Motherboard Asus P8Z68-V PRO/Gen 3
Cooling ZeroTherm FZ120
Memory G.Skill Ripjaws 4x4GB DDR3
Video Card(s) MSI GTX 1060 6G Gaming X
Storage Samsung 830 Pro 256GB + WD Caviar Blue 1TB
Display(s) Samsung PX2370 + Acer AL1717
Case Antec 1200 v1
Audio Device(s) aune x1s
Power Supply Enermax Modu87+ 800W
Mouse Logitech G403
Keyboard Qpad MK80
You do realize that there are no G-Sync monitors that go under 30Hz either, right? You were criticizing FreeSync, and the tech itself is not the limitation. Manufacturers are free to implement it the way they choose, and you are free to buy the one that suits your needs. The FreeSync monitors that you complain about for not going under 45Hz pretty much all cost less than $300. You will not find a G-Sync display at that range. You can always spend more for a higher quality FreeSync display and it'll still cost less than a comparable G-Sync equivalent. There is no arguing with that.

I'm pretty sure i've seen g-sync monitors starting from 26Hz, or 25Hz not sure, if i manage to find that i'll post it here. Besides that the g-sync is still better and you can argue as much as you want.
 
Joined
Aug 10, 2018
Messages
90 (0.04/day)
Processor Ryzen 1700X
Motherboard Asus x370 Prime-Pro
Cooling MSI Frozr L (Push-Pull)
Memory 3000mhz CL15
Video Card(s) RX 5700 XT Gigabyte OC
Storage 15TB of Mixed Storage
Display(s) 3440x1440
Audio Device(s) Asus Xonar DX + Sennheiser HD 598
Power Supply 750W
Besides that the g-sync is still better and you can argue as much as you want.
I kek'd at this. Thanks for making my day.
 
Joined
Feb 17, 2017
Messages
852 (0.32/day)
Location
Italy
Processor i7 2600K
Motherboard Asus P8Z68-V PRO/Gen 3
Cooling ZeroTherm FZ120
Memory G.Skill Ripjaws 4x4GB DDR3
Video Card(s) MSI GTX 1060 6G Gaming X
Storage Samsung 830 Pro 256GB + WD Caviar Blue 1TB
Display(s) Samsung PX2370 + Acer AL1717
Case Antec 1200 v1
Audio Device(s) aune x1s
Power Supply Enermax Modu87+ 800W
Mouse Logitech G403
Keyboard Qpad MK80
Joined
Mar 10, 2015
Messages
3,984 (1.19/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
30-75Hz/FPS is where this tech should work better

My monitor has a Freesync range of 30 - 75fps and cost $475. 32" 3440 x 1440.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.62/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Besides that the g-sync is still better and you can argue as much as you want.
Two things will always be true of G-Sync: it will cost more and have higher input lag (both because of the extra hardware step involved). That first one is a deal breaker for most potential customers.

Let these words sink in for you: there's absolutely nothing the G-Sync module can do that the GPU can't do in regards to adaptive sync.

So how is it "better?" The only thing empirically "better" I can think of is that monitor manufacturers can be a lot more lazy implementing G-Sync than adaptive sync. Consumers pay a premium for their laziness though so...yeah...not really "better."


Back on topic, I really want to know how Intel is going to handle it. AMD created the FreeSync brand for certification purposes. It stands to reason that Intel would create a similar brand also for certification purposes. Even though one monitor could be FreeSync branded and theoretically work fine with Intel's brand, the monitors might end up carrying branding for both. I don't think that was VESA's intent with the standard. It was supposed to be universal and just work like DisplayPort in general. AMD creating FreeSync may have created a snowball effect that was unintended.

On the other hand, maybe Intel will simply jump on the FreeSync bandwagon where any given FreeSync monitor will work powered by an Intel GPU too. If Intel does throw its hat in behind FreeSync entirely, I'm going to laugh at NVIDIA. How long will they beat their dead horse? NVIDIA could seriously lose the home theater PC market if they don't abandon G-Sync over the next decade.
 
Last edited:
Joined
Sep 28, 2005
Messages
3,161 (0.47/day)
Location
Canada
System Name PCGR
Processor 12400f
Motherboard Asus ROG STRIX B660-I
Cooling Stock Intel Cooler
Memory 2x16GB DDR5 5600 Corsair
Video Card(s) Dell RTX 3080
Storage 1x 512GB Mmoment PCIe 3 NVME 1x 2TB Corsair S70
Display(s) LG 32" 1440p
Case Phanteks Evolve itx
Audio Device(s) Onboard
Power Supply 750W Cooler Master sfx
Software Windows 11
after seeing my performance in Fallout 4 with a GTX1070, I don't think I will need any kind of special vsync. My FPS doesn't exceed 30fps in downtown area in that game (which is surprising to see that my cpu/gpu usage stays low even in the low fps area, dunno why).
 
Joined
Feb 17, 2017
Messages
852 (0.32/day)
Location
Italy
Processor i7 2600K
Motherboard Asus P8Z68-V PRO/Gen 3
Cooling ZeroTherm FZ120
Memory G.Skill Ripjaws 4x4GB DDR3
Video Card(s) MSI GTX 1060 6G Gaming X
Storage Samsung 830 Pro 256GB + WD Caviar Blue 1TB
Display(s) Samsung PX2370 + Acer AL1717
Case Antec 1200 v1
Audio Device(s) aune x1s
Power Supply Enermax Modu87+ 800W
Mouse Logitech G403
Keyboard Qpad MK80
Two things will always be true of G-Sync: it will cost more and have higher input lag (both because of the extra hardware step involved). That first one is a deal breaker for most potential customers.

Let these words sink in for you: there's absolutely nothing the G-Sync module can do that the GPU can't do in regards to adaptive sync.

So how is it "better?" The only thing empirically "better" I can think of is that monitor manufacturers can be a lot more lazy implementing G-Sync than adaptive sync. Consumers pay a premium for their laziness though so...yeah...not really "better."


Back on topic, I really want to know how Intel is going to handle it. AMD created the FreeSync brand for certification purposes. It stands to reason that Intel would create a similar brand also for certification purposes. Even though one monitor could be FreeSync branded and theoretically work fine with Intel's brand, the monitors might end up carrying branding for both. I don't think that was VESA's intent with the standard. It was supposed to be universal and just work like DisplayPort in general. AMD creating FreeSync may have created a snowball effect that was unintended.

On the other hand, maybe Intel will simply jump on the FreeSync bandwagon where any given FreeSync monitor will work powered by an Intel GPU too. If Intel does throw its hat in behind FreeSync entirely, I'm going to laugh at NVIDIA. How long will they beat their dead horse? NVIDIA could seriously lose the home theater PC market if they don't abandon G-Sync over the next decade.

You mean G-sync has less input lag. "Let these words sink in for you" yeah sure, i should ignore completely what i saw and what people i know saw, because someone on the internet is telling me? I'll surely do that.
Customers pay a premium for premium performance, in the case of nvidia it's a premium premium, free not to buy, but not to ignore the fact it's better even only slightly, partly because of proprietary hardware implementation.

My monitor has a Freesync range of 30 - 75fps and cost $475. 32" 3440 x 1440.

I'm sure it's a GREAT monitor. What model are we talking about exactly?
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.62/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
You mean G-sync has less input lag.
GeForce has less input lag than Radeon because of architectural differences. That has naught to do with how G-Sync is implemented.
 
Joined
Feb 17, 2017
Messages
852 (0.32/day)
Location
Italy
Processor i7 2600K
Motherboard Asus P8Z68-V PRO/Gen 3
Cooling ZeroTherm FZ120
Memory G.Skill Ripjaws 4x4GB DDR3
Video Card(s) MSI GTX 1060 6G Gaming X
Storage Samsung 830 Pro 256GB + WD Caviar Blue 1TB
Display(s) Samsung PX2370 + Acer AL1717
Case Antec 1200 v1
Audio Device(s) aune x1s
Power Supply Enermax Modu87+ 800W
Mouse Logitech G403
Keyboard Qpad MK80
GeForce has less input lag than Radeon because of architectural differences. That has naught to do with how G-Sync is implemented.

Could be, but the result is that overall g-sync has (or feels with) less input lag, put it on the card, put it on the monitor, but it's how it is
 
Last edited:
Joined
Sep 7, 2017
Messages
3,244 (1.33/day)
System Name Grunt
Processor Ryzen 5800x
Motherboard Gigabyte x570 Gaming X
Cooling Noctua NH-U12A
Memory Corsair LPX 3600 4x8GB
Video Card(s) Gigabyte 6800 XT (reference)
Storage Samsung 980 Pro 2TB
Display(s) Samsung CFG70, Samsung NU8000 TV
Case Corsair C70
Power Supply Corsair HX750
Software Win 10 Pro
after seeing my performance in Fallout 4 with a GTX1070, I don't think I will need any kind of special vsync. My FPS doesn't exceed 30fps in downtown area in that game (which is surprising to see that my cpu/gpu usage stays low even in the low fps area, dunno why).

Seriously? That sounds bad. I'm surprised the 1070 would go so low. What resolution?
 
Top