• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

FreeSync or G-Sync or both?

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
AMD has the console market. I guarentee you next generation consoles will have adaptive sync because it makes a huge difference. In fact, I wouldn't be at all surprised if Nintendo's, Sony's, and Microsoft's interest is the reason why AMD pursued adaptive sync the way they did. They undeniably have the most to gain. As TVs turn to adaptive sync, so too will monitors.

Yes, NVIDIA, using their ~80% marketshare, is driving card buyers into a walled garden of G-sync monitors but, as I said previously, G-sync monitors are prohibitively priced which keeps the market tiny. I'd argue the market for FreeSync monitors is already larger simply because of their lower price--despite AMD's smaller market share.

Bare in mind that there is a third player that dwarfs AMD and NVIDIA: Intel. Intel announced adaptive sync support is coming. G-sync will only exist as long as NVIDIA wants to keep funding it. The decision has already been made that adaptive sync is the future. No one wants to license G-sync from NVIDIA.
 
Joined
Oct 29, 2012
Messages
842 (0.20/day)
Location
Germany
System Name Perf/price king /w focus on low noise and TDP
Processor Intel Xeon E3-1230 v2
Motherboard Gigabyte GA-B75M-D3H
Cooling Thermalright HR-02 Macho Rev.A (BW)
Memory 16GB Corsair Vengeance LP Black
Video Card(s) Gigabyte GTX 670 OC
Storage 525GB Crucial MX300 & 256GB Samsung 830 Series
Display(s) Home: LG 29UB65-P & Work: LG 34UB88-B
Case Fractal Design Arc Mini
Audio Device(s) Asus Xonar Essence STX /w Sennheiser HD 598
Power Supply be quiet! Straight Power CM E9 80+ Gold 480W
Mouse Roccat Kone XTD optical
Keyboard SteelSeries Apex M500
Software Win10
AMD has the console market. I guarentee you next generation consoles will have adaptive sync because it makes a huge difference. In fact, I wouldn't be at all surprised if Nintendo's, Sony's, and Microsoft's interest is the reason why AMD pursued adaptive sync the way they did. They undeniably have the most to gain. As TVs turn to adaptive sync, so too will monitors.
Good catch!

Bare in mind that there is a third player that dwarfs AMD and NVIDIA: Intel. Intel announced adaptive sync support is coming. G-sync will only exist as long as NVIDIA wants to keep funding it. The decision has already been made that adaptive sync is the future. No one wants to license G-sync from NVIDIA.
That is exactly how I expect it to pan out:
My guess is we won't see a 2nd iteration of G-Sync. Nvidia is smart when it comes to business and they will adopt Adaptive Sync when the window of opportunity to make some sweet $ with G-Sync is about to close.
 
Joined
Dec 18, 2005
Messages
8,253 (1.24/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
i have made a few recent changes.. i was running a pair of 970 cards with a no adaptive sync 1080 144 hrz 24 inch gaming panel.. the combination worked fine i never saw tearing or any problems..

i then got rid if the 970 cards and bought a couple of 980 TI cards.. again the combination worked fine for gaming.. total overkill at 1080 but it worked fine..

i then decided i needed an IPS panel for my photo editing.. i was never entirely happy with the colour gamut on the TN gaming panel.. so just to make sure i got the best of both worlds.. i bought an asus rog swift 1440 panel.. it was f-cking expensive..

i have no problems with nvidia g-sync.. it works fine.. but all i really wanted was a fast IPS style panel.. the g-sync just happened to come with it..

i often buy things just to see.. i am often not "wowed" by the results.. but i do think its the only real way to find out.. its also part of the fun for me..

when i have done my "finding out" i stop spending money and simply use the stuff and for a f-cking long time.. i have only just stopped using my last system put together back in 2007.. :)

my 2007 hardware was still okay.. what made it not okay was the software generations left it behind.. DX being an obvious example.. i expect the same thing to happen again in the years to come.. new software will leave my old hardware behind.. tis eventually what happens to good hardware..

one thing i am sure of.. 1440 (2K) is too much for a 970 level card.. 1080 (1K) gaming is still quite viable and from a bang for buck point of view makes the most sense of all.. the other thing is 4K gaming aint here yet even with the best and most expensive hardware..

trog
 
Last edited:
Joined
Oct 29, 2012
Messages
842 (0.20/day)
Location
Germany
System Name Perf/price king /w focus on low noise and TDP
Processor Intel Xeon E3-1230 v2
Motherboard Gigabyte GA-B75M-D3H
Cooling Thermalright HR-02 Macho Rev.A (BW)
Memory 16GB Corsair Vengeance LP Black
Video Card(s) Gigabyte GTX 670 OC
Storage 525GB Crucial MX300 & 256GB Samsung 830 Series
Display(s) Home: LG 29UB65-P & Work: LG 34UB88-B
Case Fractal Design Arc Mini
Audio Device(s) Asus Xonar Essence STX /w Sennheiser HD 598
Power Supply be quiet! Straight Power CM E9 80+ Gold 480W
Mouse Roccat Kone XTD optical
Keyboard SteelSeries Apex M500
Software Win10
[...]
trog

I don't wanna sound mean or anything. But in which way does your comment contribute to the thread? At least you could've told us if you are happy with your Asus RoG Swift monitor and if G-Sync makes a difference :)

And please refer to 2.560 x 1.440 pixels as QHD or 1440p. Whenever I hear 1440p 2K I feel a certain sudden urge ... :banghead:
 
Last edited:
Joined
Dec 18, 2005
Messages
8,253 (1.24/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
dude do not tell me what i should and should not do.. it aint your place..

i did use the K reference in brackets and it does make sense.. 4 K being 4 times as many pixels as 1K.. and needing 4 times the power to run..

but if you want to take advantage of what i have learned.. ask me a simple question and i will do my best to answer it.. :)

if not your own comments have no relevance to the thread.. :)

trog
 
Joined
Oct 29, 2012
Messages
842 (0.20/day)
Location
Germany
System Name Perf/price king /w focus on low noise and TDP
Processor Intel Xeon E3-1230 v2
Motherboard Gigabyte GA-B75M-D3H
Cooling Thermalright HR-02 Macho Rev.A (BW)
Memory 16GB Corsair Vengeance LP Black
Video Card(s) Gigabyte GTX 670 OC
Storage 525GB Crucial MX300 & 256GB Samsung 830 Series
Display(s) Home: LG 29UB65-P & Work: LG 34UB88-B
Case Fractal Design Arc Mini
Audio Device(s) Asus Xonar Essence STX /w Sennheiser HD 598
Power Supply be quiet! Straight Power CM E9 80+ Gold 480W
Mouse Roccat Kone XTD optical
Keyboard SteelSeries Apex M500
Software Win10
dude do not tell me what i should and should not do.. it aint your place..
I didn't mean to sound so harsh*. My apologies! I actually liked your comment and effort to share your personal experience. Just felt it was lacking valueable info that would have added substance to the question on hand...

* English is not my first language and sometimes it's hard to get across your intentions/message using written language that is not your mother tongue.

i did use the K reference in brackets and it does make sense.. 4 K being 4 times as many pixels as 1K.. and needing 4 times the power to run..
Question: Does 2K need double the amount of power as 1K?
HInt: It does not ;)

but if you want to take advantage of what i have learned.. ask me a simple question and i will do my best to answer it.. :)
I kinda did... Are you happy with your monitor purchase? Do you notice tangible benefits from G-Sync?
 
Last edited:
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Last edited:
Joined
Oct 29, 2012
Messages
842 (0.20/day)
Location
Germany
System Name Perf/price king /w focus on low noise and TDP
Processor Intel Xeon E3-1230 v2
Motherboard Gigabyte GA-B75M-D3H
Cooling Thermalright HR-02 Macho Rev.A (BW)
Memory 16GB Corsair Vengeance LP Black
Video Card(s) Gigabyte GTX 670 OC
Storage 525GB Crucial MX300 & 256GB Samsung 830 Series
Display(s) Home: LG 29UB65-P & Work: LG 34UB88-B
Case Fractal Design Arc Mini
Audio Device(s) Asus Xonar Essence STX /w Sennheiser HD 598
Power Supply be quiet! Straight Power CM E9 80+ Gold 480W
Mouse Roccat Kone XTD optical
Keyboard SteelSeries Apex M500
Software Win10

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
2K = 2048x1080, for the record. 1080p is 1920x1080. Ive never heard of 1K...because it doesn't seem to exist.

2K is NOT 2560x1440!!!!

Here is a reference: https://en.wikipedia.org/wiki/Display_resolution#/media/File:Vector_Video_Standards8.svg

It doesn't help the confusion when big e-tailers like Newegg classify monitors in odd ways. They say 2K monitors are the following

2560 x 1440 (2K) (397)
2048 x 1536 (2K) (2)
2048 x 768 (3)
2560 x 1024 (8)
2560 x 1080 (2K) (110)
2560 x 1600 (2K) (60)
3440 x 1440 (2K) (29)

http://www.newegg.com/LCD-LED-Monitors/SubCategory/ID-20
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
It doesn't help the confusion when big e-tailers like Newegg classify monitors in odd ways. They say 2K monitors are the following
No doubt... SO annoying......

Sorry, for the threadjack... LOL!
 
Joined
Oct 29, 2012
Messages
842 (0.20/day)
Location
Germany
System Name Perf/price king /w focus on low noise and TDP
Processor Intel Xeon E3-1230 v2
Motherboard Gigabyte GA-B75M-D3H
Cooling Thermalright HR-02 Macho Rev.A (BW)
Memory 16GB Corsair Vengeance LP Black
Video Card(s) Gigabyte GTX 670 OC
Storage 525GB Crucial MX300 & 256GB Samsung 830 Series
Display(s) Home: LG 29UB65-P & Work: LG 34UB88-B
Case Fractal Design Arc Mini
Audio Device(s) Asus Xonar Essence STX /w Sennheiser HD 598
Power Supply be quiet! Straight Power CM E9 80+ Gold 480W
Mouse Roccat Kone XTD optical
Keyboard SteelSeries Apex M500
Software Win10
I'd rather see them use (W)QHD, UWQHD and so on or using the pixel height together with the corresponding aspect ratio, e.g. 1440p 21:9.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
18,914 (2.86/day)
Location
Piteå
System Name Black MC in Tokyo
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Line6 UX1 + some headphones, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
VR HMD Acer Mixed Reality Headset
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
Huh. You can get a Freesync monitor for €170.

I'd rather see them use (W)QHD, UWQHD and so on or using the pixel height together with the corresponding aspect ratio, e.g. 1440p 21:9.

OR they can just write it as number x number, as god intended.
 
Joined
Oct 29, 2012
Messages
842 (0.20/day)
Location
Germany
System Name Perf/price king /w focus on low noise and TDP
Processor Intel Xeon E3-1230 v2
Motherboard Gigabyte GA-B75M-D3H
Cooling Thermalright HR-02 Macho Rev.A (BW)
Memory 16GB Corsair Vengeance LP Black
Video Card(s) Gigabyte GTX 670 OC
Storage 525GB Crucial MX300 & 256GB Samsung 830 Series
Display(s) Home: LG 29UB65-P & Work: LG 34UB88-B
Case Fractal Design Arc Mini
Audio Device(s) Asus Xonar Essence STX /w Sennheiser HD 598
Power Supply be quiet! Straight Power CM E9 80+ Gold 480W
Mouse Roccat Kone XTD optical
Keyboard SteelSeries Apex M500
Software Win10
OR they can just write it as number x number, as god intended.
Too easy! And too much data to process for the average Joe.
 
Joined
Dec 18, 2005
Messages
8,253 (1.24/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
"I kinda did... Are you happy with your monitor purchase? Do you notice tangible benefits from G-Sync?"

i am.. but it was expensive.. compared to my high frame none g-sync 144 hrz monitor i cant say as i do..

i think that dynamic syncing allows lower frame rates to be run with everything still looking smooth.. if the frame rates are high enough i dont think there is much to be gained from it..

my monitor will do 165 hrz but i run it at 120 hrz.. my graphics cards with a game like mad max at 1440 will do 160 frames per second but i run it frame rate capped at about 90 fps..

which means (with g-sinc) my monitor refresh is 90 hrz.. its all still super smooth at 90 fps and i save 200 watts of power compared to running the game at 160 fps..

i am not so sure that g-sync is worth the price premium it currently carries.. but the amd version is much cheaper.. building from scratch the whole scene currently favours the amd solution..

sorry i cant be more definite but not everything has a simple black and white type answer..

is it worth buying.. yes but only if you afford the price premium.. its not a must have thing and i think (starting from scratch) the extra money would be better spent on the graphics cards to push the frame rates higher in the first place..

but in the sense it enables smooth gaming at lower frame rates it has to be a good thing.. shame about the hefty nvidia price premium.. maybe that will come down in the future..

trog

 
Joined
Jan 2, 2015
Messages
1,099 (0.33/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
i would have to say that newegg and what seems to be a number of others dont consider 720p into the equation at all anymore or apart of hd..

sorry 720p you getn thrown overboard.. we sinkn and dont want none
 
Joined
Jan 31, 2005
Messages
2,050 (0.29/day)
Location
Denmark
System Name Commercial towing vehicle "Nostromo"
Processor 5800X3D
Motherboard X570 Unify
Cooling EK-AIO 360
Memory 32 GB Fury 3666 MHz
Video Card(s) 4070 Ti Eagle
Storage SN850 NVMe 1TB + Renegade NVMe 2TB + 870 EVO 4TB
Display(s) 25" Legion Y25g-30
Case Lian Li LanCool 216 v2
Audio Device(s) B & W PX7 S2e
Power Supply HX1500i
Mouse Harpe Ace Aim Lab Edition
Keyboard Scope II 96 Wireless
Software Windows 11 23H2
wow - I think I got my answers ...... sum up:

1) so could it be better to by a neutral monitor with 144 Hz (No FreeSync and no G-Sync) - I think yes (no dependence on manufactor)
2) or is there a monitor that has both technologies? - is and will prob. not happen, so no
3) I have a GTX 970 - so is it better to buy a G-Sync monitor? - yes and no as I dont know which will be my next GPU purchase
4) And how does a monitor with FreeSync behave on nVidia, or G-Sync with an AMD GPU? - they can not use each others technologies, so prob not a good idea, tossing money out of the window

I have decided not to buy me colleagues GTX 970 - I´ll wait and see what AMD and nVidia are coming with next - I think I'll go with an FreeSync/G-Sync free monitor with a decent response time and refresh rate.

Thank You all for the "enlightenment"
 
Last edited:
Joined
Jan 2, 2015
Messages
1,099 (0.33/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
while nv doesnt support adaptive sync now they will in the future. couldnt really say how long.. maybe 2yrs'ish.
good idea to hold out for the new gpu's if you can and i would have to say that a adaptive sync or freesync monitor is a pretty neutral choice.. much more than gsync and will have better compatibility in the future.
they all have the ability to enable or disable adaptive sync. adaptive sync is apart of the display port standard and now is going to be the same for hdmi.
 
Last edited:

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,944 (2.61/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Razer Huntsman Tournament Edition
Software Windows 11 Pro 64-Bit
the way g-sync seems to works for me is it enables smoother running at lower fps.. at 1440 your 970 is not going to be banging out high frames rates.. in fact it will struggle..

with a more powerful card just a 144 hrz monitor without dynamic syncing does a good enough job..

beastie has said it about right..

trog

Freesync and g-sync are essentially the same thing. They allow for dynamic refresh rates, so now the fresh rate is tied to the number of frames per second sent over by the GPU.
 
Joined
Dec 17, 2015
Messages
115 (0.04/day)
Way too early to wave buhbye to G-Sync. Don't let the rest of this post undermine how big of a win it was for AMD to get Intel's backing on FreeSync because after all they are rivals so that is HUGE. Buuut... the win is for integrated graphics. Kaby Lake could possibly be the first Intel cpu where the integrated graphics can take advantage of FreeSync but most likely we are looking at Cannonlake so this gives Nvidia time to make some political power plays. But even then how many people running on integrated graphics are going to be aiming for gaming monitors? Integrated graphics has come a long way but offering good FPS especially on new titles is going to be more miss than hit.

This is extremely important especially when you compare FreeSync to G-Sync and how the technologies differ. Monitors have a minimum refresh rate and both G-Sync and FreeSync currently handle this differently. Right now when your FPS dips below that minimum refresh rate on a G-Sync panel the refresh rate remains variable so it keeps gameplay very smooth. FreeSync on the other hand stays locked in at minimum refresh rate which results in both tearing + judder. So if the monitor's min refresh rate is 40 and your FPS dip to 20 it will stay locked in at 40Hz. Now if the FPS you are getting in a game is above the minimum refresh rate of the monitor the experience will be butter smooth just like G-Sync. I'm sure at some point AMD will fix this by probably doubling the refresh rate and adding a repeated frame like G-Sync does. G-Sync is also superior when it comes to overshoot so there is less blurring effect. Currently G-Sync does everything FreeSync can do but better and the added bonus is the module in the monitor handles a lot of the calculations reducing load on the GPU or CPU.

Though if G-Sync did lose it wouldn't be the first time superior tech has lost so it really depends on how much pride gets in the way for Nvidia in future decisions. Their track history in this regard is pretty bad so that would support the feeling G-Sync will fail. Would be a shame too because not only is G-Sync the better alternative, Nvidia has an extremely large fan base that have the pockets to pay for gaming monitors. If they reduced (or better remove) the cost they impose on the monitor manufacturers it would put them in a position to go toe-2-toe with AMD and Intel because again people who care about gaming monitors will have a dedicated graphics card.


It should be noted that if you did invest G-Sync today its not like your investment is out the window if all of a sudden Nvidia threw its hands up. The module in the monitor handles pretty much everything so your panel will still perform as advertised. It's not like a HD-DVD and Blu-Ray war because with that war you invested in additional costs - buying HD-DVDs instead of Blu-Ray movies. You're still buying the same games, your PC is still the same and you interact with everything the same way. The only difference, though not to be understated, is you are locked in to buying Nvidia graphics cards as long as you wish to take advantage of G-Sync.

Right now there is no clear cut winner. AMD is making all the right moves and getting Intel's backing is almost never heard of. For the most part their technology is on par with Nvidia's with limitations as described above. Nvidia has the superior tech and the market share but Nvidia is going to have to drop some of its pride and will have to play like they're scared. But the technology on both sides is definitely worth it so we gamers are left waiting to see who leaves the colosseum alive. Right now the lowest cost bidder is leading.
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Think of G-sync as the modern-day Betamax.

All hardware with DisplayPort 1.3 ports will support adaptive sync and that includes NVIDIA. We don't know yet if Pascal cards will have DisplayPort 1.3 or not.

There is no competition here. NVIDIA developed G-sync to operate on the backbone of DisplayPort 1.2. DisplayPort 1.2a and newer have the feature built in. The relationship is like a person competing with their own heart.

AMD is not at the core here, VESA is. Look at the back of your computer and count how many VESA ports there are. Mine has 3 DVI and 3 DisplayPort.

The embedded DisplayPort (eDP) chip is a lot cheaper than the G-sync module; eDP will also get cheaper over time as adapative sync becomes the norm for all displays.
 
Joined
Sep 17, 2014
Messages
20,776 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Thats the thing - I dont want to be bound to nVidia or AMD GPU´s - I want freedom ;-)
- if my next build would be an AMD GPU and I have purchased a G-Sync monitor - that would be like tossing money out of the window.....

Then there is the simple solution of a frame rate cap and enough hardware to keep running at stable FPS. It has always worked very well and is the lowest input-lag solution in every single situation, that you can freely adapt to any hardware setup and any game.

That's how I do it, and will keep doing it until both AMD and Nvidia get their head out of their asses selling us tech that is readily available already, since it is part of standards now. It will inevitably come around someday and the current solutions are buying into an ecosystem which is exactly why we don't game on Apple or a console. Any gamer with a decent set of brains and insight into the market dynamics stays clear of either solution, sorry if I offended those that buy into this crap (though not really...).
 
Joined
Dec 17, 2015
Messages
115 (0.04/day)
Think of G-sync as the modern-day Betamax.

All hardware with DisplayPort 1.3 ports will support adaptive sync and that includes NVIDIA. We don't know yet if Pascal cards will have DisplayPort 1.3 or not.

There is no competition here. NVIDIA developed G-sync to operate on the backbone of DisplayPort 1.2. DisplayPort 1.2a and newer have the feature built in. The relationship is like a person competing with their own heart.

AMD is not at the core here, VESA is. Look at the back of your computer and count how many VESA ports there are. Mine has 3 DVI and 3 DisplayPort.
Those analogies are just bad. You don't need to buy G-Sync games so the analogy with Betamax is not the same. Investing in Betamax and then switching to VHS meant a library overhaul. Going with G-Sync now and then switching to FreeSync with a future monitor purchase is just that - a monitor switch and people do that all the time. The competing with its own heart also alludes me. DisplayPort 1.3 won't make G-Sync's job any harder, it will just make FreeSync's job easier. How much who knows at this point.

Too much is in the air at this point. What will AMD's fix be for FreeSync dropping its variable refresh rate below a monitor's min refresh rate? On lesser importance, how it will it catch up on overshoot? How will Nvidia address the proprietary concerns and price premium? Right now I will agree with you that AMD has the upper hand but its not as simplistic as you think it is. The two technologies can co-exist if Nvidia removes the price premium in favor of knowing G-Sync monitors will keep its clientele (long term investment vs short) and can prove to monitor manufacturers that its worth their time and money to have a G-Sync lineup.


edit:
Apple and console analogy is also terrible. You are not buying an ecosystem. You are not buying a system. It is supporting tech that you can decide to take advantage of or not. People see the word proprietary and immediately put blinders on and think its all the same - it's not.
 
Joined
Jan 2, 2015
Messages
1,099 (0.33/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
Think of G-sync as the modern-day Betamax.

All hardware with DisplayPort 1.3 ports will support adaptive sync and that includes NVIDIA. We don't know yet if Pascal cards will have DisplayPort 1.3 or not.

There is no competition here. NVIDIA developed G-sync to operate on the backbone of DisplayPort 1.2. DisplayPort 1.2a and newer have the feature built in. The relationship is like a person competing with their own heart.

AMD is not at the core here, VESA is. Look at the back of your computer and count how many VESA ports there are. Mine has 3 DVI and 3 DisplayPort.

The embedded DisplayPort (eDP) chip is a lot cheaper than the G-sync module; eDP will also get cheaper over time as adapative sync becomes the norm for all displays.
guess working hand in hand with vesa on dp like no one else did means nothing at all
 
Joined
Sep 17, 2014
Messages
20,776 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
edit:
Apple and console analogy is also terrible. You are not buying an ecosystem. You are not buying a system. It is supporting tech that you can decide to take advantage of or not. People see the word proprietary and immediately put blinders on and think its all the same - it's not.

Did you really think about what you've just said?

You would first go out, pay a premium for either Free- or Gsync and then opt for a GPU of the competing brand that won't support the very feature you just paid a premium for?

Sense - it makes none.
 
Top