• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

FreeSync or G-Sync or both?

Joined
Dec 18, 2005
Messages
8,253 (1.23/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
i just swapped from a new-ish 144 hrz 1 ms TN gaming panel to an IPS 4 ms gaming panel.. i did it for photo editing.. colour reproduction and viewing angles are far better on the IPS style panel and the blacks are deeper..

but it does come down to what folks are used to.. i expect to keep the monitor i now have for quite some time.. i have not the slightest desire to move to 4K even though i do have the gpu power to do it..

trog
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
But response times on IPS and PLS are slow. And a lot of cheaper IPS/PLS monitors are not superior to the better TN panels in terms of viewing angles and colours. So unless you stay out of the cheaper segment IPS is not going to be (much) better than TN while still having the response time trade off. Although I suspect you buy only $300+ monitors.
My TN right now is 5ms. Most IPS panels can match that. IPS always has better viewing angles than TN. Most TVs and smartphones have an IPS display.
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
But response times on IPS and PLS are slow. And a lot of cheaper IPS/PLS monitors are not superior to the better TN panels in terms of viewing angles and colours. So unless you stay out of the cheaper segment IPS is not going to be (much) better than TN while still having the response time trade off. Although I suspect you buy only $300+ monitors.

TNs are 6-bit where IPS are 8-bit. Most TNs found on the higher gaming monitors are still 6-bit+FRC color is between 72-75%. You can easily find a IPS panel for $150 with color range 75%. Even a eco or value IPS panel which has a 65% will look better than a TN panel and those monitors you can get for $120 and under.
 
Last edited:
Joined
Dec 17, 2015
Messages
115 (0.04/day)
Negative. Adaptive sync is agnostic. If you have an adaptive sync graphics processor and an adaptive sync monitor, adapative sync will be enabled by default. The purpose of the technology is to act without user input. Again, the goal is to reduce bandwidth requirements as well as reduce idle power consumption. The branding matters not.

That's describing the panel which inadvertantly describes the minimum refreshrate the eDP will refresh at. The frame rate can be lower from the GPU--eDP will fill in the gaps to keep it at or above minimum.

Looks like you realized your mistake on the last paragraph but we'll address that soon. :)

AdaptiveSync is agnostic but again... FreeSync is not. And the point of this topic is FreeSync vs G-Sync.

From AMD: FreeSync is a unique AMD hardware/software solution that utilizes DisplayPort™ Adaptive-Sync protocols to enable user-facing benefits

More from AMD...
Q: What are the requirements to use FreeSync?
A: To take advantage of the benefits of Project FreeSync, users will require: a monitor compatible with DisplayPort™ Adaptive-Sync, a compatible AMD Radeon™ GPU with a DisplayPort™ connection, and a compatible AMD Catalyst™ graphics driver. AMD plans to release a compatible graphics driver to coincide with the introduction of the first DisplayPort™ Adaptive-Sync monitors.​

Yup, agnostic. :rolleyes:

I never said 1.3 is when it was first supported. It will be the first supported by NVIDIA, Intel, and the rest of the industry.
Actually you did say it. You also said with DisplayPort 1.3 will allow it to work natively without any other hardware. Want me to quote that too? You made a big stink about how DisplayPort 1.3 will be the end of G-Sync but nothing in DisplayPort 1.3's spec supports that claim of yours.
All hardware with DisplayPort 1.3 ports will support adaptive sync and that includes NVIDIA.



FYI, AMD Crimson drivers added "Low Framerate Compensation" for FreeSync:

G-sync lost its technical edge with a driver update. eDP/adaptive sync is just that awesome. :laugh:


Edit: Interesting caveat there: "greater than or equal to 2.5 times the minimum refresh rate."
30 Hz -> 75 Hz
35 Hz -> 87.5
40 Hz -> 100 Hz
42 Hz -> 105 Hz
47 Hz -> 117.5 Hz
48 Hz -> 120 Hz
56 Hz -> 140 Hz

That's definitely something buyers should be aware of.

Edit: Looks like LFC should work on all 144 Hz displays.

Did you just find out about the driver update? :) There are downsides to it being a driver update such as needing driver updates for new monitors, some monitors may be left out (and some are btw) and it is definitely not the most elegant solution which leaves room for bugs (some experience overshoot, flickering and ghosting). Then you have the caveat that you mentioned. The upper end resolutions is where this will have the most effect.

Verdict: Good first start but still room for improvement

So no, Nvidia didn't lose the technical edge ...yet.

Also here is the video in which they recommended those changes to AMD.


This debate is tired. I said what I wanted to say.
You know the jargon but your information is off.
Do your per-emptive victory cheer. Get the last word in. I'm done. We are not getting anywhere. You never see anyone else's point of view but your own.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Looks like you realized your mistake on the last paragraph but we'll address that soon. :)

AdaptiveSync is agnostic but again... FreeSync is not. And the point of this topic is FreeSync vs G-Sync.

From AMD: FreeSync is a unique AMD hardware/software solution that utilizes DisplayPort™ Adaptive-Sync protocols to enable user-facing benefits

More from AMD...
Q: What are the requirements to use FreeSync?
A: To take advantage of the benefits of Project FreeSync, users will require: a monitor compatible with DisplayPort™ Adaptive-Sync, a compatible AMD Radeon™ GPU with a DisplayPort™ connection, and a compatible AMD Catalyst™ graphics driver. AMD plans to release a compatible graphics driver to coincide with the introduction of the first DisplayPort™ Adaptive-Sync monitors.​

Yup, agnostic. :rolleyes:
Intel will undeniably call it something else. My point is that that the name does not matter. If you buy a monitor that is branded as "FreeSync compatible," you'll be able to plug it into an AMD graphics card and use "FreeSync" or plug it into an Intel IGP and it will use "[insert name here]." The branding doesn't matter. If your GPU supports external adaptive sync (be it DisplayPort 1.2a or newer or whatever HDMI backbone AMD is using for that) and your monitor supports adapative sync, it will work. The name used to market it doesn't matter. You cited AMD so AMD naturally uses FreeSync branding. It is still agnostic--part of the DisplayPort standard.

Actually you did say it. You also said with DisplayPort 1.3 will allow it to work natively without any other hardware. Want me to quote that too? You made a big stink about how DisplayPort 1.3 will be the end of G-Sync but nothing in DisplayPort 1.3's spec supports that claim of yours.
facepalm.jpg

Instead of arguing, I'm just going to be very blunt. External adapative synce was/is started at...
AMD: DisplayPort 1.2a
Everyone else: DisplayPort 1.3

DisplayPort 1.3 FAQs
VESA said:
Q: Does the release of DisplayPort 1.3 mean that DisplayPort 1.2 products are obsolete?

A: Not at all. VESA develops and publishes standards like DisplayPort prior to their actual deployment in the field. DisplayPort 1.2a represents the latest interconnect technology now available to consumers from manufacturers. The new DisplayPort capabilities included in DisplayPort 1.3 have begun the cycle of hardware development that will result in such technology becoming available to consumers in a range of products over the next few years. And like other new versions of DisplayPort, DisplayPort 1.3 is backward compatible with earlier DisplayPort standards.
The reason why 1.2a exists is because AMD wanted support in 2014 and didn't want to wait until 2016 for 1.3 because G-sync was already out the door. AMD...and VESA...couldn't wait.

Did you just find out about the driver update? :) There are downsides to it being a driver update such as needing driver updates for new monitors, some monitors may be left out (and some are btw) and it is definitely not the most elegant solution which leaves room for bugs (some experience overshoot, flickering and ghosting). Then you have the caveat that you mentioned. The upper end resolutions is where this will have the most effect.
The "driver update" is AMD only. It requires no changes in the monitor. Note the last bullet on the slide: "No user configuration or proprietary monitor hardware required." If you're wondering why that is, it is because of the eDP chip inside the monitor. The LFC bug was purely in software because AMD didn't code in a solution to the "FPS < Min Refresh" problem. That's been fixed.

Despite those couple of complaints, my first impression of LFC with the Crimson driver is one of satisfied acceptance. It's a great start, there is room to improve, but AMD is listening and fixing technologies to better compete with NVIDIA as we head into 2016.
AMD will fix the bugs in time and do so through software updates. There's nothing wrong with the hardware in the card nor the monitor. Their algorithms are still just a little rough around the edges. Such is the nature of working with a complex industry standard.

Intel and NVIDIA are likely to go through the same iterations to apply and improve external adaptive sync support.
 
Last edited:
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
FreeSync monitors come "for free" (and that's a problem for AMD, in some way) since most scaler chips out there support the tech out of the box.
AMD's way is FREEE FOR EVERYONE to use, being an industry standard. (please spare me from the "adaptive sync is not free sync" argument, in this context it's the same thing).

Tests show FreeSync to be slightly superior to G-Sync (former increases FPS a bit, latter decreases).

nVidia's stuff comes at a price (more chips to integrate into monitor), AND IT LIMITS MONITOR TO HAVING ONLY PORT: G-Sync enabled DISPLAYPORT. (it will still work as normal display port though).
FreeSync has no such restriction. (AMD even promised FreeSync over HDMI!)

Since nVidia officially announced about NOT licensing G stuff to anyone (and frankly there being no good reason to endorse G over F anyhow), Intel or AMD jumping on GSync wagon is extremely unlikely. (nVi exclusiveness is it's only point)

All that is needed for nVidia to jump on FreeSync wagon is for AMD to re-gain GPU market share (which is shamelessly low given how many great products they have on the market at the moment, but that's a different discussion).
 
Last edited:
Joined
Oct 29, 2012
Messages
842 (0.20/day)
Location
Germany
System Name Perf/price king /w focus on low noise and TDP
Processor Intel Xeon E3-1230 v2
Motherboard Gigabyte GA-B75M-D3H
Cooling Thermalright HR-02 Macho Rev.A (BW)
Memory 16GB Corsair Vengeance LP Black
Video Card(s) Gigabyte GTX 670 OC
Storage 525GB Crucial MX300 & 256GB Samsung 830 Series
Display(s) Home: LG 29UB65-P & Work: LG 34UB88-B
Case Fractal Design Arc Mini
Audio Device(s) Asus Xonar Essence STX /w Sennheiser HD 598
Power Supply be quiet! Straight Power CM E9 80+ Gold 480W
Mouse Roccat Kone XTD optical
Keyboard SteelSeries Apex M500
Software Win10
Let's all keep our mouths shut until the arrival of Pascal and Polaris :toast:
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.65/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Let's all keep our mouths shut until the arrival of Pascal and Polaris :toast:

What? Without speculation there wouldn't be much posting at all in tech forums. I'm speculating that of course.
 
Joined
Jan 26, 2016
Messages
407 (0.14/day)
Location
UK
System Name it needs a name?
Processor Xeon E3-1241 v3 @3.5GHz- standard clock
Motherboard Asus Z97A 3.1
Cooling Bequiet! Dark Rock 3 CPU cooler, 2 x 140mm intake and 1 x 120mm exhaust PWM fans in the case
Memory 16 GB Crucial DDR3 1600MHz CL9 Ballistix Sport 2 x 8 GB
Video Card(s) Palit 980ti Super Jetscream
Storage Sandisk X110 256GB SSD, Sandisk Ultra II 960GB SSD, 640GB WD Blue, 12TB Ultrastar
Display(s) Acer XB270HU
Case Lian Li PC 7H
Audio Device(s) Focusrite Scarlett 6i6 USB interface
Power Supply Seasonic P660
Mouse cheapo logitech wireless
Keyboard some keyboard from the 90's
Software Win10pro 64bit
Tests show FreeSync to be slightly superior to G-Sync (former increases FPS a bit, latter decreases).
Which tests? Everything I've read gives gsync a slight advantage at the moment.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
90% sure Polaris will support DisplayPort 1.3 and 60% on Pascal. The future of GPUs is higher resolutions and refresh rates. The best path to those is DP 1.3.

I think the reason why Fiji doesn't have HDMI 2.0 is because AMD wanted HDMI 2.0a for external adaptive sync. AMD had no intent of ever using HDMI 2.0. That inevitably means HDMI 2.1 is likely to make external adaptive sync a standard too but it will come some time after Polaris (or announced at about the same time).
 
Last edited:
Joined
Oct 29, 2012
Messages
842 (0.20/day)
Location
Germany
System Name Perf/price king /w focus on low noise and TDP
Processor Intel Xeon E3-1230 v2
Motherboard Gigabyte GA-B75M-D3H
Cooling Thermalright HR-02 Macho Rev.A (BW)
Memory 16GB Corsair Vengeance LP Black
Video Card(s) Gigabyte GTX 670 OC
Storage 525GB Crucial MX300 & 256GB Samsung 830 Series
Display(s) Home: LG 29UB65-P & Work: LG 34UB88-B
Case Fractal Design Arc Mini
Audio Device(s) Asus Xonar Essence STX /w Sennheiser HD 598
Power Supply be quiet! Straight Power CM E9 80+ Gold 480W
Mouse Roccat Kone XTD optical
Keyboard SteelSeries Apex M500
Software Win10
What? Without speculation there wouldn't be much posting at all in tech forums. I'm speculating that of course.
Speculation, huh? I thought my rather provocative* and puny attempt of being funny would lead to even more posts. Trolls attract attention, right? :fear:
*self-irony
90% sure Polaris will support DisplayPort 1.3 and 60% on Pascal.
Found this:
This release time frame will coincide with AMD’s and NVIDIA’s launch of next generation of GPUs which will be equipped with DisplayPort 1.3.
Source: http://120hzmonitors.com/samsung-144hz-3440x1440-ultrawide-monitors/
 
Last edited:
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Haha, can one be DP 1.3 compliant without FreeSync (Adaptive Sync) thing? :)

Everything I've read gives gsync a slight advantage at the moment
Everything you read... where? And what "slight advantage"?
Tests show GS reducing FPS a bit (about -2%), while FS gives (even smaller) bump.
http://www.anandtech.com/show/9097/the-amd-freesync-review/4

But, hell yeah, then there is FUDzilla, fueled by pay rolled nTrolls quite a bit, about OMG nidividiaaaas infinite greatness.
Effectiveness of troll tactics is depressing. 380 is about 10% faster than 960, consumes about 20% more power, is regarded as "you can fry... on it". (applies to most AMD vs nVidia products).
 
Joined
Apr 29, 2014
Messages
4,180 (1.15/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
Hmm......this thread is getting more and more interesting. I just wonder - am I the only who not switch monitor every second year?? I maybe switch GPU every second or third year,
but my monitor ..... well the one I primarily use now is 5+ years - and it works great for the games I play (the newest is FallOut 4) - but then again I only run at 1920 x 1080.
All I get out of this thread is still: buy a monitor that have a decent refresh rate and do not look at the fancy gimmicks (FreeSync/G-Sync) but go for Adaptive Sync.
Monitors are something I don't upgrade very often with the only exception being I switched from a 2160p monitor to a 1440p 144hz monitor because it was better to me for gaming and I wanted to try Freesync on it. I normally change my cards every other generation with some exceptions in the past but I try and stick to this method now. Also if a monitor says "Freesync" it means it is adaptive sync, they are only using that name because it will be more recognized by gamers but its the exact same thing (Freesync is just AMD's design on their cards to take advantage of it).

The problem is if you want to pay for it or not at the end of the day. Adaptive Sync (Freesync) monitors are generally more like standard monitors and are not proprietary and contain extra hookups where as a G-Sync monitor is going to have 1 DP output and be locked exclusively to Nvidia hardware. Its a hard sell right now either way because there is the fact that DP 1.3 monitors could come out this year and be supported by all (Heck maybe we could get lucky and 1.2a monitors will work on them with Adaptive Sync) and then you will have a much more obvious choice. Otherwise (Or if Nvidia locks it out) were still going to be stuck with the same debate. Its not really a matter of which is better as in their ranges they perform equally well (though again G-Sync goes down to 30 and ther others are generally down to 40 with one I believe at 35 currently) so its a matter of if you want to pay for it and what GPU you have and how long you want to keep both.
 
Joined
Jan 26, 2016
Messages
407 (0.14/day)
Location
UK
System Name it needs a name?
Processor Xeon E3-1241 v3 @3.5GHz- standard clock
Motherboard Asus Z97A 3.1
Cooling Bequiet! Dark Rock 3 CPU cooler, 2 x 140mm intake and 1 x 120mm exhaust PWM fans in the case
Memory 16 GB Crucial DDR3 1600MHz CL9 Ballistix Sport 2 x 8 GB
Video Card(s) Palit 980ti Super Jetscream
Storage Sandisk X110 256GB SSD, Sandisk Ultra II 960GB SSD, 640GB WD Blue, 12TB Ultrastar
Display(s) Acer XB270HU
Case Lian Li PC 7H
Audio Device(s) Focusrite Scarlett 6i6 USB interface
Power Supply Seasonic P660
Mouse cheapo logitech wireless
Keyboard some keyboard from the 90's
Software Win10pro 64bit
Everything you read... where? And what "slight advantage"?
Tests show GS reducing FPS a bit (about -2%), while FS gives (even smaller) bump.
http://www.anandtech.com/show/9097/the-amd-freesync-review/4

But, hell yeah, then there is FUDzilla, fueled by pay rolled nTrolls quite a bit, about OMG nidividiaaaas infinite greatness.
Effectiveness of troll tactics is depressing. 380 is about 10% faster than 960, consumes about 20% more power, is regarded as "you can fry... on it". (applies to most AMD vs nVidia products).
from the summary of the link you posted-

"Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see. While NVIDIA showed a performance drop with Alien Isolation using G-SYNC, we weren’t able to reproduce that in our testing; in fact, we even showed a measurable 2.5% performance increase with G-SYNC and Tomb Raider. But again let’s be clear: 2.5% is not something you’ll notice in practice. FreeSync meanwhile shows results that are well within the margin of error."

So they found a glitch with one game, Alien Isolation.


Here's some gsync vs freesync articles

http://www.tomshardware.co.uk/amd-freesync-versus-nvidia-g-sync-reader-event,review-33278.html

http://www.pcworld.com/article/2974...rate-displays-make-pc-games-super-smooth.html

http://www.techradar.com/news/compu...cards/nvidia-g-sync-vs-amd-freesync-1289637/2

http://www.digitaltrends.com/computing/nvidia-g-sync-or-amd-freesync-pick-a-side-and-stick-with-it/


I'm not saying gsync is a better option-there are benefits to both. Freesync being cheaper is a big plus in my view.
But mostly it depends on what GPU is used.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Haha, can one be DP 1.3 compliant without FreeSync (Adaptive Sync) thing? :)
No in regards to the hardware (by installing a DisplayPort 1.3 connector, the hardware is there to do it) and yes in regards to software (the drivers determine under what condition to send a signal to the monitor). Most of the magic occurs in the eDP chip in the display but the drivers still have to instruct it what to do by sending it a signal.

There is a real danger that NVIDIA cards could look for a G-sync module and if it doesn't find it, disable adaptive sync. I'm thinking this might actually be likely in Pascal for two reasons:
1) NVIDIA wants to milk the G-sync cash cow a while longer.
2) NVIDIA needs time to not only implement adaptive sync in their drivers but also fine tune it so it matches or exceeds AMD's implementation.

Why would NVIDIA deliberately put itself behind AMD when they have a competitive product that is smoother despite being more expensive? I can only think of one reason why they wouldn't: cost. Obviously I don't have insider knowledge to know if that is a factor or not.


What isn't known is how Intel plays into this. They don't have a horse in the race yet like Intel. Are they going push their implementation out the door and tweak it like AMD or are they going to bide their time perfecting it until they push it out the door? I'm 50/50 on that one. I don't know.
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
So they found a glitch with one game, Alien Isolation.
Check other bits, it's just easier to see there:
http://www.anandtech.com/Gallery/Album/4315#6

Here's some gsync vs freesync articles
Sorry, I missed which one of them is saying GS is faster than FS.

I'm not saying gsync is a better option-there are benefits to both. Freesync being cheaper is a big plus in my view.
But mostly it depends on what GPU is used.

G-Sync, there is no visible technological advantage ('oh, monitor's manufacturer needs to do bla" isn't technical), yet it costs about a 100$ to support and restricts number of ports to 1.
It's an attempt to leverage dominant market position to bar competitors, akin to PhysX.
There is no (customer) benefit in it. nVidia could benefit from it, if AMD didn't strike back with FS. It still can if AMD goes bust.

There is a real danger that NVIDIA cards could look for a G-sync module and if it doesn't find it, disable adaptive sync.
So, "Dear Asus, put this chip of mine into your monitor, or else I disable adaptive sync"?
But then, which tech will be used for adaptive sync? And if it is GS (in which case chip is really really needed) what about the problem that caused all Monitors with GS support to have only single port, did they somehow overcome it?
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
So, "Dear Asus, put this chip of mine into your monitor, or else I disable adaptive sync"?
But then, which tech will be used for adaptive sync? And if it is GS (in which case chip is really really needed) what about the problem that caused all Monitors with GS support to have only single port, did they somehow overcome it?
All displays with the VESA-compliant eDP chip. It's really eDP versus G-sync module.

I doubt NVIDIA has any interest in adding more than one port to G-sync displays. I mean, the module makes it expensive, why you buy the display and not use it? :laugh:

I guess there is a third option I didn't mention: NVIDIA could sell their own eDP chip that is G-sync compliant as well. The G-sync module could check for non-NVIDIA cards and block adapative sync functionality if found.

I don't know what scares me more: the fact they could do this shady stuff or that I feel it is likely they will. If they do, I hope VESA and AMD pile on lawsuits. AMD would be in a better position because this would be anti-competitive behavior if NVIDIA did it.
 
Last edited:
Joined
Dec 18, 2005
Messages
8,253 (1.23/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
i own a g-sync monitor so far i cant notice much of a difference with it (g-sync not the monitor) on or off.. mind you i dont play games at 30 or 40 fps.. :)

to be honest i read that much copy and paste or entirely speculative bollocks on this place i am beginning not to trust my own judgement.. :)

trog
 
Joined
Sep 3, 2010
Messages
3,527 (0.71/day)
Location
Netherlands
System Name desktop | Odroid N2+ |
Processor AMD Ryzen 5 3600 | Amlogic S922X |
Motherboard Gigabyte B550M DS3H |Odroid N2+ |
Cooling Inter-Tech Argus SU-200, 3x Arctic P12 case fans | stock heatsink + fan |
Memory Gskill Aegis DDR4 32GB | 4 GB DDR4 |
Video Card(s) Sapphire Pulse RX 6600 (8GB) | Arm Mali G52 |
Storage SK Hynix SSD 240GB, Samsung 840 EVO 250 GB, Toshiba DT01ACA100 1T | Samsung 850 Evo 500GB |
Display(s) AOC G2260VWQ6 | LG 24MT57D |
Case Asus Prime 201 | Stock case (black version) |
Audio Device(s) integrated
Power Supply BeQuiet! Pure Power 11 400W | 12v barrel jack |
Mouse Logitech G500 |Steelseries Rival 300
Keyboard Qpad MK-50 (Cherry MX brown)| Blaze Keyboard
Software Windows 10, Various Linux distros | Gentoo Linux
i just swapped from a new-ish 144 hrz 1 ms TN gaming panel to an IPS 4 ms gaming panel.. i did it for photo editing.. colour reproduction and viewing angles are far better on the IPS style panel and the blacks are deeper..

but it does come down to what folks are used to.. i expect to keep the monitor i now have for quite some time.. i have not the slightest desire to move to 4K even though i do have the gpu power to do it..

trog

My TN right now is 5ms. Most IPS panels can match that. IPS always has better viewing angles than TN. Most TVs and smartphones have an IPS display.

TNs are 6-bit where IPS are 8-bit. Most TNs found on the higher gaming monitors are still 6-bit+FRC color is between 72-75%. You can easily find a IPS panel for $150 with color range 75%. Even a eco or value IPS panel which has a 65% will look better than a TN panel and those monitors you can get for $120 and under.

Note: I was talking on actual "independent" test results and not manufacturer specs. Specs can be outright lies especially with monitors and fans. I have been reading reviews on hardware.info recently, and they warned quit clearly about that (they are of the kind who grab colorimeters and oscilloscopes to test the screens they review).
And Windows reserves 2 bits of eight for stuff other than actual colour info (all monitors are 16-17 milion colours max).
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Uh....no...There are only a handful of color choices in Windows: 8-bit (256 color pallete), 16-bit (5-bits per color, green may have 6-bits), 24-bit (8-bits per color), or 32-bit (8-bits per color + alpha).

6-bit displays still get 24 (if not 32) bits of color information. The display simply just can't reproduce that many colors.
10-bit displays, the graphics driver has to override Windows' color settings: NVIDIA AMD

AMD is pushing 10-bit support to become the norm through HDR.
 
Joined
Oct 29, 2012
Messages
842 (0.20/day)
Location
Germany
System Name Perf/price king /w focus on low noise and TDP
Processor Intel Xeon E3-1230 v2
Motherboard Gigabyte GA-B75M-D3H
Cooling Thermalright HR-02 Macho Rev.A (BW)
Memory 16GB Corsair Vengeance LP Black
Video Card(s) Gigabyte GTX 670 OC
Storage 525GB Crucial MX300 & 256GB Samsung 830 Series
Display(s) Home: LG 29UB65-P & Work: LG 34UB88-B
Case Fractal Design Arc Mini
Audio Device(s) Asus Xonar Essence STX /w Sennheiser HD 598
Power Supply be quiet! Straight Power CM E9 80+ Gold 480W
Mouse Roccat Kone XTD optical
Keyboard SteelSeries Apex M500
Software Win10
And Windows reserves 2 bits of eight for stuff other than actual colour info (all monitors are 16-17 milion colours max).
Please verify these sort of things next time to avoid embarrassment in the future.
 
Joined
Sep 3, 2010
Messages
3,527 (0.71/day)
Location
Netherlands
System Name desktop | Odroid N2+ |
Processor AMD Ryzen 5 3600 | Amlogic S922X |
Motherboard Gigabyte B550M DS3H |Odroid N2+ |
Cooling Inter-Tech Argus SU-200, 3x Arctic P12 case fans | stock heatsink + fan |
Memory Gskill Aegis DDR4 32GB | 4 GB DDR4 |
Video Card(s) Sapphire Pulse RX 6600 (8GB) | Arm Mali G52 |
Storage SK Hynix SSD 240GB, Samsung 840 EVO 250 GB, Toshiba DT01ACA100 1T | Samsung 850 Evo 500GB |
Display(s) AOC G2260VWQ6 | LG 24MT57D |
Case Asus Prime 201 | Stock case (black version) |
Audio Device(s) integrated
Power Supply BeQuiet! Pure Power 11 400W | 12v barrel jack |
Mouse Logitech G500 |Steelseries Rival 300
Keyboard Qpad MK-50 (Cherry MX brown)| Blaze Keyboard
Software Windows 10, Various Linux distros | Gentoo Linux
Although "embarrassment" is not the thing I feel right now, I see your point.

Although I have to say that FordGT90Concept gave a better answer than you by showing I was wrong, through explanation. That usually works better than personal attacks, especially if his reply was enough to let me realise I have had a derp moment.

BTW, if you still wish to "spank" me for being stupid, please do so in PM (directed at Dethroy , not Ford).
 
Joined
Oct 29, 2012
Messages
842 (0.20/day)
Location
Germany
System Name Perf/price king /w focus on low noise and TDP
Processor Intel Xeon E3-1230 v2
Motherboard Gigabyte GA-B75M-D3H
Cooling Thermalright HR-02 Macho Rev.A (BW)
Memory 16GB Corsair Vengeance LP Black
Video Card(s) Gigabyte GTX 670 OC
Storage 525GB Crucial MX300 & 256GB Samsung 830 Series
Display(s) Home: LG 29UB65-P & Work: LG 34UB88-B
Case Fractal Design Arc Mini
Audio Device(s) Asus Xonar Essence STX /w Sennheiser HD 598
Power Supply be quiet! Straight Power CM E9 80+ Gold 480W
Mouse Roccat Kone XTD optical
Keyboard SteelSeries Apex M500
Software Win10
A personal attack? It was meant to be an advice, nothing else. But I can see how it may sound different (written word is not very good at transporting intentions) when I look at my statement from your point of view.
 
Top