• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

FreeSync or G-Sync or both?

  • Thread starter Thread starter Deleted member 6693
  • Start date Start date
it seems to me that currently AMD is a "poor mans choice" (hardcore fan base apart) people buy AMD (cpu or gpu) because its the cheaper option not because its the better option..

looked at this way g-sync costing more than free-sync just follows the otherwise normal pattern..

i think most people tend to look at a monitor as an extra and not as an integral part of their system.. its the way i used to be.. carefully plan out a system and then chuck in a cheap keyboard mouse and monitor more as an after-thought more than anything else..

i have g-sinc but never "bought into it" it just came as part and parcel of the high end monitor i decided to buy.. i aint gonna knock it but aint gonna enthuse over it ether..

i run g-sync and a frame rate cap.. a practise that dosnt seem to be that common.. i aint quite figured out what are the best frame rates to run at but i am getting there.. :)

trog
 
Last edited:
Did you really think about what you've just said?
You're really going to make me do all the work for you aren't you?

Your comparison to Apple...
- With an Apple ecosystem you are hard locked to all software you can use on that computer
- With an Apple ecosystem you are hard locked to any hardware it does and does not accept
- You cannot forego anything to remove those limitations

Your comparison to Consoles...
- With a console all your hardware is locked (except possible HDD upgrades)
- With a console you can only play games designed for that console
- You cannot forego anything to remove those limitations

G-Sync
- You need an Nvidia GPU to take advantage of the G-Sync technology
- You can forego an Nvidia GPU and simply use the monitor as a 144Hz (or whatever the refresh rate is) monitor
- It does not impact your ability to use any software you want
- It does not remove your ability to use any hardware you want

You would first go out, pay a premium for either Free- or Gsync and then opt for a GPU of the competing brand that won't support the very feature you just paid a premium for?

Is it the best or even the smartest move? Of course not but the option is still available. But that wasn't the point. The point is your analogies don't apply here. They are flawed. Either you are trying too hard to think of analogies or your understanding is lacking. I'm guessing the former despite your rudeness but perhaps you should take your own advice.
 
Of course it is póssible, but it is not sénsible... We are posting in a thread where the question is posed 'Freesync or Gsync'. I'll agree the analogy is slightly flawed, but you get the point. It makes no sense to pay premium for features that you cannot even use, and it can be a reason to not buy into either technology and wait, or not pay any premium for it at all. At thát point, buying either tech is sensible. And that lack of premium will only happen when it gets adapted as standard for both GPU vendors.

With regards to ecosystems, this is also a function of the mind/perception, because you can easily mod an Apple comp to run Windows or build a Hackintosh, it just takes some extra effort. When you buy Gsync, very slim chance you are ever buying an AMD gpu to go with it, it doesn't fit the perception of what you bought earlier. That is effectively buying into an ecosystem as well. I can also connect my Apple stuff to a non-Apple PC can't I? It just won't work as well, and that effectively pushes you into buying same brand stuff.
 
Last edited:
You don't need to buy G-Sync games so the analogy with Betamax is not the same. Investing in Betamax and then switching to VHS meant a library overhaul.
You have to buy a Betamax player (NVIDIA card) and Betamax cassettes (G-sync equiped monitor). Switching to adaptive sync means new monitors and cards (at least until NVIDIA supports it). The similarties are undeniably there.

Going with G-Sync now and then switching to FreeSync with a future monitor purchase is just that - a monitor switch and people do that all the time. The competing with its own heart also alludes me. DisplayPort 1.3 won't make G-Sync's job any harder, it will just make FreeSync's job easier. How much who knows at this point.
Presently, G-sync does not support VESA adaptive sync but G-sync only exists on the back of VESA's DisplayPort standard (where AMD already extended the VESA standard to HDMI as well). The analogy stems from the fact that no competition can exist because VESA DisplayPort is not going to compete with VESA DisplayPort. G-sync is a cobblejob placed on top of the adaptive sync standard that has to go away because it requires a device that is not directly compliant with the eDP standard. The G-sync brand could appear on eDP hardware but doing so would create a lot of confusion in terms of GPU support. It would be best if the G-sync brand was abandoned but I don't know if NVIDIA will do that--kind of moot until they debut a DisplayPort 1.3 card.

What will AMD's fix be for FreeSync dropping its variable refresh rate below a monitor's min refresh rate?
Reduce settings to get reasonable framerates. Who, seriously, thinks <30 frames is acceptable? Additionally, there is some indicators that eDP does take minimum refreshrate into consideration but manufacturers are forced to provide a minimum refreshrate via EDID so they give a higher number than the eDP can handle. More info here.

The two technologies can co-exist if Nvidia removes the price premium in favor of knowing G-Sync monitors will keep its clientele (long term investment vs short) and can prove to monitor manufacturers that its worth their time and money to have a G-Sync lineup.
They cannot. G-sync is not a standard where adaptive sync is. The former cannot exist in a market where the latter does--at least not without huge and continued investments from NVIDIA which they'll have to take a loss on to keep going.


There's an important distinction to be made here: AMD FreeSync branding is moot. FreeSync is an implementation of the adaptive sync standard. It's kind of like how the NX bit is called "XD bit" by Intel and "Enhanced Virus Protection" by AMD. They're different names for the same thing. G-sync, presently, does not fit VESA's adaptive sync mold. Sure, monitor manufacturers advertise "FreeSync" to catch the eye of potential buyers but that is a misnomer. If you buy a FreeSync monitor and plug it into an Intel DisplayPort 1.3 port, it will work just like it would on an AMD card. Adaptive sync is vendor neutral--just like plug and play functionality of all DisplayPort monitors.


guess working hand in hand with vesa on dp like no one else did means nothing at all
AMD did the same with Mantle and Vulkan. Consumers benefit hugely from AMD advancing all of these open standards because they can use any hardware they want (even NVIDIA). How is that a bad thing? The obligatory meme:
a3e74e84b75879442f73df18e67c1e0c8cb93e5919a4b3f1baba511bfa3e6cb8.jpg

NVIDIA can't ignore DisplayPort 1.3 because it comes with a huge boost to bandwidth 4K displays need. If they support DP 1.3 and deliberately axe/ignore adaptive sync in the name of G-sync, they deserve to burn in hell.
 
Last edited:
Too much is in the air at this point. What will AMD's fix be for FreeSync dropping its variable refresh rate below a monitor's min refresh rate? On lesser importance, how it will it catch up on overshoot? How will Nvidia address the proprietary concerns and price premium? Right now I will agree with you that AMD has the upper hand but its not as simplistic as you think it is. The two technologies can co-exist if Nvidia removes the price premium in favor of knowing G-Sync monitors will keep its clientele (long term investment vs short) and can prove to monitor manufacturers that its worth their time and money to have a G-Sync lineup.
I am confused because they both do that, once you exit the range you lose Freesync/G-Sync, G-Sync just at the moment goes lower in the monitors available.


You're really going to make me do all the work for you aren't you?

Your comparison to Apple...
- With an Apple ecosystem you are hard locked to all software you can use on that computer
- With an Apple ecosystem you are hard locked to any hardware it does and does not accept
- You cannot forego anything to remove those limitations

Your comparison to Consoles...
- With a console all your hardware is locked (except possible HDD upgrades)
- With a console you can only play games designed for that console
- You cannot forego anything to remove those limitations

G-Sync
- You need an Nvidia GPU to take advantage of the G-Sync technology
- You can forego an Nvidia GPU and simply use the monitor as a 144Hz (or whatever the refresh rate is) monitor
- It does not impact your ability to use any software you want
- It does not remove your ability to use any hardware you want



Is it the best or even the smartest move? Of course not but the option is still available. But that wasn't the point. The point is your analogies don't apply here. They are flawed. Either you are trying too hard to think of analogies or your understanding is lacking. I'm guessing the former despite your rudeness but perhaps you should take your own advice.
However, the problem in purchasing a G-Sync monitor is the expense as they cost extra (Quite a bit) over a non Sync enabled or Adaptive Sync (FreeSync) monitor. Even though you can use the monitor as a normal monitor its also more difficult as most (I have seen I believe one so far that has more than 1) have only 1 DP output making them harder to use for other tasks without adaptors and such (Or if you want to have multiple things hooked into it). They are strictly designed to be a 1 computer gaming monitor for Nvidia cards which is fine but it makes it hard to recommend to people unless they don't mind selling off an expensive monitor if they decided to switch vendors or just keep it without being able to use the tech they paid extra for.

NVIDIA can't ignore DisplayPort 1.3 because it comes with a huge boost to bandwidth 4K displays need. If they support DP 1.3 and deliberately axe/ignore adaptive sync in the name of G-sync, they deserve to burn in hell.

I have a feeling they are going to block it, not because of anything more than they want to push G-Sync even when they add DP 1.3.
 
I have a feeling they are going to block it, not because of anything more than they want to push G-Sync even when they add DP 1.3.
I hope not but that possibility exists. I have a feeling they won't block it because display manufacturers won't play ball unless NVIDIA greases their palms. NVIDIA goes where the profits are and I don't see profits in G-sync's future--I see losses.

There's only 9 G-sync monitors available for sale now:
http://www.geforce.com/hardware/technology/g-sync/where-to-buy-g-sync-monitors-and-modules

There are over 50 adaptive sync monitors (9 are HDMI):
http://www.amd.com/en-us/innovations/software-technologies/technologies-gaming/freesync

Note the number of manufacturers too. Adaptive sync already has broad industry backing where G-Sync does not.


Additionally, the FreeSync page says right on it that Polaris will have DisplayPort 1.3 support. 90% sure Pascal will too.

I think it is very possible 2016 is the end for G-sync. It will be put on legacy support.
 
Last edited:
I hope not but that possibility exists. I have a feeling they won't block it because display manufacturers won't play ball unless NVIDIA greases their palms. NVIDIA goes where the profits are and I don't see profits in G-sync's future--I see losses.

There's only 9 G-sync monitors available for sale now:
http://www.geforce.com/hardware/technology/g-sync/where-to-buy-g-sync-monitors-and-modules

There are over 50 adaptive sync monitors (9 are HDMI):
http://www.amd.com/en-us/innovations/software-technologies/technologies-gaming/freesync

Note the number of manufacturers too. Adaptive sync already has broad industry backing where G-Sync does not.


Additionally, the FreeSync page says right on it that Polaris will have DisplayPort 1.3 support. 90% sure Pascal will too.

I think it is very possible 2016 is the end for G-sync. It will be put on legacy support.
the next nv fad will be an operating system :oops: the next amd venture be getting another hsa member to make gpu's and push out nv.. poor nv needs to wear a helmet o_O even apple is making jokes about them :laugh:
 
You and me both... its not right. Its like calling an apple a orange....

2K = 2048x1080, for the record. 1080p is 1920x1080. Ive never heard of 1K...because it doesn't seem to exist.

2K is NOT 2560x1440!!!! 1440p or QHD is the correct way to refer to it.

Here is a reference: https://en.wikipedia.org/wiki/Display_resolution#/media/File:Vector_Video_Standards8.svg

2k is 1920x1080, like 4k is 3840 x 2160 which you can see is less than 4000 so 4k further away than what 2k is.

That said 1920x1080 has every right to be all so called 2k as 4k is way of it's mark with 3840. He mean 1920x1080 being 1k when really it's not, they just started a new naming scheme is all.
 
1K (1024 x 576) -- Probably only exists on cheap phones.
Full HD/2K (1920 x 1080)
Ultra HD/4K (3840 x 2160)
5K (5120 x 2880)

Note they are all 16:9 and they all round to their #K.


For the record, I think calling a resolution by any name other than it's resolution is stupid.
 
Last edited:
you are probably right.. maybe even more so now that extra wide screens are becoming popular.. the term 4K is accepted and commonly used.. it is easy to remember and write down..

its become the new must have buzz word for some.. 4K is roughly 8 million pixels.. 1920 x 1080 is roughly 2 million pixels.. 2560 x 1440 is roughly or a little less than 4 million pixels..

the ratio of 1 2 4 makes sense from a power needed to move them all about or the amount of pixels on a screen point of view hence my incorrect use of the 1 K 2 K and 4 K tis just the way my poor old brain makes sense of it.. :)

i go back to the 640 x 480 days.. he he

trog
 
@Vayra86
Even if Nvidia says FreeSync wins they will still provide back compatibility support of G-Sync. You only have two options of graphics cards anyway. It's one or the other. Not like you are losing out on a breadth of options. The reason Nvidia dominates the market share is because most of its user base is loyal and will keep buying Nvidia cards. This has been a thing before G-Sync and will most likely still be a thing. So being locked into Nvidia isn't as big of a negative to many (read, not all) of them.

While the price premium does suck (not arguing this) at least you get something for that premium - superior tech. Many people are quite ok with paying a premium for better technology. I am. I supported SSDs when they were new and expensive. If I get something for my money its not as evil as many make it out to be. To be clear since people are analogy happy, I am not comparing G-Sync to SSDs. Only saying I personally don't mind paying a price premium for better tech.


@FordGT90Concept
The Betamax player analogy is really pointless. You should not need me to tell you Betamax Cassettes != a friggin monitor. The similarities exist only in your head. Stop with the analogies. We get it - you hate G-Sync and Nvidia. Instead of trying to think of analogies, focus on differences in the technology.

Now if you go back and read my posts you will see that I agree with you that AMD has Nvidia by the short ones and that Nvidia has a track record of bad habits. What I don't agree with you on, is that its not as simple as you make it out to be. There is a very clear and distinct difference on that.

First you make it sound like all of a sudden some recent information came out that put the death nail in G-Sync - using DisplayPort 1.3 as your argument. Truth is what you are referring to is old news - well for the computer industry. VESA created Adaptive Sync in 2009 but it was not implemented. Nvidia took the opportunity to develop and release G-Sync. In response to that AMD announced FreeSync, which was VESA's Adaptive Sync. So all of a sudden a technology that wasn't pushed at all was used to combat Nvidia's release of G-Sync. Adaptive Sync was actually supported by DisplayPort 1.2a - so no... 1.3 is not when Adaptive Sync was first supported. It should be noted that 1.2a was released in 2014 ...so was the spec for 1.3. Even with that you still need a FreeSync enabled monitor as it needs the chip. FreeSync monitors win on price. G-Sync monitors win on tech. Since Nvidia is not supporting Adaptive Sync we are back at square one... you need an AMD card for FreeSync and you need an Nvidia card for G-Sync. So we can discuss DisplayPort all we want but in the end very little has changed in that regard. In spite of the DisplayPort changes guess what... monitor manufacturers are still releasing G-Sync monitors. :eek:

A noteworthy change is Intel's integrated graphics will now support AdaptiveSync, but how many people running integrated graphics will be buying a gaming monitor?

You kind of have AMD's triumph on DisplayPort 1.3 a little off. The big win on DisplayPort 1.3 is the fact it enables 5120×2880 displays to 60Hz refresh rate, 1080p monitors will go up to 240hz refresh rate, 170hz for HDR 1440p screens and 144Hz for 3840×2160. The upper end displays will most likely have an announcement date toward the end of the year - in case anyone is curious.

The above is great but honestly what I feel is the biggest win for AMD is getting FreeSync to work over HDMI 2.0 since not all monitors support DisplayPort. This will definitely increase the number of monitors that are FreeSync certified. Granted this is a low cost solution since HDMI doesn't have the bandwidth DisplayPort does so all serious gamers will still go for a DisplayPort monitor. But it does open up options to the lower price tier group of monitors. So now lost cost game systems can enjoy variable refresh rates on low cost monitors. AMD always kind of dominated on the low cost GPUs but now there is even more reason for people looking for a low cost GPU to invest in AMD instead of Nvidia.


As far as your clarification of FreeSync and AdaptiveSync, you should realize that the two while the same are not mutually inclusive. A FreeSync certified monitor is always based on AdaptiveSync but not every AdaptiveSync monitor is FreeSync certified. Your same paragraph makes it sound like FreeSync is completely plug-n-play with DisplayPort 1.3. Again, you need an AMD graphics card for FreeSync to work just like you need an Nvidia graphics card for G-Sync. See how we came full circle again? Now when Intel releases its first line of CPUs that are AdaptiveSync enabled than your statement becomes true but DisplayPort 1.3 does not magically enable FreeSync on its own.

Also you should not assume every monitor's min refresh rate is 30Hz because some are actually 40Hz and above. It's not hard to dip below 40 FPS on a recent title. If you are going to buy a FreeSync monitor this is one of the most important things you should look up.


That list of G-Sync monitors you posted is out of date. AMD still has the advantage but there are more than 9 G-Sync monitors :rolleyes:
- List of FreeSync monitors
- List of G-Sync monitors

Again, the important distinction between your POV and mine is I don't think it is clear cut like you do. As I said, AMD has the upper hand and Nvidia has a bad track record with the tech it likes to push. In fact, I think I provided more examples of how AMD has the upper hand. But the analogies in this thread are severely flawed. Your information is a bit off and you paint this picture that if someone invests in a G-Sync monitor they're screwed - not true.
 
VESA created Adaptive Sync in 2009 but it was not implemented.
Only in eDP (think laptop monitors)...
VESA said:
http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/
Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP™) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.

DisplayPort 1.2a was designed specifically for AMD to push FreeSync out the door. DisplayPort 1.3 is to put everyone (AMD, Intel, NVIDIA, and so on) on the same page with support for external adaptive sync. There are no DisplayPort 1.3 devices out yet--they're coming this year en masse.

A noteworthy change is Intel's integrated graphics will now support AdaptiveSync, but how many people running integrated graphics will be buying a gaming monitor?
Businesses, everywhere. When nothing changes on the screen (think your typical workstation screen), the GPU can literally shut off because the display already has everything it needs thanks to its onboard memory in the eDP. This tech goes far behind high refresh rates.


The only reason why AMD bothered with FreeSync and HDMI is for consoles...likely beginning with the Nintendo NX. It's not really aimed at the computer market...other than the Nano which is aimed at HTPCs. Remember, the Nano has 3 DisplayPorts, each capable of Multi-Stream Technology, where its sole HDMI port can only power one display. AMD is very committed to DisplayPort and the only reason why they put up with HDMI at all is because home theaters are sticking to it.


As far as your clarification of FreeSync and AdaptiveSync, you should realize that the two while the same are not mutually inclusive. A FreeSync certified monitor is always based on AdaptiveSync but not every AdaptiveSync monitor is FreeSync certified. Your same paragraph makes it sound like FreeSync is completely plug-n-play with DisplayPort 1.3. Again, you need an AMD graphics card for FreeSync to work just like you need an Nvidia graphics card for G-Sync. See how we came full circle again? Now when Intel releases its first line of CPUs that are AdaptiveSync enabled than your statement becomes true but DisplayPort 1.3 does not magically enable FreeSync on its own.
Negative. Adaptive sync is agnostic. If you have an adaptive sync graphics processor and an adaptive sync monitor, adapative sync will be enabled by default. The purpose of the technology is to act without user input. Again, the goal is to reduce bandwidth requirements as well as reduce idle power consumption. The branding matters not.

Of course this isn't true of G-Sync, in its current state, because it is non-standard.


Also you should not assume every monitor's min refresh rate is 30Hz because some are actually 40Hz and above. It's not hard to dip below 40 FPS on a recent title. If you are going to buy a FreeSync monitor this is one of the most important things you should look up.
That's describing the panel which inadvertantly describes the minimum refreshrate the eDP will refresh at. The frame rate can be lower from the GPU--eDP will fill in the gaps to keep it at or above minimum.
 
Last edited:
Ya I addressed that...

Nvidia took the opportunity to develop and release G-Sync. In response to that AMD announced FreeSync, which was VESA's Adaptive Sync. So all of a sudden a technology that wasn't pushed at all was used to combat Nvidia's release of G-Sync. Adaptive Sync was actually supported by DisplayPort 1.2a - so no... 1.3 is not when Adaptive Sync was first supported.
 
I never said 1.3 is when it was first supported. It will be the first supported by NVIDIA, Intel, and the rest of the industry.



FYI, AMD Crimson drivers added "Low Framerate Compensation" for FreeSync:
syCojWE.jpg

http://videocardz.com/57776/amd-launches-radeon-software-crimson-driver

G-sync lost its technical edge with a driver update. eDP/adaptive sync is just that awesome. :laugh:


Edit: Interesting caveat there: "greater than or equal to 2.5 times the minimum refresh rate."
30 Hz -> 75 Hz
35 Hz -> 87.5
40 Hz -> 100 Hz
42 Hz -> 105 Hz
47 Hz -> 117.5 Hz
48 Hz -> 120 Hz
56 Hz -> 140 Hz

That's definitely something buyers should be aware of.

Edit: Looks like LFC should work on all 144 Hz displays.
 
Last edited:
Without sounding trollish - I'd only buy a Freesync or G-Sync monitor if I was a brand loyalist OR had no problem buying a new monitor when i changed gfx cards.

I've owned Nvidia since i left my 7970's behind but I still wouldn't splash on a G-Sync. And likewise, I'm not buying an AMD card just to buy a FreeSync monitor.

Unless Nvidia support adaptive sync (in which case G-Sync dies), it's a lottery between cards and monitors. Without knowing what the next gen cards are doing, the F/G sync options are like technology prisons you pay to lock yourself into. No thanks. I'll stick with monitor agnostic powerful cards instead (be it Fury X or 980ti).
 
Edit: I myself am waiting for better monitor offerings (not happy with the current options) and Pascal + Polaris..
same here ... seems that my 980 and a 60Hz 1080p monitor are enough and G-Sync Freesync not really worth anything ... (specially G-sync ... seriously, proprietary + cost added? no thanks nVidia ) with all my excuses for those who think G-Sync is "tha bomb"

(i mean what's the point of that tech, when my framerate is already stable enough and no stutter no matter what game i play? )

if i decide one day to go 1440p 144hz (the day the manufacturer will be "less nuts" on pricing ...) maybe i will consider a G-Sync or Freesync monitor depending my GPU (or the 3rd tech that will be "open" and replace both and add no cost to a, already expensive enough, monitor)
well ... also any "ROG SWIFT" or "gaming" 144hz 1440p monitor, cost a little more than my GPU ... while my Philips 27E3LH cost 1/3 of it, i know i am surely missing something ...

Without sounding trollish - I'd only buy a Freesync or G-Sync monitor if I was a brand loyalist OR had no problem buying a new monitor when i changed gfx cards.).
totally true

Unless Nvidia support adaptive sync (in which case G-Sync dies), it's a lottery between cards and monitors. Without knowing what the next gen cards are doing, the F/G sync options are like technology prisons you pay to lock yourself into. No thanks. I'll stick with monitor agnostic powerful cards instead (be it Fury X or 980ti).
and true (and G-Sync need to die anyway, but nVidia want to make the most money out of the brand's loyalists )
 
liking a certain brand dosnt always turn a person into a flag waving mindless zealot.. i like intel and i like nvidia.. i do so for what i think are good reasons..

in days long past i used to like amd and ati.. i did have good reasons back then.. reasons which sadly dont exist any more.. :)

having spent a fair bit of dosh on a good gaming IPS panel which happens to have g-sync.. i will be bit pissed off if support for it does die in the near future..

i could live without it.. but i do find it useful..

trog
 
the only chance of gsync surviving is if they support both and only make them as highest premium gaming experience on every monitor coming perfectly calibrated.. 1ms.. rimless.. curved and there is the realm of 5k.
 
the only chance of gsync surviving is if they support both and only make them as highest premium gaming experience on every monitor coming perfectly calibrated.. 1ms.. rimless.. curved and there is the realm of 5k.

it will definitely die off then.. he he

i recon i am gonna turn mine off just to see what perceptible difference it actually makes.. i wont be entirely surprised if i dont see any difference.. but unlike most i can at least find out for real.. he he

trog

ps.. running mad max at 90 fps 1440 resolution with a monitor refresh rate set at 120 hrz and g-sync off i cant see any noticeable difference.. i will leave it off for longer and look f-cking harder.. he he he
 
Last edited:
Hmm......this thread is getting more and more interesting. I just wonder - am I the only who not switch monitor every second year?? I maybe switch GPU every second or third year,
but my monitor ..... well the one I primarily use now is 5+ years - and it works great for the games I play (the newest is FallOut 4) - but then again I only run at 1920 x 1080.

All I get out of this thread is still: buy a monitor that have a decent refresh rate and do not look at the fancy gimmicks (FreeSync/G-Sync) but go for Adaptive Sync.
 
Not at all, I have mine as long as possible. MY HDTV is about 4 years old now and cannot see it being replaced any time soon unless it breaks down. In no rush G Sync or freesync not are not even matured yet and on top of that Home Theater has not court up yet so.

I just wish AMD drivers would recognize that it's a 10bit panel lol.

So i am just waiting it see what happens.
 
I'd hope to get 5+ yrs out of a monitor
 
Hmm......this thread is getting more and more interesting. I just wonder - am I the only who not switch monitor every second year?? I maybe switch GPU every second or third year,
but my monitor ..... well the one I primarily use now is 5+ years - and it works great for the games I play (the newest is FallOut 4) - but then again I only run at 1920 x 1080.

All I get out of this thread is still: buy a monitor that have a decent refresh rate and do not look at the fancy gimmicks (FreeSync/G-Sync) but go for Adaptive Sync.
I haven't changed monitors in probably 7+ years. I change GPUs every 5 or fewer years. I'm going to keep using what I got until it dies. At which point, I'm hoping for an affordable HDR, IPS, adaptive sync panel. I'm hoping they don't die soon because the former is only just happening. HDR and IPS are definitely more important to me than adaptive sync.
 
But response times on IPS and PLS are slow. And a lot of cheaper IPS/PLS monitors are not superior to the better TN panels in terms of viewing angles and colours. So unless you stay out of the cheaper segment IPS is not going to be (much) better than TN while still having the response time trade off. Although I suspect you buy only $300+ monitors.
 
Back
Top