• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

Joined
Jul 5, 2013
Messages
25,559 (6.48/day)
Go get some more milk from mommy.
You were making a good point up until that.
Mommy should have aborted this mouth breather.
Really?

Maybe you're the one that needs growing up, eh?

I do agree with adaptive-sync being very useful, but whether or not it's the future remains to be seen.

It's like playing on CRT again.
CRT's had motion-blur as well. It was less pronounced at lower refresh-rates, but it was still there.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
You continue to assume gaming can only happen on the very high end...

Well I have Gsync and I choose ULMB over it. It's simply a much better experience. I rarely experience tearing, so why would I accept motion blur all the time when I can have no/low motion blur and lower input lag?

Gsync and Freesync is good for high res gaming on high settings, where fps can dip way below 100. I sometimes play single player games with it. But fast paced multiplayer games? Never. ULMB all the way. It's much easier to track enemies with no motion blur. As in day and night difference.

I'm not telling people how to play their games, but I have tried tons of gaming monitors and Gsync and Freesync have never really impressed me. Mostly because I very rarely settle with less than 100 fps and tearing is not a big issue with high fps.
 

bug

Joined
May 22, 2015
Messages
13,226 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Well I have Gsync and I choose ULMB over it. It's simply a much better experience. I rarely experience tearing, so why would I accept motion blur all the time when I can have no/low motion blur and lower input lag?

Gsync and Freesync is good for high res gaming on high settings, where fps can dip way below 100. I sometimes play single player games with it. But fast paced multiplayer games? Never. ULMB all the way. It's much easier to track enemies with no motion blur. As in day and night difference.

I'm not telling people how to play their games, but I have tried tons of gaming monitors and Gsync and Freesync have never really impressed me. Mostly because I very rarely settle with less than 100 fps and tearing is not a big issue with high fps.
So all this is you telling us that at high FPS ULMB is better and at low FPS GSync/FreeSync is better? We already knew that.

And about the lag that synching adds: it means your mouse cursor will be lagging at most until the next refresh. At 60 fps that's 16ms. At 100 fps it's 10ms. That's still lag, but well into negligible territory.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
So all this is you telling us that at high FPS ULMB is better and at low FPS GSync/FreeSync is better? We already knew that.

And about the lag that synching adds: it means your mouse cursor will be lagging at most until the next refresh. At 60 fps that's 16ms. At 100 fps it's 10ms. That's still lag, but well into negligible territory.

I'm saying that high fps ULMB is better than high fps GSYNC/FREESYNC.

https://www.blurbusters.com/persistence-vs-motion-blur/

It all adds us. I'm going for lowest possible input lag.
 
Joined
Jul 5, 2013
Messages
25,559 (6.48/day)
It all adds us. I'm going for lowest possible input lag.
You're missing something very important. There is a point where the pursuit of gaming perfection actually gets in the way of gaming enjoyment. I speak from experience. 120hz is a very good high-bar that isn't out of reach of the average PC gamer. However, 99% of all games are still very enjoyable at 60hz which is one of the reasons it's a standard across the industry.
 
  • Like
Reactions: bug

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
You're missing something very important. There is a point where the pursuit of gaming perfection actually gets in the way of gaming enjoyment. I speak from experience. 120hz is a very good high-bar that isn't out of reach of the average PC gamer. However, 99% of all games are still very enjoyable at 60hz which is one of the reasons it's a standard across the industry.

30-60 fps and 60 Hz are a more like a compromise because of hardware limitations.

High end TV's, smartphones, tablets, PC monitors etc. You see 120 Hz support on tons of devices these days. 60 Hz was never enough. Yes, it's "acceptable", but EVERYONE can see the difference between 60 fps/Hz and 120 fps/Hz when setup properly (Yep, I have seen people buy 120-240 Hz monitors and use them at 60 Hz...) Even in many stores that showcase high refresh rate monitors, they run 60 Hz. No wonder some people think it's all marketing.

It won't take many years before 120 Hz is the new 60 Hz. Atleast on high-end stuff.
 
Joined
Jul 5, 2013
Messages
25,559 (6.48/day)
It won't take many years before 120 Hz is the new 60 Hz. Atleast on high-end stuff.
People have been saying that for the better part of 2 decades, and yet here we are, still very much in a 60hz world. The reason is simple, science and the limitations of human visual perception. The human eye can only perceive between 20 to 30 individual frames per second. Above that and it's simply becomes perception of smoothness. While we can tell that there is a difference between 30->60hz and 60->120hz, we will never see the individual frames. For the vast majority of the human race, 60hz will always be seen as fluid and smooth. For elitists, like myself, 120hz is the standard. Anything above 120hz is a waste of time and resources because of the limitations of the human eye. This is factual science and the display building industry knows it.
 

bug

Joined
May 22, 2015
Messages
13,226 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
We've had this very conversation back in CRT days: is 100Hz enough or do you really need 240Hz?
This time, though the problem is a little different: fast refresh screens were expensive back then as well, but today, because of higher resolutions, the video cards that can push them are prohibitively expensive.

Edit: this is in reply to post #81
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
I easily tell 240fps/240Hz is more smooth than 120fps/120Hz, but 1080p TN is a dealbreaker for me atm. 1440p/120-165Hz IPS is "fine" for now.
It's not a huge difference like 60fps/60Hz to 120/120. But it's definitely noticable and provides even better smoothness.


In the CRT days, 100 Hz were bare minimum for me. Just because 60 Hz aint blinking on LCD/OLED, does not mean it's smooth. It's just as terrible in terms of motion.

People have been saying that for the better part of 2 decades, and yet here we are, still very much in a 60hz world.

They might have said that, but it's happening now.

Pretty much all 2018 high-end TV's supports 120 Hz native.
iPad Pro is 120 Hz native. Several Android phones has 90-120 Hz native.
More and more PC monitors and laptops has 120-240 Hz native.

Every single person can see the difference between 60 and 120 Hz. Most just don't know that 60 Hz is crap. It was choosen because of bandwidth and hardware limitations.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,226 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
@lexluthermiester 60 fps may be fine, but most panels can't even refresh that fast. They need overdrive to keep all transitions at 16ms or less and overdrive can and most of the time does introduce artifacts. That doesn't make 60Hz panel unusable, but it does mean there's room for improvement.
las's 120Hz with ULMB is actually 60Hz, but with a black frame inserted after each rendered frame. It's a trick that makes overdrive unnecessary, but needs panels that can refresh fast.
 
Joined
Jul 5, 2013
Messages
25,559 (6.48/day)
@lexluthermiester 60 fps may be fine, but most panels can't even refresh that fast. They need overdrive to keep all transitions at 16ms or less and overdrive can and most of the time does introduce artifacts. That doesn't make 60Hz panel unusable, but it does mean there's room for improvement.
las's 120Hz with ULMB is actually 60Hz, but with a black frame inserted after each rendered frame. It's a trick that makes overdrive unnecessary, but needs panels that can refresh fast.
Ah ok, I gotcha. Not all panels are like that.
 
Joined
Mar 15, 2018
Messages
33 (0.01/day)
I want to agree, but I dont see freesync winning any victories here, as nvidia GPUs dominate steam's numbers and AMD faffs around with Vega.

The nvidia lock-in will absolutely backfire on them the moment AMD gets their act together.

Nvidia Dominance after they push more to sponsored game and adding more software that only them know,,

since physix era this way start nvidia dominance,, AMD making best Graphic card HD 5870 and better than GTX 480,, people still buy nvidia because MIND SET not performance,,, this happen when Nvidia logo always flying around on every game
 
Joined
Mar 18, 2015
Messages
2,960 (0.89/day)
Location
Long Island
It's running security checks to ensure only Nvidia cards are connected, that requires a VM, which needs RAM, and a fast processor.......

I see it the other way around ... the number of Freesync monitors offering any type of MBR technology are shrinking fast. Freesync and G-Syns will drop to insignificance fir the enthusiast gamer as 144 + Hz will be the order of the day

Just check how many Freesync and how many G-Sync monitors are on the market. Even the second biggest TV manufacturer allowed Freesync support on some of their 2018 models. People can taunt AMD but getting a much cheaper monitor with the same specifications is good for everyone.

1. Too bad no such thing exists. When you can compare "apples and apples" and the price is the same. Freesync monitors with monitor manufacturer supplied Motion Blur Reduction (MBR) were roughly the same price as G-sync when both broke on the scene. Over time, said manufacturers are including this feature less and less. The quality of these solutions is variable as each manufacturer's implementation is different

2. One of the reasons for the above is the market place. nVidia sold greater than 2 times more GTX 970s then all 20+ AMD 200 and 300 series cards combined. Back then, AMD had no answer to the 970 on up.... today, AMD has no answer to the 1060 on up.

3. Lets look at the list of "best gaming monitors:

https://www.blurbusters.com/faq/120hz-monitors/

Of the 29 monitors ....

.... all 29 have some form of MBR technology
.... 9 are Freesync
.... 20 are G-Sync

Depending on budget (incl peripherals) and resolution, we recommend

2160p w/ $3k+ Budget ... wait for 11xx series and 144 hz IPS HDR G-Sync Monitors w/ ULMB
1440p w/ $2k+ Budget ... GTX 1070 series and 144 hz IPS HG-Sync Monitors w/ ULMB
1080p w/ ~1.5k budget ... GTX 1060 w/ G-Sync or at least MBR capable (Classic LightBoost Blur Reduction technology) Monitor
1080p w/ <1.5k budget ... RX 580 w/Free-Sync and / or MBR capable (Classic LightBoost Blur Reduction technology) Monitor

since physix era this way start nvidia dominance,, AMD making best Graphic card HD 5870 and better than GTX 480,, people still buy nvidia because MIND SET not performance,,, this happen when Nvidia logo always flying around on every game

This reads like Faux News ... doesn't agree with any published test numbers. I don't see any support of that position in TPU reviews, at least in the upper tiers.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/31.html

"It seems the GTX 1060 is everything AMD wanted the RX 480 to be"

While that comment was for the reference model, its the AIB cards that really matter to gaming enthusiasts. Lets give AMD an advantage by comparing the 1060 with the later generation 580.



We see that the AIB 580 is 1.064 times as fast as the reference 1060 and that when overclocked, it's 7.1% faster
https://www.techpowerup.com/reviews/Gigabyte/AORUS_RX_580_XTR/33.html

So the card is 1.14 (1.064 x 1.071) times as fast as the reference 1060.

The AIB 1060 is 1.041 times as fast as the reference 1060 but it overclocks 12.1% making it 1.17 times faster than the reference 1060.

https://www.techpowerup.com/reviews/Gigabyte/GTX_1060_Xtreme_Gaming/26.html
https://www.techpowerup.com/reviews/Gigabyte/GTX_1060_Xtreme_Gaming/29.html

GTX 1060 power consumption / noise and / OC temps are 138 watts / 28 dbA / 63C
The RX 580 power consumption / noise and / OC temps are 243 watts / 34 dbA /74C

Desired AIB 1060s (6GB) are going from $289 to $329 on newegg ... while desired RX 580s are running about $10 cheaper. Performance wise I'd call it almost a wash and worthy of a $10 premium. But the 580 uses 76 % more power, requiring the expense of a PSU that is 100 watts larger so that easily erases the cost advantage. It's 50% louder and adds 10C in GPU temps. And lets not forget that the 580 is a generation later than the 1060 which originally competed against the 480.

I'd love it if AMD could muster some competition but nVidia's but despite loads of "[insert AMD's new tech here] is going to change everything" ... it just hasn't happened. Pre-launch excitement fizzled post launch as the 200, 300, 400, 500, Fury and Vega failed to live up to the hype. We last saw competition at the high end (when both cards were overclocked) with the 780 vs 290x and the 780 Ti rendered that battle irrelevant. But much more so than the top tier battle, Im more worried that nVidia has taken the crown to a another tier with each generation. With 9xx series ... the domination dropped down another 2 tiers to the x70... with 10xx it dropped to xx60. Personally, as our most of the users we have built for, are what I call "hardware whores" ... total lack of loyalty, jumping on whatever has the best numbers ... overclocked.

So no... we, PC enthusistas, are not buying anything because of a "mindset" at least at the high end. I will agree however that at the low end, "mindset" has value. The best example I can give you here is IBM laptops. IBM made the A20 series which every year, back in the days when print media dominated, the A20 was awarded best laptop every year. It was very expensive and could easily run upwards of $3K. And while if you wanted 'the best", youd have to pay that and get an A20 because no one was offering anything comparable. At some point, some bean counter decided that IBM didn't sell enough of the A20 and discontinued the line. Soon after IBM lost laptop dominance and eventual spun the division off to Lenovo. With laptops, w/o making thodse magazine covers, the shine was off IBM... just like every junior high school kids needed 'Air Jordans' to make sure he "made the team", every business exec wants that IBM logo visible when he / she entered that business meeting. But for those buying individual components, its going to be all about the numbers. So when a teenage goes shopping wth mom for that new PC as little Johnny transitions to Jr High School, that nVidia logo will draw attention because lil Johnny read on the internet that "nVidia 1080 Ti is the best", he wanbts totell his friends that ha has an nVidia card... he'll also want water cooling. lots of RGB all so he can impress his friends, regardless of whether any of those choices give you less than AMD components, air coolers. But again, while the uninformed consumer may be fooled by this mindset, I think anyone who is spending time reading TPU forums, and who has read TPU reviews, is making the choice 'by the numbers".
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
2. One of the reasons for the above is the market place. nVidia sold greater than 2 times more GTX 970s then all 20+ AMD 200 and 300 series cards combined. Back then, AMD had no answer to the 970 on up.... today, AMD has no answer to the 1060 on up.
R9 290X is still, to this day, faster than GTX 970. R9 290X/R9 390/RX 580 are all faster than GTX 1060. Vega 56 is a great deal faster than GTX 1070, and Vega 64 is in GTX 1070 Ti and GTX 1080 territory.

3. Lets look at the list of "best gaming monitors:

https://www.blurbusters.com/faq/120hz-monitors/

Of the 29 monitors ....

.... all 29 have some form of MBR technology
.... 9 are Freesync
.... 20 are G-Sync
You're only looking at the top list which is panels with > 144 Hz. The best panels on the market are in the "Other Brands of Blur Reduction" list, namely, the FreeSync 2 Samsungs.
 
Last edited:
Joined
Jan 2, 2014
Messages
232 (0.06/day)
Location
Edmonton
System Name Coffeelake the Zen Destroyer
Processor 8700K @5.1GHz
Motherboard ASUS ROG MAXIMUS X FORMULA
Cooling Cooled by EK
Memory RGB DDR4 4133MHz CL17-17-17-37
Video Card(s) GTX 780 Ti to future GTX 1180Ti
Storage SAMSUNG 960 PRO 512GB
Display(s) ASUS ROG SWIFT PG27VQ to ROG SWIFT PG35VQ
Case Cooler Master HAF X Nvidia Edition
Audio Device(s) Logitech
Power Supply COOLER MASTER 1KW Gold
Mouse LOGITECH Gaming
Keyboard Logitech Gaming
Software MICROSOFT Redstone 4
Benchmark Scores Cine Bench 15 single performance 222
$500. extra sucks!

But that's the price you pay for performance.....some people are happy with 60Hz others like myself prefer Higher Hz and no Screen Tearing, Judder, Input Lag and better Benchmarking!

I myself waiting for ASUS new 200Hz monitor to come out! "ROG SWIFT PG35VQ"

https://www.asus.com/ca-en/Monitors/ROG-SWIFT-PG35VQ/

35" 21:9 "3K/HDR/200Hz" ….should be able to run all new games 100+FPS Ultra Mode with up coming next gen 1180/1180Ti

4K Ultra Mode is still very hard to run high FPS. Need 1080Ti SLI just to get 60fps to 90fps in new games.
 

Attachments

  • P_setting_000_1_90_end_500.jpg
    P_setting_000_1_90_end_500.jpg
    40.8 KB · Views: 358
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
$500. extra sucks!

But that's the price you pay for performance.....some people are happy with 60Hz others like myself prefer Higher Hz and no Screen Tearing, Judder, Input Lag and better Benchmarking!

I myself waiting for ASUS new 200Hz monitor to come out! "ROG SWIFT PG35VQ"

https://www.asus.com/ca-en/Monitors/ROG-SWIFT-PG35VQ/

35" 21:9 "3K/HDR/200Hz" ….should be able to run all new games 100+FPS Ultra Mode with up coming next gen 1180/1180Ti

4K Ultra Mode is still very hard to run high FPS. Need 1080Ti SLI just to get 60fps to 90fps in new games.

Why you waiting for it, if you already have it in your system specs?

Another monitor that's taking its time to come out over a year already and its a AUO AHVA screen as well. expect 120hz+OC to 200hz
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,731 (3.41/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
I never looked until today... and damn, the cheapest gsync monitors seem to cost as much as a high end graphics card! Still not sure what's wrong with regular vsync. Latency and all that... never really experienced it myself. Seems I can also use "fast" vsync as well, with any monitor, which supposedly gives you the best of both worlds anyway... But yeah, that gsync is way too expensive to be worth it.

I'm stuck using a 60hz 1920x1080 screen now... but years ago I had a 120hz monitor. It was nice until it died on me. No fancy gsync then at the time either...
 
Joined
Jan 2, 2014
Messages
232 (0.06/day)
Location
Edmonton
System Name Coffeelake the Zen Destroyer
Processor 8700K @5.1GHz
Motherboard ASUS ROG MAXIMUS X FORMULA
Cooling Cooled by EK
Memory RGB DDR4 4133MHz CL17-17-17-37
Video Card(s) GTX 780 Ti to future GTX 1180Ti
Storage SAMSUNG 960 PRO 512GB
Display(s) ASUS ROG SWIFT PG27VQ to ROG SWIFT PG35VQ
Case Cooler Master HAF X Nvidia Edition
Audio Device(s) Logitech
Power Supply COOLER MASTER 1KW Gold
Mouse LOGITECH Gaming
Keyboard Logitech Gaming
Software MICROSOFT Redstone 4
Benchmark Scores Cine Bench 15 single performance 222
Why you waiting for it, if you already have it in your system specs?

Another monitor that's taking its time to come out over a year already and its a AUO AHVA screen as well. expect 120hz+OC to 200hz

They have 240Hz monitors too for years but only TN 1080p.

"UWQHD" 3K HDR 200Hz is a new happy medium VA Technology. Better than 1080p and 1440p but less than 4K.

Probably only reach 200fps on older games like Diablo III witch Max's out @165fps and I still play quite a bit.

Building an second Rig right now ...with my older one becomes my wife's...

And yes Nvidia G-Sync HDR is ridiculously expensive.
 
Joined
Aug 2, 2011
Messages
1,451 (0.31/day)
Processor Ryzen 9 7950X3D
Motherboard MSI X670E MPG Carbon Wifi
Cooling Custom loop, 2x360mm radiator,Lian Li UNI, EK XRes140,EK Velocity2
Memory 2x16GB G.Skill DDR5-6400 @ 6400MHz C32
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra OC Scanner core +750 mem
Storage MP600 2TB,960 EVO 1TB,XPG SX8200 Pro 1TB,Micron 1100 2TB,1.5TB Caviar Green
Display(s) Acer X34S, Acer XB270HU
Case LianLi O11 Dynamic White
Audio Device(s) Logitech G-Pro X Wireless
Power Supply EVGA P3 1200W
Mouse Logitech G502 Lightspeed
Keyboard Logitech G512 Carbon w/ GX Brown
VR HMD HP Reverb G2 (V2)
Software Win 11
I never looked until today... and damn, the cheapest gsync monitors seem to cost as much as a high end graphics card! Still not sure what's wrong with regular vsync. Latency and all that... never really experienced it myself. Seems I can also use "fast" vsync as well, with any monitor, which supposedly gives you the best of both worlds anyway... But yeah, that gsync is way too expensive to be worth it.

I'm stuck using a 60hz 1920x1080 screen now... but years ago I had a 120hz monitor. It was nice until it died on me. No fancy gsync then at the time either...

It's really hard to explain just how much G-Sync makes a difference in games where you can't constantly hit the max refresh rate of the monitor in framerates. It's night and day. Once you see it with your own eyeballs you'll be a true believer.

I too am waiting for a 3440x1440 screen with higher than 120Hz refresh. 120Hz ones are on market right now, and are pretty good; but I fear going under 144Hz for regular desktop use will hurt.
 
Top