Friday, May 23rd 2014

Acer Delivers World's First 4K Display with NVIDIA G-SYNC Technology

Acer announces the new Acer XB280HK gaming monitor as the world's first 4k2k display featuring NVIDIA G-SYNC technology to provide stunning, ultra-smooth, tear-free imagery and rich colors for outstanding gaming experiences. It features Acer's flicker-less, low-dimming and ComfyView technologies that reduce strain on the eyes for smooth and comfortable extensive viewing.

Part of the new XB0 line of large gaming monitors, the Acer XB280HK is intended to be paired with enthusiast PCs for immersive, ultra high-end gaming. It features a spacious 28-inch LED backlit display with 4k2k Ultra HD (3840 x 2160 pixels) that's four times the resolution of 1080p Full HD, and presents stunning high quality images for outstanding visual enjoyment.
With a GeForce GTX-powered PC, NVIDIA G-SYNC display technology synchronizes the display's refresh rates to the GPU to eliminate screen tearing and minimize display stutter and input lag to deliver smoother, faster, more breathtaking gaming experiences. Scenes appear instantly, objects look sharper and more vibrant, and gameplay is more fluid and responsive providing gamers with significant performance advantages.

The Acer XB280HK builds in several features that take into consideration prolonged usage by heavy users such as programmers, writers, and graphic designers:
  • Flicker-less technology - stable power supply eliminates screen flicker particularly beneficial for heavy users by helping to reduce eye strain.
  • Low dimming technology - adjust to as low as 15 percent brightness in low light environments to make it easy on the eyes. Standard monitor settings start at 30 percent brightness level.
  • ComfyView technology - the non-glare panel reduces reflection from light source.
The Acer XB280HK features 170/170 degree viewing angles so that brilliantly-colored images can be seen from almost every angle. DisplayPort v1.2 transmits video signals and four USB 3.0 ports are conveniently located at the side and bottom of the display for connecting to keyboard, mouse or mobile devices.

The Acer XB280HK monitor is made with post-consumer recycled plastic and features a distinctive red ring on the base stand. The multi-functional ErgoStand allows the screen to tilt from -5° to 35° to ensure the best viewing angle; the base rotates 120° from left or right for easy screen sharing; the panel height can be raised by up to 150 mm for optimum comfort; and the screen pivots from horizontal to vertical for more viewing perspectives.

This eco-friendly monitor features a mercury- and arsenic-free panel, LED backlighting for reduced power consumption, and is ENERGY STAR qualified.

The Acer XB280HK starts shipping in Q2 in Pan America, EMEA, Japan, and Taiwan.
Add your own comment

63 Comments on Acer Delivers World's First 4K Display with NVIDIA G-SYNC Technology

#26
Hilux SSRG
Glad to see vendors producing more 4K monitors, it just means 1440p is coming down in price more :rockout:.
Posted on Reply
#27
Patriot
DP 1.2a has freesync (non-vendor lock)
DP 1.3 is 8k @ 60hz or 4k at 120hz.
Posted on Reply
#28
HM_Actua1
G-sync FTW!!!!!


Anyone who has tried or owns a Gsync monitor knows it's worth every penny and IS a game changer!

All you AMD haters......poor kids gonna hate.
Posted on Reply
#29
HM_Actua1
ArjaiBlah, blah, 4K, blah 120 Hz, blah, blah, power? Blah, G-Sync, Blah, price, epeen, blah.

'bout covers it, right?:)
Hate much guy?
Posted on Reply
#30
Octavean
Hilux SSRGGlad to see vendors producing more 4K monitors, it just means 1440p is coming down in price more :rockout:.
One would think,....

However, all these new 4K monitor announcements are based on the same 28" TN panel. I'm not saying this is bad because I have one and think its really quite good. However, higher quality IGZO and IPS monitors will typically cost more in general regardless of the resolution. In other words corners are cut to lower the price or rather price dictates design choices.

Cheap 2560x1440 monitors may come soon in mass. In fact I've seen one on newegg not too long ago (Acer manufactured IIRC) but that doesn't necessarily mean its going to be a monitor many would think of as worthwhile.

I have two Auria 2560x1440 IPS monitors and I think they are great too. However, if. I were buying a non-4K monitor now I would much prefer something like the 21:9 LG 34UM95 3440x1440.
Posted on Reply
#31
Arjai
Hitman_ActualHate much guy?
Take offense much?

Fer chissake's I get bashed by some turd, for makin' an inoffensive joke? Boo on YOU!! Grow up.
Posted on Reply
#32
Hilux SSRG
OctaveanI have two Auria 2560x1440 IPS monitors and I think they are great too. However, if. I were buying a non-4K monitor now I would much prefer something like the 21:9 LG 34UM95 3440x1440.
That LG 34UM95 is quite the looker and on my wishlist!!

How long have you had the Auria's?
Posted on Reply
#33
Arjai
Hilux SSRGThatLG 34UM95 is quite the looker and on my wishlist!!

How long have you had the Auria's?
WOW!
Posted on Reply
#34
techy1
List of Upcoming G-SYNC Displays
Acer XB280HK28″3840×2160
60Hz

www.blurbusters.com/gsync/list-of-gsync-monitors/

... so the answer is NO, no - you will not need to crosfirre 295x2 just to run videogames. sad, that GPUs in last few ganeration gave us no price/preformance increase (yes there was preformance increas vs previous gen - but there was also huge price increase)
Posted on Reply
#35
RejZoR
I don't understand why would anyone spend almost a grand on qa monitor that makes you stuck with the NVIDIA. So, if they somehow release another GeForce FX, you're stuck with their crap if youw ant to have G-Sync. Because it won't work with Radeons... Waiting for VESA's FreeSync powered displays instead... that will simply force NVIDIA to support it no matter what.
Posted on Reply
#36
semantics
RejZoRI don't understand why would anyone spend almost a grand on qa monitor that makes you stuck with the NVIDIA. So, if they somehow release another GeForce FX, you're stuck with their crap if youw ant to have G-Sync. Because it won't work with Radeons... Waiting for VESA's FreeSync powered displays instead... that will simply force NVIDIA to support it no matter what.
Why wait 1-2 years for freesync which has unknown capabilities, when you can have G-sync which is the lowest lag you can possibly get out of a monitor, right now. Freesync has no promise of no lag that G-sync offers, plus G-sync's offering is better than no sync on at all in terms of lag.

Waiting is always a shitty reason to not buy hardware. "Why not just wait" is always going to be true-ish, why buy a gpu/cpu now when you can wait a year or so for a new gpu/cpu. Why not just wait for prices to fall in 1 year or so. Why not wait is only applicable to me if it's with in 2 months with a firm release date.
Posted on Reply
#37
Xzibit
semanticsWhy wait 1-2 years for freesync which has unknown capabilities, when you can have G-sync which is the lowest lag you can possibly get out of a monitor, right now. Freesync has no promise of no lag that G-sync offers, plus G-sync's offering is better than no sync on at all in terms of lag.


Posted on Reply
#39
Fluffmeister
semanticsWhy wait 1-2 years for freesync which has unknown capabilities, when you can have G-sync which is the lowest lag you can possibly get out of a monitor, right now. Freesync has no promise of no lag that G-sync offers, plus G-sync's offering is better than no sync on at all in terms of lag.

Waiting is always a shitty reason to not buy hardware. "Why not just wait" is always going to be true-ish, why buy a gpu/cpu now when you can wait a year or so for a new gpu/cpu. Why not just wait for prices to fall in 1 year or so. Why not wait is only applicable to me if it's with in 2 months with a firm release date.
It's a valid point, especially because Freesync is still an unknown quantity, I mean beyond some shitty demo on a laptop, the true ins and outs of Freesync and various pros and cons Vs G-sync remain unknown.

I'm sure a minority of nVidia users want Mantle support, but jumping ship for a handful of mediocre games isn't ideal either.

But at the end of the day haters gonna hate, and one companies bottom line never really suffers.
Posted on Reply
#40
Xzibit
FluffmeisterIt's a valid point, especially because Freesync is still an unknown quantity, I mean beyond some shitty demo on a laptop, the true ins and outs of Freesync and various pros and cons Vs G-sync remain unknown.

I'm sure a minority of nVidia users want Mantle support, but jumping ship for a handful of mediocre games isn't ideal either.

But at the end of the day haters gonna hate, and one companies bottom line never really suffers.
I like the irony in your post
Posted on Reply
#41
Fluffmeister
XzibitI like the irony in your post
Speaking of haters. ^

I was just looking at your random graphs that offer zero context.

Nice! On one hand you have countless frames rendered into a tearing mess with zero input lag... to what amounts to V-synced goodness with no input lag.

Of course two games is hardly conclusive as I'm sure you are aware, but promising results for sure!
Posted on Reply
#42
semantics
Xzibit
Link:www.blurbusters.com/gsync/preview2/

Conclusion 1:
With VSYNC OFF averages of 72ms and 74ms, this is very similar to G-SYNC averages of 77ms and 74ms respectively. The variability of the averages appears to fall well below the noise floor of the high variability of Battlefield 4, so Blur Busters considers the differences in averages statistically insignificant.
Conclusion 2:
At first, it was pretty clear that G-SYNC had significantly more input lag than VSYNC OFF. It was observed that VSYNC OFF at 300fps versus 143fps had fairly insignificant differences in input lag (22ms/26ms at 300fps, versus 24ms/26ms at 143fps). When I began testing G-SYNC, it immediately became apparent that input lag suddenly spiked (40ms/39ms for 300fps cap, 38ms/35ms for 143fps cap). During fps_max=300, G-SYNC ran at only 144 frames per second, since that is the frame rate limit. The behavior felt like VSYNC ON suddenly got turned on.

The good news now comes: As a last-ditch, I lowered fps_max more significantly to 120, and got an immediate, sudden reduction in input lag (27ms/24ms for G-SYNC). I could no longer tell the difference in latency between G-SYNC and VSYNC OFF in Counterstrike: GO! Except there was no tearing, and no stutters anymore, the full benefits of G-SYNC without the lag of VSYNC ON.
General conclusion to the article:
As even the input lag in CS:GO was solvable, I found no perceptible input lag disadvantage to G-SYNC relative to VSYNC OFF, even in older source engine games, provided the games were configured correctly (NVIDIA Control Panel configured correctly to use G-SYNC, and game configuration updated correctly). G-SYNC gives the game player a license to use higher graphics settings in the game, while keeping the gameplay smooth.
Good thing you skipped over the Crysis 3 graph as that didn't fit your narrative!
Posted on Reply
#43
SIGSEGV
FluffmeisterIt's a valid point, especially because Freesync is still an unknown quantity, I mean beyond some shitty demo on a laptop, the true ins and outs of Freesync and various pros and cons Vs G-sync remain unknown.

I'm sure a minority of nVidia users want Mantle support, but jumping ship for a handful of mediocre games isn't ideal either.

But at the end of the day haters gonna hate, and one companies bottom line never really suffers.
says nvidia's top chief marketing.
Posted on Reply
#44
Fluffmeister
SIGSEGVsays nvidia's top chief marketing.
Feel free to post G-Sync Vs Project Freesync results and I'll hold my hands up and say I'm wrong.

I'm happy your girl liked your post though.
Posted on Reply
#45
Xzibit
semanticsLink:www.blurbusters.com/gsync/preview2/

Conclusion 1: With VSYNC OFF averages of 72ms and 74ms, this is very similar to G-SYNC averages of 77ms and 74ms respectively. The variability of the averages appears to fall well below the noise floor of the high variability of Battlefield 4, so Blur Busters considers the differences in averages statistically insignificant.

Conclusion 2:At first, it was pretty clear that G-SYNC had significantly more input lag than VSYNC OFF. It was observed that VSYNC OFF at 300fps versus 143fps had fairly insignificant differences in input lag (22ms/26ms at 300fps, versus 24ms/26ms at 143fps). When I began testing G-SYNC, it immediately became apparent that input lag suddenly spiked (40ms/39ms for 300fps cap, 38ms/35ms for 143fps cap). During fps_max=300, G-SYNC ran at only 144 frames per second, since that is the frame rate limit. The behavior felt like VSYNC ON suddenly got turned on.

The good news now comes: As a last-ditch, I lowered fps_max more significantly to 120, and got an immediate, sudden reduction in input lag (27ms/24ms for G-SYNC). I could no longer tell the difference in latency between G-SYNC and VSYNC OFF in Counterstrike: GO! Except there was no tearing, and no stutters anymore, the full benefits of G-SYNC without the lag of VSYNC ON.

General conclusion to the article:As even the input lag in CS:GO was solvable, I found no perceptible input lag disadvantage to G-SYNC relative to VSYNC OFF, even in older source engine games, provided the games were configured correctly (NVIDIA Control Panel configured correctly to use G-SYNC, and game configuration updated correctly). G-SYNC gives the game player a license to use higher graphics settings in the game, while keeping the gameplay smooth.
Images are no more cherry picked then your original statement.
semanticsWhy wait 1-2 years for freesync which has unknown capabilities, when you can have G-sync which is the lowest lag you can possibly get out of a monitor, right now. Freesync has no promise of no lag that G-sync offers, plus G-sync's offering is better than no sync on at all in terms of lag.
Your statement didn't included any specific other then no V-Sync vs G-Sync in term of lag. That's what I was responding to.

As BB points out it had to manual adjust options to equate V-Sync off to G-Sync on high fps games. The conclusion BB comes to speaks for it self.
Posted on Reply
#46
semantics
XzibitImages are no more cherry picked then your original statement.



Your statement didn't included any specific other then no V-Sync vs G-Sync in term of lag. That's what I was responding to.

As BB points out it had to manual adjust options to equate V-Sync off to G-Sync on high fps games. The conclusion BB comes to speaks for it self.
Good thing you didn't link to that conclusion so it couldn't speak for itself! Or explain at all why you posted random bar graphs!

It's statistically equal to no-vsync but has the benefit of producing full frames in line with the monitor so giving a better user experience. From a technical standpoint it should be as good or better in terms of lag than no-vsync in all cases. That is not always going to be true due to it's ran by drivers and so how ever nvidia's programs those drivers is going to affect case specific events. One could just test for case specific instances, consider the 30-60fps frame rate area; if you use lower polling rates on the mouse as you'd get more chances for the response times to not line up well and produce results to show better than no-vsync results. But then that would be making graphs to prove a point rather than just doing testing and drawing a conclusion.
Posted on Reply
#47
Fluffmeister
Well said sir, it absolutely boggles my mind that people think it's OK to post random graphs without any context or more specifically linking to the original authors source of hard work and letting the results speak for themselves.

If that isn't cherry picking God knows what is.
Posted on Reply
#48
Xzibit
semanticsGood thing you didn't link to that conclusion so it couldn't speak for itself! Or explain at all why you posted random bar graphs!
I think I did.
XzibitImages are no more cherry picked then your original statement.

Your statement didn't included any specific other then no V-Sync vs G-Sync in term of lag. That's what I was responding to.
As BB points out it had to manual adjust options to equate V-Sync off to G-Sync on high fps games. The conclusion BB comes to speaks for it self.
Conclusion 1:
With VSYNC OFF averages of 72ms and 74ms, this is very similar to G-SYNC averages of 77ms and 74ms respectively. The variability of the averages appears to fall well below the noise floor of the high variability of Battlefield 4, so Blur Busters considers the differences in averages statistically insignificant.

Conclusion 2:
At first, it was pretty clear that G-SYNC had significantly more input lag than VSYNC OFF. It was observed that VSYNC OFF at 300fps versus 143fps had fairly insignificant differences in input lag (22ms/26ms at 300fps, versus 24ms/26ms at 143fps). When I began testing G-SYNC, it immediately became apparent that input lag suddenly spiked (40ms/39ms for 300fps cap, 38ms/35ms for 143fps cap). During fps_max=300, G-SYNC ran at only 144 frames per second, since that is the frame rate limit. The behavior felt like VSYNC ON suddenly got turned on.

The good news now comes: As a last-ditch, I lowered fps_max more significantly to 120, and got an immediate, sudden reduction in input lag (27ms/24ms for G-SYNC). I could no longer tell the difference in latency between G-SYNC and VSYNC OFF in Counterstrike: GO! Except there was no tearing, and no stutters anymore, the full benefits of G-SYNC without the lag of VSYNC ON.
General conclusion to the article:
As even the input lag in CS:GO was solvable, I found no perceptible input lag disadvantage to G-SYNC relative to VSYNC OFF, even in older source engine games, provided the games were configured correctly (NVIDIA Control Panel configured correctly to use G-SYNC, and game configuration updated correctly). G-SYNC gives the game player a license to use higher graphics settings in the game, while keeping the gameplay smooth.
No where does Blur Buster says G-Sync is better then no sync in terms of lag like your original statement.
semanticsWhy wait 1-2 years for freesync which has unknown capabilities, when you can have G-sync which is the lowest lag you can possibly get out of a monitor, right now. Freesync has no promise of no lag that G-sync offers, plus G-sync's offering is better than no sync on at all in terms of lag.
It provides a better gaming experience but that's not what your original statement implied

I don't think I'm disputing anything that was said by Blur Blusters as pertaining to experience just your original statement which I cant find BB agreeing with. If they do I cant seam to find the link.
Posted on Reply
#49
Prima.Vera
PatriotDP 1.2a has freesync (non-vendor lock)
DP 1.3 is 8k @ 60hz or 4k at 120hz.
The optimistic release timeline for DP1.3 is for 1st quarter next year...
MxPhenom 216They do? $799

The Asus Swift ROG monitor

www.asus.com/us/News/xXtX0FNhXQWPrry7
No , I mean the price here for the monitor is listed as 1K Euros....
Posted on Reply
#50
semantics
Xzibit
semanticsGood thing you didn't link to that conclusion so it couldn't speak for itself! Or explain at all why you posted random bar graphs!
I think I did.

No where does Blur Buster says G-Sync is better then no sync in terms of lag like your original statement.


It provides a better gaming experience but that's not what your original statement implied

I don't think I'm disputing anything that was said by Blur Blusters as pertaining to experience just your original statement which I cant find BB agreeing with. If they do I cant seam to find the link.
Pretty sure you did not; unless you're calling hotlinking images from blurbusters as "linking to the article". I had to recognize the watermark on the bottom right on the image and go to blurbusters and find where you linked it from.
I also clarified the statement later on but you seem to be stuck on the old statement.
www.techpowerup.com/forums/threads/acer-delivers-worlds-first-4k-display-with-nvidia-g-sync-technology.201120/page-2#post-3112446

Anyways i'm not going to bother explaining you how lag from your mouse to your monitor works and then how g-sync works and letting you draw the inevitable conclusion as you're being purposely obtuse.
Posted on Reply
Add your own comment
Apr 20th, 2024 04:48 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts