Monday, June 6th 2016

ASUS Readying a 144 Hz 4K Ultra HD Monitor

ASUS is readying with what could be the world's first 4K Ultra HD monitor with 144 Hz refresh-rate. The monitor features a 27-inch AHVA panel by AU Optronics. The monitor relies on DisplayPort 1.3 for sufficient bandwidth to push its advertised resolution (3840 x 2160 @ 144 Hz). There's no word on whether the monitor supports adaptive sync technologies such as G-SYNC or FreeSync. DisplayPort 1.3 support can be found on some of the latest GPUs, such as the GeForce GTX 1080 and the Radeon RX 480.
Source: PCGH
Add your own comment

76 Comments on ASUS Readying a 144 Hz 4K Ultra HD Monitor

#26
Thoas1990
The Guy in the Video, where the picture is from, is saying that there will be G-Sync and Freesyncmodels later on.
Posted on Reply
#27
RejZoR
droopyROOut of curiosity for the 144Hz master race, what rigs do you use to play Witcher 3 for example at 144 fps minimum ? what do you do when games don`t have >60 fps support like Fallout 4 had ? 10x
Even if you're not reaching the desired framerate, it still feels way more responsive, smooth and outputting a sharp image than on 60Hz LCD screens. If the game doesn't support 144Hz, I won't play it because it hurts my eyes.
Posted on Reply
#28
droopyRO
RejZoRIf the game doesn't support 144Hz, I won't play it because it hurts my eyes.
Oookkk ...
I thought that sharp images are given by resolution/ppi not refresh rate and that in games input lag>refresh rate. I guess i will have to try one of this 144Hz displays one day even though i don`t play twitch shooters.
Posted on Reply
#29
R-T-B
droopyROOookkk ...
I thought that sharp images are given by resolution/ppi not refresh rate and that in games input lag>refresh rate. I guess i will have to try one of this 144Hz displays one day even though i don`t play twitch shooters.
Yeah, sharpness has nothing to do with refresh rate or input lag.
Posted on Reply
#30
Prima.Vera
RejZoRNow that's a nice monitor. 4K at 144Hz. And it's not even TN which is surprising. Once you go 144Hz there is no way going back. If I'm forced to use 60Hz it literally makes my eyes hurt, everything feels so sluggish. When I hooked my old 4K LCD TV to my PC, the lag at 30Hz (it only supports such input) was so bad I just couldn't use it at all.

This won't be cheap though. I have a 1080p 144Hz and it was super expensive as it is, this is freaking 4K AHVA lol :D
RejZoREven if you're not reaching the desired framerate, it still feels way more responsive, smooth and outputting a sharp image than on 60Hz LCD screens. If the game doesn't support 144Hz, I won't play it because it hurts my eyes.
You're overreacting a lot like a spoiled child. Take a chill pill. :)) There is a difference between 30Hz and 60Hz, but only a slight one between 60Hz and more. Relax!
I have a 3440x1440@100Hz monitor, and there is almost no difference between gaming at 100Hz and 60Hz on my old monitor. Also because I never used VSYNC either ways.
Posted on Reply
#31
GC_PaNzerFIN
Oh well, looks like I need GTX 1080 SLi after all. :ohwell:

I was looking to get PG348Q or PG279Q from Asus, but the panels from AUO has been such garbage + g-sync random problems that don't want to bet with my money. I am interested on this, but I really need to see some favorable reviews first.
Posted on Reply
#32
FordGT90Concept
"I go fast!1!11!1!"
VinskaIsn't DP1.3 max bandwidth not enough to run 4K@144Hz?
3840×2160×24×144=28.67Gbit/s
DP1.3 goes up to 25.92Gbit/s for data. Which is enough for 4K@120Hz, but not enough for 144Hz.
I guess it must be using DP1.4, then. As DP1.4 adds stream compression (DSC), which would allow it to push those those extra few frames
And what I want (10-bit color) would come to 35.8 Gb/s. :(

That doesn't include overhead of DisplayPort, the adaptive sync overhead (doubt that is much), and the HDR overhead (not sure how much that uses.

Edit: DSC is supposed to be 3:1 compression so...should theoretically be possible but only with DisplayPort 1.4. No Polaris for me! :cry:
Posted on Reply
#33
Caring1
I think popping down to Aldi and picking up a cheap 48" 4K UHD TV will do the job, 60Hz is enough for my system.
Posted on Reply
#34
librin.so.1
FordGT90ConceptThat doesn't include overhead of DisplayPort
VinskaDP1.3 goes up to 25.92Gbit/s for data.
25.92Gbit/s is with DP overhead removed. DP1.[3|4] goes for 32.4Gbit/s overhead included.
Posted on Reply
#36
RejZoR
Prima.VeraYou're overreacting a lot like a spoiled child. Take a chill pill. :)) There is a difference between 30Hz and 60Hz, but only a slight one between 60Hz and more. Relax!
I have a 3440x1440@100Hz monitor, and there is almost no difference between gaming at 100Hz and 60Hz on my old monitor. Also because I never used VSYNC either ways.
Seat me in front of a 144Hz screen that works at only 60Hz and I'll tell you something is not right. I don't know if it's just my screen or all 144Hz in general. If I use 60Hz screen at 60Hz it's fine. If I use 144Hz screen at 144Hz, it's fine. If I use 144Hz screen at 60Hz, I can sense it's running at less Hz. I did use 120Hz once and it was a lot less noticeable, but I could still sense it.
Caring1I think popping down to Aldi and picking up a cheap 48" 4K UHD TV will do the job, 60Hz is enough for my system.
Problem with LCD TV's is input lag. Some have horrendous input lag. Like my first generation 4K Philips LCD TV. It's an amazing TV with outstanding image, but if I hook up my PS2 on it, it's impossible to play it because the delay between input and actual happenings is so huge. It's like half a second delay if not more. As PC monitor, this is even more important imo. Certainly has to be tested before purchase if possible.
Posted on Reply
#37
Vayra86
droopyROOookkk ...
I thought that sharp images are given by resolution/ppi not refresh rate and that in games input lag>refresh rate. I guess i will have to try one of this 144Hz displays one day even though i don`t play twitch shooters.
I think (hope, at least...) what Rejzor is referring to, is the 'it's hard to go back' experience. I have it too. When I play games on a regular 60hz fixed refresh, it feels choppier than it should be or than it ever has before I owned a 120hz panel for gaming. And whenever it drops below 60 for even a split second, I feel like I'm playing a total piece of shit, even though a few years ago I would play at a shitty 25-40 fps at the most on a crappy laptop TN.

Even today I would never trade away the guarantee of having above 75 fps for a higher resolution. Happy to game at 1080p and look at some jaggies than to lose 120 hz and the tear-free, super smooth action that comes with it. Gaming is all about *motion resolution*, and so much less about how detailed a static image can be. That is why for me high refresh panels > resolution and most of everything else.

The additional advantage is that high refresh panels just don't have visible tearing when you play with uncapped FPS either, they handle variable framerates so much better even without any kind of special sync (Vsync/Gsync/Freesync). You don't need it at all for anything that runs above 75 fps.
Posted on Reply
#38
librin.so.1
RejZoRProblem with LCD TV's is input lag. Some have horrendous input lag. Like my first generation 4K Philips LCD TV. It's an amazing TV with outstanding image, but if I hook up my PS2 on it, it's impossible to play it because the delay between input and actual happenings is so huge. It's like half a second delay if not more. As PC monitor, this is even more important imo. Certainly has to be tested before purchase if possible.
Some TVs have "gaming mode" knob slapped deep in their settings, which greatly reduces the latency[1]. The latency still tends to be too damn high for my tastes.

[1] AFAIK TVs do a whole lotta crapton on processing / filtering / yadda yadda on the image, often using several frames as reference and thus also buffering quite a bit for that, too. Hence the lag. Gaming mode in TVs that have it turns most, but not all of it.
Posted on Reply
#39
Vayra86
VinskaSome TVs have "gaming mode" knob slapped deep in their settings, which greatly reduces the latency[1]. The latency still tends to be too damn high for my tastes.

[1] AFAIK TVs do a whole lotta crapton on processing / filtering / yadda yadda on the image, often using several frames as reference and thus also buffering quite a bit for that, too. Hence the lag. Gaming mode in TVs that have it turns most, but not all of it.
TV's do all sorts of shit you don't want. They oversaturate color, the interpolate static things like your Crosshair in an FPS so you get a static crosshair that becomes blurry as fuck, they have a minimum input lag of 24ms (the best LCD Sony Bravia has this, only one model, all others are higher, and 4K is a factor two above that) and they hide many darker shades in blackness or backlight bleed.

Staying far, far away from that for gaming - been there done that :P
Posted on Reply
#40
RejZoR
VinskaSome TVs have "gaming mode" knob slapped deep in their settings, which greatly reduces the latency[1]. The latency still tends to be too damn high for my tastes.

[1] AFAIK TVs do a whole lotta crapton on processing / filtering / yadda yadda on the image, often using several frames as reference and thus also buffering quite a bit for that, too. Hence the lag. Gaming mode in TVs that have it turns most, but not all of it.
It depends. My TV doesn't change a single bit even if I turn of all image enhancement features. Latest generation of Philips, particularly the new 4K model 43PUS6501 that I've tested, even with all features enabled in normal mode, almost unnoticeable lag while still with excellent image.
Posted on Reply
#41
xorbe
Yeah my new 4K tv over-saturates red, sharpness was jacked to max by default, and I have to adjust audio delay in MPC-HC.
Posted on Reply
#42
thesmokingman
techy1thhere are... precise 0 GPUs that can run everything at 4k 60fps, but we are close - 1080Ti and AMD vega could be... then you crosfire/sli those biatches and here you go... but those top GPUs x2 (or x3) + top system will be a fraction of that monitor price tough :(
Crosses fingers lol...
Posted on Reply
#43
TheinsanegamerN
Vayra86God no, Dual GPU needs to die ASAP if I'm totally honest. Give us a solid foolproof implementation of DX12's asymmetric GPU scaling instead. Mix and match, game/engine independant support. At that point we can talk about anything other than single GPU imo - and that is entirely up to the lacking support on several big titles in the recent years. Both NV and AMD have dropped the ball countless times because they (also) rely on developer time for each specific game. We had just survived the Frame Pacing issue, and AMD had just gotten Crossfire on point... and then we get DX12 that destroys the dual GPU market again with a vague 'DIY' implementation, with MS putting the final nail in the coffin of multi-GPU altogether with that abomination they call UWP.

Too bad that is utopia, just like dual GPU and great support, it will always be a painful exercise at some point, sooner or later and always in the games where you want that horsepower the most (remember The Division just now? they waited until Pascal before they came with a fix). I still have very fresh non-existant SLI support for The Elder Scrolls Online in my memory too :)
So, instead of dual GPUs, you want DX12's multiple GPU scaling? which uses more then 1 GPU? These two things are not mutually exclusive. Ideally, DX12 would replace typical SLI/crossfire, but to insist that single GPU is the way to go? That is incredibly short sighted.

Every generations wafers get more expensive, and big GPUs get exponentially harder to make. Two small GPUs are easier to make and more profitable then 1 big GPU. DX12 has support for multiple GPUs, but it is a new API. Devs are not experienced with it yet. Dual GPU support will come, though, just like DX11. The last few years have been bad all around for games, not just in the dual GPU market. Broken games, delayed support draconian DRM, on top of brand new APIs. Give the industry a few years to figure out DX12, and dual GPUs will become a proper solution again.

And before anyone says "well game engines wont support it" UE3 doesnt support AA, yet every UE3 game I've played has an option for AA. Just because an engine doesnt support it out of box doesnt mean it wont work, or that it wont get widespread adoption once someone figures it out. One dev gets UE4 playing with multiple GPUs nicely, and everyone else will copy the implementation if they have any will to support the PC market. Devs that dont, well, that's no different then it's been for the past 10 years.
Posted on Reply
#44
PP Mguire
Vayra86TV's do all sorts of shit you don't want. They oversaturate color, the interpolate static things like your Crosshair in an FPS so you get a static crosshair that becomes blurry as fuck, they have a minimum input lag of 24ms (the best LCD Sony Bravia has this, only one model, all others are higher, and 4K is a factor two above that) and they hide many darker shades in blackness or backlight bleed.

Staying far, far away from that for gaming - been there done that :p
Source it. If my TV was 24ms I'd be back on a monitor before you could say lag. Some TVs require a process to get the lag off but it does work. It also helps to do research on the TV in question before actually buying it if gaming is intended.
Posted on Reply
#45
alwayssts
PP MguireSource it. If my TV was 24ms I'd be back on a monitor before you could say lag. Some TVs require a process to get the lag off but it does work. It also helps to do research on the TV in question before actually buying it if gaming is intended.
Right. My old P series has been fine; 18.7ms according to rtings/cnet (on the port that will do 4k60 @ 4:2:0 or 1080p120). The new ones seem even better, while being slightly more versatile:

HDMI2.0a ports: 34.7ms
( 4k@60 4:4:4, 8-bit | 4k@60, 4:2:2, 12-bit | 4k@60, 4:2:0 12-bit)

HDMI 1.4 'Gaming Port': 17.7ms
(4k@60, 4:2:2, 8-bit | 4k@60, 4:2:0, 10-bit | 1080p@120, 4:4:4, 10-bit | 1080p@120, 4:2:2, 12-bit

So, IOW 2-3 frames...and in reality pretty much ALL gaming is going to account for three frames of lag (including from a typical wireless controller etc).

I don't find anything in that range bad at all; my old LG was 32-33ms and I found that fine as well. If I can keep lag under a few frames @ 60fps I'm totally happy, and can't really notice it, especially when using a controller. That said, I don't play competitive FPS.


I'm with whomever believes 4k @ 120 10-bit [HDR] with adaptive sync on large 'displays' (TVs) is the be-all, end all of these conversations. That said, I'm fine with whatever Vizio continues to put out until SOMEBODY picks up the displayport ball for 65'' screens. I, like certainly everybody else, hope that is in time for OLED to take center stage of our lives.
Posted on Reply
#46
Legacy-ZA
"27-Inch 1080p OLED 144Hz monitor for me. Keep your 4k monitors until nGreedia stops their bull**** and just launch the top cards from the get go again. I am sure 2x 1080Ti's could have gotten at least between 100-120 FPS, but Noooooooooooooooooooooooooooooooooooooooooooo, we have to wait another year now.
Posted on Reply
#47
TheGuruStud
ChaitanyaThere aren't too many GPUs capable of handling 4k at 60fps, I hope display makers push GPU manufacturers to seriously make faster GPUs.
But you can also play at 1080 with no blurring and really high framerate. If it supports adaptive sync, it will be amazing for 1080. Pixel density ftw. Although, I'd rather have a 32 or 40".
Posted on Reply
#48
Dethroy
4K? Not interested. Such a waste of GPU power. Gimme 1440p Ultrawide /w HDR & 144Hz.
Posted on Reply
#49
nemesis.ie
FordGT90ConceptEdit: DSC is supposed to be 3:1 compression so...should theoretically be possible but only with DisplayPort 1.4. No Polaris for me! :cry:
Polaris will apparently support DP1.4 once it's official. :)
Posted on Reply
#50
R-T-B
TVs also often don't support Chroma 4:4:4, which is a nice way of saying the resolution of the color image vs the monochrome image is not the same. The human eye doesn't tend to notice this very much, but it isn't good.
Posted on Reply
Add your own comment
Apr 26th, 2024 16:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts