• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Expects Partners to Release 4K, 144 Hz G-Sync Displays With HDR in April

Are you saying that extra one inch is the deal breaker between 1440p and 4k?

Scratching my head on that too. I would be rather picky with those enormous bezels than some 2.54 cm larger diameter(hmm is it diameter in English for rectangles :().
 
rather have a 34" UW with higher res and at least 144hz... I have a 34" UW panel that does 166hz (low res :( ) but going down from that would be painful and going smaller than a 34" UW is a no go

I just hope this means that the bandwidth limitation is going to be soon lifted... ie 2080 Ti ... I am not about SLi, to many times I have done it and to many times I have dealt with its short falls.
 
Are you saying that extra one inch is the deal breaker between 1440p and 4k?

I wouldn't say deal breaker. More personal preference? I installed 27 inch 1080p monitor for the president of my company and it looked pixelated (IMO) and then we installed a 28 inch 2160p, and it almost seemed to small (IMO) then we installed a 27 inch 1440p and it seemed just right. Not big(pixelated) and not too small (He's old, about 65 and needed a monitor that had alot of real estate space to show his vast empire of properties to his rich friends) lol
 
I wouldn't say deal breaker. More personal preference? I installed 27 inch 1080p monitor for the president of my company and it looked pixelated (IMO) and then we installed a 28 inch 2160p, and it almost seemed to small (IMO) then we installed a 27 inch 1440p and it seemed just right. Not big(pixelated) and not too small (He's old, about 65 and needed a monitor that had alot of real estate space to show his vast empire of properties to his rich friends) lol

Well this is gaming monitor and in your use case very bad option anyway. But I would say it's more depending on how OS handles high DPI count than anything else.
 
I wouldn't say deal breaker. More personal preference? I installed 27 inch 1080p monitor for the president of my company and it looked pixelated (IMO) and then we installed a 28 inch 2160p, and it almost seemed to small (IMO) then we installed a 27 inch 1440p and it seemed just right. Not big(pixelated) and not too small (He's old, about 65 and needed a monitor that had alot of real estate space to show his vast empire of properties to his rich friends) lol

Again, each to their own.

In addition to my 27" 4k display, I have a 25" Dell 1440p display and I think that's about the right size for that resolution. Prior to that I had a 23" 2048x1152 display and that felt about right for that size...

I just like to have a lot of desktop real estate when I'm not playing games on my screen, as I actually use it for work as well as playing games on it. I don't want to have to move my head around just to see the entire screen, so going much larger doesn't seem practical to me in that sense.
 
Again, each to their own.

In addition to my 27" 4k display, I have a 25" Dell 1440p display and I think that's about the right size for that resolution. Prior to that I had a 23" 2048x1152 display and that felt about right for that size...

I just like to have a lot of desktop real estate when I'm not playing games on my screen, as I actually use it for work as well as playing games on it.

Ooh true 2K 16:9 display, some CRT maybe?
 
Ooh true 2K 16:9 display, some CRT maybe?

Nope, Samsung made a couple and Acer made a couple. Mine was a Samsung. The backlight slowly faded though, good old days of florescent backlight...
 
Well this is gaming monitor and in your use case very bad option anyway. But I would say it's more depending on how OS handles high DPI count than anything else.

Thought we were having a side conversation about 4k and sizes than the actual monitor. But I agree, I wouldn't be installing this monitor for business work.

Again, each to their own.

In addition to my 27" 4k display, I have a 25" Dell 1440p display and I think that's about the right size for that resolution. Prior to that I had a 23" 2048x1152 display and that felt about right for that size...

I just like to have a lot of desktop real estate when I'm not playing games on my screen, as I actually use it for work as well as playing games on it. I don't want to have to move my head around just to see the entire screen, so going much larger doesn't seem practical to me in that sense.

Yes, each their own. Not all eyes are equal
 
....I honestly believe that Nvdia been sandbagging for so long that they actually do have something that can handle this......for a price of course
 
When I see benchmarks with the newest games that can push 144 fps+ @ 4K with MAX settings, I might raise an eyebrow. But I know that is going to be a good long while.

Uhm right, why? It has gsync, so staying on range within 30-144(Though preferably over 40 fps) should be enough and the crucial point of having VRR in the first place. Those high fps:ses are usually needed on competitive gaming anyway and lowering graphical fidelity is no problem on those kind of use cases(Might be even preferable to not distract some non-needed graphical effects).
 
....I honestly believe that Nvdia been sandbagging for so long that they actually do have something that can handle this......for a price of course

But they are the leader and AMD has nothing close to offer, they have no reason to release anything new when they are making BANK! on the current gens and pumping them out doing so. they are only competing against themselves, I am sure they could release a 2080 Ti today but why would they?
 
I will be interested in seeing where these are priced at. Still waiting to upgrade from my old 30" Dell.
 
My grandfather cant tell the difference between 480p and 1080p at any distance. A persons vision is a big factor.

I don't think taking an extreme example is a good idea when advising the average person. I can see the benefits of 4K and my vision is aweful. That's largerly thanks to the existence of glasses.
 
Coming from a 27" 4k monitor to a 38" 3840x1600 monitor, I don't see no text difference being less crisp than the 4k monitor. I'd say the 38" is better than the 27" in almost all categories. A 40" 4k monitor with 144hz HDR is a much better sell. But knowing that it has G-Sync it would be hella expensive
 
Once you go 21:9 you cannot go back.
 
Uhm right, why? It has gsync, so staying on range within 30-144(Though preferably over 40 fps) should be enough and the crucial point of having VRR in the first place. Those high fps:ses are usually needed on competitive gaming anyway and lowering graphical fidelity is no problem on those kind of use cases(Might be even preferable to not distract some non-needed graphical effects).
And how many people actually game competitively?

Meanwhile in the real world, people want all the details on like a game was designed to be seen. The GPU’s that can do that, and especially with these monitor specs is that elusive white whale (or unicorn, for those that prefer).

Edit: I realized there might be some differing views on “competitive”. To me, this means the pros, who get paid. Otherwise I see it as recreational.
 
Last edited:
I can't even get Nvidia to work properly on 4K TVs (I've heard it's not a prob with AMD). Windows HDR mode creates an awful dimming of the screen. I've seen others complaining of the same thing for the past year too.

So yeah, screw them. When Vega gets cheaper, I'm jumping ship.
 
And how many people actually game competitively?

Meanwhile in the real world, people want all the details on like a game was designed to be seen. The GPU’s that can do that, and especially with these monitor specs is that elusive white whale (or unicorn, for those that prefer).

Edit: I realized there might be some differing views on “competitive”. To me, this means the pros, who get paid. Otherwise I see it as recreational.

Yeah should have said fast paced multiplayer games or esports(god I hate that term). If the game is not made by iD software ~100 fps on 4K is kind rare imaginary animal.
 
I simply refuse to pay for Gsync due to its vendor lock-in. Its a bad practice for any consumer that cares about the marketplace. You keep monitors for multiple GPU upgrades, so it just doesn't make proper sense to me.

And having played on high refresh, you do get accustomed to a snappiness in everything that 60 fps/hz or lower just can't offer, Gsync or not, while 4K resolution detail is lost for the most part in moving images. Just don't see the merit of adding such a huge GPU performance requirement for it.
 
I simply refuse to pay for Gsync due to its vendor lock-in. Its a bad practice for any consumer that cares about the marketplace. You keep monitors for multiple GPU upgrades, so it just doesn't make proper sense to me.

And having played on high refresh, you do get accustomed to a snappiness in everything that 60 fps/hz or lower just can't offer, Gsync or not, while 4K resolution detail is lost for the most part in moving images. Just don't see the merit of adding such a huge GPU performance requirement for it.

Once you play on Gsync and 144hz+ you can not go back. I can make sense of what you are saying but I still think Gsync is worth the extra pennies I spent
 
I simply refuse to pay for Gsync due to its vendor lock-in. Its a bad practice for any consumer that cares about the marketplace. You keep monitors for multiple GPU upgrades, so it just doesn't make proper sense to me.

And having played on high refresh, you do get accustomed to a snappiness in everything that 60 fps/hz or lower just can't offer, Gsync or not, while 4K resolution detail is lost for the most part in moving images. Just don't see the merit of adding such a huge GPU performance requirement for it.
let me know when monitor manufactures get off their ass and make built in strobe back lighting with a decent panel. Only benq as far as i know has done that and the panels they used in those are trash compared to what is being offered today. When was the last one even sold? It was a trash TN and years ago last I saw.

Only Nvidia supports back light strobing via ULMB so let me know when manufactures actually make a damn product that has it built in...otherwise my options are ULMB....or no ULMB.....the latter isn't a valid option.....
Once you play on Gsync and 144hz+ you can not go back. I can make sense of what you are saying but I still think Gsync is worth the extra pennies I spent
Gsync is lame. ULMB is where it is at...though there is supposedly a way to trick it so both work at once.
 
Announcemnt in April (1st), sales on July :)
They will announce only price, to be prepared when sales begin :)
 
Once you play on Gsync and 144hz+ you can not go back. I can make sense of what you are saying but I still think Gsync is worth the extra pennies I spent

I'm not contesting the use of Gsync especially (in fact, exclusively-) when you dip below 60 FPS, the advantages are clear. But high refresh and Gsync in fact exclude one another, you don't need it when you are pusing 100+ FPS. At that point you can just use Fast Sync which is free AND combines well with all sorts of other monitor goodies like strobing backlight.
 
Back
Top