Friday, April 23rd 2021

iiyama Announces 27'' WQHD 165 Hz IPS GB2770QSU Gaming Monitor

Boosted by a new type IPS - Fast IPS panel technology, the new G-Master delivers next level gaming with an outstanding 0.5 ms Moving Picture Response Time and offers high-fidelity, vivid battleground scenes displayed with outstanding color accuracy. HDR 400 certified, the GB2770QSU is a true 8-bit monitor guaranteeing 400 cd/m² luminance. Offering Premium Pro - the highest level - FreeSync support, the monitor dynamically adjusts its vertical refresh rate to the frame rate of the graphics card resulting in virtually zero tearing or stuttering issues. The low framerate compensation effectively removes the minimum refresh rate boundary.

To enhance visibility in shadowed areas, users can switch on the Black Tuner function improving viewing performance and helping to spot the enemy earlier. Whether you're on First Person Shooter or Strategy, predefined and custom gaming modes are available via the on-screen-display menu and make it easy to adjust the screen settings to anyone's personal preferences. The new Red Eagle is equipped with HDMI and DisplayPort inputs. It also features a headphone connector and a USB hub with charging function. The flicker-free and blue light reducer technologies reduce the strain put on the eyes by long hours spent in front of the screen.
Featuring a height-adjustable stand with tilt and swivel capabilities, the G-Master GB2770QSU helps you maintain a healthy body posture and promotes comfort during marathon gaming sessions.

The monitor is also available in Full HD resolution in 27" and 24" sizes.
Source: iiyama
Add your own comment

8 Comments on iiyama Announces 27'' WQHD 165 Hz IPS GB2770QSU Gaming Monitor

#1
DeathtoGnomes
Featuring a height-adjustable stand with tilt and swivel capabilities
I was gonna say that stand looks pretty tall, which is nice to see taller stands.
Posted on Reply
#2
Caring1
"0.5 ms Moving Picture Response Time"
What's that in GtG?
Posted on Reply
#3
Nordiga
Very nice, professional look! Some might say it is boring but i like it plain and simple, its a monitor after all.
The logical prices of this brand are pretty much guaranteed, now i just hope the performance is on par or better than the competition.
Posted on Reply
#4
Chrispy_
Caring1"0.5 ms Moving Picture Response Time"
What's that in GtG?
I believe that's marketing bullshit lingo for "strobing backlight" or "black frame insertion"

G2G is going to be the same for Iiyama as everyone else provided they don't screw up the overdrive firmware (and that happens often enough that you really do need to be looking at reviews with detailed response time testing of any monitor model).

LG's 27" 1440p IPS panels seem to reliably test at 5ms or so, with a few terrible midtone outliers if the firmware of that particular model sucks.
AUO's AHVA equivalent model seems to go down to sub-4ms without significant overshoot in gamer-focused premium models, (Acer Nitro, Asus ROG)

0.5ms is, of course, utter lies in every possible meaning of the term. Even stretching the truth with a stobing backlight is dumb because at 0.5ms strobe length it'd be putting out peak theoretical brightness of 33 nits and thanks to real-world physics probably closer to 25 nits. Unusable, even in a darkened room but good enough for a marketing lie because nobody's ever going to hold them accountable.
Posted on Reply
#5
voltage
"true 8-bit"

what happened to 10? too expensive to produce?
Posted on Reply
#6
Chrispy_
voltage"true 8-bit"

what happened to 10? too expensive to produce?
I'm pretty sure that 10-bit panels are too niche to make any real development headway. The advantage of 10-bit panels over 8-bit with FRC is really hard to spot, 6-bit panels were easy to spot as grainy, patterned, and noisy but since then pixels have become smaller, response times have increased making FRC more effective, and the jumps between colours are far more subtle. Mass-produced 8-bit panels are getting all the investment and being developed the fastest. The advantages over 10-bit are threefold - cheaper, faster, brighter.
  • Cheaper because of volume of scale and 8-bit panels being suitable for both 8 and 10-bit output.
  • Faster because switching to one of 256 intensities is much faster than switching to one of 1024 intensities.
  • Brighter because simpler local subpixel drivers (the 'unlit' parts between subpixels) are smaller and block less of the backlight - better efficiency and peak brightness.
6-bit panels with FRC dithering weren't great. The dithering was slower, larger, and in very noticeable steps, often using basic, static dithering grids that made it even more obvious.
8-bit panels with FRC dithering are far, far better; You'd likely need a high-speed camera with a great macro lens and then some histogram manipulation of the results to clearly identify the result. The improvements in dithering tech also make it less uniform, and closer to random noise/film grain.
Posted on Reply
#7
lynx29
Chrispy_I'm pretty sure that 10-bit panels are too niche to make any real development headway. The advantage of 10-bit panels over 8-bit with FRC is really hard to spot, 6-bit panels were easy to spot as grainy, patterned, and noisy but since then pixels have become smaller, response times have increased making FRC more effective, and the jumps between colours are far more subtle. Mass-produced 8-bit panels are getting all the investment and being developed the fastest. The advantages over 10-bit are threefold - cheaper, faster, brighter.
  • Cheaper because of volume of scale and 8-bit panels being suitable for both 8 and 10-bit output.
  • Faster because switching to one of 256 intensities is much faster than switching to one of 1024 intensities.
  • Brighter because simpler local subpixel drivers (the 'unlit' parts between subpixels) are smaller and block less of the backlight - better efficiency and peak brightness.
6-bit panels with FRC dithering weren't great. The dithering was slower, larger, and in very noticeable steps, often using basic, static dithering grids that made it even more obvious.
8-bit panels with FRC dithering are far, far better; You'd likely need a high-speed camera with a great macro lens and then some histogram manipulation of the results to clearly identify the result. The improvements in dithering tech also make it less uniform, and closer to random noise/film grain.
I have a 10 bit + frc making it 12 bit panel, and I can't tell the difference at all between that and 8 bit mode. lol

edit: doesn't look like this panel will be coming to usa from what i can see... :( im actually interested in this one if the price is right. ($330)
Posted on Reply
#8
Chrispy_
lynx29I have a 10 bit + frc making it 12 bit panel, and I can't tell the difference at all between that and 8 bit mode. lol

edit: doesn't look like this panel will be coming to usa from what i can see... :( im actually interested in this one if the price is right. ($330)
I can't remember where I read it but I remember an article about colourblindness that cited normal human vision can see around 3 million individual colour shades, of which only a few hundred thousand are blues, and the majority are reds and greens.



7-bit is 2 million, 8-bit is 16 million so 3 million shades puts typical human vision at barely more than 7-bit colour which is why you can discern no difference between 8-bit and 12-bit on your fancy 10-bit+FRC monitor!

The best granularity we have is limited to a very narrow range of greyscale in the near-black range, since light sensitivity in humans is non-linear and 0-255 greyscale (8-bit colour) is linear increases in intensity. If you want to test this, make a colour fade at 8-bit from 0-64 greyscale and fullscreen it in a dark room. You will struggle to see banding if you focus on it, since your focal area is mostly colour vision, but if you pay attention to what is just outside your focal point you may spot some banding.

That means that human vision is probably 9-bit for dark greyscale peripheral vision, 6.5 to 7.5-bit for our colour vision.

The reason 10 and 12-bit panels exist are probably for compatibility with 10 and 12-bit raw images. Even if people are going to struggle to tell the difference between 8-bit and 12-bit, when taking raw 12-bit source material and then stretching/editing the historgram, there is still enough colour data to avoid banding when output at 8-bit.

TL;DR, human vision is non-linear and beyond 8-bit there is almost no point.

Edit - found a better, linear-scale image that shows just how useless our eyes are at blue light:

Posted on Reply
Copyright © 2004-2021 www.techpowerup.com. All rights reserved.
All trademarks used are properties of their respective owners.