• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI Announces a Trio of Optix MPG Series Gaming Monitors

Joined
Feb 14, 2020
Messages
123 (0.08/day)
What's easier to understand for a normal consumer - the well-known standard it fully supports, or it's bandwidth, that nobody knows apart from enthusiasts/specialists? (clue: the former)
You don't get the problem: almost every seller calls his own cables "HDMI 2.1" but only "Ultra high speed cables" are certified to support the bandwidth these specs need. You need an expensive signal generator (such as Murideo 8k Seven) to properly test a cable like that.
While with 1/3m cables it makes no difference at all, when you buy long cables you need a REAL certification.

Msi shouldn't promote misleading tags.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
You don't get the problem: almost every seller calls his own cables "HDMI 2.1" but only "Ultra high speed cables" are certified to support the bandwith these specs need. You need an expensive signal generator (such as Murideo 8k Seven) to properly test a cable like that.
While with 1/3m cables it makes no difference at all, when you buy long cables you need a REAL certification.
... so the solution would be to bar anything but actual 48Gbps cables from using the HDMI 2.1 label? That's hardly difficult to achieve. Sure, people will always try to get around things like that, and you'll have grey-market crap seeping in through the cracks, but that's already reality today. Simplifying things will only make them better.
 
D

Deleted member 185088

Guest
32" at 1440p is the same PPI as 24" at 1080p so please explain how come it's too low?

You would have to use 20" 1080p display to match a 27" 1440p display's PPI value.
I think you misunderstood me, I meant 32" 4k not 1440p.
 
Joined
Oct 4, 2017
Messages
695 (0.29/day)
Location
France
Processor RYZEN 7 5800X3D
Motherboard Aorus B-550I Pro AX
Cooling HEATKILLER IV PRO , EKWB Vector FTW3 3080/3090 , Barrow res + Xylem DDC 4.2, SE 240 + Dabel 20b 240
Memory Viper Steel 4000 PVS416G400C6K
Video Card(s) EVGA 3080Ti FTW3
Storage XPG SX8200 Pro 512 GB NVMe + Samsung 980 1TB
Display(s) Dell S2721DGF
Case NR 200
Power Supply CORSAIR SF750
Mouse Logitech G PRO
Keyboard Meletrix Zoom 75 GT Silver
Software Windows 11 22H2


This image is very misleading because the layers are arranged in the opposite order from the viewers perspective . The quantum dots on this product are used for the enhancement film ( QDEF ) , they are placed right after the backlight layer in order to enhance the blue light of the backlight as seen here :



It's a shame to see monitors being this far behind in terms of technology because this is essentialy old quantum dot technology used on commercial TVs from 2013 . Nowadays (from 2019 and on ) TVs have the quantum dot layer also applied infront of the LC layer much closer to the eye of the viewer , this is called QD color filter ( QDCF ) as seen here :

( In this example QD are not used to enhance the backlight )


Yet in PC monitor market we get to enjoy 2013 technology in 2021 .... Can't wait for OLED to shake up this lethargic PC monitor market !
 
Last edited:
Joined
Jun 10, 2014
Messages
790 (0.22/day)
Location
Poland
System Name Proper
Processor 5900X + OC
Motherboard GB X570s Elite AX
Cooling WC Heatkiller 3.0 LT
Memory G.Skill 3600 CL16
Video Card(s) Zotac RTX 3070 Ti Trinity LC'ed + OC
Storage KC2500 1TB + A2000 1TB
Display(s) GB M32Q
Case Fractal Define R6 USB C
Audio Device(s) Creative AE-7 + Phonic AM120
Power Supply Seasonic Prime PX-850
Mouse Log G502 X LS
Keyboard Keychron K5 Opt.brown
Yet in PC monitor market we get to enjoy 2013 technology in 2021 .... Can't wait for OLED to shake up this lethargic PC monitor market !

OLED, because of burnout, isn't really suited to work as a PC monitor. For gaming/movies - fine, but everything other than that (working, web browsing etc.)... Yes, panels can regenerate, but it's not back to 100% condition, also each next one could yield worse results.

I look more forward to microLED/high-count miniLED, however the former does not exist yet, and the latter's current prices of 2,5-4K $ are ridiculous.
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro


This image is very misleading because the layers are arranged in the opposite order from the viewers perspective . The quantum dots on this product are used for the enhancement film ( QDEF ) , they are placed right after the backlight layer in order to enhance the blue light of the backlight as seen here :



It's a shame to see monitors being this far behind in terms of technology because this is essentialy old quantum dot technology used on commercial TVs from 2013 . Nowadays (from 2019 and on ) TVs have the quantum dot layer also applied infront of the LC layer much closer to the eye of the viewer , this is called QD color filter ( QDCF ) as seen here :

( In this example QD are not used to enhance the backlight )


Yet in PC monitor market we get to enjoy 2013 technology in 2021 .... Can't wait for OLED to shake up this lethargic PC monitor market !
Are they able to make QD color filters at the types of pixel densities monitors require, at sufficient brightness? That's the main question here. I would guess there are technical reasons for this not happening - if not, they could just make the same panels on the same production processes in the same factories, just denser and cut into smaller final sizes.

Also, as mentioned above, OLED has severe issues for large parts of PC usage, with burn-in/image retention still not being solved. It's partially manageable, but it's highly usage dependent (OLED is essentially fine for movie and varied gaming use, but if you play one game a lot or use the desktop a lot, you're going to get retention of various parts of the UI after a while, and no mechanisms for lessening it can fully fix the issue. QD-OLED might alleviate this further by using a single color optimally tuned for longevity for the actual OLED layer, but you still can't get past the fact that organic LEDs degrade over time, while non-organic LEDs essentially do not (yes, they also dim, but at a vastly lower rate). If all you do is game (many different games) and/or watch movies/videos, OLEDs today are great (if large). If you spend a lot of time on the desktop or in the same game, LCDs are still going to be better for years to come.

Edit: The HWUB review is in. In summary: As expected, this is very close to the Asus PG32UQ - though not identical. The panel is likely the same though, as response time performance is very similar - just tuned slightly differently (trading slightly more overshoot for slightly faster response times). No single overdrive mode is really suited for the full range of refresh rates though. The stand-out is color gamut and accuracy when calibrated, which is staggeringly good. So, if what you're after is a do-it-all monitor with a strong emphasis on content creation/photo+video editing but also good (but not great) gaming performance, this looks like the best 32" 2160p144 monitor out there. But it still doesn't come close to the panel performance (response times, overshoot) of the ~27" class panels still. Seems we have to wait for the next generation of 32" panels for that to happen (at which time the 27" class has no doubt moved on once again).
 
Last edited:
Joined
Oct 4, 2017
Messages
695 (0.29/day)
Location
France
Processor RYZEN 7 5800X3D
Motherboard Aorus B-550I Pro AX
Cooling HEATKILLER IV PRO , EKWB Vector FTW3 3080/3090 , Barrow res + Xylem DDC 4.2, SE 240 + Dabel 20b 240
Memory Viper Steel 4000 PVS416G400C6K
Video Card(s) EVGA 3080Ti FTW3
Storage XPG SX8200 Pro 512 GB NVMe + Samsung 980 1TB
Display(s) Dell S2721DGF
Case NR 200
Power Supply CORSAIR SF750
Mouse Logitech G PRO
Keyboard Meletrix Zoom 75 GT Silver
Software Windows 11 22H2
OLED, because of burnout, isn't really suited to work as a PC monitor. For gaming/movies - fine, but everything other than that (working, web browsing etc.)... Yes, panels can regenerate, but it's not back to 100% condition, also each next one could yield worse results. I look more forward to microLED/high-count miniLED, however the former does not exist yet, and the latter's current prices of 2,5-4K $ are ridiculous.

Burn in issues with OLED are way too overblown nowadays and they can be easely mitigated with basic tricks ( hidding the task or/and icons , switching frequently wallpapers etc etc ) . Sure risk does exist but it really requires heavy abuse of the user to manifest itself . I don't agree with the statement that OLED isn't suited for a PC monitor even when considering workloads outside of gaming / content consumption , it all depends on how much you workload stresses the monitor . LG have already released an OLED monitor 32EP950-B which targets professionals so yeah there is that .

MiniLED even under extremely high count still manifest all the shortcomings of FALD , no matter how many mini leds you put they will never be small enough to mimic the self emisive nature of OLED this can only be achieved by MicroLED . Had MiniLED been much more affordable it could be something worth considering as a stop gap untill MicroLED or QNED ( not to be confused with LG QNED which uses MiniLEDs instead of MicroLEDs for the backlight layer ) hit the market BUT at these sort of ridiculous prices MiniLED is DOA , i can buy 3 or 4 48'' OLED panels for the price of a single MiniLED monitor , even by abusing those OLED panels they will last me at least 2 years each so we are talking about 3-6 years worth of usage while blasting the MiniLED panel out of the water in terms of motion clarity and picture quality . So size limitations aside ( there are 42'' OLED panels coming and even smaller sizes according to reputable sources ) MicroLED is a waste of money in my opinion .

Now MicroLED sure is the endgame since it combines all the advantages of OLED and LCD technologies without any apparent drawback but the real drawback of MicroLED is the price . Chances are QNED displays ( MicroLED blue light backligh + Quantum Dot color filter ) and QD-OLED ( OLED blue light backlight + Quantum Dot color filter ) will hit the market much much sooner than MicroLED and in the case of ink jet printed OLED the prices will go down so we are talking about products that will cost orders of magnitude cheaper than any initial MicroLED product .

Are they able to make QD color filters at the types of pixel densities monitors require, at sufficient brightness? That's the main question here. I would guess there are technical reasons for this not happening - if not, they could just make the same panels on the same production processes in the same factories, just denser and cut into smaller final sizes.

Absolutely , quantum dots have a size of 2-8 nm https://nanosys.com/science even MicroLED type of pixel density is a joke for QD let alone for nowadays technologies . QD are not self emissive they are just a color filter so brightness is not a criteria as it depends of the backlight layer . So to answer your question no you are guessing wrong , the only reason why this is happeping in the TV market but not in the PC monitor market is because of how lethargic the PC monitor market is .

As i said can't wait for OLED , especially now that Chinese companies such as TCL are heavely investing into it and are coming up with ink jet printing methods etc , to enter this market and shake it up for good . OLED will only get cheaper as the time goes . Imagine monitors consting in the 500-700$ range while blowing any existing FALD product ( no matter the number of diming zones ) out of water in terms of picture quality and motion clarity ....
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Absolutely , quantum dots have a size of 2-8 nm https://nanosys.com/science even MicroLED type of pixel density is a joke for QD let alone for nowadays technologies . QD are not self emissive they are just a color filter so brightness is not a criteria as it depends of the backlight layer . So to answer your question no you are guessing wrong , the only reason why this is happeping in the TV market but not in the PC monitor market is because of how lethargic the PC monitor market is .

As i said can't wait for OLED , especially now that Chinese companies such as TCL are heavely investing into it and are coming up with ink jet printing methods etc , to enter this market and shake it up for good . OLED will only get cheaper as the time goes . Imagine monitors consting in the 500-700$ range while blowing any existing FALD product ( no matter the number of diming zones ) out of water in terms of picture quality and motion clarity ....
QDs are absolutely not a color filter, that is precisely why they are as good as they are. A filter takes a wide range of things (light, ground coffee, rocks, whatever) and lets through a selective subset of what is out into it. In other words, filters are lossy - that is inherent to the concept of filtration. QDs transform light, absorbing light of one wavelength/set of wavelengths, become excited by the energy of that light, and emit light of another set of frequencies. So while they aren't technically self-emissive, they are closer to that than being a filter. Assuming a single, controlled input wavelength, QDs can convert nearly 100% of input light to useful colored light. An RGB color filter will, no matter what you do, waste 66% of its light through filtration. That's where the QD benefit comes from.

Hence my question. Individual quantum dots are of course tiny, but the question becomes how large of an area you need to collect a sufficient number to reliably and uniformly transform light at the intensity and wavelengths you need, in a sufficiently controlled manner. So it doesn't depend on the size of the quantum dots, but how the films are made - i.e. panel production technology.

As for OLED, there is no level of mitigations that can avoid that desktop OSes have fixed UI elements, nor that most desktop use cases include a high amount of high APL scenarios - web browsing, text documents, spreadsheets, whatever. And high APL scerarios are what causes the most wear and thus has the highest chance of causing retention on an OLED. You can't pixel shift your way out of your web browser occupying roughly the same area on your screen every time you use it. And sure, things like dark mode can further alleviate things a tad, but it won't solve any of the above examples, and soon you're looking at a scenario where you're spending a significant amount of time and work to avoid image retention. Is that worth it? It sure isn't to me. I want to use my monitor, not manage it.
 
Joined
Oct 4, 2017
Messages
695 (0.29/day)
Location
France
Processor RYZEN 7 5800X3D
Motherboard Aorus B-550I Pro AX
Cooling HEATKILLER IV PRO , EKWB Vector FTW3 3080/3090 , Barrow res + Xylem DDC 4.2, SE 240 + Dabel 20b 240
Memory Viper Steel 4000 PVS416G400C6K
Video Card(s) EVGA 3080Ti FTW3
Storage XPG SX8200 Pro 512 GB NVMe + Samsung 980 1TB
Display(s) Dell S2721DGF
Case NR 200
Power Supply CORSAIR SF750
Mouse Logitech G PRO
Keyboard Meletrix Zoom 75 GT Silver
Software Windows 11 22H2
QDs are absolutely not a color filter, that is precisely why they are as good as they are. A filter takes a wide range of things (light, ground coffee, rocks, whatever) and lets through a selective subset of what is out into it. In other words, filters are lossy - that is inherent to the concept of filtration. QDs transform light, absorbing light of one wavelength/set of wavelengths, become excited by the energy of that light, and emit light of another set of frequencies. So while they aren't technically self-emissive, they are closer to that than being a filter. Assuming a single, controlled input wavelength, QDs can convert nearly 100% of input light to useful colored light. An RGB color filter will, no matter what you do, waste 66% of its light through filtration. That's where the QD benefit comes from.

What i meant by color filter is how they are referred to when used at the front of the display , this is an industry standard denomination ( QDCF/QDCC ) https://www.nature.com/articles/s41598-020-72468-8 .



Of course they are being given this denomination not because they act as a filter per say but because they replace the RGB color filter layer . Realistically they do indeed work as color converters ( hence QDCC ) blue light coming from the backlight layer excites the red and green QD subpixels who convert it in pure red and green light and goes through the convetional ( read no QD ) blue subpixel . Anyhow as i said the brightness of the panel doesn't depend of the QDCF/QDCC layer since quantum dots are not self emisive yet ( nanosys are working on it ) it depends on the blue light source aka the backlight .

Hence my question. Individual quantum dots are of course tiny, but the question becomes how large of an area you need to collect a sufficient number to reliably and uniformly transform light at the intensity and wavelengths you need, in a sufficiently controlled manner. So it doesn't depend on the size of the quantum dots, but how the films are made - i.e. panel production technology.

Well i did already answered this question , the manufacturing of QDCF/QDCC layers is mature enough to the point they can handle microLED requirements . They can arrange them with enough precision down to 3 micro meters , that's enough precision for any current or upcoming technology :

 
Last edited:
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Well i did already answered this question , the manufacturing of QDCF/QDCC layers is mature enough to the point they can handle microLED requirements . They can arrange them with enough precision down to 3 micro meters , that's enough precision for any current or upcoming technology :

But that doesn't actually answer my question. MicroLED displays have absolutely terrible pixel density (the smallest/densest in existence is, what, 70-something inches at 2160p?). And pixel density is the key to smaller displays. If that patterning capability down to 3 micron is correct that should be plenty (though judging by the illustrations i wonder if that's a typo and it should say 30? Still tiny though.), but that still doesn't answer whether the tech is suitable for implementation on smaller panels yet. There are plenty of possible challenges, from leakage around the subpixels (would be more visible at closer viewing distances, and inherent to not having a passive color filter, this could be especially noticeable on white/high APL images, which are more common on monitors than TVs) to whether the dense patterning required could affect brightness, contrast, or other factors. Or it could of course just be down to economics, with high end TVs outselling high end monitors by several orders of magnitude.
 
Joined
Oct 4, 2017
Messages
695 (0.29/day)
Location
France
Processor RYZEN 7 5800X3D
Motherboard Aorus B-550I Pro AX
Cooling HEATKILLER IV PRO , EKWB Vector FTW3 3080/3090 , Barrow res + Xylem DDC 4.2, SE 240 + Dabel 20b 240
Memory Viper Steel 4000 PVS416G400C6K
Video Card(s) EVGA 3080Ti FTW3
Storage XPG SX8200 Pro 512 GB NVMe + Samsung 980 1TB
Display(s) Dell S2721DGF
Case NR 200
Power Supply CORSAIR SF750
Mouse Logitech G PRO
Keyboard Meletrix Zoom 75 GT Silver
Software Windows 11 22H2
If that patterning capability down to 3 micron is correct that should be plenty (though judging by the illustrations i wonder if that's a typo and it should say 30? Still tiny though.)

It's not a typo , the SEM image is probably looking at the far right side of the green/red subpixel structure . On the far left side you can see the gap between subpixels are much smaller and the gap between them is even smaller , this is likely where the 3 microns claim is coming from .

There are plenty of possible challenges, from leakage around the subpixels (would be more visible at closer viewing distances, and inherent to not having a passive color filter, this could be especially noticeable on white/high APL images, which are more common on monitors than TVs) to whether the dense patterning required could affect brightness, contrast, or other factors. Or it could of course just be down to economics, with high end TVs outselling high end monitors by several orders of magnitude.

I think this will answer most of your questions :

 
Top