• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Samsung Electronics Unveils Its New Odyssey, ViewFinity and Smart Monitor Lineups at CES, World's First 7,680 x 2,160 DP 2.1 Monitor

Guess that Neo G9 (or equivalent from others) might be my upgrade in 3-4 years time

Gonna be less interesting if it contains "smart" functionality, but sounded to me that Neo G9 and ViewFinity didn't include any of that shit.
 
I'm done with Samsung after the OTA update of theirs bricked my not even 2 year old Samsung Galaxy A11.

And seeing a friends Sammy Qled TV blanking out, I refuse their crap

Same here, not a single Samsung monitor is entering the house anymore. The quality and especially longevity is just below par every time. It happens with phones, tablets, monitors and TVs. It looks nice out of the box, but corners have been cut everywhere, and software is also meh at best.

We're looking at a QLED VA TV rn that is barely 2 years old and has worse backlight/uniformity issues than my 6 year old Eizo VA monitor had when I sold it. It will, or will not detect audio proper when the PC is turned on. It turns itself back on again when we press the standby button on the remote - sometimes. All in a single product! Quite an achievement.

Note that Samsung pushing Smart Hub features into computer monitors means you'll now get ad pop-ups. That is the only reason for this feature - nobody cares about being able to watch Netflix without using the PC you already have connected.
Yet another strike for Sammy, even if it didn't need any more...
 
Perhaps supporting proper UHBR 20 ports on devices is expensive too for some reason. AMD wanted the promotional benefits of supporting the newest DisplayPort version without actually going through the effort and expense of supporting the fastest spec. And they seem to have successfully duped a lot of people.

Of course using "proper" UHBR 20 (lol, UHBR13.5 is also "proper" and amd still has the first gpu to support it - intel only supports UHBR 10 which is still a big improvement over dp1.4 because of the encoding efficiency on dp2.0) would be more expensive, it's a much higher bandwidth that needs better signal integrity etc.

I don't get how you can call that duping people, what do you call what nvidia did saying "it's not necessary because of dsc" then? The misleading thing AMD did was use the 8k moniker during the presentation while referring to "8k" ultrawide - it's almost like they knew some new display with that resolution was coming soon :cool:

Gonna be less interesting if it contains "smart" functionality, but sounded to me that Neo G9 and ViewFinity didn't include any of that shit.

The only one that doesn't mention the smart whatever crap is the Neo G9 (the miniled "8k" one). Both the QD-OLED G9 and the ViewFinity mention smart hub features :nutkick:
 
Of course using "proper" UHBR 20 (lol, UHBR13.5 is also "proper" and amd still has the first gpu to support it - intel only supports UHBR 10 which is still a big improvement over dp1.4 because of the encoding efficiency on dp2.0) would be more expensive, it's a much higher bandwidth that needs better signal integrity etc.

I don't get how you can call that duping people, what do you call what nvidia did saying "it's not necessary because of dsc" then? The misleading thing AMD did was use the 8k moniker during the presentation while referring to "8k" ultrawide - it's almost like they knew some new display with that resolution was coming soon :cool:
People have been duped by AMD, whether AMD intended this or not. There are so many people saying that they didn't realize that RDNA3 cards don't support the max DP 2.1 bandwidth, so obviously AMD wasn't being very clear. The fact that they only supported UHBR 13.5 was buried in the footnotes of their RDNA3 presentation, and all most people saw and heard was "DisplayPort 2.1." That seems pretty deceptive to me.

Nvidia is obviously deceptive all the time, but in this case, DP 2.1 really isn't necessary because of DSC. You need DSC with or without DP 2.1 in order to get the monitor's full refresh rate, so it's really the same either way. And DSC is genuinely a "visually lossless" compression algorithm. Even if you've studied the effect it has on image quality and have the ability to compare images side by side, I promise that you will have an exceptionally hard time telling them apart. I've read the white paper and looked at the example images, and it really is borderline impossible to tell which is which even when you're trying your hardest to. So yeah, if your expected use case for DP 2.1 is to avoid DSC, then a) you probably won't, and b) it wouldn't be necessary anyway.
 
People have been duped by AMD, whether AMD intended this or not. There are so many people saying that they didn't realize that RDNA3 cards don't support the max DP 2.1 bandwidth, so obviously AMD wasn't being very clear. The fact that they only supported UHBR 13.5 was buried in the footnotes of their RDNA3 presentation, and all most people saw and heard was "DisplayPort 2.1." That seems pretty deceptive to me.

Nvidia is obviously deceptive all the time, but in this case, DP 2.1 really isn't necessary because of DSC. You need DSC with or without DP 2.1 in order to get the monitor's full refresh rate, so it's really the same either way. And DSC is genuinely a "visually lossless" compression algorithm. Even if you've studied the effect it has on image quality and have the ability to compare images side by side, I promise that you will have an exceptionally hard time telling them apart. I've read the white paper and looked at the example images, and it really is borderline impossible to tell which is which even when you're trying your hardest to. So yeah, if your expected use case for DP 2.1 is to avoid DSC, then a) you probably won't, and b) it wouldn't be necessary anyway.

I have no problems with DSC generally speaking, I just don't like this message of "amd duped people by supporting only UHBR 13.5". This are the first products to come to market with DP2.1 and I'd call your attention to the marketing slides amd used

ZdS1UOo8fnmjdbyO.jpg

arch14.jpg


Most people won't even know the specifics of DP2.1, to say they are being duped when the information is clearly there is a big streach. In my opinion the only misleading part was the "8k" ultrawide thing (meh 8k is a bullshit term anyway, but it is indeed misleading)

DP2.1 was created with a huge future proof margin, it will be a while until UHBR20 becomes popular or even necessary when for example just the cables will be super expensive and you really need to be running a huge resolution/refresh rate to need that much bandwidth. The bitrate publicized numbers obfuscate this but UHBR13.5 has double the effective bandwidth of dp1.4 (data rate goes from 25.92 to 52.22 gbit/s)
 
One thing for certain, this has been a very informative post. This made me actually get into the weeds and understand the marketing jargon.
 
The question is if this also supports hdmi2.1 if not it will only properly run on 7900XTX/XT
 
I have no problems with DSC generally speaking, I just don't like this message of "amd duped people by supporting only UHBR 13.5". This are the first products to come to market with DP2.1 and I'd call your attention to the marketing slides amd used
thats true. i think that 13.5 is all they could do for now .. i mean its more than enough, it can do 8k 165, what more do you want. this tv is exactly half of that btw... 16m pixels, double 4k, 8k is 4x 4k
 
thats true. i think that 13.5 is all they could do for now .. i mean its more than enough, it can do 8k 165, what more do you want. this tv is exactly half of that btw... 16m pixels, double 4k, 8k is 4x 4k

The "8k" AMD used in their marketing slide wasn't "real" "8k", it was this dual 4k ultrawide. A bit weird samsung is using 175Hz as max refresh but exact max values are dependent on timings used which can vary between devices so meh, maybe it works (they also don't mention what spec of DP2.1 they're supporting but in all likelihood is the same as amd - UHBR 13.5)
 
Having gotten a hand-me-down Neo G9 '21 version (friend upgraded to the Odyssey Ark and sold the Neo G9 for 400 + lunch), I'm tempted to get the '22 version or the OLED variant. I definitely wouldn't be using it in 8k though; either 4k or lower, as I find that having such a wide screen doesn't really allow for equal scaling (between desktop environment and games/webpages/videos).
 
Unfortunate what you experienced. I had the CF791 since launch, still going strong. I'm just waiting for a high refresh rate 21:9 2160p ultrawide to replace it.
Yep, but what was worse was Samsung attitude. This wasnt rthe first time I had to deal with SS regarding that monitor either.

Moving on, I will keep my eyes pealed for a replacement panel, or a monitor thats been damaged but the panel is useable.
 
Back
Top