• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Demonstrates Revolutionary 14 nm FinFET Polaris GPU Architecture

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Here's a demo:

At 9-bit, there would be a 254.5 step between 254 and 255. At 10-bit, there would be 254.25, 254.5. and 254.75 steps between 254 and 255. Of course it doesn't work like that in binary but in practice that's the difference.

In terms of the eyes being able to perceive the difference at 8-bit is highly subjective. I can see the transition line between 254 and 253 but I can't really make out the other two. I think 9-bit would be good but I'm leaning towards the idea that 10-bit is excessive.
 
Last edited:
Joined
Apr 16, 2015
Messages
306 (0.09/day)
System Name Zen
Processor Ryzen 5950X
Motherboard MSI
Cooling Noctua
Memory G.Skill 32G
Video Card(s) RX 7900 XTX
Storage never enough
Display(s) not OLED :-(
Keyboard Wooting One
Software Linux
Here's a demo:

At 9-bit, there would be a 254.5 step between 254 and 255. At 10-bit, there would be 254.25, 254.5. and 254.75 steps between 254 and 255. Of course it doesn't work like that in binary but in practice that's the difference.

In terms of the eyes being able to perceive the difference at 8-bit is highly subjective. I can see the transition line between 254 and 253 but I can't really make out the other two. I think 9-bit would be good but I'm leaning towards the idea that 10-bit is excessive.

Thanks. But it's not only up to your eyes, but also about your monitor/panel you are using. For example I can see line between 252 and 253 on MVA; on TN panel I can see all 3 lines, but on absurd angles ... directly looking there are no lines at all and it looks solid one color. On IPS I can also only see 1 line between 252 and 253. Then again as you chose the bright end of the palette, the lack of visual difference might be caused also from your panel brightness limitations.... there comes in the HDR brighter light thing?

For comparison I took the liberty to copy your idea to darker colors and here I can see all 4 colors clearly on my 3 different displays (Dell IPS; Dell TN and Benq MVA):


That example already quite clearly illustrates IMHO that on average scenario the brightest and possibly also darkest colors are just cut off on ordinary cheap 8bit monitors; NOT that you wouldn't see the difference, if it were actually there!
 
Last edited:
Joined
Apr 2, 2011
Messages
2,659 (0.56/day)
Thanks. But it's not only up to your eyes, but also about your monitor/panel you are using. For example I can see line between 252 and 253 on MVA; on TN panel I can see all 3 lines, but on absurd angles ... directly looking there are no lines at all and it looks solid one color. On IPS I can also only see 1 line between 252 and 253. Then again as you chose the bright end of the palette, the lack of visual difference might be caused also from your panel brightness limitations.... there comes in the HDR brighter light thing?

For comparison I took the liberty to copy your idea to darker colors and here I can see all 4 colors clearly on my 3 different displays (Dell IPS; Dell TN and Benq MVA):


That example already quite clearly illustrates IMHO that on average scenario the brightest and possibly also darkest colors are just cut off on ordinary cheap 8bit monitors; NOT that you wouldn't see the difference, if it were actually there!



I took your picture. I cut out one of the four colors. Tell me, which one is it?



Tired of waiting, it's 128.


Here's another image. Tell me if it's the same color, or a different one (no cheating, I uploaded 3 different versions and files). Based upon that, what number is it (off of your spectrum)?


Last one, Is this the same as the two above. The same as one of the two above, or completely different than the two above? What number is it?





I can't really assume that you're answering honestly, because you could utilize poor viewing angles to see differences in the colors. I can't even assume you'll be honest about not reading ahead. As such, I'll give you the benefit of doubt and assume that you got all these right. The answers, by the way, are 128-128-126.

What have we proven. In a static environment you can alter luminosity and because of the underlying technology you might be able to tell the difference in a continuous spectrum. Now I'm going to need you to be honest with yourself here, how often do you see a continuous spectrum of color? While I'm waiting for that honesty, let's tackle the whole fallacy of the sunset image.


Sunsets are a BS test. They're taking a continuous spectrum, and by nature of digital storage, they're never going to have the same amount of colors available to reproduce that continuous spectrum. The absolute best we can ever hope for is producing a spectrum whose steps are indistinguishable. What has been demonstrated isn't that, what has been demonstrated is an image artifacted to hell by compression. Decompression can't account for a lot of differences between the colors, leading to blockiness where a spectrum value rounds to the same producible number. 10-bit won't fix that. 10-bit can't save an object compressed to hell. What 10 bit can do is make gaming colors more contiguous, but there again I have to ask you whether you can actually tell the difference between the above characters when they might appear on screen together for fractions of a second.



Let's keep all of this simple. Your example, as stated previously, is BS. The 3-8 bit difference being the same as 8-10 is laughable. The sunsets can be produced right now if my static image is compressed to hell first. You've shown one continuous spectrum, which is compressed to fit onto a screen. Tell me, how does one slider that's supposed to have 256^3 colors fit on a screen with less than 1920 vertical pixels without being compressed?

I will state this again. 10 bit colors matter less than increased frame rate, HDR colors, and amount of pixels driven. If you'd like to argue that 10 bit is somehow a gigantic leap forward, which is what you did and continue to do, you've got some hurdles. You're going to have to justify said "improvement" with thousands of dollars in hardware, your software is going to have to support 10 bit, and you're going to have to do all of this when the competition just plugged a new card in and immediately saw improvements. Do you sell a new video card on the fact that you could invest thousands on new hardware to see mathematically and biologically insignificant gains, or do you buy the card because the next game you want to play will run buttery smooth? Please, mind that what you are arguing is that more colors are somehow better, before you respond with "I can tell the difference on my current monitor (which I would have to replace to actually see benefits from 10 bit video data.)"

This isn't a personal attack. This is me asking if you understand that what you are arguing is silly in the mathematical and biological sense. It made sense to go from 5 bits per color to 8 bits per color. It doesn't make the same sense to go from 8 bits per color to 10 bits per color (mathematically derived above) despite actually producing a lot more colors. You're just dividing the same spectrum farther. HDR would increase the spectrum size. Pixel count (or more specifically density) would allow more color changes without incongruity. Increasing to 10 bits of color information doesn't do the same, and its costs are huge (right now).


You're welcome to continue arguing the point. I've lost the drive to care. You're welcome to let your wallet talk, and I'll let mine talk. If Nvidia's offering isn't better, Polaris will be the first time I've seen a reason to spend money since the 7xxx series from AMD. It won't be for the 10-bit colors. It won't be for the HBM. It'll be because I can finally turn up the eye candy in my games, have them playback smoothly, and do all of this with less power draw. That's what sells me a card, and what sells most people their cards. If 10-bit is what you need, power to you. It's a feature I don't even care about until Polaris demonstrates playing games better than what I've already got.
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Here's another image. Tell me if it's the same color, or a different one.

And it would prove what?


I will state this again. 10 bit colors matter less than increased frame rate, HDR colors, and amount of pixels driven.
How can you get to HDR colors without 10 bit?


If Nvidia's offering isn't better, Polaris will be the first time I've seen a reason to spend money since the 7xxx series from AMD.
It's there today, at the very least, Fury Nano (after recent nice price cut) vs 980 (non ti).
380,390 are strong products as well.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
HDR I think is more about the ability of the panel to block light (for blacker blacks) and to let natural light through (for brighter whites) than about the color depth. Reason: 8-bit and 10-bit, black is black and white is white as far as the binary is concerned. AMD wants to make 10-bit color the norm for all panels so that the graphics card can send the data regardless if the monitor can actually reproduce it.

HDR (the contrast ratio between black and white) is far more important than the color depth (8-bit or 10 bit) as far as I'm concerned. You can clearly see the benefits of HDR--you can't clearly see the benefits of 10-bit under most circumstances.
 
Joined
Apr 2, 2011
Messages
2,659 (0.56/day)
And it would prove what?



How can you get to HDR colors without 10 bit?



It's there today, at the very least, Fury Nano (after recent nice price cut) vs 980 (non ti).
380,390 are strong products as well.

You seem to be allergic to reading. My point was spelled out below (in that post). I'm tempted to do the same, but if I've taken the effort to speak with you so I guess I can try and understand your point before completely losing my relevance by asking a question answered later (in that post).

HDR doesn't require 10 bit. HDR functionally makes the scale for color bigger. @Xzibit's graph does an excellent job showing it. Mathematically, 64,64,64 in an HDR monitor might equate to 128,128,128 in a standard monitor. This will increase the range between different values, which will make 10 bit relevant because the colors will be separated by more than in an 8-bit monitor. At the same time if you need x to make y relevant then y on its own is significant;y less relevant. This is even more true when x and y enabled devices are still perched well out of the reasonable consumer pricing zone.



No, just no. You're telling me than despite having my current 7970 GPU on the same process node as Fury and the 980 they are worth buying. You're saying that a purchase of $500+ is warranted, to get marginally better performance. You're of course glossing over the near release of Pascal, and Polaris (in a thread about Polaris no less) to tell me that both of these cards represent a worthwhile delta in performance for their substantial price tag. Either you don't understand the value of money, or you have a definition of worthwhile reason to upgrade that allowed you to upgrade through each and every one of Intel's recent processors despite the popular concensus being that Skylake is really the first time we've had a CPU worth upgrading to since Sandybridge. Let me make this clear though, buying a new card today I'd go with your selections. New versus upgrading is an entirely different consideration though.


I think you've adequately demonstrated both a personal lack of effort by not reading my comment, and an understanding of the situation which I can't link to a rational drive. As such, I'm going to just ask one thing from you; show me any proof of your statements and logic. deemon put forward the effort to find an infographic. If you're going to make a claim, support it. Math, graphics, infographics, and the like make your points salient. You've come to the discussion believing that you don't even have to try and understand the opposite argument or fully comprehend what they are saying.

In contrast, I believe I understand where deemon is coming from. It's a technically correct point, more color variations is better. What it isn't is a point which I feel should be used to sell monitors, or demonstrated with misleading material. This is where we differ, and it seems to be an irreconcilable point. There isn't a right answer here, but there are plenty of things that can be contradicted if said without comprehending their meaning.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I think cards going back a really long ways had hardware support for 10-bit but it was disabled in drivers except on professional cards (Fire and Quadro). It sounds like AMD is going to stop disabling 10-bit on consumer cards because AMD wants all monitors to support 10-bit no matter the application (kind of like their mentality towards adaptive sync).
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
I think cards going back a really long ways had hardware support for 10-bit but it was disabled in drivers except on professional cards (Fire and Quadro). It sounds like AMD is going to stop disabling 10-bit on consumer cards because AMD wants all monitors to support 10-bit no matter the application (kind of like their mentality towards adaptive sync).

10-bit has to be supported at the content/software level to make a difference. If not it be no different then current conversion you'd just be filling in to get to 10. Wouldn't make much sense having it on when the monitor itself is a 10-bit non HDR or just a regular (6-bit+FRC/8-bit) monitor. You'd also just be increasing bandwidth usage. Those monitors would have to have a TC capable of down-sampling. You'd also be up-sampling any content not compliant. Could turn into a mess of up & down sampling before it is viewed.

Supporting it normally is more likely. The EDID of a monitor/TV can tell the GPU/device what its capable of. User would have option to enable/disable.
 
Last edited:
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Polaris will be the first time I've seen a reason to spend money since the 7xxx series from AMD....
...my current 7970 GPU....


You're saying that a purchase of $500+ is warranted, to get marginally better performance...
That's much narrower a statement, from 7xxx.
Are you seriously saying that Fury cards are only "marginally" faster than 7970?

My point was spelled out
Into a wall of text, when main point could have been expressed in a single sentence: compression will eat up those extra bits in 10 bit.
Well, we'll see about that.

HDR doesn't require 10 bit.
Depends on what one means by "require". In my books, if you have a 10 bit panel (and all HDR panels I've heard about are 10 bits and to my knowledge it will stay like that for quite a while) 10 bit support in your graphic card is required.
 
Joined
Apr 2, 2011
Messages
2,659 (0.56/day)
That's much narrower a statement, from 7xxx.
Are you seriously saying that Fury cards are only "marginally" faster than 7970?


Into a wall of text, when main point could have been expressed in a single sentence: compression will eat up those extra bits in 10 bit.
Well, we'll see about that.


Depends on what one means by "require". In my books, if you have a 10 bit panel (and all HDR panels I've heard about are 10 bits and to my knowledge it will stay like that for quite a while) 10 bit support in your graphic card is required.

If you want to reply, and deserve a response, you read what somebody else wrote. If you don't want to reply to a wall of text, don't quote parts of it.


You continue to be ignorant. Technically the 290 was an improvement on the 7970. The question isn't what is better, but where the price to performance increase is reasonable. Going from a 7970 (with a healthy overclock) to a Fury isn't reasonable. This is the same argument you seem to be constantly making, under the assumption that there's no such thing as a cost to benefit weighting on money spent. Heck, HDR and 10-bit together are objectively better than what we have now, but I'm not rushing out to spend thousands of dollars on new hardware. If you really wanted to continue this logic you should own the Intel enthusiast platform's X offering, because everything else compromises core count and thus is worse.


You now try and summarize my entire point in a sentence, and utterly fail. Congratulations, you can't see the other side and it is therefore wrong. Third grade level reasoning there. What I've been saying is that 10 bit offers more distinction between already hard to differentiate colors. Your monitor is probably 1920x1080 (judging from sales figures). There are therefore more colors in the spectrum than can adequately fit on a horizontal line. How do you believe that's done? All you need to so is remove a ton of them, and round the remaining color values so that (2^8)^3 (256^3 = 16777216 colors fit into 1920 pixels. Do you not see the hypocrisy there? Better yet, I explicitly stated that HDR would change the limitations of colors (pure colors would be beyond the traditional 256 value range) such that HDR would make 10-bit more reasonable, yet you've somehow glossed over that conclusion.

I'd like you to ask yourself one question. @deemon is not an idiot, and has a point which has some foundation. Why then did they follow up the spectrum with 4 color blotches? I'll give you a hint, it was the most intelligent move I could have imagined, and while I don't agree to the point the argument was a laudable response to me. What was demonstrated was an argument that our current spectrum doesn't do enough to differentiate color values. It didn't rely on compression, or crappy math. It displayed the point simply and concisely. If you saw a marked difference between the four colors you have to admit that 10 bit color values would have erased that distinction by having so many more in between them that the colors wouldn't be distinguishable.

I don't buy the argument because I don't watch the solid color polygon network. Motion is what our eyes are better at discerning, which is why frame rate and pixel count are more important. Deemon is free to allow their money to push for 10 bit color. My money is going to push for higher pixels count and frame rates. Right now, 10 bit is a huge price premium from the hardware side, which is another huge hit against it being particularly useful. Whenever 10 bit and HDR capable monitors are reasonably priced (ie, reasonably enough that most monitors/TVs use the technology) then 10 bit will be viable. At that point, @deemon will be 100% correct. For now, with Polaris being a few months from release, it's not a concern. Unsurprisingly, I only care about now because if I forever lived in the future I'd never be able to buy anything and be happy with its performance.
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Technically the 290...

No, thanks.
I have asked a simple question, "Are you seriously saying that Fury cards are only "marginally" faster than 7970" and I want to see a simple answer to it, like YES (look, it's merely X% faster) or NO.

I read your wall of text and didn't find the answer to that.

Now to newly introduced "reasonably priced" argument.
MSRP of 7970 was 550$.
MSRP of Fury Nano (yet, any would do) is 499$
Fury STOMPS over 7970 performance wise, it's nowhere a "marginal" improvement, 290x was +30-40% over 7970 on avg, Furys are faster than that.

10 bit offers more distinction between already hard to differentiate colors.
Among other things.
To me, this is the least compelling benefit it brings.
Much more interesting is wider gamut.


I explicitly stated that HDR would change the limitations of colors (pure colors would be beyond the traditional 256 value range) such that HDR would make 10-bit more reasonable, yet you've somehow glossed over that conclusion.
And why do I need to address your own objections of your own arguments?


What was demonstrated was an argument that our current spectrum doesn't do enough to differentiate color values.
I don't need to see bars to get that point.
 
Joined
Apr 2, 2011
Messages
2,659 (0.56/day)
No, thanks.
I have asked a simple question, "Are you seriously saying that Fury cards are only "marginally" faster than 7970" and I want to see a simple answer to it, like YES (look, it's merely X% faster) or NO.

I read your wall of text and didn't find the answer to that.

Now to newly introduced "reasonably priced" argument.
MSRP of 7970 was 550$.
MSRP of Fury Nano (yet, any would do) is 499$
Fury STOMPS over 7970 performance wise, it's nowhere a "marginal" improvement, 290x was +30-40% over 7970 on avg, Furys are faster than that.


Among other things.
To me, this is the least compelling benefit it brings.
Much more interesting is wider gamut.



And why do I need to address your own objections of your own arguments?



I don't need to see bars to get that point.

I've really got to ask, troll or idiot?

Now that I've taken the effort to say that, let's define why. In a single game, the 290 was capable of going from 70 FPS to 95 FPS (http://gpuboss.com/gpus/Radeon-R9-290X-vs-Radeon-HD-7970). That "30-40%" improvement means nothing, when the difference is based upon setting everything to maximum and playing it at a constant resolution. You've pulled numbers that mean very little, when I'm literally already playing with all of the eye candy set to maximum. If the 7970 could only get 70 FPS at moderate setting, and the 290x could get 95 at high settings I'd gladly agree with you. The problem here is numbers aren't useful without context, that you are incapable or too lazy to provide. I'll say this much, the reviewers here said that the 290x was a waste on any resolution less than 2560x1600, and I'd have to agree.



I'd like to follow that up with the comment that you are either being obtuse intentionally, or revision of history is acceptable to you. MSRP today and at release aren't the same thing. You quote the release price of the 7970, but the newly discounted price of the Fury is on display (release was $550 or $650). Tell me, isn't the best bang for your buck then going to be the oldest high level card out there? If I were to follow that logic a 7970 can be had today for a couple hundred dollars. The Fury is now $500. That means the price difference is 250%. Can the Fury perform at 250% of the output of a 7970? Nope.


Finally, you don't get 10-bit at all. Please stop talking like you do. The wider range of colors is all HDR. Let's make the example simple. You've got a red LED. Said LED can be signaled between 0 and 2 volts. The steps between on and off (0-2 volts) in 8 bit would be 0.0078 wide. This mean that 1 volt would be 128. Now we've got 10 bit color. The LED is powered the same way, 0-2 volts. The difference is each step is now 0.0020 wide. That means 1 volt would now be 512. In both cases for 1 volt we've got different numbers, but the color is the same exact value. That's the difference between 10 bit an 8 bit color encoding (grossly simplified).

You quote a "wider gamut" of colors. The earlier post by Xzibit explains that is what HDR does. Let me give you a clue, HDR is High Dynamic Range; what you are talking about is a greater range in colors. I'm not sure how that eluded your detection, but I think we can agree that the logic is easy to comprehend now.




Why are you continuing to argue the point? I can only assume that you're one of the people who upgraded to Fury or the 2xx or 3xx series cards. Cards that were developed on the same node as the 7970, but pushed further from the factory to help differentiate them when performance gains from minor structural changes proved insignificant. I will say that the 9xx series from Nvidia is excellent for DX11 performance. The 3xx series from AMD is doing a very good job driving huge monitors. The problem with both is that their improvements aren't noticeable gaming, despite benchmarks telling us otherwise. Polaris and Pascal are setting themselves up as a huge leap in performance that will be appreciable. If you'd like to argue otherwise I implore you to go spend the money. The entire point here was to discuss Polaris, but you've made it about why I should support an insanely expensive feature and buy a GPU today. I can't understand why, but you seem hell bent.
 
Joined
Feb 8, 2012
Messages
3,013 (0.68/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
10-bit has to be supported at the content/software level to make a difference. If not it be no different then current conversion you'd just be filling in to get to 10. Wouldn't make much sense having it on when the monitor itself is a 10-bit non HDR or just a regular (6-bit+FRC/8-bit) monitor. You'd also just be increasing bandwidth usage. Those monitors would have to have a TC capable of down-sampling. You'd also be up-sampling any content not compliant. Could turn into a mess of up & down sampling before it is viewed.
Not actually downsampling, rather dihtering. There will have to be a couple of years where all 8 bit color content will be dihtered to be displayed on 10 bit panels ... just like 8 bit content is now dihtered to be displayed on 6 bit panels. (well almost like it, because dihtering works both ways either when increasing or decreasing resolution of the color space)
 
Last edited:
Top