I cannot seem to find a simple answer to this question! Anyway. So I found in my next display search, some talk brewing over whether a monitor is a true 8-bit vs 10-bit. Take these Dell Panels for instance: Dell U2713HM http://www.newegg.com/Product/Product.aspx?Item=9SIA0ZX1AJ7781 Dell U3014 http://www.newegg.com/Product/Product.aspx?Item=N82E16824260132 From what I gather, 10-bit displays need a Quadro or Firepro card. But there are possible instances where this may not be the case with the right Geforce card?!? I also saw that now for 2650x1440 it does not matter if you use HDMI (with rev 1.3 or higher) or Display Port. (Also for AdobeRGB support you need HDMI 1.4 or DisplayPort) I just wish this were laid out a bit simpler. Does anyone have a good experience with these? I know either way a display upgrade from my TN Samsung will be fine. Currently I was lucky to binge on a 780 Ti, so, wondering if its worth the extral leap in the panel for my photo/video work, in addition to the wind-down headshot sessions.