- Joined
- Apr 30, 2012
- Messages
- 3,881 (0.89/day)
I think it falls on VESA. FreeSync exists because VESA didn't go far enough with DisplayPort 1.2a and it may have stemmed from a lack of understanding of the issues in pushing an unfinished frame out to the monitor. In order to compensate, AMD had to create a proprietary standard (FreeSync) which requires monitors to not only support Adaptive Sync (to send the signal), but also enforce a minimum refresh rate internally so should the GPU fail to deliver it, it doesn't destroy itself. VESA needs to require Adapative Sync monitors to have a memory buffer that can hold the previous frame in full and pull from that buffer whenever the input falls below the minimum for the panel.
Why should the buffer be in the monitor? Two reasons:
1) Reduces DisplayPort bandwidth usage which may be needed elsewhere.
2) It is extremely simple to calculate how large of a frame buffer needs to be in a monitor based on its specs (height * width * bits / pixel) where a single GPU running 6 displays would have to maintain 6 separate buffers of ridiculous size (potentially) to satisfy the need (assuming it isn't pulled from its existing memory).
If VESA did this, FreeSync and GSync would go the way of the dodo and GPU design doesn't get more complicated.
VESA has done this and better just not in desktop form. The LVDS panels are the same. Its just a matter of adaptation to stand alone monitors. Pointed it out more then a year ago here
Its what Nvidia try to copy. Just that Nvidia is using the AUX to sink and thus disabling the standards for audio pass-through. They still don't have any kind of backlight control.
If sales for VRR panels take hold I'd expect DP 1.4a to look more like the above.