Tuesday, September 20th 2011
Super-High 4096 x 4096 Display From An IGP? The Upcoming Ivy Bridge Can Do It
The new Ivy Bridge processors, due out in about six months, have one apparently overlooked but important feature. No, it's not the greatly increased speed (about double or more of Sandy Bridge) or the advanced feature set. It's actually the super-high resolution capability: specifically 4096 x 4096 pixels. This astonishing capability is far better than any of the top-end discreet graphics cards such as the NVIDIA GTX 590 or AMD HD 6990 via a single monitor port. It's so high in fact, that there's almost no content at that resolution and no monitor that can handle it. This IGP can actually play multiple 4K video streams, too. NVIDIA unsurprisingly, is talking up the gaming possibilites at such a resolution. I'd like to see what kind of monster GPU could handle it. It will be interesting to see what uses this capability gets put to generally - and just how much the whole setup will cost.
Source:
VR-ZONE
54 Comments on Super-High 4096 x 4096 Display From An IGP? The Upcoming Ivy Bridge Can Do It
4Kx4K is about improving the pixel density to show more information, more clearly.
As newtekie said, it's all about 2D environment. Desktop publishers would love this, finance industry would love this, photo editing would love this, people who like looking at maps would love this.
Remember that a regular A4 or Letter PRINTER is delivering (at 300dpi) circa 3Kx4K, so this resolution will allow you to see and read much more clearly.
99% of current medical imaging is NOT in 3D, but what the regular doctor or specialist is looking at.
Indeed very impressive for an IGP (Mine begins to slowdown when rendering 1280x1024 :( ), but we will see the performance.
If you were serious, then you were just plain wrong (misinformed or misinterpreting my post, that is)
I'm talking about HDMI Connector B
DVI has double the wires as HDMI, again, because the connector profile being used (A) is single-link. They achieve the ability to run at these resolutions and the ability to carry audio and DRM data solely by sending the clockrate through the roof (and likely by tweaking the protocol, I suppose)
If a theoretical HDMI B device existed at 1.4 spec, it would easily surpass DVI for bandwidth. HDMI A is basically matching single-link DVI while also carrying Audio and DRM. HDMI A at 1.4 spec supports 4096×2160p24 at a clock rate of 340. Imagine an HDMI B device (with 29 pins as opposed to HDMI A's 19) running at 340. That could just as easily be achieved by cranking the clock rate on DVI. Again, this is all hypothetical as DVI is abandoned, and no HDMI B devices exist.
www.youtube.com/watch?v=piCvq4hc5BU&feature=feedf
Let it buffer if it can't handle it in real time and then watch. In "original" resolution, of course.
Do we need that resolution, even if our current monitors can't support it ?
Short answer: yes. Long answer: yeeeeeeeeeeeees. :)
1080p is crap compared to that, even on normal displays.
At 4K it is becoming obvious that jpeg is out of date. We need something that is more efficient for CPU processing and scaling. Scalable or fractal graphics files, and also "multiple entry points" to the graphics file so that the separate CPU cores can parallelize/multi-thread the work more easily.
Then again, if you tried to run DVI at higher frequencies, the connector itself may become an obstacle. Hence why Cat6 is so stringent on how much of the pairs can be untwisted inside the connector. DVI may or may not limit the potential MHz just by the pin position. I guess we'll never know.
I've done a lot of work with sheilded CAT6 and your average DVI cable is far more impressive (it's like many CAT6 cables rolled into one).
The screws are awesome, especially when dealing with KVMs. Never, ever do I have to worry about something coming undone unless I want it undone.
100% agree on all points.
1: Yup.
2: It's more the connector itself. Really good quality signal meters can tell down to within ~1" how far down a cable any given connector may be, that's how much signal interference connectors create on high-frequency signals. As signal frequency goes up, so does the impact that connectors have on signals. I'm not an electrician, but I have talked at length with one or two who've done commercial high-frequency signalling installs as well as what I've read... I was just speculating that the decreased spacing between pins, and hence between any one signal wire and its paired ground on an HDMI cable may give it an advantage on obscenely high frequencies over short distances(although of course the DVI connector was never intended for any really high frequency and HDMI has poorer cabling)
3: Again, I agree. Unfortunately Joe Average doesn't. I can't describe to you how many times I've seen a nest of USB A to Micro B cables labelled like "mp3 player", "camera", "Flash drive", etc. Hey guys... They're all the same cable; the "U" stands for universal :shadedshu Generally, consumers like simple, easy, small, cheap, and stylish. If it were you and I running the world, I have a feeling that the current auto-tune craze in "popular" music wouldn't be the norm either.
Anywho </offtopic>
This won't be used for anything but ultra high-end monitors for photo editing and the like. 4Kx4K is just a max anyways. It'll still handle all the standard form factors (so any res that's a multiple of 4:3, 5:4, 16:9 which our monitors can display)
But in practical terms, it will be 4Kx2K for the consumer, and 4Kx4K for the workstation user. Unforunately, the TFT industry is YEARS away from providing us with these displays. YES, there is a 2K by 2K for the airline and mapping industry, at $10,000 per unit at volume quantities, and there is a 4Kx3K monitor for the medical industry at $35,000 per unit. I would be happy with either, but no way can I afford something like that.
So... we are at least 5 years away from 4Kx2K at affordable prices.
"HD" and y1080 took us off the path to 4K. I bought my first 1600x1200 nearly 10 years ago. TFT have some down in price, and quality has improved, but resolutions and pixel densities have not. (Except the iPad and iPhone, but they are small screens)
Would my screeen be classed as HD? The reason i mention it is because i have more hight than 1080 but as my screen aint wide screen i dont need 1920 pixels so to me it says my screen is still HD lol.
and that amd has heavy invested in displayport, due it to its high bandwidth support
so there is no reason why amd and nvidia cant support it at this moment
to get 4K, you will need DisplayPort, there is no way around that, HDMI/DVI/VGA is horrible outdated when it comes to RAW bandwidth to push 4k x 4k
dont amd eyefinity support something like 8k x 8k that is double that of intel ivy bridge:P
Give me packets.
Give me flexibility.
Every new generation of Intel IGP bring more bullshit hype and they never fail to deliver less than stellar performance after release. This kind of press release is not intended to computer enthusiasts but rather targeted at shareholders concerned about Intel's lack of presence in the high end GPU market.
This "ability" is either impossible (due to limitations of all the other hardware involved) or pointless. As for real performance, I would expect better than current APUs by a small margin (still limited by memory speed) and APUs due out with Ivy Bridge is released will be just as good or better in real world performance.