• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Super-High 4096 x 4096 Display From An IGP? The Upcoming Ivy Bridge Can Do It

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.81/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
The new Ivy Bridge processors, due out in about six months, have one apparently overlooked but important feature. No, it's not the greatly increased speed (about double or more of Sandy Bridge) or the advanced feature set. It's actually the super-high resolution capability: specifically 4096 x 4096 pixels. This astonishing capability is far better than any of the top-end discreet graphics cards such as the NVIDIA GTX 590 or AMD HD 6990 via a single monitor port. It's so high in fact, that there's almost no content at that resolution and no monitor that can handle it. This IGP can actually play multiple 4K video streams, too. NVIDIA unsurprisingly, is talking up the gaming possibilites at such a resolution. I'd like to see what kind of monster GPU could handle it. It will be interesting to see what uses this capability gets put to generally - and just how much the whole setup will cost.



View at TechPowerUp Main Site
 
Any talk about anything over 1920x1080 can only be good.
 
welcome 2 the matrix

Wow . Already unecessarily powerful...... now with even more POWER!!!!!:rockout: man I love intel:rockout::rockout::rockout:


Yeah yeah yeah nothing on earth now can effectively use this. Think of it as future proofing for the next decade. Or hologram ready :) yeah that's the ticket
 
Last edited:
I guess it was Intel's turn for a fluff post. Lets be realistic. Current dedicated GPU's are capable of outputting at a resolution far beyond their actual processing capability. All GPU's now list 2560 x 1600 as the maximum resolution for a single screen. There are maybe 2 that could actually do that at movie frame rate and only CRT monitors that could display that res.

On the flip, we have seen higher that 4k resolutions from AMD Eyefinity. Granted it was always more than one LCD and more than one GPU. In the end, this will just be some digits on a spec sheet that will be ignored and overlooked as they admit it will serve no purpose for lack of content. And if you do run a 4k res video, I am sure it will fall well below the 24 FPS movies use. Don't expect this on-die GPU to be beyond the sub-$100 performance section. So yes, I am going to hate a little here, but somebody has to do it.
 
Sure it can drive a monitor at 4096*4096, but what about games (which will likely be a slideshow). No GTX 590 or HD 6990 is going to be able to drive a 16 megapixel display as fluently as a 1920*1080 (~2 megapixel), so how is some Intel IGP sharing a DDR3 interface with the CPU going to do anything well at that resolution? Sounds like a USB video dongle displaying a Word document at that rate.
 
4096 x 4096 in a 2D environment, big deal. By the time we see that even come close to mainstream, when people that are cheap enough to use the IGPs are looking to buy monitors at that resolution, Ivy Bridge will be as distant a memory as the Pentium III.
 
I'd like a 4KHD (guess it'll be called S-HD or something) monitor now please.
 
The GPU isn't the problem, it's the cable. There's currently no means to deliver that high of a resolution to a display without using multiple cables. This is why NVIDIA and AMD haven't done it. It has me wondering about LightPeak or some other new technology they aren't talking about.

Eyefinity 6 can do a higher resolution (7680x3200, 24.5 MP) but over six discreet cables--Intel might simply be leaving out that little bit of information.
 
As everybody's said, the point is moot. I have a hard time believing that Intel's "IGP" can do any more than move windows around the desktop at that resolution without becoming a slideshow. They've come a long way since the Intel GMA of old, but not that far. Lets face it, photo editing is the only application where such ultra high-res could be used, without the dismal framerate rendering it completely useless. Even so, it would be irritating to use.
 
The GPU isn't the problem, it's the cable. There's currently no means to deliver that high of a resolution to a display without using multiple cables. This is why NVIDIA and AMD haven't done it. It has me wondering about LightPeak or some other new technology they aren't talking about.

Eyefinity 6 can do a higher resolution (7680x3200, 24.5 MP) but over six discreet cables--Intel might simply be leaving out that little bit of information.

I didn't even think about that. DisplayPort comes the closes but stops in the 3500 x 2600 ish range @ 30 bpp. Thunderbolt would be the only possible option.

P.S. It is no longer called LightPeak.
 
DL-DVI maxes out at 3840x2400. Didn't I read somewhere about QL-DVI? (dual DL-DVI cables in tandem)

I know HDMI has a "Professional" version of their spec taped out including a proprietary connector, which is supposed to address the disparity between HDMI and DVI as far as resolution capability, but last I read about it, no device was actually using the "pro" connector. Probably has sky-high liscencing fees too.
wikipedia said:
Type B
This connector (21.2 mm × 4.45 mm) has 29 pins and can carry double the video bandwidth of type A, for use with very high-resolution future displays such as WQUXGA (3,840×2,400).[48][49] Type B is electrically compatible with dual-link DVI-D, but has not yet been used in any products.[48][50]

I imagine if they spec'd for higher quality cabling and pushed out a new revision of the specification, new devices could probably operate on a combination of the 29-pin HDMI connector and higher clock rates. Although DVI and DisplayPort could do so just as easily.
Wikipedia - DVI said:
Dual link maximum data rate is limited only by the bandwidth limits of the copper the DVI cable is constructed of and by the DVI signal's source.
 
Last edited:
What does DL-DVI max out at? Didn't I read somewhere about QL-DVI? (dual DL-DVI cables in tandem)

I know HDMI has a "Professional" version of their spec taped out including a proprietary connector, which is supposed to address the disparity between HDMI and DVI as far as resolution capability, but last I read about it, no device was actually using the "pro" connector. Probably has sky-high licensing fees too.

DL-DVI maxes out at 3,840 × 2,400 @ 33 Hz and HDMI 1.4 maxes at 4096×2160 p24. As you seem to be unaware, all addition exploration into DVI was ended when HDMI became the standard. There were a number of projects in the work for DVI that did not and will not ever be completed. The final nail in the coffin was DisplayPort removing some of the proprietary issues with HDMI's license.

There is no disparity in resolution between DVI and HDMI. The only reason the specs are different is DVI is very old and still worked with 4:3 computer monitors in mind, while HDMI was done mainly for 16:9 with all display systems in mind.

And as far as I know, there never has been an HDMI Pro. Sounds like some marketing gimmick BS from Monster Cables to sell their $100 cables to simple folk who don't know better.
 
I didn't even think about that. DisplayPort comes the closes but stops in the 3500 x 2600 ish range @ 30 bpp. Thunderbolt would be the only possible option.

P.S. It is no longer called LightPeak.
Thunderbolt's effective bandwidth is 10 Gb/s. DisplayPort is 17.28 Gb/s.

Thunderbolt is copper-based. LightPeak is optical/fiber based. Optical is ideal for sending imagery but it isn't simple nor cheap. Maybe Intel had a breakthrough.


DL-DVI maxes out at 3840x2400. Didn't I read somewhere about QL-DVI? (dual DL-DVI cables in tandem)
At an abnormal 33 Hz. 2560x1600 is the maximum dual-link DVI can handle at 60 Hz. 60 Hz is the standard for computers.

Quad-link DVI is an impossibility due to not having enough physical connections in the DVI standard. You might be thinking of 2 x dual-link DVI (literally two inputs on the monitor) which a lot of very high resolution (5+ MP) professional monitors use.
 
Last edited:
And Wikipedia left out the fact that DVI is limited by the coding used to translate a signal to DVI standards. A set of encoding that will no longer be updated to improve it. Note HDMI has physically not changed. It has simply adhered to higher production standards and the encoding scheme has been updated to improve resolution and bandwidth.
 
So what about multiple cables? It's how we've been doing 4K for years now.
 
DL-DVI maxes out at 3,840 × 2,400 @ 33 Hz and HDMI 1.4 maxes at 4096×2160 p24. As you seem to be unaware, all addition exploration into DVI was ended when HDMI became the standard. There were a number of projects in the work for DVI that did not and will not ever be completed. The final nail in the coffin was DisplayPort removing some of the proprietary issues with HDMI's license.

There is no disparity in resolution between DVI and HDMI. The only reason the specs are different is DVI is very old and still worked with 4:3 computer monitors in mind, while HDMI was done mainly for 16:9 with all display systems in mind.

And as far as I know, there never has been an HDMI Pro. Sounds like some marketing gimmick BS from Monster Cables to sell their $100 cables to simple folk who don't know better.

I'm perfectly aware DVI is no longer updated. Nice try at the troll though. "pro" was just my name for it, for lack of a better word. The connector is "HDMI B", and as wikipedia and every other source on the planet says, nothing uses it. Of course there is no HDMI Pro, and if there was a brand to sell a "HDMI Pro" cable, you're damn right it would be Monster trying to milk a "superior" gold coated diamond encrusted blessed by the saints of endor HDMI 1.4 "A" Cable. When I said pro, I just meant intended for professional use, in the same way 2xDL-DVI is.

At an abnormal 33 Hz. 2560x1600 is the maximum dual-link DVI can handle at 60 Hz. 60 Hz is the standard for computers.

Quad-link DVI is an impossibility due to not having enough physical connections in the DVI standard. You might be thinking of 2 x dual-link DVI (literally two inputs on the monitor) which a lot of very high resolution (5+ MP) professional monitors use.

Exactly. QL-DVI was one brand's marketing name for the 2xDL-DVI system. IF someone updated the DVI spec for a higher quality cable (which is what HDMI did, electrically speaking, then HDMI just fiddled with the TMDS protocol to add DRM and interleave audio on the video pins, with a higher clock rate) so it could handle a higher clock rate, actual QL-DVI-Effective bandwidth may be possible on a DVI connector.
 
Last edited:
Sure Intel might be able to do 4k x 4k but at what kind of frame rate? No fun watching youtube videos on a $32k TV that shows videos as a slide show.

Also Intel =! drivers.

Until that problem is solved this is just another Intel "marketing" point.
 
Thunderbolt is copper-based. LightPeak is optical/fiber based. Optical is ideal for sending imagery but it isn't simple nor cheap. Maybe Intel had a breakthrough.

Thunderbolt is LightPeak. Several revisions were made during its development.

Originally conceived as an optical technology, Thunderbolt switched to electrical connections to reduce costs and to supply up to 10W of power to connected devices.[14]

In 2009, Intel officials said the company was "working on bundling the optical fibre with copper wire so Light Peak can be used to power devices plugged into the PC."[15] In 2010, Intel said the original intent was "to have one single connector technology" that would allow "electrical USB 3.0 […] and piggyback on USB 3.0 or 4.0 DC power."[16]

In January 2011, Intel's David Perlmutter told Computerworld that initial Thunderbolt implementations would be based on copper wires.[17] "The copper came out very good, surprisingly better than what we thought," he said.[18]

Intel and industry partners are still developing optical Thunderbolt hardware and cables.[19] The optical fiber cables are to run "tens of meters" but will not supply power, at least not initially.[20][21][22] They are to have two 62.5-micron-wide fibers to transport an infrared signal up to 100 metres (330 ft).[23] The conversion of electrical signal to optical will be embedded into the cable itself, allowing the current DisplayPort socket to be future compatible, but eventually Intel hopes for a purely optical transceiver assembly embedded in the PC.

As of now, Thunderbolt is a PCIe 4x and DisplayPort rolled into one. Bandwidth is 10 Gbits/s bi-directional. Switch that to single direction and you could currently get 20 Gbits/s, putting it just north of DisplayPorts max.

Reference: http://www.intel.com/technology/io/thunderbolt/ under the "what is Thunderbolt section"
 
...actual QL-DVI-Effective bandwidth may be possible on a DVI connector.
It most likely is, but only over very short distances. Digital Display Working Group thought about cable lengths and signal quality when they introduced the DVI standard; the HDMI Founders did not. DVI has more than double the data wires than HDMI and the quality of said cables is much, much greater (better isolation, thicker, etc.). DVI, therefore, should be able to push at least double the bandwidth over the same distance but, as TheLaughingMan pointed out, DVI is no longer being expanded because of DisplayPort.


...but eventually Intel hopes for a purely optical transceiver assembly embedded in the PC.
That is LightPeak. Thunderbolt is the result of the future meeting reality. The goal that is LightPeak is still very much alive. I wouldn't be surprised at all if Thunderbolt always remains copper-based. DisplayPort was not intended for optical signals so they're going to have to put in place massive limitations (the optical/electrical conversions in the cables themselves) by not changing the connectors.


24-bit, 4096x4096 @ 60 Hz = 24.159191040 Gb/s
32-bit, 4096x4096 @ 60 Hz = 32.212254720 Gb/s
 
Last edited:
Do not be so impressed with this technological terror they've constructed. The ability to project to 4k resolutions is insignificant next to the power of a GPU.
 
I'm perfectly aware DVI is no longer updated. Nice try at the troll though.

That was not a troll post. I was kinda serious.

And HDMI "B" is nothing special. It was just a revision designation. When they update 1.4a it will become 1.4b. If they change something major or there is a planned upgrade in bandwidth or performance, it will be 1.5.

24-bit, 4096x4096 @ 60 Hz = 24.159191040 Gb/s
32-bit, 4096x4096 @ 60 Hz = 32.212254720 Gb/s

I did say maybe. LightPeak is not some grand scheme or goal. It was just a code name. They may reuse the code name, but I really, really doubt they would.
 
Last edited:
Do not be so impressed with this technological terror they've constructed. The ability to project to 4k resolutions is insignificant next to the power of a GPU.

Hehe: Star Wars reference :D
 
correct me if im wrong can this be used effectively in medical imaging?
 
No, everything that uses high resolution (read: CT and MRI scans) also needs substantial computational power. IGPs are simply not powerful enough to render dozens of slices in real-time without a serious GPU.
 
Back
Top