Thursday, October 16th 2014

8K A Great Challenge: NVIDIA and AMD

Even as 4K Ultra HD (3840 x 2160) is beginning to enter the consumer mainstream, with 28-inch displays being priced around $600, and Apple toying with 5K (5120 x 2880), with its next-generation iMac Retina desktops, Japanese display maker Sharp threw a spanner in the works, by unveiling a working prototype of its 8K (7680 x 4320 pixels) display, at the CETAC trade-show, held in Japan.

Two of the industry's biggest graphics processor makers, NVIDIA and AMD, reacted similarly to the development, calling 8K "a great challenge." Currently, neither company has a GPU that can handle the resolution. 8K is four times as many pixels as 4K. Driving an Ultra HD display over DVI needs two TMDS links, and DisplayPort 1.2 and HDMI 2.0 have just enough bandwidth to drive Ultra HD at 60 Hz. To drive 8K, both NVIDIA and AMD believe you would need more than one current-generation GPU, the display should connect to both cards over independent connectors, and somehow treat the single display as four Ultra HD displays. We imagine Sharp demoed its display at a very low refresh rate, to compensate for the bandwidth limitation. After 10 years of Full-HD tyranny, display resolutions are finally beginning to see their normal rate of development. It's time now for GPU developers and display interconnects to keep up.
Source: Expreview
Add your own comment

93 Comments on 8K A Great Challenge: NVIDIA and AMD

#51
FordGT90Concept
"I go fast!1!11!1!"
But the infrastructure is not for CAT-8 (CAT-6 is still kind of rare). It is for DisplayPort and especially HDMI. Both of which use smaller gauge cable and more wires. There are no specs presently that can handle 8K without multiple cables. The industry needs to decide if higher and higher resolution displays will be the new norm and if so, they need to create a standard that can respond to it in kind (think DVI with single and dual-link: standard had room for expansion even though few displays needed it).
Posted on Reply
#52
ZeDestructor
FordGT90ConceptBut the infrastructure is not for CAT-8 (CAT-6 is still kind of rare). It is for DisplayPort and especially HDMI. Both of which use smaller gauge cable and more wires. There are no specs presently that can handle 8K without multiple cables. The industry needs to decide if higher and higher resolution displays will be the new norm and if so, they need to create a standard that can respond to it in kind (think DVI with single and dual-link: standard had room for expansion even though few displays needed it).
My point was more about the fact that in terms of bandwidth, the tech is here, and largely usable (much like Ethernet, and much more closely, PCIe, Displayport uses a packet-based architecture, also why I chose Ethernet as example). The standard needs a bit of updating, but not as much as you would think. If copper cabling becomes an issue, fibre will replace it, but DisplayPort at it's core will likely remain unchanged (besides allowing higher bit rates within the protocol). This is also the most likely place where 8K is stuck: deciding if the world is ready or not to jump to all-fibre. I for one am all for it.

Oh, nd High-res displays will become the norm: the Industry has just found a new cash cow, and engineers are more than happy to have fun designing that stuff!
Posted on Reply
#53
JDG1980
BansakuIt takes 2 top tier GPUs to effectively play a game at 4K with moderate settings at a decent frame-rate. I don't see 8K coming any time soon. Could you imagine the heat coming off a GPU that can fully render 8K with all of the rendering goodies?
It isn't all about gaming. Once you've worked with a high-DPI smartphone or tablet, the text reproduction on a standard PC monitor looks crude and blurry in comparison. And the only reason it even looks as good as it does is because of a lot of terrible hacks, like font hinting and sub-pixel AA. Once screens move to 300 PPI and beyond, these hacks can be done away with. We will finally have print-quality reproduction on a monitor screen with no compromises.
Posted on Reply
#54
ZeDestructor
JDG1980It isn't all about gaming. Once you've worked with a high-DPI smartphone or tablet, the text reproduction on a standard PC monitor looks crude and blurry in comparison. And the only reason it even looks as good as it does is because of a lot of terrible hacks, like font hinting and sub-pixel AA. Once screens move to 300 PPI and beyond, these hacks can be done away with. We will finally have print-quality reproduction on a monitor screen with no compromises.
Font Hinting, sub-pixel rendering and AA are all amazing, they just work much better on high-DPI screens than they do on low-DPI screens, as evidenced by all manner of devices with High-DPI screens (they all have font smoothing).
Posted on Reply
#55
Katanai
Most of the posts here are simply wrong. You view 8K too much from a narrow PC centric perspective. 8K is first a video format resolution and from that perspective there is nothing wrong with it. There are 8k cameras right now that take awesome video footage, the best looking footage ever taken in the history of mankind, comparable only to the large imax cameras. Those 8K cameras when hooked up to an 8K display are able to drive that display at that resolution perfectly. Not at a low refresh rate as the writer of the article imagined, 8K cameras can record at 120 frames per second and if the monitor supports it, display that footage at 120Hz. Any 8K monitor that exists right now, and there are a few this is not the first one, can display 8K footage at 60Hz without any problems. Now when this resolution will be available to PC users as a viable alternative, I don't know...
Posted on Reply
#56
ZeDestructor
KatanaiMost of the posts here are simply wrong. You view 8K too much from a narrow PC centric perspective. 8K is first a video format resolution and from that perspective there is nothing wrong with it. There are 8k cameras right now that take awesome video footage, the best looking footage ever taken in the history of mankind, comparable only to the large imax cameras. Those 8K cameras when hooked up to an 8K display are able to drive that display at that resolution perfectly. Not at a low refresh rate as the writer of the article imagined, 8K cameras can record at 120 frames per second and if the monitor supports it, display that footage at 120Hz. Any 8K monitor that exists right now, and there are a few this is not the first one, can display 8K footage at 60Hz without any problems. Now when this resolution will be available to PC users as a viable alternative, I don't know...
I'm expecting 8K displays in the next 24months myself.

We have 5K screens already :)
Posted on Reply
#57
Katanai
ZeDestructorI'm expecting 8K displays in the next 24months myself.

We have 5K screens already :)
The writer of this article is mainly responsible for the confusion here as he presented this as being something completely new. He should have researched this a bit more before posting this. Here is a clip from 2011 showing a working 8K display at 60Hz lol.
Posted on Reply
#58
FordGT90Concept
"I go fast!1!11!1!"
Using what interface? I came across this and if the cable coming out that I think is it looks proprietary (almost like SATA). I'm also pretty sure it is encoding via HEVC from 30 Gbps to 88 Mbps which is being sent to the display and being decoded to 4K.
Posted on Reply
#59
Katanai
FordGT90ConceptUsing what interface? I came across this and if the cable coming out that I think is it looks proprietary (almost like SATA). I'm also pretty sure it is encoding via HEVC from 30 Gbps to 88 Mbps which is being sent to the display and being decoded to 4K.
I don't know what interface it is being used but here is a 8K 120Hz panel from LG .
Posted on Reply
#60
FordGT90Concept
"I go fast!1!11!1!"
They have to be using multiple HDMI/DisplayPort or something proprietary.
Posted on Reply
#61
ZeDestructor
KatanaiThe writer of this article is mainly responsible for the confusion here as he presented this as being something completely new. He should have researched this a bit more before posting this. Here is a clip from 2011 showing a working 8K display at 60Hz lol.
10bit too! HNNNG!
FordGT90ConceptUsing what interface? I came across this and if the cable coming out that I think is it looks proprietary (almost like SATA). I'm also pretty sure it is encoding via HEVC from 30 Gbps to 88 Mbps which is being sent to the display and being decoded to 4K.
Looks like twin-axial cabling sending raw uncompressed frames (at 30Gbit/s) to a splitter box that converts into 17 much smaller frame strips sent as SDI signals to the encoders.

PS: SATA 3.0 (6Gbit/s) uses twin-axial cabling for the differential pairs, hence the similarity.
Posted on Reply
#62
ZeDestructor
FordGT90ConceptThey have to be using multiple HDMI/DisplayPort or something proprietary.
Nope, just Twin-Ax from the camera to the splitter box. Yes, 30Gbit/s over a single pair :)

You see Twin-ax commonly is in 40 and 100 gigabit ethernet for very short runs (7m or less) which don't require fibre, running at either 10.3125Gbit/s per lane (one twin-ax cable per lane) for 100GBASE-CR10 or 25.78125Gbit/s per lane for 100GBASE-CR4.
Posted on Reply
#63
FordGT90Concept
"I go fast!1!11!1!"
Still a proprietary cable. Cable standards need to catch up to the proposed display standards.
Posted on Reply
#64
ZeDestructor
FordGT90ConceptStill a proprietary cable. Cable standards need to catch up to the proposed display standards.
That was a professional video camera. Those always have strange standards, and almost completely interoperable, SDI and HD-SDI for instance is a common interconnect used there, while HDMI and DisplayPort are simply use for nothing besides connecting TVs and computer monitors. The only possibly proprietary cable in that video would be the camera to aplitterbox cable. Everything else will be running on other standards that consumers never interact with.

Consumer cabling will follow later, with a lot less reliability features, much like SATA vs SAS.
Posted on Reply
#65
The Von Matrices
ZeDestructor#1 Diplayport Can do ~25Gbit/s right now, and according to the DiaplyPort page on wikipedia, 8K*24bit@60Hz lies around 50-60Gbit/s. For 8K*30bit@120Hz, 125-150Gbit/s should be the bandwidth we're looking at. CAT-8 cabling (4-pair, 8wires) is currently being finalized to provide 40Gbit/s, over 100m. DP is a 4-lane cable (one "pair" per lane). Using CAT-8 grade of cabling with the right transceivers over the usual 5m max needed length (a 5m CAT-8 cable should be good for at least 100Gbit/s) of DP cabling, 8K is perfectly feasible with current tech. Expensive, but feasible. Hell, odds are that CAT-8 cabling will be good for 100Gbit/s over the full 100m length thanks to new electronics... Pennsylvania State Universitypeople theorized that 32 or 22nm circuits will do 100Gbit/s over 100m of CAT-7A in 2007.
You're looking at the wrong side of the issue. The cables can be cheap but the transcievers are what make things expensive. Look at Thunderbolt - the transcievers, not the cable between them, are why you can't get a Thunderbolt cable for less than $30 no matter how short it is. Similarly, I would love to have 10Gb Ethernet connecting me to the storage server in my house, but I can't justify paying $150 for each 10GBase-T NIC.
Posted on Reply
#66
ZeDestructor
The Von MatricesYou're looking at the wrong side of the issue. The cables can be cheap but the transcievers are what make things expensive. Look at Thunderbolt - the transcievers, not the cable between them, are why you can't get a Thunderbolt cable for less than $30 no matter how short it is. Similarly, I would love to have 10Gb Ethernet connecting me to the storage server in my house, but I can't justify paying $150 for each 10GBase-T NIC.
Fair point, although I think that the transceiver issue can be solved fairly easily by scaling production up.

Also, where do you fin a 10GBASE-T NIC for $150?! The lowest I've seen an X540-T1 was around the $300 mark... and then I need to get a new switch...
Posted on Reply
#67
Sony Xperia S
RojanUP2414Q, been out for months
Good but still I realised that I don't want or need this brand and expensive, 700 $. Meh :rolleyes:

I need my 22 inch 4K monitor for 300 $ NOWWW!!!
Posted on Reply
#68
The Von Matrices
ZeDestructorFair point, although I think that the transceiver issue can be solved fairly easily by scaling production up.

Also, where do you fin a 10GBASE-T NIC for $150?! The lowest I've seen an X540-T1 was around the $300 mark... and then I need to get a new switch...
You're right in that NICs cost closer to $300 than the $150 I stated. The cheapest NIC I could found is a Marvell based Startech card that costs $284. The switches though are relatively cheap per port compared to the NICs; an 8-port switch costs only $824 or $103 per port.
Posted on Reply
#69
ZeDestructor
The Von MatricesYou're right in that NICs cost closer to $300 than the $150 I stated. The cheapest NIC I could found is a Marvell based Startech card that costs $284. The switches though are relatively cheap per port compared to the NICs; an 8-port switch costs only $824 or $103 per port.
Not bad..

Though I think I'll be waiting another 2-5 years before I move up. If I'm buying comepletely new, I want something quiet, if second hand, at least 24 ports.
Posted on Reply
#70
anonymous6366
first off, 1080p 120Hz>4K 60Hz
8K... well first off, as other people said there isn't a video card that would be able to handle that.
I would also agree with the comments about this being a marketing strategy to trick rich people who don't know anything and think that the most expensive product is always the best (I know people like this). I'm still in the stone ages with a 1680x1050 monitor so what do I know lol
Posted on Reply
#71
ZeDestructor
anonymous6366first off, 1080p 120Hz>4K 60Hz
8K... well first off, as other people said there isn't a video card that would be able to handle that.
I would also agree with the comments about this being a marketing strategy to trick rich people who don't know anything and think that the most expensive product is always the best (I know people like this). I'm still in the stone ages with a 1680x1050 monitor so what do I know lol
I have better vision than most people (excluding people with glasses), and all I want is nicer text. 4K is just an in-between step IMO. 8K is where would stop, for screens. VR needs higher... Seriously, just look at normal 8pt text on a 100% (100dpi/ppi) scale screen at 800% magnification... so horrible, and I can see the horribleness on a normal 1920x1200 24" (100ppi) screen :(

As for the marketing, well, it's marketing.. It's always gonna be aimed at rich people with more money than sense
Posted on Reply
#72
Sony Xperia S
ZeDestructorAs for the marketing, well, it's marketing.. It's always gonna be aimed at rich people with more money than sense
Marketing is not specifically aimed at a given class, be it the poor who have enough for something or the rich who are trying to find where to waste their money, it is invented to show products in better light than they actually are.

Ok, the classic should be that it just showcases the product but nowadays it is more like imagination and fantasies...
Posted on Reply
#73
Hood
NaitoReally!? I haven't even moved to 1440p yet! Even though I would love to! :(
I took the plunge a few months ago and ordered a Crossover 27QW Perfect Pixel model from NewEgg (about $400). It looks amazing. The 27" 2560x1440 IPS panel is a definite step up from 1080p, and games have no problem even with my single GPU (660 Ti). These monitors ship direct from South Korea and arrive in 3 days. No dead pixels, and it's still working perfectly. Prices have dropped to about $350. The panels used are the same Samsung or LG panels found in the expensive name brands ($700-$900). The stand isn't adjustable, but that was easily remedied by setting it on a 3 inch riser which doubles as a place to slide my keyboard under (for more desk space).
Posted on Reply
#74
johnspack
Here For Good!
Funny, I'm sure these arguments were had over 1080 a few years ago in here. Eventually 4k will be the norm, and further down the line 8k. I'm still pissed over the 1200p thing, why wasn't that the norm? Video cards will catch up, 4k monitors will become standard, and we'll all be laughing at the old days when we only had 1080p. Let's allow advancement, so it will become the norm.......
Posted on Reply
#75
newconroer
" and somehow treat the single display as four Ultra HD displays. " This is the concerning part... and 4k faced it already until I think Iiyama figured out a resolution ..

johnspackFunny, I'm sure these arguments were had over 1080 a few years ago in here. Eventually 4k will be the norm, and further down the line 8k. I'm still pissed over the 1200p thing, why wasn't that the norm? Video cards will catch up, 4k monitors will become standard, and we'll all be laughing at the old days when we only had 1080p. Let's allow advancement, so it will become the norm.......
Well quite a few of us have been laughing at 1080p for a long time now. The computer graphics and monitor market is to the audio visual industry what Formula 1 is to the automaker industry. All of the cutting edge stuff has an influence on what becomes common in the marketplace.

The problem with 1080p is that it's out lived it's welcome and would have been replaced by 1440p/1600p, if television manufacturers and broadcast networks weren't so lazy or behind the times. It also doesn't help that the popular console systems have only now just gotten 1080p.

1080p needs to die, and die quick.
Posted on Reply
Add your own comment
Apr 26th, 2024 18:21 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts