Wednesday, September 11th 2019

Intel Says Its Upcoming Gen12 GPUs Will Feature Biggest Architecture Change In A Decade

Intel is slowly realizing plans to "one up" its GPU game starting from first 10 nm Ice Lake CPUs that feature Gen11 graphics, equipping users of integrated GPUs with much more performance than they previously got. Fortunately, Intel doesn't plan to stop there. Thanks to the recent pull request found on GitLab Mesa repository, we can now expect to receive biggest GPU performance bump in over a decade with the arrival of Gen12 based GPUs, found on next generation Tiger Lake processors.

In this merge request, Francisco Jerez, member of Intel's open source Linux graphics team, stated the following: "Gen12 is planned to include one of the most in-depth reworks of the Intel EU ISA since the original i965. The encoding of almost every instruction field, hardware opcode and register type needs to be updated in this merge request. But probably the most invasive change is the removal of the register scoreboard logic from the hardware, which means that the EU will no longer guarantee data coherency between register reads and writes, and will require the compiler to synchronize dependent instructions anytime there is a potential data hazard..."
Planned for release sometime around 2020/2021 (with Tiger Lake), Gen12 graphics features a complete overhaul of Execution Unit in a way we haven't seen since i965 debut. There will be less hardware logic that checks data for coherency, possibly resulting in lower latency and higher performance. That workload will shift from logic built into hardware, to compilers for them to figure out if data is correct or not, resulting in less wasted GPU clock cycles dedicated to that function. Source: Phoronix via HotHardware
Add your own comment

67 Comments on Intel Says Its Upcoming Gen12 GPUs Will Feature Biggest Architecture Change In A Decade

#51
64K
potato580+
so does it mean we finnaly have a third option in gpu market? i believe in intel, wait and observe:)
Yes, but at what level of performance, pricing, quality and drivers support we don't know yet.
Posted on Reply
#52
Spencer LeBlanc
64K
2020 release doesn't mean January 1st 2020. Intel could take over a year from now and still call it a 2020 release.
I have nothing wrong with intels products. I've owned many i5/i7s. I choose best performance in the range i can afford when available. Regardless AMD/Inetl
However, they are fucking liars.

laszlo
you can write anything but without self-censoring yourself you may loose soon that privilege..this is a decent forum not an underground one...
My bad, link me to forum rules plz.
Posted on Reply
#54
lexluthermiester
notb
If only it existed.
It DOES exist. It happens now. Reality is knocking...
Posted on Reply
#55
R0H1T
notb
One of the most efficient GPUs available (including ARM)? Triple 4K support?
Citation needed, their competition in AMD absolutely smashes them in every metric possible. Now if you're comparing ICL, with LPDDR4x advantage, how about wait for the real competitor from AMD?
ARM - don't even go there, Apple literally smashes everything including Intel in the CPU+GPU space in the sub 15~10W space!
notb
Most polished hardware encoder/decoder?
That's debatable, though it's hard to find too many holes in that argument.
Posted on Reply
#56
notb
lexluthermiester
It DOES exist. It happens now. Reality is knocking...
Show me. :-)
R0H1T
Now if you're comparing ICL, with LPDDR4x advantage, how about wait for the real competitor from AMD?
Wow. That's an interesting suggestion.
Why aren't you so eager to wait for Zen2's real competitor from Intel? :-) It should arrive more or less at the same time as Zen2 mobile SoCs.

At this very moment Intel has a more powerful mobile SoC. Why can't you just admit it? Why is it so painful to you?
ARM - don't even go there, Apple literally smashes everything including Intel in the CPU+GPU space in the sub 15~10W space!
And what is the maximum resolution for A12? How many screens? :-)

You don't get this, right? Somehow you just can't grasp the idea that computers are about functionality first, performance second. For you it's all about benchmarks and smashing. :-)
Posted on Reply
#57
Apocalypsee
Intel need to improve their driver if they want to get serious about graphics, not just pure hardware performance. Last time I'm using Intel HD 4600 I got some problems in some games (like no reflection in water, poor texture filtering quality etc.). They need better control panel to configure graphics settings too.

I don't mind Intel ramping up their IGP performance, perhaps with competition we will see some AMD CPU with HBM on some market (gaming notebook or NUC)
Posted on Reply
#58
Vayra86
notb
One of the most efficient GPUs available (including ARM)? Triple 4K support? Most polished hardware encoder/decoder?

Out of all PCs existing maybe 10% are used for gaming and maybe 1% for some kind of semi-conscious GPGPU - most of these have a graphics card anyway.
For the rest of users Intel HD is pretty much perfect.
They're increasing performance to support higher resolutions. That's it.
Yes, I'm sure there is more than 3% actually waiting for triple 4K support right about now, more so than just wanting to play a few games and not have it suck.

APUs are usable without this and have been for over a decade. You're right, its a res bump, and therefore its about 0% interesting :) Even with its efficiency, its another baby step yawnfest.
Posted on Reply
#59
notb
Apocalypsee
Intel need to improve their driver if they want to get serious about graphics, not just pure hardware performance. Last time I'm using Intel HD 4600 I got some problems in some games (like no reflection in water, poor texture filtering quality etc.). They need better control panel to configure graphics settings too.
True. I play older games mostly, but some just didn't want to work on IGP (despite it being fast enough).
Intel is said to be addressing this in Xe drivers. It's going to be a proper, unified gaming driver (from IGP to big cards).
I don't mind Intel ramping up their IGP performance, perhaps with competition we will see some AMD CPU with HBM on some market (gaming notebook or NUC)
I find this quite surprising as well. I don't know why we're not getting small gaming PCs to compete with consoles.
Of course you can get a powerful NUC or something from Zotac. But it's still using Windows or Linux. I'd rather have a tiny OS optimized for games - just like on consoles.
Steam OS is just an average Linux distro with a fancy skin.
Vayra86
Yes, I'm sure there is more than 3% actually waiting for triple 4K support right about now, more so than just wanting to play a few games and not have it suck.
It's fairly normal today for software developers, analysts or admins to work on 3 screens, so I'm not sure what you're trying to say... Even most accountants I've seen prefer at least 2.

Windows is awful at scaling, which means office screens are mostly ~20-24" 1080p (or a much better 1920x1200). As that gets fixed, 4K will become the new norm.
MacOS scaling works properly and their ecosystem has already moved to (and past) 4K.
APUs are usable without this and have been for over a decade. You're right, its a res bump, and therefore its about 0% interesting :) Even with its efficiency, its another baby step yawnfest.
APUs are way too big, way too power hungry and not as polished as IGP. Intel could make a bigger IGP as well - every CPU could get something from Iris range. But what's the point?
Intel is slowly improving their IGP to address current needs of non-gaming consumers and business.
Posted on Reply
#60
R0H1T
notb
And what is the maximum resolution for A12? How many screens? :)

You don't get this, right? Somehow you just can't grasp the idea that computers are about functionality first, performance second. For you it's all about benchmarks and smashing. :)
What let me try that again ~ what the what o_O

I'm sure you know the answer to that, care to share what's the maximum resolution (not iPhone screen resolution!) actually supported by the A12 GPU? Never mind that A13 is already in mass production & will probably sell 10x 10000x the (ICL) units Intel moves by then, as the new iPhones launch :rolleyes:

You're saying that with a straight face, really? I guess the iPhone or iPad is just a toy to you, huh ? Not to mention you claimed that Intel's IGP is @par & equally(?) power efficient as any of it's competitors - both claims are dubious at best :shadedshu:
Posted on Reply
#61
micropage7
in a decade, so they gonna compare the new chip to the one that released in 2009
Posted on Reply
#62
Octopuss
Who cares about biggest changes in a decade when it's still useless integrated GPU that can play a video or Minecraft at best?
Posted on Reply
#63
lexluthermiester
Octopuss
Who cares about biggest changes in a decade when it's still useless integrated GPU that can play a video or Minecraft at best?
That will be a part of the improvements being made. Besides, who cares? Buy a Radeon or Geforce card if you want to game.
Posted on Reply
#64
Vlada011
Intel have HUUUUGE problems in company to build competitive processors to AMD.
They have no technology for radical improvement in performance per core.
Posted on Reply
#65
Octopuss
Vlada011
Intel have HUUUUGE problems in company to build competitive processors to AMD.
They have no technology for radical improvement in performance per core.
Someone didn't get the topic.
Posted on Reply
#66
jabbadap
lexluthermiester
That will be a part of the improvements being made. Besides, who cares? Buy a Radeon or Geforce card if you want to game.
Point being: Laptops. Maybe some NUCs too, can't even remember when was the last Iris desktop cpu(some soldered R sky lake maybe?). Ice lake laptops with 25W TDP will be mx150 like performance, so no more pesky little nvidia dgpu needed for light weight gaming. As a Linux user I'm actually looking forward for those, Intel's open source drivers have been rock solid so far.
Posted on Reply
#67
Arctucas
Call me when Intel produces discrete graphics cards that match performance of AMD and nVIDIA flagship cards.
Posted on Reply
Add your own comment