Wednesday, August 15th 2018

Intel Teases Their Upcoming Graphics Cards for 2020

Right in time for SIGGRAPH, the world's leading conference for computer graphics, the people around Raja Koduri and Chris Hook have posted a video on Twitter, which shows a teaser for their upcoming graphics cards, that are scheduled to become available in 2020.
The video is produced in a style that's typical for what Chris Hook has been releasing at AMD, too. It starts with a history lesson, praising Intel's achievements in the graphics department, and then continues to promise that in 2020, Intel discrete graphics cards "will be set free, and that's just the beginning".

In the comments for the video, Chris Hook, who left AMD to join Intel as head of marketing for their graphics department said: "Will take time and effort to be the first successful entrant into the dGPU segment in 25 years, but we have some incredible talent at Intel, and above all, a passion for discrete graphics."

You can find the video here.
Add your own comment

80 Comments on Intel Teases Their Upcoming Graphics Cards for 2020

#51
StrayKAT
eidairaman1Bring S3, PowerVR/STMicro/VideoLogic back along with Hitachi SuperH!
Nostalgia time. My first PC had a "Cirrus Logic".. I think with a whopping 1MB.. or maybe 4? And a Cyrix CPU. :)
Posted on Reply
#52
scorpion_amd13
rtwjunkieRaja wasn’t brought in on the ground floor. This had already been under development for awhile before he left AMD.
That is likely, especially if the rumors about the cards being targeted at professionals pan out. But if Raja had nothing to do with developing the architecture, why bother bringing him aboard? It's not like his efforts at AMD led to the best GPUs they ever put out...

I'm very curious to know more about the architecture. That should clarify a lot of things.
Posted on Reply
#53
GoldenX
Bring Voodoo back! Stupid Nvidia didn't even get SLI right.
Posted on Reply
#54
StrayKAT
GoldenXBring Voodoo back! Stupid Nvidia didn't even get SLI right.
..And their cringy ads to boot. :P

Posted on Reply
#55
RejZoR
That's great news actually. Competition is great, but I have fears that Intel might eat into AMD's market share and actually make things worse and turn it into two player game again, just with different players. We certainly don't want that to happen. So, I hope that NVIDIA continues with their success, that Intel does really well with its new graphic card and that AMD makes a huge comeback as well. If all that happens, it'll be a golden era for gaming graphics. Assuming Intel is targeting that and not professionals only. Holding fingers crossed.
Posted on Reply
#56
dj-electric
RejZoRThat's great news actually. Competition is great, but I have fears that Intel might eat into AMD's market share and actually make things worse and turn it into two player game again, just with different players. We certainly don't want that to happen. So, I hope that NVIDIA continues with their success, that Intel does really well with its new graphic card and that AMD makes a huge comeback as well. If all that happens, it'll be a golden era for gaming graphics. Assuming Intel is targeting that and not professionals only. Holding fingers crossed.
This has a big chance of turning into a 2 player game again.
Posted on Reply
#57
azdesign
next will be nvidia want to make their own cpu :laugh:
I'm still rooting for VIA though
Posted on Reply
#58
silentbogo
azdesignnext will be nvidia want to make their own cpu :laugh:
They kinda do.... for the past 8-10 years or so...
Posted on Reply
#59
Vya Domus
FordGT90ConceptIn fact, I wouldn't be surprised at all if Intel's card has no ROPs or TMUs (things mostly for gaming/rendering). Think Xeon Phi but instead of x86, it's a new SIMD architecture (a whole lot of shaders and not much else).
Now matter how much you'll try, if you don't include fixed function hardware blocks like ROPs and TMU it will never succeed as GPU, even if it's supposed to be for professionals. Intel would be shooting themselves in the foot and making the same mistake yet again. They have to get it out of their head that they can just turn CPUs into GPUs.
Posted on Reply
#60
oxidized
scorpion_amd13There have always been brilliant minds at intel. That didn't stop them from utterly failing both times they tried to do a dedicated GPU.



Money doesn't solve everything. See intel's woes with the 10nm process. Also, see how their first two attempts at dedicated GPUs turned out. They need people who can focus people and spending in the right direction.

This being said, something's really not right here. It really hasn't been that long since Raja defected from AMD and went on to intel. Building an architecture from the ground up takes a while. It seems that intel are in a hurry to ship this thing, and that's never a good sign.

Even if the hardware part will be well done, intel still has to invest a lot in drivers. Their IGP drivers are the worst thing since square wheels. And they'll have significantly less than two years to make the drivers work well (they have to have the hardware first). This is a big warning sign here.
Again, we can't know these facts, we can't know if koduri joined a project which was already started, if they'll be able to release their cards in 2020 and won't suck it means they probably were working on it for much longer than when koduri joined them. We just have to wait, but this is intel we're talking about, and this time doesn't sound like the last time.
Posted on Reply
#61
FordGT90Concept
"I go fast!1!11!1!"
Vya DomusNow matter how much you'll try, if you don't include fixed function hardware blocks like ROPs and TMU it will never succeed as GPU, even if it's supposed to be for professionals. Intel would be shooting themselves in the foot and making the same mistake yet again. They have to get it out of their head that they can just turn CPUs into GPUs.
That's the point. Intel wants to be selling cards into the $2000+ market. That's compute products to corporate customers. What makes the most sense to me is packaging a monolithic SIMD architecture with a IGP imbedded so it's still technically a GPU but it sucks at real time rendering (making it terrible at gaming but still very useful in the performance segment because of the performance at creating pre-renders.

The growth is in deep learning, neural networks, and AI. Intel can make a fortune targeting those clients without gamers.

We'll have to wait and see if Intel makes a similar announcement in GamesCom.
Posted on Reply
#62
stimpy88
You can certainly tell Raja is "working" there...

The hype train has departed to a station called Disappointment Ave...

He is another one like Peter Molyneux. Lots of grand, waffling speeches, and very little real-life content to match up to it after the dust settles, yet the media hail him as a visionary.
Posted on Reply
#63
krykry
stimpy88You can certainly tell Raja is "working" there...

The hype train has departed to a station called Disappointment Ave...

He is another one like Peter Molyneux. Lots of grand, waffling speeches, and very little real-life content to match up to it after the dust settles, yet the media hail him as a visionary.
Raja isn't the problem, they employed the marketing guy responsible for "POOR VOLTA", Chris Hook.
Posted on Reply
#64
Vayra86
stimpy88You can certainly tell Raja is "working" there...

The hype train has departed to a station called Disappointment Ave...

He is another one like Peter Molyneux. Lots of grand, waffling speeches, and very little real-life content to match up to it after the dust settles, yet the media hail him as a visionary.
krykryRaja isn't the problem, they employed the marketing guy responsible for "POOR VOLTA", Chris Hook.
This. This is what you're seeing here and Intel has already been searching hard many years for a new big business move.

ARM isn't really taking off, x86-64 is starting to look more difficult than it ever was, IoT is too small and insignificant (plus: full of privacy and security concerns) and everything else is just not interesting for core business. Graphics though... that is an area where the market is screaming for progress and where competition is lacking. And its that unicorn Intel really never could catch in the right way.

Raja probably convinced Intel to take another shot. Of all people. Not having high hopes, all this is, is the right voice at the right time, with no product to really back it up.
Posted on Reply
#65
HD64G
In 2020 at best they will have a medium performance gpu versus the compatition in that age. And with mediocre drivers that will make their gpu seem even weaker and less efficient. Don't underestimate the driver aspect in gpu as AMD was crippled for years until they managed to have great drivers, and many were buying nVidia because of drivers back then. To have real competition from the blue side in gpu market (gaming wise at least) we will have to wait at least until 2022-3. Even for the medium class of performance. They won't get into it by offering cheap gpus imho. That will ruin their leader image of their company. Unless if they get hurt heavily in the cpu market and will desperately need market share in gpus to save the day as AMD did in the 2013-2017 period.
Posted on Reply
#66
CrAsHnBuRnXp
I forsee it happening like this:

AMD releases a new GPU. It's subpar like it always is.
Intel then releases their new GPU shortly after that (within a few months) and it's superior to AMD but still falls slightly behind nvidias current offering at the time.
Nvidia releases their product within a 6 month time frame of Intel and completely demolishes both AMD and Intel like they always seem to do but we still pay a slightly higher premium for their cards because, Nvidia and they continue to dominate the GPU marketplace.
Posted on Reply
#67
bug
HD64GIn 2020 at best they will have a medium performance gpu versus the compatition of those age. And drivers mediocre that will make their gpu seem even weaker and less efficient. Don't underestimate the driver aspect in gpu as AMD was crippled for years until they managed to have great drivers, and many were buying nVidia because of drivers back then. To have real competition from the blue side in gpu market (gaming wise at least) we will have to wait at least until 2022-3. Even for the medium class of performance. They won't get into it by offering cheap gpus imho. That will ruin their leader image of their company. Unless if they get hurt heavily in the cpu market and will desperately need market share in gpus to save the day as AMD did in the 2013-2017 period.
i740 contradicts pretty much everything you just said. Just sayin'.
Posted on Reply
#68
Vayra86
HD64GIn 2020 at best they will have a medium performance gpu versus the compatition of those age. And drivers mediocre that will make their gpu seem even weaker and less efficient. Don't underestimate the driver aspect in gpu as AMD was crippled for years until they managed to have great drivers, and many were buying nVidia because of drivers back then. To have real competition from the blue side in gpu market (gaming wise at least) we will have to wait at least until 2022-3. Even for the medium class of performance. They won't get into it by offering cheap gpus imho. That will ruin their leader image of their company. Unless if they get hurt heavily in the cpu market and will desperately need market share in gpus to save the day as AMD did in the 2013-2017 period.
Its not just drivers you need, its also an architecture that:

- uses die space very efficiently
- can be fully utilized for every type of workload (and remains efficient doing so)
- can work within a limited TDP budget (~250-300W tops)
- can scale well
- does delta compression and efficiently uses memory
- has a flexible clock range
- can mitigate heat/throttling issues while keeping performance

.... and another few dozen things we've not seen Intel do yet.
Posted on Reply
#69
Dr_b_
StrayKATLet them fail then, if you think they suck so much. :p

But they shouldn't stop before they even start. What if AMD had such a defeatist attitude 5 years ago?
GPU isn't intel's core competency, no pun intended. And if they can't get their business in order with their CPU manufacturing and design, as indicated by their recent marketing failures, lack of 10nm progress, or even architectural design, then this is truly just hype. It is a realistic assessment, not an attitude, based on their trajectory, past and present.

Intel has made a lot of expensive mistakes. For example, they bought McAfee. They also tried and failed already to make a dGPU. The problem is they are trotting out a lot of marketing BS now, and its not credible. "In 2030 we are going to change the world!!111one" Sure you are.

They don't have a steve jobs to pull an iphone out of his ass and save them, they have... Raja Koudori, and a lot of management issues. Maybe something has changed at intel and they will be successful, if so, and the dGPU is better than everyone else, i'll be right there standing in line with you waiting to get one. And maybe I'll see a Unicorn today too.
Posted on Reply
#70
bug
Dr_b_GPU isn't intel's core competency, no pun intended. And if they can't get their business in order with their CPU manufacturing and design, as indicated by their recent marketing failures, lack of 10nm progress, or even architectural design, then this is truly just hype. It is a realistic assessment, not an attitude, based on their trajectory, past and present.

Intel has made a lot of expensive mistakes. For example, they bought McAfee. They also tried and failed already to make a dGPU. The problem is they are trotting out a lot of marketing BS now, and its not credible. "In 2030 we are going to change the world!!111one" Sure you are.

They don't have a steve jobs to pull an iphone out of his ass and save them, they have... Raja Koudori, and a lot of management issues. Maybe something has changed at intel and they will be successful, if so, and the dGPU is better than everyone else, i'll be right there standing in line with you waiting to get one. And maybe I'll see a Unicorn today too.
If we forget they have been building integrated GPUs since forever and that they know best where their manufacturing stands, you are spot on :wtf:
Posted on Reply
#71
GoldenX
StrayKAT..And their cringy ads to boot. :p

That's better than most AMD ads.
Posted on Reply
#72
mtcn77
The only truth to Intel's failure is a marketing one. Hardware does just what it is told to do. If Intel is to succeed, they need better software support, based on which I can state they will succeed on that front.
Intel needed to promote Broadwell as a single-threaded specialty chip - like they did with 7350K, posthumously. 128MB L4 that almost negates any calls to memory should have been promoted as such and therefore, Intel's shortcoming isn't an engineering one.
Posted on Reply
#73
XiGMAKiD
Intel needs to provide a real benefit if they want people to buy their card as Nvidia got the top performance crown and (usually) AMD got the perf/dollar aspect. I doubt Intel is gonna price their card with low margin so it's not gonna be mainstream line, high end segment seems to be the best place for them to test their new card while maintaining good margin.
Posted on Reply
#74
mtcn77
XiGMAKiDIntel needs to provide a real benefit if they want people to buy their card as Nvidia got the top performance crown and (usually) AMD got the perf/dollar aspect. I doubt Intel is gonna price their card with low margin so it's not gonna be mainstream line, high end segment seems to be the best place for them to test their new card while maintaining good margin.
What they have is good software engineering. If they can leverage that and show continuous support for features, they have a winner.
PS: they have this which I find interesting. Anything you can sample with a gaussian, you can use for centroid domain antialiasing which I find a good match to build a bilateral filter with.
Any 2x antialiasing mode will automatically generate 3 taps. The intermediate boundary is prone to artifacting the most. Bilateral filters have no negative 'intermediary' lobes however much you sharpen, so it can seam any image together without incurring any 'shock' of its own. It is a good basis point to postprocess an image - which gpus naturally do. Literally, if you are into type or video filtering, bilateral is a must for conserving the edge detail.
Posted on Reply
#75
StrayKAT
mtcn77What they have is good software engineering. If they can leverage that and show continuous support for features, they have a winner.
PS: they have this which I find interesting. Anything you can sample with a gaussian, you can use for centroid domain antialiasing which I find a good match to build a bilateral filter with.
Any 2x antialiasing mode will automatically generate 3 taps. The intermediate boundary is prone to artifacting the most. Bilateral filters have no negative 'intermediary' lobes however much you sharpen, so it can seam any image together without incurring any 'shock' of its own. It is a good basis point to postprocess an image - which gpus naturally do. Literally, if you are into type or video filtering, bilateral is a must for conserving the edge detail.
I don't doubt you know your stuff, but everytime you talk, you're way over my head. ;)

Posted on Reply
Add your own comment
Apr 25th, 2024 12:00 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts