• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Could Unveil its Graphics Card at 2019 CES

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,283 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
It looks like Intel is designing its discrete graphics processor at a breakneck pace, by a team put together by Raja Koduri. Its development is moving so fast, that the company could be ready with a working product to show the world by the 2019 International CES, held in early-January next year. Intel's development of a graphics processor is likely motivated by the company's survival instinct to not fall behind NVIDIA and AMD in making super-scalar architectures to cash in on two simultaneous tech-booms - AI and blockchain computing.

A blessing in disguise for gamers is the restoration of competition. NVIDIA has been ahead of AMD in PC graphics processor performance and efficiency since 2014, with the latter only playing catch-up in the PC gaming space. AMD's architectures have proven efficient in other areas, such as blockchain computing. NVIDIA, on the other hand, has invested heavily on AI, with specialized components on its chips called "tensor cores," which accelerate neural-net building and training.



View at TechPowerUp Main Site
 
Joined
Aug 13, 2010
Messages
5,380 (1.08/day)
Seems very very soon. I would guess that they might show more of a proof-of-concept, with possibly H2-2019 launch?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
26,958 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Just wondering, isn't anybody considering that Intel might just fail again at this? Or that their GPU isn't targeted at gamers at all? (which needs a ton of driver support)
 
Joined
Aug 13, 2010
Messages
5,380 (1.08/day)
Just wondering, isn't anybody considering that Intel might just fail again at this? Or that their GPU isn't targeted at gamers at all? (which needs a ton of driver support)

Their actions tell that they seem very ambitious about this one. I would believe that Intel has the capability of making decent and competitive products in this segment. They definitely have the money, manpower and tools.

Its just that i don't think they can do it this fast. I would believe a CES2020 paper launch, not a 2019 one.
And sure, Intel knows how to lose battles. They do almost every other month in some segments.

If a product does see the light in 7 months, it might have an awkward start, but this trigger needs to happen for the next ones to be a lot better. Hopefully, this is a race they can even attend.
 
Joined
Nov 18, 2010
Messages
7,112 (1.46/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX + under waterblock.
Storage Optane 900P[W11] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO[FEDORA]
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 39 / Windows 11 insider
IMHO they will fail...

But as always they will salvage some parts of the development into their CPU arch just as the ring arch for example. So it won't be a complete loss if it fails.

I am afraid that they will go emulation route mending the handicap by brute force...
 
Joined
Oct 22, 2014
Messages
13,210 (3.83/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E3-1260L v5
Motherboard MSI E3 KRAIT Gaming v5
Cooling Tt tower + 120mm Tt fan
Memory G.Skill 16GB 3600 C18
Video Card(s) Asus GTX 970 Mini
Storage Kingston A2000 512Gb NVME
Display(s) AOC 24" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Pro 64 bit
I can't see why they can't pull one out of the hat, they've been making Co-Processors for years for data centres etc, surely adding a display output on to them can be done as Nvidia has on some of their top tier cards.
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
Well if you read the source.

TweakTown said:
My sources are telling me to expect something late this year with all attention to be placed on Intel at CES 2019 in January, where Intel could unveil their new GPU.

Something means well anything CPU, Chipset Mobile, Auto, Laurebee 2, Knight Hill-top. That quote is the foundation of the entire articles speculations.

Also you have to wonder on what process. 10nm isn't working out so is it going to be their 14nm when others will be using 7nm.
 
Last edited:
Joined
Feb 13, 2012
Messages
522 (0.12/day)
Just wondering, isn't anybody considering that Intel might just fail again at this? Or that their GPU isn't targeted at gamers at all? (which needs a ton of driver support)
They've been making gpus for a while now so they probably learned enough about the business model of gpus. As for failing, technically you cant fail at making a discrete GPU because they are parallel and can be compensated by more cores aslong as its within reason. being that they already have somewhat decent gpu IP, its just a matter of scaling their design higher. Yes intel graphics are behind amd and nvidia, but thats also because they dont make their chips big enough. An intel i7 8700k with 24 EUs has a size of 155mm2 with about a third of the chip for graphics. Another aspect is that intel has proven to be efficient With their power usage since these EUs are designed with mobility in mind. The challenge now is scaling from about 50mm2 worth of graphics to like 250+
Amd requires 484mm2 worth of vega cores to compete with 300mm2 Pascal chips, and it isn't exactly failing except at the high end where it cant scale any further due to cost. Intel could do the same thing initially
 
Joined
Feb 19, 2009
Messages
1,151 (0.21/day)
Location
I live in Norway
Processor R9 5800x3d | R7 3900X | 4800H | 2x Xeon gold 6142
Motherboard Asrock X570M | AB350M Pro 4 | Asus Tuf A15
Cooling Air | Air | duh laptop
Memory 64gb G.skill SniperX @3600 CL16 | 128gb | 32GB | 192gb
Video Card(s) RTX 4080 |Quadro P5000 | RTX2060M
Storage Many drives
Display(s) M32Q,AOC 27" 144hz something.
Case Jonsbo D41
Power Supply Corsair RM850x
Mouse g502 Lightspeed
Keyboard G913 tkl
Software win11, proxmox
Benchmark Scores 33000FS, 16300 TS. Lappy, 7000 TS.
Just wondering, isn't anybody considering that Intel might just fail again at this? Or that their GPU isn't targeted at gamers at all? (which needs a ton of driver support)

I'm pretty sure their linux support will be good.
They can jump right into the server market with openCL with an alliance with AMD on the software stack.
 
Joined
Feb 13, 2012
Messages
522 (0.12/day)
I'm pretty sure their linux support will be good.
They can jump right into the server market with openCL with an alliance with AMD on the software stack.
That would actually be good for both of them. Someone definitely needs to put cuda in its place.
 
Last edited:

dorsetknob

"YOUR RMA REQUEST IS CON-REFUSED"
Joined
Mar 17, 2005
Messages
9,105 (1.31/day)
Location
Dorset where else eh? >>> Thats ENGLAND<<<
Speculation ( as at the moment that's all we can do)
Intel Discrete Graphics Card combined with the IGP Could kick ass ( Running Intels version of AMD hybrid Crossfire Setup).
 
Joined
Aug 13, 2010
Messages
5,380 (1.08/day)
Speculation ( as at the moment that's all we can do)
Intel Discrete Graphics Card combined with the IGP Could kick ass ( Running Intels version of AMD hybrid Crossfire Setup).

I would much like something like exists in laptops. A dGPU with its own fast memory and all, working solo on 3D rendering, and iGPU when system at complete idle, using the dGPU's outputs on the same system

That is, in mainstream platforms. Of course us HEDT users would have to use the dGPU even at idle :)
 
Joined
Sep 15, 2011
Messages
6,457 (1.41/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Actually, with the resources they have, and with the ATi staff already under their roof, I have a feeling we are going to see a surprise from Intel. They have there own fabs, all resources are in house, I think they are going for a killer product, even if initially will be for professional area only.
 
Joined
Jun 25, 2010
Messages
854 (0.17/day)
I don't consider this fast. Like has been said Intel did attempt this before so they know what's involved. Not to mention they have been making iGPUs forever, there's not much of a leap from that to a dedicated GPU. With all their different products I would assume they have nearly everything they need to create a dedicated GPU in a reasonable time frame, I mean they even have their own fabs.

As w1zzard stated this might not be for the consumer market but I can't see it not being a possibility, their driver does need work but the foundation is there, same goes with the hardware side of the equation. As for if they will fail, well that's just speculation, any number of the infinite possibilities can happen. I just hope for more competition in this space, I am tired of the same two after all this time. Good luck buying this competition nvidia, this is no 3dfx. :p
 
Last edited:
Joined
Jun 21, 2011
Messages
165 (0.04/day)
Just wondering, isn't anybody considering that Intel might just fail again at this? Or that their GPU isn't targeted at gamers at all? (which needs a ton of driver support)

The main question I want answered is: what type of GPU is Intel developing (and why are we still calling certain chips Graphics Processing Units)?

Intel's earlier failures with graphics development were mostly to do with resources required to sustain long-term development cycles and "political will" at Intel because the "old" Intel was a CPU company. The current Intel diversified its portfolio and can do whatever it wants (vide the asinine McAfee deal).

It all comes down to "how long a deadline and how many resources did they give Raja?". While Raja's experience has been desktop GPUs, I wouldn't put it past Intel to repurpose a poor design just for the sake of saving the product, and end up with a crappy AI core.

I am not very impressed by Raja, tbh. His experience at AMD (even though you can't peg him entirely for the VEGA design) shows he oversells his product.
 
Joined
Jul 16, 2014
Messages
8,115 (2.29/day)
Location
SE Michigan
System Name Dumbass
Processor AMD Ryzen 7800X3D
Motherboard ASUS TUF gaming B650
Cooling Artic Liquid Freezer 2 - 420mm
Memory G.Skill Sniper 32gb DDR5 6000
Video Card(s) GreenTeam 4070 ti super 16gb
Storage Samsung EVO 500gb & 1Tb, 2tb HDD, 500gb WD Black
Display(s) 1x Nixeus NX_EDG27, 2x Dell S2440L (16:9)
Case Phanteks Enthoo Primo w/8 140mm SP Fans
Audio Device(s) onboard (realtek?) - SPKRS:Logitech Z623 200w 2.1
Power Supply Corsair HX1000i
Mouse Steeseries Esports Wireless
Keyboard Corsair K100
Software windows 10 H
Benchmark Scores https://i.imgur.com/aoz3vWY.jpg?2
Speculation ( as at the moment that's all we can do)
Intel Discrete Graphics Card combined with the IGP Could kick ass ( Running Intels version of AMD hybrid Crossfire Setup).
I would speculate that Intel is ditching IGP to favor discrete.
 
Joined
Dec 22, 2011
Messages
3,890 (0.87/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Actually, with the resources they have, and with the ATi staff already under their roof, I have a feeling we are going to see a surprise from Intel. They have there own fabs, all resources are in house, I think they are going for a killer product, even if initially will be for professional area only.

Indeedy, it will great to have a third player enter the market and provide more competition.

More freedom and choice is great.
 

iO

Joined
Jul 18, 2012
Messages
526 (0.12/day)
Location
Germany
Processor R7 5700x
Motherboard MSI B450i Gaming
Cooling Accelero Mono CPU Edition
Memory 16 GB VLP
Video Card(s) AMD RX 6700 XT
Storage P34A80 512GB
Display(s) LG 27UM67 UHD
Case none
Power Supply SS G-650
Maybe a small chip to replace Vega M in Kaby Lake G. Or an AI specific accelerator or so.
But not a full blown big GPU, out of nowhere and competitive with Nvidia..
 
Joined
Jan 8, 2017
Messages
8,863 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Just wondering, isn't anybody considering that Intel might just fail again at this?

If this really is a consumer product , then I'm pretty much convinced it will fail. But as you said , I am much more inclined to believe this is meant for datacenters , Intel really doesn't like how Nvidia is eating away server space sales.

They've been making gpus for a while now so they probably learned enough about the business model of gpus.

They really didn't learn much , some of their decisions are absolutely baffling , such as this : http://www.joshbarczak.com/blog/?p=667

Why would they put so much effort into making such a efficient hardware functionality which is avoided like the plague by the graphics industry is beyond comprehension.

Yes intel graphics are behind amd and nvidia, but thats also because they dont make their chips big enough.

Even something of similar size from ARM or Qualcomm would wipe the floor with these things in pretty much every relevant category (including power efficiency , though to be fair it would beat AMD/Nvidia a well). Their architecture just simply performs poorly and it's optimized for things that no one needs as pointed above. I don't know what is the cause of their failure to implement competitive GPUs in this segments , it's either lack of money allocated to it , or lack of talented people. I'm tempted to believe it's the latter and the addition of Raja wont be enough as far as I am concerned.
 
Last edited:
Joined
Aug 11, 2014
Messages
866 (0.25/day)
Processor ryzen 5 5600x
Motherboard AB350m Pro4
Cooling custom loop
Memory TEAMGROUP T-Force TXKD416G3600HC18ADC01 16gbs XMP
Video Card(s) HP GTX1650 super 4gb
Storage MZVLB256HBHQ-000H1 PM981a (256GB)/3TB HDD
Display(s) Nitro XF243Y Pbmiiprx
Case Rosewill CULLINAN
Audio Device(s) onboard
Power Supply Corsair 750w
Mouse Best Buy Insignia
Keyboard Best Buy Insignia
Software Win 10 pro
this does not surprise me. intel already has graphics cores on their cpus. the problem is it was limited on how powerful they could make it because it had to run next to a cpu and pull all its power from the socket. all they really have to do is add more shaders, gddr5x or gddr6. it is not impossible for them to add more execution units on their gt3e and triple the pipelines. this with faster ram and an extremely increased power threshold is well within reason. the added power threshold will also allow them to increase core clocks. power wise it should be around 120-135w. so it might not be the most power saving but it could still be a good entry to mid range card.

edit: not to mention they will have refined the process used on there current integrated gpus for the upcoming discrete ones. plus stuff like this is something that just was not thought up yesterday, they have been planning it for a few years and have already worked out some of the issues. i wont be surprised if this is in the range of rx560 - gtx1060 category of cards. its also not suprising that they are waiting for gddr6 to come into full production by the time they show this.
 
Last edited:
Joined
Sep 15, 2016
Messages
473 (0.17/day)
The acquisition of Raja and product development were likely planned at least a year in advance with the intention of releasing a product by CES 2019... large tech companies like Intel aren't giving away all of their secrets in press releases.

I'm less concerned with performance or the adaptions of an Intel GPU versus the consumer / professional options from their opponents but rather how a MAJOR change in innovation (from onboard to discrete) will lead to localized optimizations.

Xeon Phi-- Absolve memory / thread wall (MIC) local
3D XPoint - Absolve RAM limits / Storage latency micron acquisition
Intel ??? GPU - Absolve parallel bus latency through firmware or driver optimizations.

If the first iteration of Intel's discrete GPU doesn't scare its competition... the second or third will.
 
Last edited:
Joined
Mar 10, 2010
Messages
11,878 (2.31/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I can see a low end consumer card with enhanced performance on their igpu but not a gaming competitive one, as i have said before I expect more of an application accelerator then a mere gpu ,to push edge use cases and possibly be an AI accelerator too, it certainly would be strange imho if intel just made a straight discreet Gpu ,to me, Amd and Nvidia are way too far ahead to catch in Gfx But not in other areas ie FPGAs ,Ai and database etc acceleration and intel would have a better chance if they can leverage their own special sauce into the mix and have it become integral , like Gpus themselves, No one else can even compete on that level at this point.
And that Usp could get them selling.

No one mentions the totally rubbish rendering of 3d objects on intel?? I have seen 3 way , intel ,amd , Nvidia, film and game rendering comparisons that shine a light on the rather poor game reproduction on intel hardware, they do films fine to be fair.
 
Joined
Apr 18, 2015
Messages
234 (0.07/day)
Its just that i don't think they can do it this fast. I would believe a CES2020 paper launch, not a 2019 one.

Why would you think this?
They already have working GPU, working drivers, all the infrastructure and knowledge is in place, and don't forget they are doing GPUs since 20 years ago.
Its just scaling everything to a whole new level.
From programming perspective going from 1 core to 2 cores is hard, however once your software is working with 20, switching to 2000 is just a parameter change. (They have 20-40 in their current integrated GPUs)
 
Joined
Jan 8, 2017
Messages
8,863 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
From programming perspective going from 1 core to 2 cores is hard, however once your software is working with 20, switching to 2000 is just a parameter change. (They have 20-40 in their current integrated GPUs)

It doesn't work like that at all , whether you are referring to GPUs or CPUs.
 
Joined
Sep 15, 2011
Messages
6,457 (1.41/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
One stupid question, and apologies in advance for it... :D
Isn't the SHADER cores nVidia and AMD proprietary ONLY?? If so, how can Intel develop a new GPU without those?? Sorry, just asking....
 
Top