• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Powers-on the First Xe Graphics Card with Dev Kits Supposedly Shipping

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,232 (0.91/day)
Intel is working hard to bring its first discrete GPU lineup triumphantly, after spending years with past efforts to launch the new lineup resulting in a failure. During its Q3 earnings call, some exciting news was presented, with Intel's CEO Bob Swan announcing that "This quarter we've achieved power-on exit for our first discrete GPU DG1, an important milestone." By power on exit, Mr. Swan refers to post-silicon debug techniques that involve putting a prototype chip on a custom PCB for testing and seeing if it works/boots. With a successful test, Intel now has a working product capable of running real-world workloads and software, that is almost ready for sale.

Additionally, the developer kit for the "DG1" graphics card is supposedly being sent to various developers over the world, according to European Economy Commission listings. Called the "Discrete Graphics DG1 External FRD1 Accessory Kit (Alpha) Developer Kit" this bundle is marked as a prototype in the alpha stage, meaning that the launch of discrete Xe GPUs is only a few months away. This confirming previous rumor that Xe GPUs will launch in 2020 sometime mid-year, possibly in July/August time frame.


View at TechPowerUp Main Site
 
Joined
Jul 16, 2014
Messages
8,118 (2.27/day)
Location
SE Michigan
System Name Dumbass
Processor AMD Ryzen 7800X3D
Motherboard ASUS TUF gaming B650
Cooling Artic Liquid Freezer 2 - 420mm
Memory G.Skill Sniper 32gb DDR5 6000
Video Card(s) GreenTeam 4070 ti super 16gb
Storage Samsung EVO 500gb & 1Tb, 2tb HDD, 500gb WD Black
Display(s) 1x Nixeus NX_EDG27, 2x Dell S2440L (16:9)
Case Phanteks Enthoo Primo w/8 140mm SP Fans
Audio Device(s) onboard (realtek?) - SPKRS:Logitech Z623 200w 2.1
Power Supply Corsair HX1000i
Mouse Steeseries Esports Wireless
Keyboard Corsair K100
Software windows 10 H
Benchmark Scores https://i.imgur.com/aoz3vWY.jpg?2
if only they would put the GPU ...


:rockout:RIGHT SIDE UP! :roll:
 
Joined
Aug 6, 2017
Messages
7,412 (3.02/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Seen rumors mid range xe is supposed to run multi GPU with new iris pro.
Would be a killer move,amd should have done it long time ago.
 
Last edited:
Joined
Aug 12, 2019
Messages
1,726 (1.00/day)
Location
LV-426
System Name Custom
Processor i9 9900k
Motherboard Gigabyte Z390 arous master
Cooling corsair h150i
Memory 4x8 3200mhz corsair
Video Card(s) Galax RTX 3090 EX Gamer White OC
Storage 500gb Samsung 970 Evo PLus
Display(s) MSi MAG341CQ
Case Lian Li Pc-011 Dynamic
Audio Device(s) Arctis Pro Wireless
Power Supply 850w Seasonic Focus Platinum
Mouse Logitech G403
Keyboard Logitech G110
i haven't been following intel on releasing the discrete gpu.
Are these consumer workstation gpu? or something else?
 
Joined
Feb 11, 2009
Messages
5,403 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Seen rumors mid range xe is supposed to run multi GPU with new iris pro.
Would be a killer move,amd should have done it long time ago.

ermm why and how?
As the article mentions, this has been claimed before, heck multiple iterations of the idea of multiple different gpu's working together, but it has never actually come to anything good/usable.

Next to that...AMD should have done this a long time ago..with the APU's? why?
The more powerful cpu's from AMD dont have a build in gpu and those that do are semi low end, meant for the office work or laptops etc, there it make sense.
To pair that with a dedicated gpu is a bit silly really.

On the article itself, more non-news from Intel it seems:
Intel Marketing: "We won't let you forget us"
 
Last edited:
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Seen rumors mid range xe is supposed to run multi GPU with new iris pro.
Would be a killer move,amd should have done it long time ago.
AMD has done a lot of R&D on this, so has Nvidia and so has Intel. There are fundamental issues with how GPUs work today that prevent from this being effective for gaming (and other real-time rendering applications). Either someone has to come up with a new approach (and collectively they have tried a lot of things) or something has to change about how graphics are rendered/computed - a new paradigm and APIs probably.

Rumors are pretty much always there as there is constant research going on around multi-GPU solutions. Crossfire and SLI were/are the most efficient methods for now. Again, this is for gaming and other real-time rendering applications. GPGPU or similar applications (non-real-time rendering), especially data centers and distributed GPGPU with lower latency/time dependency are much easier and have been largely figured out for a long while (research into improving these has obviously not stopped and is ongoing and this are better in that area as well, the pain points and bottlenecks are simply different).
 
Low quality post by fancucker
Joined
Oct 25, 2019
Messages
203 (0.12/day)
Hopefully the final nail in AMDs coffin and the first true Nvidia competitor after years of inefficient regressive architectures. RDNA can barely compete against 12nm with a new node and even AMDs ashamed to release a hot 5800XT
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Seen rumors mid range xe is supposed to run multi GPU with new iris pro.
Would be a killer move,amd should have done it long time ago.
Only Vulkan and D3D12 can do that but virtually no developers take the time to implement it. As far as I know, only Sniper Elite 4 supports it. And that's hardware agnostic: would work across an NVIDIA GPU, AMD GPU, and Iris Pro integrated if all were in one machine.
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
On the article itself, more non-news from Intel it seems:
Intel Marketing: "We won't let you forget us"
As long as Intel brings more competition to GPU space we should be happy. When they said they are bringing out GPUs we knew that it would take a couple years or more. Some progress reports - especially significant ones like this - are definitely welcome.
Only Vulkan and D3D12 can do that but virtually no developers take the time to implement it. As far as I know, only Sniper Elite 4 supports it.
A simplified version of the problem is that while Crossfire and SLI (and their DX12 counterpart - implicit mGPU) do multi-GPU largely automatically and developer needs to enable it and test it to make sure they avoid a bunch of gotchas, doing this in Vulkan or DX12 (implicit mGPU) requires developer to basically implement it.
 
Joined
Feb 11, 2009
Messages
5,403 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Hopefully the final nail in AMDs coffin and the first true Nvidia competitor after years of inefficient regressive architectures. RDNA can barely compete against 12nm with a new node and even AMDs ashamed to release a hot 5800XT

ok slow down there alt account, I know it does not matter if this one gets banned because you have your main account and can just make a new alt account, but still, have a bit more respect for yourself.

As long as Intel brings more competition to GPU space we should be happy. When they said they are bringing out GPUs we knew that it would take a couple years or more. Some progress reports - especially significant ones like this - are definitely welcome.

idk to me this is just non news, if you want an update for every little screw they put into the card then we can have a LOT of articles about this.

We knew it was coming, we knew the expected release date and...nothing has changed except for that we know its coming a bit more certainly now? yay.

Delays would be news but reinforcing what we already know really isnt, unless there were previous rumors that the originally expected date was not going to be met.
 
Joined
Jan 8, 2017
Messages
8,942 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
It was possible a long time ago to do Crossfire using the GPU in the APU. This isn't new.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
A simplified version of the problem is that while Crossfire and SLI (and their DX12 counterpart - implicit mGPU) do multi-GPU largely automatically and developer needs to enable it and test it to make sure they avoid a bunch of gotchas, doing this in Vulkan or DX12 (implicit mGPU) requires developer to basically implement it.
Crossfire/SLI work completely different from what Sniper Elite 4 has. Namely, it sends all of the data to all of the cards and each card focuses on rendering part of the scene then puts together the parts and displays it. It's an extremely inefficient way to do it and severely punishes the best card paired in an asymmetric configuration.

What Vulkan and D3D12 can do is schedule work based on warp load. A Xe card, for example, could get 25 frames scheduled while the Iris Pro gets 5 frames scheduled per second. I'm not entirely sure how that works in regards to VRAM but all of the GPUs work on a need-to-know basis rather than cloning like SLI/Crossfire does. It's much more efficient from a performance an power consumption perspective.
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
@FordGT90Concept, exactly. The key here is that this work scheduling has to be managed by developer.
I have not looked very closely into what exactly the couple available explicit mGPU games do in terms of memory usage (and I do not have Crossfire or SLI right now) but VRAM is not the biggest concern with it right now.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Yup, and despite that being available for many years now, only Sniper Elite 4 does it as far as I know. So unless Intel has something new up their sleeve, this isn't exciting at all.

VRAM is high bandwidth and high latency. In GPU design, VRAM performance is the number one factor they have to engineer around. Every clock that the VRAM is able to respond, they have to have all of their requests queued while still (hopefully) executing on previous data. Anything that hinders VRAM performance can severely reduce work throughput.
 
Joined
Sep 15, 2007
Messages
3,944 (0.65/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
In before devs start making fun of Intel.
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Scheduling 25 frames on dGPU and 5 on iGPU is tried-and-true approach and pretty much what Crossfire/SLI do, minus differences in scheduling. Intel actually had the better (or at least more innovative) idea a few years back when they were talking about mGPU stuff - getting iGPU to do postprocessing effects. Basically dGPU does most of the frame and last few postprocessing effects are done on iGPU. These effects need less resources (textures, buffers, basically things in memory). It didn't work all that well but the idea is sound. Managing the data, scheduling and performance levels of different GPUs throughout this is still a pain though.

I think this is getting off the main topic, sorry.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Scheduling 25 frames on dGPU and 5 on iGPU is tried-and-true approach and pretty much what Crossfire/SLI do, minus differences in scheduling.
It is not. The dedicated GPU would produce frames at roughly the same rate the iGPU does so you'd get about 90% uplift from iGPU performance. If the dedicated GPU is capable of producing frames faster than 190% of iGPU performance, then you're better off leaving Crossfire/SLI disabled. As I said, Crossfire/SLI split the screen and each card renders its area of the screen then puts them together. It cannot take advantage of asymmetric hardware.


Post processing effects do make some sense.
 
Joined
Oct 28, 2019
Messages
228 (0.14/day)
AMD has done a lot of R&D on this, so has Nvidia and so has Intel. There are fundamental issues with how GPUs work today that prevent from this being effective for gaming (and other real-time rendering applications). Either someone has to come up with a new approach (and collectively they have tried a lot of things) or something has to change about how graphics are rendered/computed - a new paradigm and APIs probably.

Rumors are pretty much always there as there is constant research going on around multi-GPU solutions. Crossfire and SLI were/are the most efficient methods for now. Again, this is for gaming and other real-time rendering applications. GPGPU or similar applications (non-real-time rendering), especially data centers and distributed GPGPU with lower latency/time dependency are much easier and have been largely figured out for a long while (research into improving these has obviously not stopped and is ongoing and this are better in that area as well, the pain points and bottlenecks are simply different).

Uhm, but since both SLI and Crossfire are scaling like 90-99% with dual GPU in several games (so the potential is there), what is the this R&D work focused on? Making the scaling 100% all time on all the games?
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
It is not. The dedicated GPU would produce frames at roughly the same rate the iGPU does so you'd get about 90% uplift from iGPU performance. If the dedicated GPU is capable of producing frames faster than 190% of iGPU performance, then you're better off leaving Crossfire/SLI disabled. As I said, Crossfire/SLI split the screen and each card renders its area of the screen then puts them together. It cannot take advantage of asymmetric hardware.
Crossfire and SLI are doing either AFR or SFR. With current rendering technologies, AFR seems to be the more common method of choice. Both work on a level of frame (or two frames). 25 on dGPU and 5 on iGPU example is on a higher level and is effectively same solution but with better scheduling. Scheduling in this case is a more complex problem for asymmetric hardware than it seems at first glance.
Uhm, but since both SLI and Crossfire are scaling like 90-99% with dual GPU in several games (so the potential is there), what is the this R&D work focused on? Making the scaling 100% all time on all the games?
It works in some games and fails in many others. SLI/Crossfire users can do custom profiles and are tweaking the settings to make games work better but even that does not always help. An mGPU solution needs more reliability in how it works in terms of what games it works in.

Currently there are games that scale perfectly but there is an increasing number of games where it does not scale or scales very poorly. That takes away a lot of the value.
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Hey, I have an idea, why don't you start a GPU company and make something that breasts the lot of them, for a quarter of the price? Everyone here would be eternally grateful to you...
Adding to this - IP and patents are going to be a very serious problem for any new player in the GPU market. It might be worth checking the various lawsuits between companies making GPUs. AMD, Intel and Nvidia are all covered with extensive cross-licensing agreements. Other vendors have been mostly bought up and surviving separate ones are generally in the mobile GPU market - ARM's Mali, Qualcomm's (old AMD-sourced) Adreno and Imagination's PowerVR.
 
Joined
Mar 14, 2018
Messages
76 (0.03/day)
It may not be competition to high-edn like 5700XT or 2070, but just small graphics aimed to use in notebooks.

As for release date, this is first silicon, and may require few revisions before go to shops. Each revision take some 5 months (3m for manufacturing, 2m for analyzing and preparing next revision)
 
Joined
Apr 24, 2019
Messages
182 (0.10/day)
My guess is their top tier dgpu will be rx570 tier..which is not bad for first time..I just want AMD to make a rx580 tier APU (1080p 60fps)..do it and 80% gamer will forget about dgpu.
 
Joined
Jan 8, 2017
Messages
8,942 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Adding to this - IP and patents are going to be a very serious problem for any new player in the GPU market.

Actually I am wondering why no one tried licensing ARM Mail GPUs and make something custom with them.
 
Joined
Dec 28, 2006
Messages
4,378 (0.69/day)
Location
Hurst, Texas
System Name The86
Processor Ryzen 5 3600
Motherboard ASROCKS B450 Steel Legend
Cooling AMD Stealth
Memory 2x8gb DDR4 3200 Corsair
Video Card(s) EVGA RTX 3060 Ti
Storage WD Black 512gb, WD Blue 1TB
Display(s) AOC 24in
Case Raidmax Alpha Prime
Power Supply 700W Thermaltake Smart
Mouse Logitech Mx510
Keyboard Razer BlackWidow 2012
Software Windows 10 Professional
whats the improvement over the i740, why havn't intel told us, i want to know how much preformance uplift from their last dedicated card
 
Top