Saturday, June 4th 2022

Intel Shows Off its Arc Graphics Card at Intel Extreme Masters

Currently the Intel Extreme Masters (IEM) event is taking place in Dallas and at the event, Intel had one of its Arc Limited Edition graphics cards on display. It's unclear if it was a working sample or just a mockup, as it wasn't running in a system or even mounted inside a system. Instead, it seems like Intel thought it was a great idea to mount the card standing up on the port side inside an acrylic box, on top of a rotating base. The three pictures snapped by @theBryceIsRt and posted on Twitter doesn't reveal anything we haven't seen so far, except the placement of the power connectors.

It's now clear that Intel has gone for a typical placement of the power connectors and the card in question has one 8-pin and one 6-pin power connector. Intel will in other words not be using the new 12-pin power connector that is expected to be used by most next generation graphics cards. We should mention that @theBryceIsRt is an Intel employee and is the Intel Arc community advocate according to his twitter profile, so the card wasn't just spotted by some passerby. Intel has as yet not revealed any details as to when it's planning on launching its Arc based graphics cards.
Source: @theBryceIsRt
Add your own comment

108 Comments on Intel Shows Off its Arc Graphics Card at Intel Extreme Masters

#51
defaultluser
aQiOver the corse of years nvidia has developed impressive harmony towards their hardware in terms of software. Likewise Amd did the same.
Intel has excellent hardware, proper innovation in and out of their architectural history yet lack software harmony. This is where they are making sure their ARC series makes a breakthrough and delaying the release almost from last 4+ months.
Common example of what im saying is the power consumption vs performance graphs. Intel graphics eat watts but produce less numbers compared to Nvidia or Amd cards on same power draw charts.
Intel has produced working gpus by now even showcased the limited edition but going through the phase of software to hardware harmony. They are exceptionally working on software/drivers to harvest the power of their power drawing hardware.
I believe blue team will definitely make a difference in the market. Especially those useless hardware issue Nvidia and Amd cards have leaving users to through away and buy new. Atleast Intel hardware would be more superior in general point of view.
The problem with this methodology: in all previous instances where there have been critical delays, the market has had only a single competitor to deal with.

AMD's delay of TeraScale by nearly a year could be covered by continued refreshes of the beefcake x1900 XTX. And NVIDIA survived the 8-month delay of Fermi with endless re-brands of G92 and GT200 x2,

What happens to a delayed architecture when there are two other strong releases waiting on their doorstep? Will Intel even get a foothold, without massively discounted parts?
Posted on Reply
#52
Bomby569
mechtechNo one really wants to downgrade, but something is better than nothing though. Also can only do what budget/time allow. I play Terraria at 4k with my RX 480 :)
that seems fair, better then playing terraria at 4k on a 3090, and yes they exist, i know one (not terraria but point proven)
Posted on Reply
#53
Steevo
Someone check for wood screws
Posted on Reply
#54
johnpa
Now you see it, now you don't.

Classic vaporware.

From the article:
It's unclear if it was a working sample or just a mockup.
Posted on Reply
#55
AusWolf
FrickOhhh man you haven't tried Freelancer on a big 4K monitor. Generally I agree with you, but that game (and the texture mod) REALLY shines at higher resolution and 32".
That's an old game that doesn't need a 3080 even at 4K. But now that you mentioned it, I've got a light gaming capable HTPC and a 4K TV, so I guess it's time to try. :rolleyes:
Posted on Reply
#56
Unregistered
I have a 4k TV, not once have i tried running my games on it with my 1080ti. Happy with my 1440p monitor. If i ever get £2k+ to piss up a wall on a 3090/ti would i just to try gaming at 4k? would i f ck.
#57
Frick
Fishfaced Nincompoop
AusWolfThat's an old game that doesn't need a 3080 even at 4K. But now that you mentioned it, I've got a light gaming capable HTPC and a 4K TV, so I guess it's time to try. :rolleyes:
Seriously, do it. If you liked Freelancer back then you'll love it now. Sit close to it. I too dismissed the notion of "increased screen estate hightens imershun" but in this case it really works.
TiggerI have a 4k TV, not once have i tried running my games on it with my 1080ti. Happy with my 1440p monitor. If i ever get £2k+ to piss up a wall on a 3090/ti would i just to try gaming at 4k? would i f ck.
It's not so much the resolution as the size. Sure the programs/games that actually scales look good, but not "worth it" good. At this point in time I actually wouldn't mind a 42" 4K monitor.
Posted on Reply
#58
Vayra86
aQiOver the corse of years nvidia has developed impressive harmony towards their hardware in terms of software. Likewise Amd did the same.
Intel has excellent hardware, proper innovation in and out of their architectural history yet lack software harmony. This is where they are making sure their ARC series makes a breakthrough and delaying the release almost from last 4+ months.
Common example of what im saying is the power consumption vs performance graphs. Intel graphics eat watts but produce less numbers compared to Nvidia or Amd cards on same power draw charts.
Intel has produced working gpus by now even showcased the limited edition but going through the phase of software to hardware harmony. They are exceptionally working on software/drivers to harvest the power of their power drawing hardware.
I believe blue team will definitely make a difference in the market. Especially those useless hardware issue Nvidia and Amd cards have leaving users to through away and buy new. Atleast Intel hardware would be more superior in general point of view.
It was going fine until you said Intel doesnt deal in planned obscolescence / insinuate Intel cards magically wont suffer from getting old and just dying. I think you oughta change the color of those glasses to transparent, uncolored.
Posted on Reply
#59
Unregistered
FrickSeriously, do it. If you liked Freelancer back then you'll love it now. Sit close to it. I too dismissed the notion of "increased screen estate hightens imershun" but in this case it really works.


It's not so much the resolution as the size. Sure the programs/games that actually scales look good, but not "worth it" good. At this point in time I actually wouldn't mind a 42" 4K monitor.
We have a 58" 4k TV
#60
Vayra86
FrickSeriously, do it. If you liked Freelancer back then you'll love it now. Sit close to it. I too dismissed the notion of "increased screen estate hightens imershun" but in this case it really works
Yeah but too bad those on screen markers are so effin huge man! Kinda turned me off and boy did the game age a bit, too.
Posted on Reply
#61
R0H1T
ZetZetIt doesn't matter as long as the price is right. Even if their top card is around the next generation RTX4060 or 7600XT it can still sell quite well.
With the impending global recession don't be so sure! In fact if oil prices stay this high for longer, also the war continues, you can bet your bottom dollar that Intel will be cursing themselves for not launching this 1-2 years back :slap:
Posted on Reply
#62
AusWolf
FrickSeriously, do it. If you liked Freelancer back then you'll love it now. Sit close to it. I too dismissed the notion of "increased screen estate hightens imershun" but in this case it really works.
I didn't like it. I LOVED IT! :D I'll definitely try it on my TV. I don't know why I didn't think about this before. :rolleyes: I can imagine screen estate adds to the experience when it gives you more space (literally) to look at.

I just don't feel like a bigger monitor would add enough to my general gaming experience to justify its own, and a faster and noisier GPU's cost. That is, I'd much rather game at 1080p 24" with a dead silent 6500 XT than pay for a 4K monitor and a not so silent 3080.
Posted on Reply
#63
aQi
defaultluserThe problem with this methodology: in all previous instances where there have been critical delays, the market has had only a single competitor to deal with.

AMD's delay of TeraScale by nearly a year could be covered by continued refreshes of the beefcake x1900 XTX. And NVIDIA survived the 8-month delay of Fermi with endless re-brands of G92 and GT200 x2,

What happens to a delayed architecture when there are two other strong releases waiting on their doorstep? Will Intel even get a foothold, without massively discounted parts?
Intel wont stand tough against rdna3 and rtx4000 series but they could just try to catch up. I remember Intel job openings for discrete gpu graphic driver development in US and China on urgent basis. I guess they want the bells to ring but they just need more and more time to suite up against both green and red team.
Vayra86It was going fine until you said Intel doesnt deal in planned obscolescence / insinuate Intel cards magically wont suffer from getting old and just dying. I think you oughta change the color of those glasses to transparent, uncolored.
Well im not saying their hardware is immortal atleast the build quality engineering is a bit ahead of others. Atleast their hardware carry their reputation whether its a processor or even some controller on pci x1, it would perform as good as it should even after a decade.
Posted on Reply
#64
Steevo
aQiIntel wont stand tough against rdna3 and rtx4000 series but they could just try to catch up. I remember Intel job openings for discrete gpu graphic driver development in US and China on urgent basis. I guess they want the bells to ring but they just need more and more time to suite up against both green and red team.


Well im not saying their hardware is immortal atleast the build quality engineering is a bit ahead of others. Atleast their hardware carry their reputation whether its a processor or even some controller on pci x1, it would perform as good as it should even after a decade.
Laughs in 1100T on decade old AMD components…..

You should try trolling elsewhere or read a little more before spouting off literal fanboy BS.

Flexing boards, flexing CPUs, the terrible idea of the original 775 coolers that caused noobs and even pros to kill boards, their relabeled and unsupported Killer NiC series, their hardware security issues, the number of overheated chips that died in OEMs.

Intel isn’t magical, they are a for profit company just like AMD and their bullshitdozer that got stuck in the ponds.

Quality is quality, buying a 300 board from either means or should last longer than a 79 dollar board from either. Same for coolers they don’t make, memory, PSUs, and hard drives.
Posted on Reply
#65
chstamos
That Bitboys Oy purchase technical know-how sure is paying dividends already.
Posted on Reply
#66
damric
Seems like the perfect card to play Star Citizen
Posted on Reply
#67
PapaTaipei
AusWolfI agree. Graphics cards are getting ridiculously overpowered - not just in terms of power consumption, but also performance. I mean, game technology isn't evolving as fast as graphics performance is. That's why they're advocating completely pointless 4K gaming, because there's nothing else that would need the performance of a 3080 level graphics card, not to mention 4080. They have to sell those oversized monsters somehow. I've just recently downgraded my 2070 to a 6500 XT because of the noise, and I'm happy... with an enrty-level graphics card! This has never happened to me before.
Agreed. RT is also 100% useless. However I can attest that 4K is nice for non competitive games. If the 20xx amd 30xx card didn't had those 100% useless tensor/RT cores it could have 30-40% more performance per die. Also I saw the UE5 demos, yes it looks cool but at the cost of MASSIVE storage requirements and a complete rework of the data pipeline, and even then it would still have bad framerate and more importantly extremely bad mouse input for competitive games.
Posted on Reply
#68
chrcoluk
AusWolfI agree. Graphics cards are getting ridiculously overpowered - not just in terms of power consumption, but also performance. I mean, game technology isn't evolving as fast as graphics performance is. That's why they're advocating completely pointless 4K gaming, because there's nothing else that would need the performance of a 3080 level graphics card, not to mention 4080. They have to sell those oversized monsters somehow. I've just recently downgraded my 2070 to a 6500 XT because of the noise, and I'm happy... with an enrty-level graphics card! This has never happened to me before.
They are kind off, there is two things which in my opinion are keeping the market alive.

One is generated demand via hardware exclusive features, so e.g. with Nvidia, Gsync, DLSS and hardware based RT. All three features thanks to AMD have alternatives that dont require hardware lock in, VRR, FSR and RT using rasterisation hardware.

I get the merit of VRR, Nvidia deserve praise for introducing the concept, but they did attempt vendor lock in, not only via GPU but also using expensive chips on monitors. DLSS I also get the merit and out of the three this is for me by the far the most useful tech in terms of potential, but initially limited to games where dev's implement it (so limited to a a fraction of new games released), however we have more recently seen a new variant of DSR that uses DLSS so can be implemented now driver side, not sure if FXR can do via driver (someone can clarify for me this would be great). Finally RT, this one I have little love for, I feel lighting can be done very well in games via traditional methods, and I thought was lame where RT developed games would have heavily nerfed non RT lighting and to have good lighting you needed RT hardware, AMD have at least proven the hardware isnt needed to make this a bit less lame. But ultimately RT has served to increase the level of hardware required to achieve a quality level so is a bad thing for me.

The second demand for new GPUs is been fuelled by the high FPS/HZ craze, which is primarily desired by first person shooter and esport gamers. If there was no high refresh rate monitors, then Nvidia and AMD would be having a much harder time now in convincing people to buy new gen GPUs, as we starting to get to the point a single GPU can handle anything thrown at it to a degree at 60fps. The to a degree comment is that of course game developers whether its via lazyness or bungs thrown at them by AMD/Nvidia are optimising their games less so they need more grunt to hit a quality and frame rate target. Nvidia have been caught in the past when a developer was rendering tesselation underneath the ground in a game which helped sell Nvidia cards. There was also problems noticed in the FF15 benchmark tool where it was rendering things out of view which lowered the framerate. In that game as well Nvidia exclusive features absolutely tanked framerate and spiralled VRAM demand skywards.
Posted on Reply
#69
phanbuey
idk... im super stoked playing games at 4k with a 48" oled. That kind of experience is something else -- to the point that it makes some boring games fun.

I think 4k is worth it... text sharpness gfx fidelity, it's really nice... I think a 4080 would be a good upgrade to my gaming experience -- esp using larger screens.
Posted on Reply
#70
defaultluser
aQiIntel wont stand tough against rdna3 and rtx4000 series but they could just try to catch up. I remember Intel job openings for discrete gpu graphic driver development in US and China on urgent basis. I guess they want the bells to ring but they just need more and more time to suite up against both green and red team.
And how many quarters of losses in the graphics card sector is Intel going t continue to endure before the board cuts their losses??

At least compute GPUs they can sorta justify (via the supercomputer contracts) but how long do you expect Intel to continue to be #3 in the consumer market, while bleeding thee losses? (untill they fire the entire driver team , and absorb the best innto compute modules.)?
Posted on Reply
#71
eidairaman1
The Exiled Airman
trsttteFor all the haters, they do have demo computers running Intel GPUs at the event. Laptops with the smaller version of the silicon are also becoming available worldwide (i'm not sure the drivers are up to par yet, but at least they should work better than 5700xt initial launch)

There's also a lot of reasons to buy an Intel GPU other than games, like linux support, the advanced media engine or if you want to dable with OneApi stuff.

Anyway, now you made sound like an Intel fanboy :D , it's fun to mock the forever delayed intel gpu but can we keep some semblance of reality?


8+6 power connector is disapointing, I hate 6 pin connectors, more often than not if you're not running custom cables you'll be left with a 2 pin connector dangling of the gpu, guess this was designed so long ago that the new gen5 power connector wasn't available :D
Linux support for AMD has been since the ATi days
Posted on Reply
#72
aQi
SteevoLaughs in 1100T on decade old AMD components…..

You should try trolling elsewhere or read a little more before spouting off literal fanboy BS.

Flexing boards, flexing CPUs, the terrible idea of the original 775 coolers that caused noobs and even pros to kill boards, their relabeled and unsupported Killer NiC series, their hardware security issues, the number of overheated chips that died in OEMs.

Intel isn’t magical, they are a for profit company just like AMD and their bullshitdozer that got stuck in the ponds.

Quality is quality, buying a 300 board from either means or should last longer than a 79 dollar board from either. Same for coolers they don’t make, memory, PSUs, and hard drives.
I agree those are the issues faced with most of the companies (partners) i am magnifying the gpu soldering issue and the artifects issue we see every now and then with both nvidia and amd cards. If you positively see there is less ration of intels late north and south bridge to desolder and cause issues. Something I use to experience with nforce chips. If you see more into this Intel ICs on other peripherals also prove to have more life and less malfunctions.
You are right and thats what i mentioned earlier. Intel has software issues and its harmony towards hardware coming from software is less compared to nvidia and amd. This is what I am trying to say that it wants to cover this segment for its ARC gpus causing massive delay.
Posted on Reply
#73
eidairaman1
The Exiled Airman
Vayra86It was going fine until you said Intel doesnt deal in planned obscolescence / insinuate Intel cards magically wont suffer from getting old and just dying. I think you oughta change the color of those glasses to transparent, uncolored.
Their socket changes every 2 years leaves users high and dry.
Posted on Reply
#74
aQi
defaultluserAnd how many quarters of losses in the graphics card sector is Intel going t continue to endure before the board cuts their losses??

At least compute GPUs they can sorta justify (via the supercomputer contracts) but how long do you expect Intel to continue to be #3 in the consumer market, while bleeding thee losses? (untill they fire the entire driver team , and absorb the best innto compute modules.)?
Lol if Intel is really concerned in giving back to the gamer community and devoting to graphics then they wont care about losing anything at all but I highly doubt that frame this exactly where things are going. Nvidia was the only company who was on panel of Tesla motors for graphics and AI. Amd took over alot of interest from time to time introducing low cost and highly effective rdna structure. Intel introduced hardware AI in their 10th gen processor family yet there was always something missing and Intel was leaving itself behind and foreseen losing markets in most of the Industries due to poor support in graphics. Its the right time to release so called limited edition graphics but there is no point in releasing something which has poor driver support. If I were Intel I would do the same holding everything down just to make an appearance rather then thinking about losing money.
Posted on Reply
#75
R0H1T
defaultluserAnd how many quarters of losses in the graphics card sector is Intel going t continue to endure before the board cuts their losses??
If past is any indication then 8-12 quarters easily, assuming only their consumer cards bear losses because if HPC or enterprise ones are also sold on something like "contra revenues" then they'll have to jump ship faster than mice on the Titanic o_O

Of course knowing Intel they'll probably charge you for OCing these cards or turning 3d acceleration on via their premium (upgrade) plans :laugh:
Posted on Reply
Add your own comment
May 8th, 2024 23:59 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts