Saturday, June 4th 2022

Intel Shows Off its Arc Graphics Card at Intel Extreme Masters

Currently the Intel Extreme Masters (IEM) event is taking place in Dallas and at the event, Intel had one of its Arc Limited Edition graphics cards on display. It's unclear if it was a working sample or just a mockup, as it wasn't running in a system or even mounted inside a system. Instead, it seems like Intel thought it was a great idea to mount the card standing up on the port side inside an acrylic box, on top of a rotating base. The three pictures snapped by @theBryceIsRt and posted on Twitter doesn't reveal anything we haven't seen so far, except the placement of the power connectors.

It's now clear that Intel has gone for a typical placement of the power connectors and the card in question has one 8-pin and one 6-pin power connector. Intel will in other words not be using the new 12-pin power connector that is expected to be used by most next generation graphics cards. We should mention that @theBryceIsRt is an Intel employee and is the Intel Arc community advocate according to his twitter profile, so the card wasn't just spotted by some passerby. Intel has as yet not revealed any details as to when it's planning on launching its Arc based graphics cards.
Source: @theBryceIsRt
Add your own comment

108 Comments on Intel Shows Off its Arc Graphics Card at Intel Extreme Masters

#1
Jism
Nice prototype.
Posted on Reply
#2
Tomgang
Interesting. But comming this late to the game in the current gpu market. I would say Intel's Arc gpu will end competing with amd rdna3 and nvidia rtx 4000 series cards amd not so much the current generation of graphics cards.
Posted on Reply
#3
Jism
Nah, their initial batch failed. Did'nt meet expectations. So they fix it, respin it and hope for the best. By the time it's released next generation is knocking at the door with figures of 120FPS 4K ready cards.
Posted on Reply
#4
ZetZet
TomgangInteresting. But comming this late to the game in the current gpu market. I would say Intel's Arc gpu will end competing with amd rdna3 and nvidia rtx 4000 series cards amd not so much the current generation of graphics cards.
It doesn't matter as long as the price is right. Even if their top card is around the next generation RTX4060 or 7600XT it can still sell quite well.
Posted on Reply
#6
GreiverBlade
given the recent rumors ...

8+6pin for a 3060Ti equivalent :ohwell:

well, i'll wait the reviews before any definitive opinions (preferably TPU reviews :lovetpu: what? too bootlicker? ok... :oops: )
Posted on Reply
#7
Vayra86
TiggerWhy so big when it will be no better than a GT710 /s
Wait... are you bashing Intel now? :D Welcome to the club!
Posted on Reply
#8
aQi
Over the corse of years nvidia has developed impressive harmony towards their hardware in terms of software. Likewise Amd did the same.
Intel has excellent hardware, proper innovation in and out of their architectural history yet lack software harmony. This is where they are making sure their ARC series makes a breakthrough and delaying the release almost from last 4+ months.
Common example of what im saying is the power consumption vs performance graphs. Intel graphics eat watts but produce less numbers compared to Nvidia or Amd cards on same power draw charts.
Intel has produced working gpus by now even showcased the limited edition but going through the phase of software to hardware harmony. They are exceptionally working on software/drivers to harvest the power of their power drawing hardware.
I believe blue team will definitely make a difference in the market. Especially those useless hardware issue Nvidia and Amd cards have leaving users to through away and buy new. Atleast Intel hardware would be more superior in general point of view.
Posted on Reply
#9
AusWolf
ZetZetIt doesn't matter as long as the price is right. Even if their top card is around the next generation RTX4060 or 7600XT it can still sell quite well.
I agree. Graphics cards are getting ridiculously overpowered - not just in terms of power consumption, but also performance. I mean, game technology isn't evolving as fast as graphics performance is. That's why they're advocating completely pointless 4K gaming, because there's nothing else that would need the performance of a 3080 level graphics card, not to mention 4080. They have to sell those oversized monsters somehow. I've just recently downgraded my 2070 to a 6500 XT because of the noise, and I'm happy... with an enrty-level graphics card! This has never happened to me before.
Posted on Reply
#10
Unregistered
AusWolfI agree. Graphics cards are getting ridiculously overpowered - not just in terms of power consumption, but also performance. I mean, game technology isn't evolving as fast as graphics performance is. That's why they're advocating completely pointless 4K gaming, because there's nothing else that would need the performance of a 3080 level graphics card, not to mention 4080. They have to sell those oversized monsters somehow. I've just recently downgraded my 2070 to a 6500 XT because of the noise, and I'm happy... with an enrty-level graphics card! This has never happened to me before.
Too many "must haves" in the world.
#11
AusWolf
TiggerToo many "must haves" in the world.
Absolutely.

My TL;DR: if the Arc can deliver 3060 (Ti) level performance with low-noise cooler options at a good price, it'll be a winner (at least in my world).
Posted on Reply
#13
GreiverBlade
AusWolfAbsolutely.

My TL;DR: if the Arc can deliver 3060 (Ti) level performance with low-noise cooler options at a good price, it'll be a winner (at least in my world).
but the price will not be low/good ... it's Intel we talk about ... and they seems to aim to make their top dog a LE...
also given the 8+6pin ... it's already kind of grim for the consumption (or they did go overkill for the sake of pricing it higher)

my "fear" since it's their top dog ... they will be pricing it too high, because if they price it adequately, their mid and low end will have an abysmal pricing, starting from the second after the A770
Posted on Reply
#14
chrcoluk
ZetZetIt doesn't matter as long as the price is right. Even if their top card is around the next generation RTX4060 or 7600XT it can still sell quite well.
Pretty much this, its price/perf ratio, how the card achieves that wont matter to most, hopefully its power/perf is good.
Posted on Reply
#15
Uroshi
Just me or that PCIE connector seems not having any traces?
Posted on Reply
#16
V3ctor
I have seen so much info about these gpus for the last 2 years, that I dont know when this vaporware finally launches.

Too much time has passed, surely they can maybe perform close to the current gen GPUs, but the next gen will problably wipe the floor with them.
Just hope they help lowering the prices of gpus, to me that will be a victory.
Posted on Reply
#17
DeathtoGnomes
JismNice prototype.
Never seen a professional 3D printed object? :p
Posted on Reply
#18
64K
My guess is that it will be mostly a "paper launch". I could be wrong though.

Intel makes tens of billions USD of profit in their CPU line. Whatever profit they stand to make in their future discrete GPU will be very small in comparison.
Posted on Reply
#19
TheLostSwede
News Editor
AusWolfI agree. Graphics cards are getting ridiculously overpowered - not just in terms of power consumption, but also performance. I mean, game technology isn't evolving as fast as graphics performance is. That's why they're advocating completely pointless 4K gaming, because there's nothing else that would need the performance of a 3080 level graphics card, not to mention 4080. They have to sell those oversized monsters somehow. I've just recently downgraded my 2070 to a 6500 XT because of the noise, and I'm happy... with an enrty-level graphics card! This has never happened to me before.
Pointless in your opinion.
Been playing at 4K for several years already and in some games I wouldn't want to go back.
Posted on Reply
#20
InVasMani
If this is the best Raja can come up with AMD made the right choice.
Posted on Reply
#21
TheLostSwede
News Editor
UroshiJust me or that PCIE connector seems not having any traces?
Could be a mockup.
Posted on Reply
#22
Vayra86
GreiverBladebut the price will not be low/good ... it's Intel we talk about ... and they seems to aim to make their top dog a LE...
also given the 8+6pin ... it's already kind of grim for the consumption (or they did go overkill for the sake of pricing it higher)

my "fear" since it's their top dog ... they will be pricing it too high, because if they price it adequately, their mid and low end will have an abysmal pricing, starting from the second after the A770
Of course. Competitive product wil cause price movement. Arc is late and not enough, it wont even compete. It will land in the hands of collectors, early adopters, and naive consumers. At a high price.

Thats fine for a first gen product though. Look at RDNA1. It took another gen to make the actual competitive stack ouf of it. Similarly: Turing vs Ampere. And for both you could argue theyre also not quite perfect just yet, either missing a featureset or just gobbling up lots of power to do that. But even against that, Arc is never going to be competitive.
Posted on Reply
#23
Bomby569
Seems to closed in the back and sides, not good for thermals.
Posted on Reply
#24
AusWolf
TheLostSwedePointless in your opinion.
Been playing at 4K for several years already and in some games I wouldn't want to go back.
Fair enough. Correction: pointless / overpriced.

I'm still fine with 1080p and I feel no need to upgrade. An expensive 4K monitor would necessitate a costly GPU upgrade. No thanks.
Posted on Reply
#25
Daven
Only the ego’s at Intel would showcase an otherwise insignificant piece of technology as if it were some powerful artifact found in an ancient temple.
Posted on Reply
Add your own comment
Apr 27th, 2024 04:50 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts