• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Shows Off its Arc Graphics Card at Intel Extreme Masters

Fair enough. Correction: pointless / overpriced.

I'm still fine with 1080p and I feel no need to upgrade. An expensive 4K monitor would necessitate a costly GPU upgrade. No thanks.

That's what has been holding back 4K gaming adoption for years now. At first is was the cost of the monitors but they have come way down in price but it's still the cost of the video card to run it properly that has/will continue to hold down 4K gaming adoption for most people.

I have been gaming for almost 4 decades now and in all those years I've seen it over and over again. Games have always will always continue to require more and faster resources.
 
Only the ego’s at Intel would showcase an otherwise insignificant piece of technology as if it were some powerful artifact found in an ancient temple.
Every time intel feels they are being left behind they wave their arms and dangle the carrot in front of suckers.

I say when it's physically on store shelves or being sold on etailers that news be brought up. I mean a RX 6000 series refresh came out before these are to see light of day, If that ever happens.

Deadware to me.
 
Couldn't they at least show working eval samples.....
 
I agree. Graphics cards are getting ridiculously overpowered - not just in terms of power consumption, but also performance. I mean, game technology isn't evolving as fast as graphics performance is. That's why they're advocating completely pointless 4K gaming, because there's nothing else that would need the performance of a 3080 level graphics card, not to mention 4080. They have to sell those oversized monsters somehow. I've just recently downgraded my 2070 to a 6500 XT because of the noise, and I'm happy... with an enrty-level graphics card! This has never happened to me before.
Yea, but in light of the above, iNtEl has probably been conflaborating & colluding with the game makers for quite some time to make it so that all new games released near or after the "overpowered monsters" are out in the wild will REQUIRE a minimum of a high-end ARC (or equivalent) card just to run at mid-high res/FR's....

In fact, this may well be the reason for the continued launch delays, so that ALL new games are highly optimized to run on high-end iNtEl's cards way moar betta than the other brands of cards.....

I hope this is not the case, but this IS intel we are talking about here, so who knows what kinda sneaky sh*t they have been doing behind the scenes.....
 
Fair enough. Correction: pointless / overpriced.

I'm still fine with 1080p and I feel no need to upgrade. An expensive 4K monitor would necessitate a costly GPU upgrade. No thanks.
Well, some of us use our computers for more than playing games.
Then again, right now I can't afford to buy anything, but at least most games that I play, I can play at 4K.
But yes, you're right that this is becoming a stupidly expensive hobby.

Couldn't they at least show working eval samples.....
Sorry, the drivers aren't at a state that would allow anything to be shown in public.
 
Only the ego’s at Intel would showcase an otherwise insignificant piece of technology as if it were some powerful artifact found in an ancient temple.

do you think Intel are peasents to present technology in someones kitchen?
 
They probably still don’t have working silicon fit for public demos even after all this time. Given everything over the last five years, I can’t understand why anyone cares about Intel GPUs.

I think most of what we are seeing is just hope that Intel will release something that will bring the costs of GPUs down to reasonable levels but the fact is those days have long gone. Probably most won't buy an Intel discrete GPU anyway until they prove that they will support their GPUs with proper drivers. It's like you see with AMD GPUs where so many claim they will buy AMD if the value is better but they buy Nvidia anyway and that's why Nvidia holds 80% of the discrete GPU market.
 
Just like their igp and i740, larrabee

Well, some of us use our computers for more than playing games.
Then again, right now I can't afford to buy anything, but at least most games that I play, I can play at 4K.
But yes, you're right that this is becoming a stupidly expensive hobby.


Sorry, the drivers aren't at a state that would allow anything to be shown in public.
 
I think most of what we are seeing is just hope that Intel will release something that will bring the costs of GPUs down to reasonable levels but the fact is those days have long gone. Probably most won't buy an Intel discrete GPU anyway until they prove that they will support their GPUs with proper drivers. It's like you see with AMD GPUs where so many claim they will buy AMD if the value is better but they buy Nvidia anyway and that's why Nvidia holds 80% of the discrete GPU market.

to be fair, the driver side of things is very important on a gpu, almost as important as the hardware
 
For all the haters, they do have demo computers running Intel GPUs at the event. Laptops with the smaller version of the silicon are also becoming available worldwide (i'm not sure the drivers are up to par yet, but at least they should work better than 5700xt initial launch)

There's also a lot of reasons to buy an Intel GPU other than games, like linux support, the advanced media engine or if you want to dable with OneApi stuff.

Anyway, now you made sound like an Intel fanboy :D , it's fun to mock the forever delayed intel gpu but can we keep some semblance of reality?


8+6 power connector is disapointing, I hate 6 pin connectors, more often than not if you're not running custom cables you'll be left with a 2 pin connector dangling of the gpu, guess this was designed so long ago that the new gen5 power connector wasn't available :D
 
They probably still don’t have working silicon fit for public demos even after all this time. Given everything over the last five years, I can’t understand why anyone cares about Intel GPUs.
The rumours suggest the silicon is just fine, but the drivers aren't anywhere near something that could be handed over to the general public.
That said, laptops are going up for pre-order in Australia and New Zealand, so there must be some kind of working drivers... Still about a month out though, maybe...

do you think Intel are peasents to present technology in someones kitchen?
Pat will be holding a BBQ for the launch...
 
to be fair, the driver side of things is very important on a gpu, almost as important as the hardware

I would say even more so than the hardware. It's pretty much useless to have the fastest hardware if the drivers won't run it properly.
 
1624713263_amd_finewine_(source_-_reddit)_story.jpg


We need and Intel version of the meme
 
I would say even more so than the hardware. It's pretty much useless to have the fastest hardware if the drivers won't run it properly.
Something that's very obvious when it comes to arm based SoCs made by companies in the PRC.
There's a reason why the RPi is so popular, despite being ho hum in the hardware department.
 
I've seen very hyped products turn out garbage and low key ones being amazing. We just have to wait. But my expectations are very low, probably they will do great in the low end market with a good price point, were AMD and Nvidia have been shooting their own feet constantly, and that's it. I doubt they can compete in the other markets.
 
Regarding the comments about Intel laptops on the market, all of these contain Arc A3 series (96 and 128 EUs) which is the bare minimum IGP implementation in a discrete form factor. Intel has had this tech for many years now. The trick is making something above IGP levels of performance which Intel has only slightly exceeded with the A370M. No other higher mobile and certainly any desktop SKUs have appeared yet.

By the way, working silicon implies working hardware AND software. Its not so easy to have one without the other so its implied.
 
Interesting. But comming this late to the game in the current gpu market. I would say Intel's Arc gpu will end competing with amd rdna3 and nvidia rtx 4000 series cards amd not so much the current generation of graphics cards.
Late is a relative term. But yes I would agreed that they will be up against rdna3 and nv 4k series.
 
Fair enough. Correction: pointless / overpriced.

I'm still fine with 1080p and I feel no need to upgrade. An expensive 4K monitor would necessitate a costly GPU upgrade. No thanks.

Ohhh man you haven't tried Freelancer on a big 4K monitor. Generally I agree with you, but that game (and the texture mod) REALLY shines at higher resolution and 32".
 
Pointless in your opinion.
Been playing at 4K for several years already and in some games I wouldn't want to go back.
No one really wants to downgrade, but something is better than nothing though. Also can only do what budget/time allow. I play Terraria at 4k with my RX 480 :)
 
Back
Top