• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Shows Off its Arc Graphics Card at Intel Extreme Masters

That shows just how badly Nvidia manipulated and distorted consumers quite frankly with deceptive advertising. A lot of people bought into RTRT cards based on that video thinking they would end up with something similar or modestly similar and yet look at Cyber Punk they had to handicap poly count in the end just to insert the bits of RTRT they did insert into the game. Plus that demo of Nvidia's is 24 FPS and no one in their right mind is going to considering gaming at 24 FPS reasonably fluid there is just too much input lag at that FPS. It's noticeably better even at 30FPS on input lag and still not great and further still at 36 FPS it's really not until about 48FPS average things start to approach a generally decent experience that feels fairly responsive. Until GPU's can start to integrate better frame interpolation around a FPS render target like that to compensate for it that type of frame rate will ever being very satisfying to end users. The fact is even 60 FPS is a bit of a general crutch for input responsiveness.

It is deceptive. Nvidia can call it RTRT and leave out the part where it's only partly RTRT. It's actually a mixture of Ray Tracing and Rasterization.

It will be interesting to see how Intel markets their gaming cards.
 
Those Titan V GPU's are faster than RTX 3060's as well is the crazy part and they used 4 of them. They also used expensive as hell HBM thus weren't at all cheap. Interestingly they were only 12GB and with traditional path tracing rendering VRAM capacity is often more beneficial not to mention SLI doesn't increase memory capacity and DX 12 API pooling together VRAM capacity is a pipe dream it seems since I've not heard a thing about that except to help market and sell new DX12 hardware and push the new DX12 API itself and adoption of Windows 10.

The industry would probably be better off w/o DX12 and Microsoft contributing towards the development of a open API like Vulkan instead, but we can't have that right!!? Microsoft needs something to hold over consumers to adopt newer OS's otherwise they'd be fine with Linux.
 
If past is any indication then 8-12 quarters easily, assuming only their consumer cards bear losses because if HPC or enterprise ones are also sold on something like "contra revenues" then they'll have to jump ship faster than mice on the Titanic o_O

Of course knowing Intel they'll probably charge you for OCing these cards or turning 3d acceleration on via their premium (upgrade) plans :laugh:


And that's my whole point. if you think Intel will magically catch-up in three years, I want whatever you're smoking. That's barely a single architecture generation!

Intel's board will cut the cruft after 2-3 years, and fir the third time Intel will pretend it never had discrete graphics
 
Is it possible that Intel's new ARC architecture might be more viable in cryptomining than in gaming? At least that would give Intel some road to profitability...
 
That shows just how badly Nvidia manipulated and distorted consumers quite frankly with deceptive advertising. A lot of people bought into RTRT cards based on that video thinking they would end up with something similar or modestly similar and yet look at Cyber Punk they had to handicap poly count in the end just to insert the bits of RTRT they did insert into the game. Plus that demo of Nvidia's is 24 FPS and no one in their right mind is going to considering gaming at 24 FPS reasonably fluid there is just too much input lag at that FPS. It's noticeably better even at 30FPS on input lag and still not great and further still at 36 FPS it's really not until about 48FPS average things start to approach a generally decent experience that feels fairly responsive. Until GPU's can start to integrate better frame interpolation around a FPS render target like that to compensate for it that type of frame rate will ever being very satisfying to end users. The fact is even 60 FPS is a bit of a general crutch for input responsiveness.
No, NVIDIA did not manipulate anyone with a fucking tech demo. Unless those people are morons who don't know what a tech demo is.
 
Is it possible that Intel's new ARC architecture might be more viable in cryptomining than in gaming? At least that would give Intel some road to profitability...


I doubt it - the reason you still cant buy an rtx 3060 at anywhere near msrp is because the 192-bit-bus combined with enough vram to multitask mining. Mining performance depends on memory bandwidth, plus enough spare vram to multitask.

Why would something new and shiny on arc somehow speed-up memory-limited compute?>?
 
@defaultuser
Maybe instead of trying (and failing) to compete with the duopoly in gaming videocards Intel should've tried to compete in the cryptomining arena -- it's not very ecofriendly, but it could be profitable?
 
Are these things being made at Intel Fabs, or they are at TSMC?

If they are at TSMC they are wasting resources/production space that could be used for decent gpus, so instead of lowering prices and helping the market with an alternative they can put the graphics cards problem in a bigger hole that they are now.
 
Back
Top