• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Could Unveil its Graphics Card at 2019 CES

One stupid question, and apologies in advance for it... :D
Isn't the SHADER cores nVidia and AMD proprietary ONLY?? If so, how can Intel develop a new GPU without those?? Sorry, just asking....

There is no such thing as a proprietary shader core it's a loosely defined term anyway.

Sure they patent their GPUs in their entirety and their ISAs but these change very often and it's not worth it to waste time copying entire architectures. Not as a long term strategy anyway.
 
Just wondering, isn't anybody considering that Intel might just fail again at this? Or that their GPU isn't targeted at gamers at all? (which needs a ton of driver support)
I'm hoping that with all the ATI staff they have hired, they have at least some people with a reasonable amount of experience in making a consumer-facing driver for 3Dgraphics that doesn't have the same issues that their iGPU drivers do. It's the only reason they have, really , for all those hires.

You'll note I didn't say AMD employees.... :p;)

One stupid question, and apologies in advance for it... :D
Isn't the SHADER cores nVidia and AMD proprietary ONLY?? If so, how can Intel develop a new GPU without those?? Sorry, just asking....

Doing graphics is just doing math. That's why GPUs are so good for "mining". There's no patent on doing math equations in that way, and where there is, Intel has licenses for most of the tech they need that they don't outright own. There's no story to be told at all about any of that side of this.
 
I agree with this discrete GPU Intel can kill more birds with one stone like AI and cryptocurrency with gaming as side dish, assuming they can keep up not only the hardware development cycle but also the software support
 
I agree with this discrete GPU Intel can kill more birds with one stone like AI and cryptocurrency with gaming as side dish, assuming they can keep up not only the hardware development cycle but also the software support
if there was a choice between AI and crypto, I'd think they would heavily leaning towards AI. The saying 2's company, 3 is a crowd, I hope Intel can stay away from crypto "stuff"
 
That would actually be good for both of them. Someone definitely needs to put cuda in its place.
Easier said than done. Simply because CUDA (as proprietary as it might be) works. I mean, CUDA-enabled programs usually beat OpenCL2 implementations and not having OpenCL2 support in place is (was?) supposed to be a weakness for Nvidia.
 
I hate the fact that x86 has all this licensing red tape locking out competition meanwhile Intel can just jump right into GPU development.
 
I'm glad you said usually cause sometimes it just can't do basic math correctly lol
Yeah that was only for Volta and was probably a driver bug. Nothing to do with CUDA itself.
 
I hate the fact that x86 has all this licensing red tape locking out competition meanwhile Intel can just jump right into GPU development.

They made a concept of their own: https://www.techpowerup.com/241669/intel-unveils-discrete-gpu-prototype-development

It's not infriding tech of Nvidia or AMD. It's a free market and thus their ability to create a new GPU.

Intel owned the whole market for many years with integrated GPU's. This to delivery of complete motherboards and chips that dont need a discrete graphics card in the first place. But they still hold a great portion of desktop space market. It's just not suitable or even comparible with gaming in the first place.

It's just "2D" and some basic 3D. They have created some IGP's into their CPU's lately but they are not strong enough (esp with Vega and HBM) to compete in the first place.
 
Just wondering, isn't anybody considering that Intel might just fail again at this? Or that their GPU isn't targeted at gamers at all? (which needs a ton of driver support)

Larabee type of fail? Certainly not. Not bringing exceptional performance? Well, yeah, but so what?
Fail, learn from it, improve is the normal cycle.
If they are serious about GPU presence, I simply don't see what could stop them.

If there is a company on the planet which can enter consumer GPU market, it's Intel.

I hate the fact that x86 has all this licensing red tape locking out competition meanwhile Intel can just jump right into GPU development.
I don't follow.
It's understandable to hate "x86 licensing red tape" (not that we would have real competition even if there were no licensing problems).
But hating it "because GPU is not limited", huh?

Intel actually paid nVidia and now pays AMD for GPU patents.
 
I'm mostly concerned by it since we've seen Intel abuse it's power in the past and at the same time lock out competition from them. I just don't feel like I want to welcome in trouble personally. Intel has done some anti competitive things to both Nvidia and AMD in the past so them getting into their primary business market is a pretty reasonable concern from a consumer standpoint. Yeah it could bring short lived increased competition followed by a decade of the most incremental updates they can get away with all while raising prices gradually. Then we again we've sort of reached that influx now anyway with Nvidia behaving in much the same way. I think the biggest concern is how it might cripple AMD at a point in time where they really just becoming competitive again when it's desperately needed for consumers. I view it as a double edged sword that cuts both ways.
 
I think the biggest concern is how it might cripple AMD at a point in time where they really just becoming competitive again when it's desperately needed for consumers. I view it as a double edged sword that cuts both ways.

The only thing competitive about AMD is their marketing team.
 
The only thing competitive about AMD is their marketing team.
That's not true and you know it. At the same time, if every company would be able to succeed only when competition was playing nicely, then we wouldn't have any successful companies at all. AMD needs to man up and play the hand they were dealt. Which they have done, lately.
 
Just wondering, isn't anybody considering that Intel might just fail again at this? Or that their GPU isn't targeted at gamers at all? (which needs a ton of driver support)
previous iteration isn't fail, they made big money, drop support for products released less than 1-2 ago and go further. Now they focused on Internet of things to make make big money on fools who trust in their "technological superiority", so me waiting for rebranding and repurposing this products to sell it in data center markets.
 
Back
Top