• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Downplays the Growing Popularity of NVIDIA CUDA

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,890 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
The co-general manager of Intel's Digital Enterprise Group, Pat Gelsinger told Custom PC that NVIDIA's CUDA programming model would be nothing more than an interesting footnote in the annals of computing history.

Gelsinger says that programmers simply don't have enough time to learn how to program for new architectures like CUDA. Says Gelsinger: "The problem that we've seen over and over and over again in the computing industry is that there's a cool new idea, and it promises a 10x or 20x performance improvement, but you've just got to go through this little orifice called a new programming model. Those orifices have always been insurmountable as long as the general purpose computing models evolve into the future.". The Sony CELL and the fact that it didn't live up to all its hype as something superior to current computing architectures proves his point.

Gelsinger tells that Intel's Larrabee graphics chip will be entirely based on Intel Architecture x86 cores, and the reason for that is so developers can program for the graphics processor without having to learn a new language. Larrabee will have full support for APIs like DirectX and OpenGL.



View at TechPowerUp Main Site
 
Last edited by a moderator:
the man is right, but larrabee will have the same issues because its "programming model" is still different .. massive parallelism, "programmers simply don’t have enough time to learn how to program for new architectures"
 
There's very little furniture in that GPU schematic, even S3 Chrome S27 made it look pretty :p

jk
 
the man is right, but larrabee will have the same issues because its "programming model" is still different .. massive parallelism, "programmers simply don’t have enough time to learn how to program for new architectures"
True to a point... there are ready-to-go libraries for x86, and compilers that can be extended for larrabee. While the average programmer will not get optimal performance our of larrabee, they can work within their regular IDE and the compiler will do most of the work.

With CUDA (or Physx for that matter) you need a huge shift in how you think/develop, plus you need to learn the new programming model.
 
i agree with Lemonadesoda it takes large ammount of time and money to leurn a new code and Update the game engines, mapping tools and every els that part of this stuff
 
the man is right, but larrabee will have the same issues because its "programming model" is still different .. massive parallelism, "programmers simply don’t have enough time to learn how to program for new architectures"

Well, you know how it is -- they have to say that. With all this talk about CUDA and ATI's GPGPU and how disgustingly fast they're able to crunch numbers -- even if they're limited in what they can do -- Intel's starting to look left out of the loop to the public, at least where buzzwords like "folding" and "number crunching" are involved.
 
There is no acceptable learning curve for new gaming engines. That's why most games today only use 1 of the 3 proven engines. NV's best bet is to continue to derive CUDA to fit those engine. We all know about the mindstorm that came about when physics were introduced into the crytek engine. Crytek 2 is a monster but CUDA can still only help it. Developing new CUDA drivers for every crackpot engine that comes around would be a complete waste.

That Larrabee chip looks really plain, maybe we'll be able to play doom on it :p
 
So if there is an Orifice that these Companies have to go thru, why Is ATI and Nvidia the Most popular among gamers, Where intel isnt. I say intel is feeling threatened by both ATI and Nvidia.

The co-general manager of Intel’s Digital Enterprise Group, Pat Gelsinger told Custom PC that NVIDIA’s CUDA programming model would be nothing more than an interesting footnote in the annals of computing history.

Gelsinger says that programmers simply don’t have enough time to learn how to program for new architectures like CUDA. Says Gelsinger: “The problem that we’ve seen over and over and over again in the computing industry is that there’s a cool new idea, and it promises a 10x or 20x performance improvement, but you’ve just got to go through this little orifice called a new programming model. Those orifices have always been insurmountable as long as the general purpose computing models evolve into the future.”. The Sony CELL and the fact that it didn't live up to all its hype as something superior to current computing architectures proves his point.

Gelsinger tells that Intel’s Larrabee graphics chip will be entirely based on Intel Architecture x86 cores, and the reason for that is so developers can program for the graphics processor without having to learn a new language. Larrabee will have full support for APIs like DirectX and OpenGL.



Source: Custom PC
 
^Exactly, intel is afraid from all these new techs, which could some day compete with their best parts.
Number crunching is number crunching, and if some day GPUs could bend and crunch as good as a processor, this might be a problem for them.
 
no cus the dont have X86
 
Sonys cell does live up to its hype, and Im sure there is more.

I for one am interested in seeing what Intel does with their larrabee cards and this ray tracing they are so fond of.
 
Technically ATI doesnt Exist as a Company, just a Product Line now, so yes ATI does have X86 licensing thru AMD.
 
Technically ATI doesnt Exist as a Company, just a Product Line now, so yes ATI does have X86 licensing thru AMD.

You sure?

*Begins to drool* ... :rockout:
 
well most of the head positions at ATI left after the buyout, aka forced retirement (good sum of money for them to quit) And it was a buyout, so AMD could implement a X86 graphics card, but the coding will be tough regardless, so i believe the route will be, what isnt broke dont fix it. beyond that isnt there a link here stating that ATI cards as of late have some ray tracing capability anyway?
 
Sonys cell does live up to its hype, and Im sure there is more.

I for one am interested in seeing what Intel does with their larrabee cards and this ray tracing they are so fond of.

AMD already beat intel to it...

You sure?

*Begins to drool* ... :rockout:

Indeed. However "ATi" still hasnt been dropped. AMD, unlike nvidia actually keep what they buy alive.
 
Intel is bringing up a weak arguement because their attempt in the "gaming" graphics industry is so far behind.

They may well be right, but the current systems came about because people were willing to make them work. Both ATI and Nvidia are willing to try to make theirs work.
 
AMD already beat intel to it...



Indeed. However "ATi" still hasnt been dropped. AMD, unlike nvidia actually keep what they buy alive.

only thing Nvidia kept was SLI when it was 3DFX that created that solution.
 
only thing Nvidia kept was SLI when it was 3DFX that created that solution.

The "SLI" by 3DFX wasn't an abbreviation for Scalable Link Interface. So no, the name isn't a carry forward of an old brand name.
 
the Initials were Scan-Line Interleave, which was same concept, only thing nvidia did was tried improving upon it.
 
Back
Top