• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Equipped to Lead Industry to Era of Exascale Computing

But will it run Crysis?
 
x86 PCI "blade" aka larrabee-phoenix, is a great way to scale a PC/workstation/server for asymmetric computational power.

One CPU "horse" and 100 "mice". Perfect for 99% of real-world applications where CPU is not 90%+ all the time, but where you want lots of background threads, services, applications or VMs "ready".

But I have said it before, and will say it again, windows task scheduler is not yet up to scratch to handle this. MS needs to build a more sophisticated scheduler that understands asymmetric CPU scaling.
 
CUDA vs x86. x86 = much better and virtually anyone can program for it without having to learn much new. x86 is also far more powerful in that it can handle logic processing much better than CUDA.

o i get it, just like those pentium 4s, ordinary x86 architecture like a CPU
 
2 million homes? 1.6GW/2000000=0.8Watt per home
 
Until Intel can make drivers: I'll believe it when I see it.

If nVidia is around long enough, then they will be first. If they are not, then AMD will be. In either case Intel will be last. If I could put money on it, I would.
 
2 million homes? 1.6GW/2000000=0.8Watt per home

1.6GW is 1.6 billion watts...not 1.6 million. It would equal 800W per home. :slap:
 
1.6GW is 1.6 billion watts...not 1.6 million. It would equal 800W per home. :slap:

that really enough for a household? a normal one that is. I know a lot of us on here use half to almost that much through our comps, but 800w?.... That sounds like xbox, tv, a light per room, no ac, and when you want to use the microwave you better unplug yo frig lol
 
that really enough for a household? a normal one that is. I know a lot of us on here use half to almost that much through our comps, but 800w?.... That sounds like xbox, tv, a light per room, no ac, and when you want to use the microwave you better unplug yo frig lol

average over the course of 1 days might be 800 watts per hour.....

lots of people only have a fridge turned on during work/sleep.
 
Ohhh P2 is back! slot cpu's again
 
i cant see 32core beat gpu with 512 cores or 3200 sps
 
Is that a radeon card with an intel sticker :laugh:

No.

This is what has had ATI and Nvidia, scared for some time now, as Intel is the leading develpoer in advanced CPU architecture, it's not a stretch to think they could and are looking into GPU market, the first steps of which we see CPU's with integrated GOU's ion the chip, this is just the next step.
 
i cant see 32core beat gpu with 512 cores or 3200 sps

There is more to it than just cores, it's all about the architecture.
 
There is more to it than just cores, it's all about the architecture.

One of the reasons why Intel quad cores beat AMD hexa cores.
 
No.

This is what has had ATI and Nvidia, scared for some time now, as Intel is the leading develpoer in advanced CPU architecture, it's not a stretch to think they could and are looking into GPU market, the first steps of which we see CPU's with integrated GOU's ion the chip, this is just the next step.

In which case I still stand by my statement. Until Intel can prove they can write drivers, ATI and nVidia are far, far from scared of this. It's more along the lines of this: :roll:
 
I doubt these things need much of any drivers (just a little something something to say hey, you got extra x86 processors over here). Also, I suspect these cards are going to sell for $2000+ a pop so it isn't like any of us are in the market for them. IBM, Cray, and other super-computer manufactureres will likely buy them up by the hundreds of thousands though.
 
In which case I still stand by my statement. Until Intel can prove they can write drivers, ATI and nVidia are far, far from scared of this. It's more along the lines of this: :roll:


Basically Intel could rule the scientific research and discrete graphics card markets if they were willing to spend that much R&D money on it. They have deep pockets. But there isn't that much money to be made in the enthusiast and scientific markets versus family internet machines, and business solutions.
 
Last edited:
It's the Intel version of NVIDIA Tesla. XD
 
I doubt these things need much of any drivers (just a little something something to say hey, you got extra x86 processors over here). Also, I suspect these cards are going to sell for $2000+ a pop so it isn't like any of us are in the market for them. IBM, Cray, and other super-computer manufactureres will likely buy them up by the hundreds of thousands though.

Basically Intel could rule the scientific research and discrete graphics card markets if they were willing to spend that much R&D money on it. They have deep pockets. But there isn't that much money to be made in the enthusiast and scientific markets versus family internet machines, and business solutions.

If they didn't need drivers (or very basic ones) then why do the nVidia and AMD Tesla and Firepro cards (respectively) cost so much?

Its because of drivers.

Intel wants the cast cow. Problem is they don't have the feed to feed them with.
 
Back
Top