Discussion in 'News' started by btarunr, Jun 21, 2011.
But will it run Crysis?
Its going to try, and fail miserably at it.
x86 PCI "blade" aka larrabee-phoenix, is a great way to scale a PC/workstation/server for asymmetric computational power.
One CPU "horse" and 100 "mice". Perfect for 99% of real-world applications where CPU is not 90%+ all the time, but where you want lots of background threads, services, applications or VMs "ready".
But I have said it before, and will say it again, windows task scheduler is not yet up to scratch to handle this. MS needs to build a more sophisticated scheduler that understands asymmetric CPU scaling.
o i get it, just like those pentium 4s, ordinary x86 architecture like a CPU
2 million homes? 1.6GW/2000000=0.8Watt per home
Until Intel can make drivers: I'll believe it when I see it.
If nVidia is around long enough, then they will be first. If they are not, then AMD will be. In either case Intel will be last. If I could put money on it, I would.
1.6GW is 1.6 billion watts...not 1.6 million. It would equal 800W per home.
that really enough for a household? a normal one that is. I know a lot of us on here use half to almost that much through our comps, but 800w?.... That sounds like xbox, tv, a light per room, no ac, and when you want to use the microwave you better unplug yo frig lol
average over the course of 1 days might be 800 watts per hour.....
lots of people only have a fridge turned on during work/sleep.
Ohhh P2 is back! slot cpu's again
Don't forget p1 and early p3 stuff as well as some AMD styled slot processors !
nice high-flow bracket
i cant see 32core beat gpu with 512 cores or 3200 sps
This is what has had ATI and Nvidia, scared for some time now, as Intel is the leading develpoer in advanced CPU architecture, it's not a stretch to think they could and are looking into GPU market, the first steps of which we see CPU's with integrated GOU's ion the chip, this is just the next step.
There is more to it than just cores, it's all about the architecture.
One of the reasons why Intel quad cores beat AMD hexa cores.
This is true in most cases.
In which case I still stand by my statement. Until Intel can prove they can write drivers, ATI and nVidia are far, far from scared of this. It's more along the lines of this:
I doubt these things need much of any drivers (just a little something something to say hey, you got extra x86 processors over here). Also, I suspect these cards are going to sell for $2000+ a pop so it isn't like any of us are in the market for them. IBM, Cray, and other super-computer manufactureres will likely buy them up by the hundreds of thousands though.
Basically Intel could rule the scientific research and discrete graphics card markets if they were willing to spend that much R&D money on it. They have deep pockets. But there isn't that much money to be made in the enthusiast and scientific markets versus family internet machines, and business solutions.
would you say this is the x86 version equivalent of CUDA?
It's the Intel version of NVIDIA Tesla. XD
If they didn't need drivers (or very basic ones) then why do the nVidia and AMD Tesla and Firepro cards (respectively) cost so much?
Its because of drivers.
Intel wants the cast cow. Problem is they don't have the feed to feed them with.
Separate names with a comma.