• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Larrabee Only by 2010

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Last week, Intel announced its Visual Computing Research Center at Saarland University in Saarbrücken, Germany. During its opening ceremony, details emerged about when Intel plans to commercially introduce Larrabee, the company's take on graphics processing using x86-based parallelism. The company categorically stated that one could expect Larrabee to be out only by early 2010.

"I would expect volume introduction of this product to be early next year," said Intel chief executive Paul Otellini. Until now, Larrabee was known to be introduced coarsely around the 2009-2010 time-frame. "We always said it would launch in the 2009/2010 timeframe," said Intel spokesperson Nick Knupffer in an email to PC Magazine. "We are narrowing that timeframe. Larrabee is healthy and in our labs right now. There will be multiple versions of Larrabee over time. We are not releasing additional details at this time," he added. In the same event, Intel displayed a company slide with a die-shot of Larrabee, revealing what looked like the x86 processing elements. Sections of the media were abuzz with inferences drawn on the die-shot, some saying that it featured as many as 32 processing elements.

View at TechPowerUp Main Site
 
since we're in the middle of 09 now, that means 6-12 months.

That means they're competing with next-gen Nvidia and ATI products... it'll be a fun period :)
 
Last edited:
since we're in the middle of 09 now, that means 6-12 months.

That means they're competing with next-gen NVIDIA and ATI products... it'll be a fun period :)

FTFY. Maybe this is why NV is going close to overkill with GT300. Can't underestimate someone like Intel.
 
FTFY? Furry tree flies... yellow?
 
To be honest, if they will have even similar performance as we do have NOW, they will be way over my expectations. Really, who believes that lar. will compete with G300 and RV800?
Although, you don't need to be the performance leader to have profit (AMD i.e.).
 
To be honest, if they will have even similar performance as we do have NOW, they will be way over my expectations. Really, who believes that lar. will compete with G300 and RV800?
Although, you don't need to be the performance leader to have profit (AMD i.e.).

i beleive it will compete. my god, if anyone can pull off 32 CPU's matching a GPU, its intel.
 
FTFY = fixed that for you. Compare the post I quoted to yours.
 
oh lawl. typo for the lose.
 
But adding more shaders doesnt give the same peformance yeild as it once did for GPU architecture like ATI and nVidia GPU's.
I hope Larrabee doesnt turn out to be a flop, its such a unique architecture, Intels taking quite a gamble and has to fight the stigma their intergrated poopy graphics offering has left them with.
Just reading about Larrabee makes it sound ever more so awesome, built of the Pentium architecture but modified like crazy, the way it splits threads into strands and all that is cool it, doesnt use stream processors afaik.
I think PC Perspective have some detailed articles about it.

To be honest, if they will have even similar performance as we do have NOW, they will be way over my expectations. Really, who believes that lar. will compete with G300 and RV800?
Although, you don't need to be the performance leader to have profit (AMD i.e.).


I believe that to begin with, we'll be lucky if we see close to the same peformance of current cards, the benefit is though that unlike the card manufacturers which will have to go back to the drawing boards when slapping more SP's on doesnt make their cards faster, Intel (provided the product doesnt get canned) could in theory, just cruse along with their scalable peformance, not needing to worry about hitting some peformance wall.

Anyway, Intel to face the same conundrum they had with Itanium which I hope doesnt stop larrabee from being a big change to computing/gaming/etc.
 
Last edited:
2010 GPU market by performance 1# AMD/ATI 2#INTEL 3#Nvidia GT300=EPIC FAIL
 
Larrabee as a GPU is a trojan horse.

With 32x 512-bit vector MIMD it will whoopass CUDA.

512-bit can handle 8 double precision (32 bit) or long integer variables. And 32x. That is equivalent to 256x the speed of a regular math unit in a single core processor.

That means for math work, Larrabee has the potential to be (256/4)=64x as fast as a quad core CPU clock-for-clock. It will probably be running at half the frequency of a regular CPU, maybe slower. But that will equate to at least 16-24x the speed of a quad core.

And math libraries can be very rapidly modified for Larrabee x86.

R.I.P. CUDA/Tesla.

nV know this. GT300 has to compete. Look at the panic in their eyes.

Moore's Law will be bust open for the 2008-2010 season. And the Soda-Darwin law applies: survival of the fastest.
 
oh and you can bet your ass intel will put out an i7 server motherboard with 4 PCI-E x16 slots just waiting for 4x larrabees to rape and pillage CPU farms everywhere.
 
Larrabee as a GPU is a trojan horse.

With 32x 512-bit vector MIMD it will whoopass CUDA.

512-bit can handle 8 double precision (32 bit) or long integer variables. And 32x. That is equivalent to 256x the speed of a regular math unit in a single core processor.

That means for math work, Larrabee has the potential to be (256/4)=64x as fast as a quad core CPU clock-for-clock. It will probably be running at half the frequency of a regular CPU, maybe slower. But that will equate to at least 16-24x the speed of a quad core.

And math libraries can be very rapidly modified for Larrabee x86.

R.I.P. CUDA/Tesla.

nV know this. GT300 has to compete. Look at the panic in their eyes.

Moore's Law will be bust open for the 2008-2010 season. And the Soda-Darwin law applies: survival of the fastest.

Nvidia switched to MIMD, but I think these cards have the potential to suprise everyone, I think they might be the first PCI-E 3.0 cards too

well, maybe not lol:D

but its really funny how nobody is talking about how Nvidia stole Intels plan to use MIMD, they obviously did, but everyone still busses on Intel for doing things different
 
I havent seen mention once that they will be made as a gaphics card.
 
Bah, 6-12 more months? I don't know if my 8800 GT can take that much more abuse. :(
 
actually one thing gets to me.

what the hell does the "only" in the title actually mean?
 
I suppose he knew that already:)
I think he meant"what's their next move in the graphics sector?"
 
good news gives me time to get my system running
 
I think Intel is taking too long to try to hit a moving target. Plus they haven't had a good track record with graphics performance. Nvidia and AMD will at least have revised versions of G300 and RV800 out by the time Intel releases Larrabee, and they may be well on the way to G400 and RV900 by then. Looks like G300 will be a good bit faster than G200 and AMD will be doing their thing also, I don't quite see how they will have anything better than them at release.

I do believe Intel can make a better one eventually, but they need to make the decision to put the effort and time in for the long term.
 
Intel has a great track record with graphics performance--for those that do only light gaming and no gaming at all. Their IGPs are wildly successful. Intel has only been working on this for 2-3 years at most and it usually takes about 4-5 years to develop new processors. The rate they are going at is excellent.

Again, Intel's design is superscaler. It isn't very hard for them to release a processor with far more cores but it comes at a price. Intel shouldn't have much problem staying way ahead of NVIDIA and AMD.

Remember, the only thing that's really special about GPUs is there extremely high FlOp performance. It doesn't matter how a GPU gets (only reflected in price/profit margins) it so long as it gets it.
 
Back
Top