• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Quashes Larrabee and Fusion Hype

I am an nvidia owner but they seem to be getting desparate, they know they have no answer for fusion and amd will be taking a lot of oem systems and eliminating the need of dedicated graphics, Nvidia is just downplaying fusion because they got no answer for it, or they will be late with that answer.

not true, for media enthusiants to accelerate hi-def a dedicated GPU with high memory bandwith is a must. And as for gaming well nuff said. And intel hasn't built a quality GPU in 15 years. The last decent thing they built was the i740 to compete with the Riva128.
 
Lemme just say this about fusion:
It will sell like water!!
For laptops, umpc, low cost machines, etc...
 
Actually candle, the way AMD is putting out the specs for fusion, it will do HI-def very well, not to mention the cpu is gonna be a multi core cpu so it will have plenty of bandwidth to do whats needed. This is a boon not only for laptops, htpcs and netbooks, but also for entertainment devices, cell phones, etc. Its just the beginning. I for one would love to get my hands on a kick ass fusion chip with a decent gpu (hell hd 3200 would work) and work out some sort of crossfire, power saving feature on my desktop.

@Dude who said about the expense millimeter thingy coming from the nvidia guy, I was so gonna post that but I didnt cause you put it there. Thanks man.
 
no i don't think so , nvidia do good cpu's , maybe they want tell amd we can do cpu's too but this not happened
 
I think, Nvidia needs to rework a GPU to be used as a CPU. Since a GPU is so much faster. Think of the possibilities if ATI started making AMD CPUs. AMD would be king until Intel foudn a way to steal what they where doing.
 
Oh ya? I remember when people used to say dual video cards wouldnt work back in the Voodoo 2 days. GPUs are already processing CPU information via Cuda. If theres a will theres a way.
 
Its not nearly the same thing
 
a GPU can do 95% of the work a CPU can, you add a few hundred more transistors to it, you will have a GPU that can double as a CPU. You can also used CUDA to translate x86 into something a GPU understands.
 
The Fusion is going to be god send to every laptop owner out there.
 
depends on power really, how many low power ATOMS can AMD make @ 45nm
 
a GPU can do 95% of the work a CPU can, you add a few hundred more transistors to it, you will have a GPU that can double as a CPU. You can also used CUDA to translate x86 into something a GPU understands.
Exactly. People say its different only because people think GPU=graphics. Well just program it to do something else.
 
thats the whole point of fusion. dirt cheap oem systems that need to be able to run vista aero, play back some basic video and be cheap, cheap, cheap. this is by far the biggest market in the pc industry, about 5,000 times bigger (educated guess) than all this GTX 260/280, 4870 X2 stuff

completely agree, and unless I'm mistaken, it was this OEM market that was intended to be the primary market for the Fuzion since it's initial acknowledgment of existance from AMD.



OT - I wouldn't be surprised if Intel has some kind of smart-ass comment in return for these statements from nVidia . . .

. . . . although, I'm fairly certain AMD will keep their comments to themselves and allow Fuzion sales (once released) to speak for themselves.
 
So what, Nvidia created Geforce 9 hype...
 
a GPU can do 95% of the work a CPU can, you add a few hundred more transistors to it, you will have a GPU that can double as a CPU. You can also used CUDA to translate x86 into something a GPU understands.

Its not just that easy! You have to redo the whole architecture and change alot of stuff - Thats gonna take years
 
Its not just that easy! You have to redo the whole architecture and change alot of stuff - Thats gonna take years

Indeed.

CPU = slow all-terain vehicle
GPU = fast car designed for drag-racing
 
a GPU can do 95% of the work a CPU can, you add a few hundred more transistors to it, you will have a GPU that can double as a CPU. You can also used CUDA to translate x86 into something a GPU understands.

A GPU is terrible at normal computing. The architecture is not efficient at anything but floating point calculations. They do horrible in everything else, which is where most desktop apps reside.

And using CUDA to translate x86 into something that a gpu can understand is called emulation. Emulation would make it perform worse than just using a cpu to begin with.

Besides that, modern CPUs are more complicated to design than modern gpus. GPUS don't have branch predictors, ways to compensate for cache misses, out of order execution, etc., etc.

The short version = gpus make bad general cpus.
 
Imagine the big fans and the power consumption just for the CPU. Everyone would have to buy high capacity PSU's and you can forget about HTPC's
 
Everybody uses big fans on CPUs anyways, either that or watercooling.
 
Where did you get your statistics from?
 
Back
Top