- Joined
- Oct 6, 2009
- Messages
- 2,820 (0.53/day)
- Location
- Midwest USA
System Name | My Gaming System |
---|---|
Processor | Intel i7 4770k @ 4.4 Ghz |
Motherboard | Asus Maximus VI Impact (ITX) |
Cooling | Custom Full System Water cooling Loop |
Memory | G.Skill 1866 Mhz Sniper 8 Gb |
Video Card(s) | EVGA GTX 780 ti SC |
Storage | Samsung SSD EVO 120GB - Samsung SSD EVO 500GB |
Display(s) | ASUS W246H - 24" - widescreen TFT active matrix LCD |
Case | Bitfenix Prodigy |
Power Supply | Corsair AX760 Modular PSU |
Software | Windows 8.1 Home Primeum |
ACCORDING TO A REPORT at Semiaccurate, Nvidia's Fermi GPU chip has recently been downsized.
Nvidia has had to cut back its Fermi GPU because of the chip's low wafer yields and its high heat output, if the latest reports are to believed.
Fermi has long been thought to be too hot to handle and Nvidia has now cut down the number of stream processors to 448 instead of the previous 512, and it has admitted that the GPU chip will be a 225 Watt part. Nvidia's own Fermi information pages still show it as having 512 SPs, but according to a recent PDF this is no longer the case.
Semiaccurate thinks Nvidia cut back the number of stream processors either because the Fermi chip can't be produced in volume with less than two major flaws in its SP array or its power draw and heat output were just too much for Nvidia's graphics board partners to deal with, or both. Those reasons do sound plausible.
But really, a single GPU chip that draws 225W maximum and reportedly 190W gaming average is going to run hot and be hard to cool. Let's just put it this way, we wouldn't fancy leaving the thing running on top of some dried newspaper and kindling.
Update In response to readers' valid comments, we have revised the foregoing paragraph in recognition that other high-end graphics chips have similarly high maximum power draws, and we've amended other text below. However, for just one example, Nvidia's GTX280 graphics chip has been reported to pull less than 150W running 3DMark06, or just over 75 per cent of what Fermi is expected to draw in typical gaming. So we stand by our general opinion that Fermi is going to be a very hot graphics chip when it finally appears, and not in a good way. - Ed.
This new information makes some of the praise that's still on Nvidia's Fermi information page look a little odd. "Fermi surpasses anything announced by Nvidia's leading GPU competitor (AMD)", said Tom Halfhill, senior analyst and editor at Microprocessor Report. Dave Patterson, the director of the parallel computing research laboratory at UC Berkeley, added, "I believe history will record Fermi as a significant milestone."
Maybe not so much a milestone as a millstone, perhaps. Fermi is starting to seem like a GPU chip too far.
What seems certain is that Nvidia's fab partner TSMC is finding it challenging, to say the least, to manufacture the chip, and that it will have to be cooled by a massive heat-piped heatsink and likely multiple fans, if not a waterblock or a more exotic cooling method like phase-change refrigeration.
Whether this will all work out for the Green Goblin remains to be seen. But at this point it looks as though Nvidia's Fermi GPU chip might be delayed again. µ
http://www.theinquirer.net/inquirer/news/1567017/nvidia-fermi-gpu-cut
Being someone who prefers ATI over Nvidia..... Is this finally going to be the time that ATI takes back the crown? Or is this a sign that Nvidia's new Femi is so powerful that it just can't be contained?
I posted similar threads in both Nvidia and ATI forums to hear from both sides.....
Enjoy .....FTW ATI!