Thursday, October 1st 2009

NVIDIA 'Fermi', Tesla Board Pictured in Greater Detail, Non-Functional Dummy Unveiled

Unveiled at the footnote of the GPU Technology Conference 2009, by none other than NVIDIA CEO Jen-Hsun Huang, NVIDIA's Fermi architecture looks promising, at least in the field of GPGPU, which was extensively discussed upon in his address. The first reference board based on NVIDIA's newest 'GT300' GPU is a Tesla HPC processor card, which quickly became the face of the Fermi architecture. Singapore HardwareZone, and PCPop caught some of the first closeup pictures of the Tesla accelerator, and the GPU's BGA itself. Decked in a dash of chrome, the Tesla HPC processor card isn't particularly long, instead a great deal of compacting by its designers is evident. It draws power from one 8-pin, and 6-pin PCI-E power connectors, which aren't located next to each other. The cooler's blower also draws air from openings in the PCB, and a backplate further cools the GPU (and possibly other components located) from behind. From the looks of it, the GPU package itself isn't larger than that of the GT200 or its predecessor, the G80. Looks like NVIDIA is ready with a working prototype against all odds, after all, doesn't it? Not quite. On close inspection of the PCB, it doesn't look like a working sample. Components that are expected to have pins protruding soldered on the other side, don't have them, and the PCB seems to be abruptly ending. Perhaps it's only a dummy made to display at GTC, and give an indication of how the card ends up looking like. In other words, it doesn't look like NVIDIA has a working prototype/sample of the card they intended to have displayed the other day.
Sources: Singapore HardwareZone, PCPop
Add your own comment

94 Comments on NVIDIA 'Fermi', Tesla Board Pictured in Greater Detail, Non-Functional Dummy Unveiled

#26
Hunt3r
Stupid .. it will still be expensive
Posted on Reply
#27
a111087
it seems to me that nvidia just shows off this dummy, so people would look away from new Ati cards. it's like showing even bigger bone to a dog, so the dog will forget about all the other bones she already has. and the dog will stare at the bigger bone, even though she can't have it, yet.
Nvidia is just saying "look this way geeks, we have something tasty for you if you will not spend your money on ATi cards, it can be yours REALLY soon, so just hold on! just a little more!"
I don't blame them, it's a good tactic from a a business point
Posted on Reply
#28
[I.R.A]_FBi
newtekie1Post counts and length of time here mean nothing, it is the quality of the post that matters. And frankly both sucked in that department. Though I'd take the post telling us to ignore the fanboy with a little decent on-topic information over the purely fanboyish post any day.

And with the exception of bta, I have the largest postcount in this thread, and I've been here the longest, if that really matters to you, and I'll tell you IRA's post was very fanboyish and very trollish. But I'm sure he meant it more in a joking manner than a trollish/fanboyish manner.

And of course it is a PR stunt, that is exactly what these types of things are, ATi had bunches of them before the RV870 release, it is how the game works.

And there are offical specs, they came out yesterday.
I realise things are taken more seriously than they need to be at times in these threads so maybe ill excuse myself.
Posted on Reply
#29
Binge
Overclocking Surrealism
a111087it seems to me that nvidia just shows off this dummy, so people would look away from new Ati cards. it's like showing even bigger bone to a dog, so the dog will forget about all the other bones she already has. and the dog will stare at the bigger bone, even though she can't have it, yet.
Nvidia is just saying "look this way geeks, we have something tasty for you if you will not spend your money on ATi cards, it can be yours REALLY soon, so just hold on! just a little more!"
I don't blame them, it's a good tactic from a a business point
What other perspective is there for them? As long as they deliver a functioning product to the end user their business is sound. As an example would you have liked a poster with the card's picture on it or a life-sized model?
Posted on Reply
#30
15th Warlock
newtekie1And with the exception of bta, I have the largest postcount in this thread, and I've been here the longest, if that really matters to you, and I'll tell you IRA's post was very fanboyish and very trollish. But I'm sure he meant it more in a joking manner than a trollish/fanboyish manner.
.
Actually, I've been here the longest, and I also think he was joking :toast: :p

Can't brag about my post count though :o :respect:
Posted on Reply
#31
a111087
BingeWhat other perspective is there for them? As long as they deliver a functioning product to the end user their business is sound. As an example would you have liked a poster with the card's picture on it or a life-sized model?
like i said, i don't blame them, it was a good move for them
Posted on Reply
#32
Unregistered
So when is this card supposed to be out? By out, I mean in the stores and ready for purchase.

Is team green really going to give ATI the entire holiday season with no direct competition? If they are, what a gift.
#33
AsRock
TPU addict
R_1No offence man, but part of the PCB is cut off.
Just what i was thinking you see the 8 pin connector and it looks like 4 pins of the 6 pin pci-e connector have been cut off lol. And then the just stuck the plastic parts of the PCI-e connectors were they thought was good lol.
Posted on Reply
#34
Onderon
happitaYou shouldn't call veterans here with 5000+ posts here "fanboys". It shows just how little you know with your outstanding 17 total posts and joining this community for a whole month.

I'm really looking forward to what Nvidia has to offer in this generation. I do agree that this is a pathetic PR attempt to make nvidia FANBOYS drool over something that doesn't have any official specs to speak about.
pr0n InspectorThat's exactly right.
Onderon, learn to spam like them before calling them fanboys. :laugh:
(Escuse me ppl but im not spamming and my post coun't doesn't matter, im not going to say anything to you, but IF you read his post that was "NVIDIOTIC Display" that was my conclussion. Im not starting a flame war nor looking for trools. was just my comment. Pd. try to read my other post then judge me ) may be Obviated by others

yes it was a great move by them but the price have to be agressive, yeah massive cruncher it will be, but as times goes more ppl want more "bang for the buck" and they have to remember that some or most of the ppl won't use the all the cuda capacities and will just want raw graphic power. Some espaculation say that having a bigger proccesor, a bigger memory interface and a multy layer PCB they say the price it's going to round 400-500 $.
Posted on Reply
#35
Velvet Wafer
lol, this is just cheap. they wont even design the dummy right:laugh:
Posted on Reply
#36
CDdude55
Crazy 4 TPU!!!
Awesome, big waitin for Nvidia to show something.:)
Posted on Reply
#37
TheLaughingMan
Nvidia reps. were asked about the chrome accent and they responded with, "We gotcha Chrome Package. We gotcha Woodgrain. We gotcha spinner fan that keeps spinning when your computer turns off."
Posted on Reply
#38
tkpenalty
Unfortunately the speculation behind the 2% yield rates may be true... this is baaaaaddddddd..

(secretly proclaiming victory).
Posted on Reply
#39
shevanel
"Nvidia seems to be gearing the world up for this. The mantra they keep chanting is that “graphics performance isn’t enough anymore.” Compute really matters a whole heckuva lot, they tell us. This sounds like PR code for “the card is going to be 50% more expensive than the competition and not 50% faster in games, so please place as much importance on GPU compute apps as possible so we look like a better value.”
Posted on Reply
#41
aj28
MAC292OH10would love to see how these fast these crunch Weather Research & Forecasting simulations....:roll:
Wouldn't be surprised if those are the type of benchmarks we begin to see from nVidia as the PR stuff rolls on. Problem is, those aren't GeForce numbers, they're Quadro numbers. Hell, even most Quadro customers don't care about that stuff... Unless they roll this chip immediately to the GPGPU crowd (which doesn't actually exist yet), they're going to have issues marketing this thing.
newtekie1And there are offical specs, they came out yesterday.
The specs being released right now don't mean much. It can have as many cores as it wants, 'cause for all we know the thing could run at 200Mhz. Even nVidia doesn't have those numbers yet...
shevanelThis sounds like PR code for the card is going to be 50% more expensive than the competition and not 50% faster in games, so please place as much importance on GPU compute apps as possible so we look like a better value.
For all the interest the community has shown, gamers don't care about GPGPU. Not enough to open their wallets for it anyway... If a card they already own supports it, yeah, maybe they'll fold in their spare time, but no one is going to go out and buy a GPU (or pay extra for one) because it has those types of capabilities. It's nonsense...

What we need is a G92 successor and a new Intel chipset.
Posted on Reply
#43
lism
There's more market into a Quaddro or FireGL type of card then plain Geforce's or Radeons.

Did you know the hardware thats in a Quaddro or FireGL is exactly the same except for the GUID of the card itself? These cards are basicly sold for like 1000 to 2000$ in avg because there's pretty much better support on drivers then a regular 3D card. Maybe nvidia is depening on this type of modelling-market, instead of Gamers.

I guess there's where the big money is, and there will be a shrinked version of this card that has a Gamers-stamp on it. On the other hand, Nvidia was pretty much lowzy back in the DX 10.1 age, and rebranding 8x00 series into 9x00 or GT200 etc. I think Ati was the more innovativer team this time and nvidia is going to pay for this.

Untill Q1, Ati holds pretty much the market, esp when christmass is coming :)
Posted on Reply
#47
Beowulf
pr0n InspectorCharlie Demerjian's a f--ktard. Save your brain cells, don't read his bullshit.
He might have a grudge against nVidia, but the things he brings up in the article are not false. So yes, read it and make up your own mind about it.

To me this certainly makes nvidia a company im less inclined to deal with. And these cards are about 5+months out anyway, so this is just a poor attempt at ruining DAAMITs party. :shadedshu
Posted on Reply
#48
ZoneDymo
Makes me wonder what Nvidia is thinking.
Why release a dummy model that looks like this, when we all know this is nothing like the final design?
Posted on Reply
#49
inferKNOX
TheLaughingManNvidia reps. were asked about the chrome accent and they responded with, "We gotcha Chrome Package. We gotcha Woodgrain. We gotcha spinner fan that keeps spinning when your computer turns off."
Lol, classic! :laugh:

Wow, the saying's really true, huh? The bigger they are, the harder they fall (ie, the bigger the co. & GPU size in this case)!:shadedshu
Posted on Reply
#50
pr0n Inspector
BeowulfHe might have a grudge against nVidia, but the things he brings up in the article are not false. So yes, read it and make up your own mind about it.

To me this certainly makes nvidia a company im less inclined to deal with. And these cards are about 5+months out anyway, so this is just a poor attempt at ruining DAAMITs party. :shadedshu
No do not read it. Read this thread.
Posted on Reply
Add your own comment
Apr 25th, 2024 09:11 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts