Thursday, January 7th 2010

NVIDIA GF100 Graphics Card Chugs Along at CES

NVIDIA's next generation graphics card based on the Fermi architecture, whose consumer variant is internally referred to as GF100 got to business at the CES event being held in Las Vegas, USA, performing live demonstration of its capabilities. The demo PC is housing one such accelerator which resembles the card in past sightings. Therefore it is safe to assume this is what the reference NVIDIA design of GF100 would look like. The accelerator draws power from 6-pin and 8-pin power connectors. It has no noticeable back-plate, a black PCB, and a cooler shroud with typical NVIDIA styling. The demo rig was seen running Unigine Heaven in a loop showing off the card's advanced tessellation capabilities in DirectX 11 mode. The most recent report suggests that its market availability can be expected in March, later this year. No performance figures have been made public as yet.
A short video clip after the break.

Source: PCWatch
Add your own comment

105 Comments on NVIDIA GF100 Graphics Card Chugs Along at CES

#1
20mmrain
I don't know. If the card really beats a 5870 by 36%. Wouldn't Nvidia be touting it left and right? I'm hoping to see some concrete numbers by the end of CES. Really, we aren't seeing much that we haven't already seen.
Well here is the video I was looking at let me know what you think? I don't know if you have seen this one.

http://www.youtube.com/watch?v=gkI-ThRTrPY

But like I said looks smoother than my 5870 but he was also toggling threw the Tessellation. Plus like I said they had 6 months longer. Not that is an excuse....... my whole point wether I am a 5870 owner or not is.... that 6 months later and only 36% better (if that's what it is) is not that impressive. IMO :)
Posted on Reply
#2
EastCoasthandle
erocker said:
I don't know. If the card really beats a 5870 by 36%. Wouldn't Nvidia be touting it left and right? I'm hoping to see some concrete numbers by the end of CES. Really, we aren't seeing much that we haven't already seen.
Yes, they would as they did with the GT8800 back then for example. The reason for that trend is that they would have head room to overclock it making it the new standard clock should AMD be able to match or beat their part. They are being very conservative by not providing this information. Specially when they are possibly 2 quarters behind before they are available for purchase. IMO, I call this a red flag. But perhaps before the day is over with we actually get performance numbers.
Posted on Reply
#3
cadaveca
My name is Dave
Realistically, it can only hurt them to release actual performacne, as the time between now and release will give AMD a set goal to beat with HD5XXX refreshes. I'd prefer to see it running a myriad of apps with stability, and no more.
Posted on Reply
#4
gumpty
Benetanegia said:

Tom's Hardware on Fermi - http://www.tomshardware.com/reviews/ces-2010-fermi,2527-4.html


He's claiming Fermi to be faster than HD5970 and that's from a Nvidia guy this time.
No he didn't. He claimed it was faster than AMD's fastest GPU, not AMD's fastest graphics card. He was claiming it was faster than a 5870.
Posted on Reply
#5
a_ump
eh, i didn't expect performance numbers at all, i was hoping to see official specs, clock speeds n whatnot. But now i don't expect to see anything until the day of release...:(
Posted on Reply
#6
EastCoasthandle
I believe the very opposite of that. If they truly have the lead they are claiming they would always compensate based on the overhead of just increase the clock rate. Which is something they've always done.
Posted on Reply
#7
20mmrain
Realistically, it can only hurt them to release actual performacne, as the time between now and release will give AMD a set goal to beat with HD5XXX refreshes. I'd prefer to see it running a myriad of apps with stability, and no more.
You do have a point there. But still think about it they released their GTX200 series first and are releasing Femi last. They have had Hella more time to work on their card. So it should be faster.

But still it's seems just fishy to me.
If they come out with a much faster and awesome card..... I am at peace with that. I think that could only help the consumer in the end.
My hole focus is though that everything they have been doing with this card just seems fishy.

But looking at it from this stand point too..... If I were Nvidia and I had the best card the world has ever seen. I might not want to say anything as well. Especially if I thought that it might be the undoing of my competition! I just hope that it doesn't go that far!
Posted on Reply
#8
cadaveca
My name is Dave
Heh..to me, i think Jen Hsun has been seen at the poker tables in Vegas. ;)
Posted on Reply
#9
Steevo
I am sure the NV card will be faster than a 5870, but I'm also sure it will cost at least 50% more. The question is how much, how much faster exactly, and how much more monies exactly. They need to get some figures out there and fast, as ATI still has the fastest/only DX11 card, and they are taking more of the marketshare everyday. Plus with a huge die on the NV card the chances of yield problems at the foundry, and thus a short supply will only cause more people to choose the in stock ATI variant if they don't have a large performance/price lead.


I know I can't wait to see what this card brings, for folding, GTA4, and price. If it is right I'm going green this time.
Posted on Reply
#10
20mmrain
Heh..to me, i think Jen Hsun has been seen at the poker tables in Vegas.
You too? I thought that was him but I couldn't tell with out him holding his fake tessellation card?
Posted on Reply
#11
[H]@RD5TUFF
I'm still waiting for someone to proclaim this as fake, and say there has to be an ATI card doing the DX11 demo.:ohwell:
Posted on Reply
#12
TheMailMan78
Big Member
From the pictures one can note the following facts:
• There is no backplate like on the Tesla Fermis
• Given the holes in the PCB the temperature regulation seems to be more difficult - but this could also be a preparation for future SLI versions
• 1x 6-pin and 1x 8-pin power connector: According to the PCI E specifications the card is allowed to draw up to 300 watt
Is 300watt max for a PCI-E or just those connections? Ima confused.

Source
Posted on Reply
#13
20mmrain
I'm still waiting for someone to proclaim this as fake, and say there has to be an ATI card doing the DX11 demo.
It was didn't you know ATI had a GForce version?
Posted on Reply
#14
Benetanegia
TheMailMan78 said:
Is 300watt max for a PCI-E or just those connections? Ima confused.

Source
It's just saying that according to PCIE expecifications...

code:
8pin = 150w
6pin = 75w
MB pcie slot = 75W

150+75+75 = 300w


...the card is allowed to draw up to 300 watt

According to that the card will draw something between 225w and 300w, UNLESS it's like the Tesla cards, which also have 1x 8pin and 1x 6pin, but only need 150w external. That is, you can use 2x 6pin OR 1x 8pin. In that case power consumption would be up to 225w like in Tesla cards. IMO the GeForce card consumes 250-ish at the moment and that's why Nvidia PR guys have told the press at the event that the configuration could change in the final product. If it was something like 275-300 they wouldn't be able to change anything.
Posted on Reply
#15
fullinfusion
1.21 Gigawatts
I shall keep my comments to myself lol :eek:.... go green team..go! :laugh:
maybe next time :slap:
Posted on Reply
#16
OneCool
So this one wasnt put together with wood screws? :wtf:








baaaaaaaa hahahaha :roll:
Posted on Reply
#17
kid41212003
Benetanegia said:

He's claiming Fermi to be faster than HD5970 and that's from a Nvidia guy this time. Real or not, it's one step closer to being an official statement, not rumors or fakes (now, they really were fakes?). I'll chose to believe him, because some months ago I made my own calculations based on the specs and reached that conclusion, as some of you may remember. We'll see, but I'm optimistic.
He said "GPU", so I assume he meant a single GPU card :laugh:.
Posted on Reply
#18
btarunr
Editor & Senior Moderator
AMD does occasionally refer to its dual-GPU cards as GPUs.
Posted on Reply
#20
Benetanegia
And they said "fastest GPU". No need to remark fastest if they were talking about the HD5870. No one is going to think they were refering to the HD5850. :laugh:

And like bta said, Ati themselves refer to their dual-GPU cards as simply GPU. Ever since the HD3870x2 they have always tried to impose the idea that X2 was the high-end and the single GPU cards are the performance level cards. They don't want any differentiation based on number of GPU, the X2 card is just their top card (even the new name HD5970 points to this), hence the fastest card.
Posted on Reply
#21
[H]@RD5TUFF
20mmrain said:
It was didn't you know ATI had a GForce version?
No I didn't silly me.:wtf:
Posted on Reply
#22
Weer
btarunr said:
I'll say it again, it does not make a difference. Besides they are not doing a performance evaluation there, so there's no scope to even speculate on something this trivial.
Gosh, for some reason, I've come to respect you.
Posted on Reply
#23
Thrackan
As far as Toms Hardware is concerned, I think they've lost the status of being "independent reviewers" a long time ago, especially when it comes to GFX cards. Their GFX charts are constantly favoring the green team, to an absurd extent.

Let me see some actual test results and we can start talking again, Tom!
Posted on Reply
#24
Bjorn_Of_Iceland
Thrackan said:
As far as Toms Hardware is concerned, I think they've lost the status of being "independent reviewers" a long time ago, especially when it comes to GFX cards. Their GFX charts are constantly favoring the green team, to an absurd extent.
Any direct evidence for that?
Posted on Reply
#25
Thrackan
Bjorn_Of_Iceland said:
Any direct evidence for that?
Just watch the charts and use common sense and plenty of other reviews.
Posted on Reply
Add your own comment