• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GF100 Graphics Card Chugs Along at CES

I too thought it looked like 3 wires at first until I looked again. It is only two 1 6pin + 1 8 pin. Just must be the way they had the wires situated.

Also did anyone notice that the guy running the demo happened to keep switching the modes during the really intense tessellation parts? It was either him doing it or that is the way they had it set up to run.
I the parts that it did run with full Tessellation it did look a little choppy at times. I do think that it looked a little smoother than my single 5870..... but not too much. Not enough to make me really impressed.

So let's say for a second that this thing really does beat a 5870 by 36%. Well I wouldn't be surprised it only is coming out half a year later. If you ask me that number wouldn't be that impressive for 6 months later.

But I still like the idea of the technology that they are using. I also think it will be a great addition to the world of GPU's. It will keep things moving forward that is for sure!

But If I was that confident with it's performance and I were Nvidia. I would have left the FPS counter on the bottom and the top..... still left out there for all to see. But they didn't makes me wonder.

I don't know. If the card really beats a 5870 by 36%. Wouldn't Nvidia be touting it left and right? I'm hoping to see some concrete numbers by the end of CES. Really, we aren't seeing much that we haven't already seen.
 
I don't know. If the card really beats a 5870 by 36%. Wouldn't Nvidia be touting it left and right? I'm hoping to see some concrete numbers by the end of CES. Really, we aren't seeing much that we haven't already seen.

Well here is the video I was looking at let me know what you think? I don't know if you have seen this one.

http://www.youtube.com/watch?v=gkI-ThRTrPY

But like I said looks smoother than my 5870 but he was also toggling threw the Tessellation. Plus like I said they had 6 months longer. Not that is an excuse....... my whole point wether I am a 5870 owner or not is.... that 6 months later and only 36% better (if that's what it is) is not that impressive. IMO :)
 
I don't know. If the card really beats a 5870 by 36%. Wouldn't Nvidia be touting it left and right? I'm hoping to see some concrete numbers by the end of CES. Really, we aren't seeing much that we haven't already seen.

Yes, they would as they did with the GT8800 back then for example. The reason for that trend is that they would have head room to overclock it making it the new standard clock should AMD be able to match or beat their part. They are being very conservative by not providing this information. Specially when they are possibly 2 quarters behind before they are available for purchase. IMO, I call this a red flag. But perhaps before the day is over with we actually get performance numbers.
 
Realistically, it can only hurt them to release actual performacne, as the time between now and release will give AMD a set goal to beat with HD5XXX refreshes. I'd prefer to see it running a myriad of apps with stability, and no more.
 
eh, i didn't expect performance numbers at all, i was hoping to see official specs, clock speeds n whatnot. But now i don't expect to see anything until the day of release...:(
 
I believe the very opposite of that. If they truly have the lead they are claiming they would always compensate based on the overhead of just increase the clock rate. Which is something they've always done.
 
Realistically, it can only hurt them to release actual performacne, as the time between now and release will give AMD a set goal to beat with HD5XXX refreshes. I'd prefer to see it running a myriad of apps with stability, and no more.

You do have a point there. But still think about it they released their GTX200 series first and are releasing Femi last. They have had Hella more time to work on their card. So it should be faster.

But still it's seems just fishy to me.
If they come out with a much faster and awesome card..... I am at peace with that. I think that could only help the consumer in the end.
My hole focus is though that everything they have been doing with this card just seems fishy.

But looking at it from this stand point too..... If I were Nvidia and I had the best card the world has ever seen. I might not want to say anything as well. Especially if I thought that it might be the undoing of my competition! I just hope that it doesn't go that far!
 
Heh..to me, i think Jen Hsun has been seen at the poker tables in Vegas. ;)
 
I am sure the NV card will be faster than a 5870, but I'm also sure it will cost at least 50% more. The question is how much, how much faster exactly, and how much more monies exactly. They need to get some figures out there and fast, as ATI still has the fastest/only DX11 card, and they are taking more of the marketshare everyday. Plus with a huge die on the NV card the chances of yield problems at the foundry, and thus a short supply will only cause more people to choose the in stock ATI variant if they don't have a large performance/price lead.


I know I can't wait to see what this card brings, for folding, GTA4, and price. If it is right I'm going green this time.
 
Heh..to me, i think Jen Hsun has been seen at the poker tables in Vegas.

You too? I thought that was him but I couldn't tell with out him holding his fake tessellation card?
 
I'm still waiting for someone to proclaim this as fake, and say there has to be an ATI card doing the DX11 demo.:ohwell:
 
From the pictures one can note the following facts:
• There is no backplate like on the Tesla Fermis
• Given the holes in the PCB the temperature regulation seems to be more difficult - but this could also be a preparation for future SLI versions
• 1x 6-pin and 1x 8-pin power connector: According to the PCI E specifications the card is allowed to draw up to 300 watt

Is 300watt max for a PCI-E or just those connections? Ima confused.

Source
 
I'm still waiting for someone to proclaim this as fake, and say there has to be an ATI card doing the DX11 demo.

It was didn't you know ATI had a GForce version?
 
Is 300watt max for a PCI-E or just those connections? Ima confused.

Source

It's just saying that according to PCIE expecifications...

Code:
8pin = 150w
6pin = 75w
MB pcie slot = 75W

150+75+75 = 300w

...the card is allowed to draw up to 300 watt

According to that the card will draw something between 225w and 300w, UNLESS it's like the Tesla cards, which also have 1x 8pin and 1x 6pin, but only need 150w external. That is, you can use 2x 6pin OR 1x 8pin. In that case power consumption would be up to 225w like in Tesla cards. IMO the GeForce card consumes 250-ish at the moment and that's why Nvidia PR guys have told the press at the event that the configuration could change in the final product. If it was something like 275-300 they wouldn't be able to change anything.
 
I shall keep my comments to myself lol :eek:.... go green team..go! :laugh:
maybe next time :slap:
 
So this one wasnt put together with wood screws? :wtf:








baaaaaaaa hahahaha :roll:
 
He's claiming Fermi to be faster than HD5970 and that's from a Nvidia guy this time. Real or not, it's one step closer to being an official statement, not rumors or fakes (now, they really were fakes?). I'll chose to believe him, because some months ago I made my own calculations based on the specs and reached that conclusion, as some of you may remember. We'll see, but I'm optimistic.

He said "GPU", so I assume he meant a single GPU card :laugh:.
 
AMD does occasionally refer to its dual-GPU cards as GPUs.
 
mmmmmm i cant wIT FERMI here i come!!!
 
And they said "fastest GPU". No need to remark fastest if they were talking about the HD5870. No one is going to think they were refering to the HD5850. :laugh:

And like bta said, Ati themselves refer to their dual-GPU cards as simply GPU. Ever since the HD3870x2 they have always tried to impose the idea that X2 was the high-end and the single GPU cards are the performance level cards. They don't want any differentiation based on number of GPU, the X2 card is just their top card (even the new name HD5970 points to this), hence the fastest card.
 
I'll say it again, it does not make a difference. Besides they are not doing a performance evaluation there, so there's no scope to even speculate on something this trivial.

Gosh, for some reason, I've come to respect you.
 
As far as Toms Hardware is concerned, I think they've lost the status of being "independent reviewers" a long time ago, especially when it comes to GFX cards. Their GFX charts are constantly favoring the green team, to an absurd extent.

Let me see some actual test results and we can start talking again, Tom!
 
As far as Toms Hardware is concerned, I think they've lost the status of being "independent reviewers" a long time ago, especially when it comes to GFX cards. Their GFX charts are constantly favoring the green team, to an absurd extent.
Any direct evidence for that?
 
Back
Top