• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

GeForce GTX 480 has 480 CUDA Cores?

im tryin to think of nice things to say about the 480 but when I think about the shader clocks being permantly linked to the core since the core clock is half the shader clock, . . . . . .

well in a little while we'll know for sure, whether overclocking sucks or not

hopefully it can be unlinked like the other cards, but with these new cards you never know T.T
 
The more news tidbits that are released, the more this chip sounds like another HD 2900 disappointment. :(

As Wolf said: bring on the reviews.
 
im tryin to think of nice things to say about the 480 but when I think about the shader clocks being permantly linked to the core since the core clock is half the shader clock, . . . . . .

It's not exactly like that iirc, shaders are not linked to core clock, it's somehow difficult to say if the new aproach makjes the chip worse or better though. Everything inside the GPC (TMUs, tesselator, setup engine, rasterizer) runs at half the speed of the SPs (hot clock), but everything on the outside (ROPs, main scheduler, L2 cache) runs at the core clock.

IMO it's mostly a good thing, because the only significant move (to the higher clock) are the TMU. The setup engine, rasterizer and tesselators are suposedly much smaller than SPs, TMUs or ROPs, so they should not have any effect on keeping the shader domain reaching higher clocks or on the temperatures and stability of the GPC, IMO. The units that are supposed to be more sensible to clocks like the ROPs and L2 remain at the slower core clock.
 
How do we know what ATI needs, there isn't even competing cards out yet? Everyone of the green team keeps assuming this card will be so awesome, yet it has been how long, and all we keep getting is spin, more spin, and more spin on how awesome it is, and how much better it will be. If you really believe all this and feel that good, then keep on breathing the fumes man.

Putting the green versus red bit aside, Steve unlocks an even greater point :
The architecture of both GPUs and the software they run hasn't changed fundamentally. So until that does, all of the hype on upcoming GPUs as we know them, is neither here nor there.
What we'll get, is another powerful card, that still falls down like all the rest, in all the same places, for all the same reasons - of which are the same reasons that have existed for the last decade or more.

To me that's not impressive. I like big, I like powerful, it's how I like my vehicles, but not my computer components. I'm tired of getting bigger cases, bigger motherboards, bigger radiators and bigger PSUs, only to have the overrated max FPS of a game, go plummeting straight back down to twenty-five frames, because another character strolled onto the screen, and all this supposed Direct X special effects, that unfortunately we cannot actually see, has just sucked away the performance.

Don't get me wrong, I'm pro Direct X. When people were crying and whining about DX10 being a failure, I wasn't. I understood, I got it. Had you tried to run a lot of the background processing of DX10 on DX9(if it was possible) it wouldn't be a pretty sight, and DX11 brings some much needed tools for developers.

But I'm just not 'pro waiting six months or more every year' to see these 'fabled' graphics processors be put on a pedestal, and be released and yet don't provide anything really tangible over the last generation.

Consider that brute power alone, and computational flexibility, something like a GTX 295 or 4870X2 should be MORE than enough for modern games, and they usually are. Heck I can run the X2 at clocks of 500/500 in about 90% of modern games, and still have over 50fps. But then you get those moments where it all comes crashing down, and no matter how powerful the cards, it never ends.
 
Putting the green versus red bit aside, Steve unlocks an even greater point :
The architecture of both GPUs and the software they run hasn't changed fundamentally. So until that does, all of the hype on upcoming GPUs as we know them, is neither here nor there.
What we'll get, is another powerful card, that still falls down like all the rest, in all the same places, for all the same reasons - of which are the same reasons that have existed for the last decade or more.

To me that's not impressive. I like big, I like powerful, it's how I like my vehicles, but not my computer components. I'm tired of getting bigger cases, bigger motherboards, bigger radiators and bigger PSUs, only to have the overrated max FPS of a game, go plummeting straight back down to twenty-five frames, because another character strolled onto the screen, and all this supposed Direct X special effects, that unfortunately we cannot actually see, has just sucked away the performance.

Don't get me wrong, I'm pro Direct X. When people were crying and whining about DX10 being a failure, I wasn't. I understood, I got it. Had you tried to run a lot of the background processing of DX10 on DX9(if it was possible) it wouldn't be a pretty sight, and DX11 brings some much needed tools for developers.

But I'm just not 'pro waiting six months or more every year' to see these 'fabled' graphics processors be put on a pedestal, and be released and yet don't provide anything really tangible over the last generation.

Consider that brute power alone, and computational flexibility, something like a GTX 295 or 4870X2 should be MORE than enough for modern games, and they usually are. Heck I can run the X2 at clocks of 500/500 in about 90% of modern games, and still have over 50fps. But then you get those moments where it all comes crashing down, and no matter how powerful the cards, it never ends.

I know what you mean by the same old same old, man. I hate those frame rate drops too. However, bear in mind that quite often that low fps bottleneck can also be at the cpu and not necessarily the gpu. Keeping those frame rates high and consistent is a big challenge when designing a game and unfortunately, it's not possible to prevent high complexity/detail scenes from tanking the frame rate sometimes. This is why I love running my old DX7 games on my big, grossly overpowered rig: even the lowest points are doing over 100fps (if vsync us unlocked) and the game runs smooth as butter 100% of the time. :D :rockout:
 
What annoys me the most, (sorry to those who do this) but I hate when a series of card(s) isn't even out, and people start talking about their "next" version how its going to be so much better.

Seriously, lets wait for whats not even out first.


+1 dude.
 
Putting the green versus red bit aside, Steve unlocks an even greater point :
The architecture of both GPUs and the software they run hasn't changed fundamentally. So until that does, all of the hype on upcoming GPUs as we know them, is neither here nor there.
What we'll get, is another powerful card, that still falls down like all the rest, in all the same places, for all the same reasons - of which are the same reasons that have existed for the last decade or more.

To me that's not impressive. I like big, I like powerful, it's how I like my vehicles, but not my computer components. I'm tired of getting bigger cases, bigger motherboards, bigger radiators and bigger PSUs, only to have the overrated max FPS of a game, go plummeting straight back down to twenty-five frames, because another character strolled onto the screen, and all this supposed Direct X special effects, that unfortunately we cannot actually see, has just sucked away the performance.

Don't get me wrong, I'm pro Direct X. When people were crying and whining about DX10 being a failure, I wasn't. I understood, I got it. Had you tried to run a lot of the background processing of DX10 on DX9(if it was possible) it wouldn't be a pretty sight, and DX11 brings some much needed tools for developers.

But I'm just not 'pro waiting six months or more every year' to see these 'fabled' graphics processors be put on a pedestal, and be released and yet don't provide anything really tangible over the last generation.

Consider that brute power alone, and computational flexibility, something like a GTX 295 or 4870X2 should be MORE than enough for modern games, and they usually are. Heck I can run the X2 at clocks of 500/500 in about 90% of modern games, and still have over 50fps. But then you get those moments where it all comes crashing down, and no matter how powerful the cards, it never ends.


I see your point but at the same time I dont. Yeah they all come crashing down, but crashing down for an x2 or 295 is like 25 fps which is annoying but still ok. Crashing down for a single 260 or 4870 is like 15 fps... which is just jerky enough to send me into epileptic shock.

plus

All brand new architectures are head of their time. Becuase no developer will spend oodles of money and time to develop a game for a hardware feature that 0.0001% of the gaming market has. (sometimes a token game comes out with sponsorship of ati/nv but changes nothing).

Its just exciting bc these cards do bring something new to the table... unlike the 4xxx or gt200 or g92 or rv670 - its been a long time since that has happened.
 
fermi meant for parallel processing

disabled SM to control TPD
 
The more news tidbits that are released, the more this chip sounds like another HD 2900 disappointment. :(

As Wolf said: bring on the reviews.

Don,t worry there are some GREEN web sits like GURU3D to do some 8xAA benchmark and say ATI was beaten by 10% but they will not try 8xQ which is real 8X for Nvidia ;)

:roll: and there are fools who support that.....
 
Or perhaps nVidia is so confident that their Fermi will perform well at its price point and decided to disable a cluster to improve yields and pave the way to release a GTX485 later on.

Just throwing some thoughts out there.

I think they'll first make GTX495 with 2x GTX470 but with 512 Cores. Then later with better yields GTX485 with some OC.
 
Don,t worry there are some GREEN web sits like GURU3D to do some 8xAA benchmark and say ATI was beaten by 10% but they will not try 8xQ which is real 8X for Nvidia ;)

:roll: and there are fools who support that.....

+1, team green will release there next set of cards in 4 yrs, lol
 
Where all the hate for nVidia FERMI go??? anyone??? lol don't get me wrong i like ATI but i knew this would happen nice move nvidia the good price is the key and nice work ATI/AMD to keep nvidia controlled in the price departament...
 
so where's the board at then huh Nvidia Huh!? I dont see it in Wizzard's or other members hands. To me it's deader than dead itself.
 
The more news tidbits that are released, the more this chip sounds like another HD 2900 disappointment. :(

As Wolf said: bring on the reviews.

x2

All this ongoing delays and lack of communication from Nvidia and those rumours about insanely high TPD are clearly pointing to a disaster release from Nvidia. If they were confident in their product, they would brag about it to no end instead of hiding it far from the rogue benchmarkers.
 
Shot of Die A3 Stepping of Fermi

gf100die.jpg


74b9df3ba28b32.jpg


74b9df9952cd99.jpg


74b9debabf411a.jpg



more HERE
 
Look at all the scared fanboys trying to convince themselves that these cards are full of fail. How entertaining.
 
Naked pictures are always nice, be they cores or more fleshy stuff :)

So 87 cores ready at least :p Wonder if they leave that marker stuff under IHS or those are just quality assurance samples.
 

Stares at pic.

Oh nooooo! Fermi A3 silicon also has 2% yields. Because we can clearly see a number 2 written over that die and as everybody knows when a company gets few samples back from factory they write numbers on them (and only when they get very few of them, otherwise they would never write over them, it would be stupid to do so) and always ALWAYS show the one with the higher number, in this case a 2 (in other a 7 :)). That number clearly means 2% yields. :rolleyes:
 
The gap between 470 and 480 is now far too small. This is very bad news, imo.
 
I keep saying... rumooooorrrrrs. Show me the numbers.

There's no attack to Nvidia, it's just fair to recognize when there's a disappointment, that's how human civilization has evolved, learning from errors.

ATi, for instance, failed with their HD2xxx/3xxx series, that's why I kept my X1650Pro a little longer. Nvidia delivered the mythical G92 and the sole reason I chose my HD4830 over the 9800GT was the price (in my country Nvidia is really overpriced).

Nvidia failed with the FX series and ATi triumphed with the 9xxx (9 a lucky number?).

There's no need to fight, even a fanboy has to reckon when his/her company screwed it up.

More rumors? See this:

http://www.semiaccurate.com/2010/02/20/semiaccurate-gets-some-gtx480-scores/
 
...
All brand new architectures are head of their time. Becuase no developer will spend oodles of money and time to develop a game for a hardware feature that 0.0001% of the gaming market has. (sometimes a token game comes out with sponsorship of ati/nv but changes nothing).

Its just exciting bc these cards do bring something new to the table... unlike the 4xxx or gt200 or g92 or rv670 - its been a long time since that has happened.
It really is a hit or miss when it comes to deploying new architecture / hardware level instructions. Heck when MMX came out, we all thought that it was the future of games.. but then graphics accellerators squashed it.. and tbh, when that whole Hardware T&L came out, very few games were using it, I thought it was just another fad.. but then it became staple to all games.
 
The gap between 470 and 480 is now far too small. This is very bad news, imo.

How so? When looking back at the GT200 cards the gap between the GT400 cards is actually bigger. I know ATI cards are a little different, but by nVidia standards this is about right.
 
Look at all the scared fanboys trying to convince themselves that these cards are full of fail. How entertaining.

imo they are full of fail, released 6-7 months late then when they should have been.
 
Back
Top