Friday, October 15th 2010

NVIDIA to Counter Radeon HD 6970 ''Cayman'' with GeForce GTX 580

AMD is undertaking its product development cycle at a breakneck pace, NVIDIA trailed it in the DirectX 11 and performance leadership race by months. This November, AMD will release the "Cayman" GPU, its newest high end GPU, the expectations are that it will outperform the NVIDIA GF100, that is a serious cause for concern, for the green team. It's back to its old tactics of talking about GPUs that haven't even taken shape, to try and water down AMD's launch. Enter, the GF110, NVIDIA's new high-end GPU under design, on which is based the GeForce GTX 580.

The new GPU is speculated to have 512 CUDA cores, 128 TMUs, and a 512-bit wide GDDR5 memory interface holding 2 GB of memory, with a TDP of close to that of the GeForce GTX 480. In the immediate future, there are prospects of a more realistic-sounding GF100b, which is basically GF100 with all its 512 CUDA cores enabled, while retaining its 384-bit GDDR5 memory interface, 64 TMUs, and slightly higher TDP than that of the GTX 480.
Sources: 3DCenter.org, PCGH
Add your own comment

195 Comments on NVIDIA to Counter Radeon HD 6970 ''Cayman'' with GeForce GTX 580

#51
HalfAHertz
derwin75That's terrific news about the new GF110 Fermi card. But I read somewhere online stated that GeForce GTX 580 will have 580 Cuda cores not 512 Cuda cores. Here is a statement below:

( NVIDIA announced a next-generation GeForce GTX 500 series , the first high-end models named GeForce GTX 580 based on the Fermi architecture.

The GeForce GTX 580 features 580 CUDA cores to go along with its "580" moniker and have a whopping 2560MB of 384-bit GDDR5 memory

"Although we're very proud of the GTX 480," NVIDIA President and CEO Jen-Hsun Huang said, "the 400 series is merely a tease for what Fermi can really accomplish. When we release the 500 series later this year, I think everyone will be pleasantly surprised. ) By Softpedia News.

Anyone care to explain this....In my opinion I would rather buy a 580 cuda cores card instead of 512 cuda cores card.
I think that's bull because 580 is not dividable by neither 32 or 48, so that would mean they'd need a completely new architecture to achieve it...
Posted on Reply
#52
Naito
This is ridiculous! Another half-assed re-brand from nVidia. Does this mean the whole 500 series is going to be full of other pointless re-brands as well? Disappointed nVidia. I am a nVidia fan, and the only 2 cards I would even consider getting this series, would be the GTX 460 and possibly the GTX 465 if unlock-able to a GTX 470. They should concentrate on at least releasing this 'GTX 580' as GTX 485 or something (or use 2xGF104 for GTX 490?), and then improve upon the GF104 SP/Cluster ratio/design, add more features, and make sure it doesn't guzzle too much juice, for the Geforce 500 series.

ATI looks more enticing everyday. Good for them with kicking nVidia up the arse, and increasing competition. Maybe one day I might build a rig with ATI in it.....
Posted on Reply
#53
Naito
derwin75That's terrific news about the new GF110 Fermi card. But I read somewhere online stated that GeForce GTX 580 will have 580 Cuda cores not 512 Cuda cores.
It wont be possible. It would be more like 576 cuda cores.
Posted on Reply
#54
the54thvoid
Intoxicated Moderator
This is just silly.

Everyone is now speculating about what NV's next big card is going to be...... Even though they don't even have it yet.

There is only one reason for this 'outlet' of info - to make those about to shell out for 69xx cards stop and go, "oh, maybe i should wait for this.."

Shame on all of you for falling head over heels in love with an idea that is really an attempt at FUD to stall people buying AMD.

Even Benetanegia with his well informed tech speak is just 'speculating'. And that's not an insult Ben :) I'm just saying that one simple release has sparked all this rumour. It's NV marketing doing what it does best - spreading doubt without foundation.

We really all need to realise both companies are out for our cash. And they'll both fight nasty to get it.

Please, let's stop falling for all this marketing shit.
Posted on Reply
#55
derwin75
HalfAHertzI think that's bull because 580 is not dividable by neither 32 or 48, so that would mean they'd need a completely new architecture to achieve it...
This is why Nvidia called it a GF110 as a new architecture. I'm sure that Nvidia has already achieve it but we have to wait and see. For now, it is only a rumor.
Posted on Reply
#56
Benetanegia
the54thvoidThis is just silly.

Everyone is now speculating about what NV's next big card is going to be...... Even though they don't even have it yet.

There is only one reason for this 'outlet' of info - to make those about to shell out for 69xx cards stop and go, "oh, maybe i should wait for this.."

Shame on all of you for falling head over heels in love with an idea that is really an attempt at FUD to stall people buying AMD.

Even Benetanegia with his well informed tech speak is just 'speculating'. And that's not an insult Ben :) I'm just saying that one simple release has sparked all this rumour. It's NV marketing doing what it does best - spreading doubt without foundation.

We really all start to realise both companies are out for our cash. And they'll both fight nasty to get it.

Please, let's stop falling for all this marketing shit.
Sorry man, that is not correct, you have to re-read the thread mostly on its entirety. Most people are just whinning about this thread/news. I am (and pretty much only me) speculating and that's because I like speculating, just read any HD6000/Barts/Cayman thread... ;)
It's NV marketing doing what it does best - spreading doubt without foundation.
That's common, SI/NI was also announced when Fermi was about to launch. Fusion has always been there "being the future", after every Intel release. Intel's bla and and Nvidia bla, when AMD is releasing bal, and bla bla bla. Some people get hyped and some don't. I am one of those who doesn't really get hyped ever, but can't loose any oportunity to speculate on something. What it is disturbing is people that get offended by the fact that other people are hyped or not. Don't be one of those... ;)

EDIT: I say this ^^ because it's best for your sanity, you can start whinning about these hyped people, next you would take on people who vote on elections hoping for change, next religion and you'd end up yelling at children in the streets "Santa does not exist you prick! It's your parents". :laugh:
Posted on Reply
#57
bear jesus
the54thvoidThis is just silly.

Everyone is now speculating about what NV's next big card is going to be...... Even though they don't even have it yet.

There is only one reason for this 'outlet' of info - to make those about to shell out for 69xx cards stop and go, "oh, maybe i should wait for this.."

Shame on all of you for falling head over heels in love with an idea that is really an attempt at FUD to stall people buying AMD.

Even Benetanegia with his well informed tech speak is just 'speculating'. And that's not an insult Ben :) I'm just saying that one simple release has sparked all this rumour. It's NV marketing doing what it does best - spreading doubt without foundation.

We really all need to realise both companies are out for our cash. And they'll both fight nasty to get it.

Please, let's stop falling for all this marketing shit.
I have to admit though it is nice to stop for a second to think about what nvidia has got coming up in the future, i only expect to be choosing between AMD 5xxx, 6xxx or nvidia 4xx cards with my coming upgrade but still, nice to stop thinking constantly about the 6870/50 for a moment like most of us have recently. :laugh:
BenetanegiaSorry man, that is not correct, you have to re-read the thread mostly on its entirety. Most people are just whinning about this thread/news. I am (and pretty much only me) speculating and that's because I like speculating, just read any HD6000/Barts/Cayman thread... ;)
True, most are complaining or almost mocking the idea of the spec :laugh:
Posted on Reply
#58
derwin75
NaitoIt wont be possible. It would be more like 576 cuda cores.
When it comes to graphic technology, anything is possible. So nothing is impossible. Nvidia Company is always full of surprises.
Posted on Reply
#59
stupido
the54thvoidThis is just silly.

Everyone is now speculating about what NV's next big card is going to be...... Even though they don't even have it yet.

There is only one reason for this 'outlet' of info - to make those about to shell out for 69xx cards stop and go, "oh, maybe i should wait for this.."

Shame on all of you for falling head over heels in love with an idea that is really an attempt at FUD to stall people buying AMD.

Even Benetanegia with his well informed tech speak is just 'speculating'. And that's not an insult Ben :) I'm just saying that one simple release has sparked all this rumour. It's NV marketing doing what it does best - spreading doubt without foundation.

We really all need to realise both companies are out for our cash. And they'll both fight nasty to get it.

Please, let's stop falling for all this marketing shit.
meh...

calling for common sense is a bit useless when fanboism kicks in... :laugh:

however, I personally do not see a need for an upgrade... just yet...
I usually upgrade when a game comes that I can not play with my current setup.
for example I upgraded from 8800 GTS to GTX280 because of crysis... :D

so far I do not see any game coming that will tax my GTX280 (not even overclocked) so bad that I will wish to cash out for something new... and when that moment comes, I will not be looking if it is NV or AMD but will look which one plays "Da game" for least cash... :cool:
Posted on Reply
#60
Yellow&Nerdy?
Erm, am I the only one who remembers the article about the "full fledged" 512SP GTX 480 a couple of months ago?en.expreview.com/2010/08/09/world-exclusive-review-512sp-geforce-gtx-480/9070.html 204W extra power consumption and 5% more performance. Not to mention the triple slot, triple fan cooler.

For this "GTX 580" to be possible, Nvidia has to make major changes to the GF100 architecture. And I don't see Nvidia having resources to do that, since they just got done releasing their entry-level desktop cards, are still missing a dual-GPU card and have only released their first notebook-graphics.
Posted on Reply
#61
the54thvoid
Intoxicated Moderator
BenetanegiaEDIT: I say this ^^ because it's best for your sanity, you can start whinning about these hyped people, next you would take on people who vote on elections hoping for change, next religion and you'd end up yelling at children in the streets "Santa does not exist you prick! It's your parents".
I'm not whining. Please for gods sake dont say i'm whining. I'm just like, "oh ffs, here we go again."

I dont whine, too long in the tooth for that.

And i often 'argue' with religious types. It's fun. And when people say "what's Santa getting you for christmas?" I often reply, hopefully not sexually abused again.

I have no sanity :wtf:
Posted on Reply
#62
CDdude55
Crazy 4 TPU!!!
stupidohowever, I personally do not see a need for an upgrade... just yet...
I usually upgrade when a game comes that I can not play with my current setup.
for example I upgraded from 8800 GTS to GTX280 because of crysis... :D

so far I do not see any game coming that will tax my GTX280 (not even overclocked) so bad that I will wish to cash out for something new... and when that moment comes, I will not be looking if it is NV or AMD but will look which one plays "Da game" for least cash... :cool:
This is probably the most logical and well thought out post i have read so far...
the54thvoidWe really all need to realise both companies are out for our cash. And they'll both fight nasty to get it.

Please, let's stop falling for all this marketing shit.
Agreed.
Posted on Reply
#63
Hayder_Master
if like this it will be kick 6980 ass, and it will be very close to 6990
Posted on Reply
#64
Atom_Anti
Yellow&Nerdy?Erm, am I the only one who remembers the article about the "full fledged" 512SP GTX 480 a couple of months ago?en.expreview.com/2010/08/09/world-exclusive-review-512sp-geforce-gtx-480/9070.html 204W extra power consumption and 5% more performance. Not to mention the triple slot, triple fan cooler.
I can also remember to that:toast:.
hayder.masterif like this it will be kick 6980 ass, and it will be very close to 6990
This miracle Nvidia card is almost impossible, so nobody have to worry about it. I would say Nvidia is in deep shit now and won't be easy answer to Cayman.
Posted on Reply
#65
crow1001
buggalugsIf you look through all the games and 3D benchmarks its no where near 50%, average is 10-20%.You were selective in picking the only game with a decent gap, just like the other guy. For the extra 10-20% you have a blast furnace in your comp and paid an extra $100-$200.

Nvidia is still trying to appeal to the lowest common denominator by making the fastest card any cost. Sales of the 5XXX series should have taught them that most people are a bit more discerning and will weigh up heat/temps/computer noise and cost.

This thing with a 512 bit bus is going to cost more than the 480 and will most likely be over $800 outside the US. Looks like another fail.
I had a 5870, was a great card, but the 480 owns it, minimum fps is way better, and yeah max 80c temps. Don't worry I'm sure your card will play the latest titles at max IQ for a good few months yet.

Regarding selective benchmarks, 50% faster in an AMD endorsed DX11 game to show of the 5*** series, 45 % faster in DX11 metro2033 :roll: Nvidia-DX11 done right. :rockout:
Posted on Reply
#66
SNICK
much like gtx380:laugh:
Posted on Reply
#67
CDdude55
Crazy 4 TPU!!!
Atom_AntiThis miracle Nvidia card is almost impossible, so nobody have to worry about it. I would say Nvidia is in deep shit now and won't be easy answer to Cayman.
We know absolutely nothing about this card, how can someone rationally argue that ''nobody has to worry about it'' without just being a blatant fanboy.
Posted on Reply
#68
LAN_deRf_HA
This could be a horrendous failure for nvidia. 512 shaders equated to only a 5% speed increase in the reviews I've seen for those one off boards, and the bus will add what? 5% more if they're lucky? The 6970 is projected to be 10% faster than a 480, so it will have the same performance. Where nvidia is going to lose massively is price. 2 GBs of memory? Twice the bus size? Just to break even with a cheaper AMD card? That's going to have an atrocious profit margin, and if AMD puts any real price pressure on them the card's sales could come to a complete halt. I'd estimate a $100-150 price discrepancy if nvidia intends to make a profit, and they still won't claim the top title. Matching the 480s power draw means it too will be unfit for a dual gpu board.
Posted on Reply
#69
crow1001
CDdude55We know absolutely nothing about this card, how can someone rationally argue that ''nobody has to worry about it'' without just being a blatant fanboy.
AMD fanboys like to make out they know more than the guys that design the Nvdia GPU's, fact is they know shit and are just out troll Nv threads.
Posted on Reply
#70
wolf
Performance Enthusiast
wow, for 3 pages, I think this thread has the most poop-talkers and nay-sayers I've seen at once.

so many comments about how it's not possible, not going to happen, don't even try, it will suck if they do.

since when was competition a bad thing? and I don't see anything that Nvidia themselves have said there, it's not an official announcement from them about a new series or anything sheesh.

calm down people, why don't you save ripping on the card for when it arrives.
Posted on Reply
#71
LAN_deRf_HA
wolfwow, for 3 pages, I think this thread has the most poop-talkers and nay-sayers I've seen at once.

so many comments about how it's not possible, not going to happen, don't even try, it will suck if they do.

since when was competition a bad thing? and I don't see anything that Nvidia themselves have said there, it's not an official announcement from them about a new series or anything sheesh.

calm down people, why don't you save ripping on the card for when it arrives.
This competition is a bad thing. The 580 will need to be very expensive, and rather than start a price war AMD will just jack up the price of the 6970 to match like they did with the 58xx series. This is not good for the consumer because neither company is willing to start a price war outside of the mainstream. I'd expect both the 5970 and 580 to be between $450-550 shortly after launch. At least without the 580 maybe the 480 would have finally gotten cheaper. Now it will just be discontinued.
Posted on Reply
#72
CDdude55
Crazy 4 TPU!!!
As i have always said, TPU is generally bias towards ATI/AMD cards. People can disagree, but I've been here for around 3 years and have seen this crap run rampant for a while now unfortunately..

As stated above, how about we wait for the card to at least see it's system specs confirmed and we have some official announcements from Nvidia themselves, but, something tells me this type of irrational thinking will just happen again in those threads too..:(
Posted on Reply
#73
dir_d
I think Nvidia could get a GF100b out by march or ditch the GF100 and scale up the GF108 core.
Posted on Reply
#74
bear jesus
CDdude55As i have always said, TPU is generally bias towards ATI/AMD cards. People can disagree, but I've been here for around 3 years and have seen this crap run rampant for a while now unfortunately..

As stated above, how about we wait for the card to at least see it's system specs confirmed and we have some official announcements from Nvidia themselves, but, something tells me this type of irrational thinking will just happen again in those threads too..:(
Personally i think every tech site seams to be mainly filled with fan boys of every brand, but to be honest i expect nothing less as the world seams to be mainly filled with people lacking common sense and logic :roll: all i can hope for is that some day people learn that everything should be judged on a product by product basis not on the logo slapped on it (:laugh: never going to happen).
Posted on Reply
#75
Atom_Anti
CDdude55We know absolutely nothing about this card, how can someone rationally argue that ''nobody has to worry about it'' without just being a blatant fanboy.
We know very much about it, even lot more than about the upcoming HD6900:rolleyes:. GF110= 512+ CUDA, 128 TMUs, 512-bit 2 GB memory. It means about 700+ mm^2 chip, which is too big for TSMC production line, TDP would lot more than 300 watt, than I could continue with technical problems, heating and price. No way!, just Nvidia trying to keep their fans somehow:ohwell:.
Posted on Reply
Add your own comment
Apr 29th, 2024 13:59 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts