Thursday, October 21st 2010

GeForce GTX 580 Expected to be 20% Faster than GeForce GTX 480

NVIDIA's next enthusiast-grade graphics processor, the GeForce GTX 580, based on the new GF110 silicon, is poised for at least a paper-launch by end of November, or early December, 2010. Sources in the video card industry told DigiTimes that the GTX 580 is expected to be 20% faster than the existing GeForce GTX 480. The new GPU is built on the existing 40 nm process, NVIDIA's 28 nm GPUs based on the Kepler architecture are expected to take shape only towards the end of 2011. Later this week, AMD is launching the Radeon HD 6800 series performance graphics cards, and will market-launch its next high-end GPU, codenamed "Cayman" in November.
Source: DigiTimes
Add your own comment

98 Comments on GeForce GTX 580 Expected to be 20% Faster than GeForce GTX 480

#27
DarkMatter75
Do i have to buy a dedicated AC unit from APC to use the new card???
Bye bye NVidia...
Posted on Reply
#28
Benetanegia
SNICKyour whole mathemetics based on unitary method is not supposed to be used on electricals components.TDP of FULL gf100 is around 204 watts more than gf100.based on your mathematics can you explain why the TDP OF FULL GF100 IS QUITE MORE AS EXPECTED?
Do you really believed that to be real? That was not coming from Nvidia. How naive can you be? First of all the revision code (A1, A2, A3) was not shown so that card if it was real was probably an old prototype card using A1 or A2 silicon and why did Nvidia use A3? Ah yes, because A1 and A2 were broken. Other posibility is that they re-enabled a normal GTX480, and oh yes, there was a reason that SM was disabled...

If you want to know more exactly how enabling SPs/ROPs trully affect power consumption on real cads, then look at GTX470 vs GTX465, because they both have the same clocks. And how much is that then again?

GTX465 = 199 W
GTX470 = 232 W

Let's experiment with that:

GTX465 = 352 SP ; GTX470 = 448 SP
GTX465 = 4 ROP Partition ; GTX470 = 5 ROP partition

Ficticial GTX470 = (199 W x 448) / 352 = 253 W hmm
Second Fictional GTX470 = (199 W x 5) / 4 = 248 W hmm

wow so it looks like actual power consumption of actual GTX470 is lower than my math. I knew that from the beginning, my assumptions above are for worst case scenario.

GF100 consumed a lot, everybody knows that. Everybody should know by now too, that it was a problem with the fabric (interconnection layer between the different units within a chip) and that the problem has already been fixed. It was mostly fixed for GF104 as can be seen by its power consumtion and is probably even better for other future releases.
Posted on Reply
#29
1Kurgan1
The Knife in your Back
Panic news to make ATI's new cards not get so much attention, makes me sad, not a real NV fan myself, but some better competition on releases would be better for price drops.
Posted on Reply
#30
LAN_deRf_HA
They must be upping the clock speed a lot. We already saw that going to 512 shaders alone only adds 5%, and I don't think the memory bandwidth is going to add the other 15%. Managing that in the same power consumption as the 480 might actually have taken a lot of work.... assuming their claims aren't total BS.
Posted on Reply
#31
MxPhenom 216
ASIC Engineer
CharlOWait, 20% faster than a 480 is like a 5870 oc'd (well something more) but it is definetly slower than a promised 6870... They are not even getting even on paper now?
what are you on? can i get some of that?? the hd5870 cannot even hold ground against a non overclocked GTX480. a 20% faster GTX480 will annihilate the HD5870 and HD6870. remember the 6870 isnt a hd5870. its basically a 5850. amd naming is relaly messed up with this generation

And for people who are obsessing over the potential high heat and consumption. Just wait and see, its not worth getting all worked up about it now. Maybe nvidia has new strategies to over come this. Better reference coolers perhaps
Posted on Reply
#32
CDdude55
Crazy 4 TPU!!!
Woot, TPU's heavy bias comes out the wood work again! *fist pumps*

Hopefully they can up the performance when it actually comes out, it's still WAY to early to tell anything.
Posted on Reply
#33
Benetanegia
CDdude55Woot, TPU's heavy bias comes out the wood work again! *fist pumps*
+1

It's really sad tbh, sadder every day. Every day that passes is more difficult to talk about tech.

If only GTX460 had never been released, then maybe, maybe, the comments wouldn't be so baseless. But the GTX460 was released and it's faster than the GTX465 while consuming 50 W less... Is it posible for Nvidia to repeat that achievement with GF110, or maybe even... take a sit AMD fanboys... further improve the efficiency a little bit?

- No, because I'm so biased I cannot even see what's in front of my eyes.

- No, because AMD has apparently improved power efficiency further with NI, but there's no way on Earth or otherwise for Nvidia to catch up. Imposible! I mean, Nvidia never had better power efficiency than AMD/Ati. Never, never, never... hmm well... hmmmmokey "only" before Evergreen/Fermi. :rolleyes:

Posted on Reply
#34
bear jesus
CDdude55Woot, TPU's heavy bias comes out the wood work again! *fist pumps*
Benetanegia+1

It's really sad tbh, sadder every day. Every day that passes is more difficult to talk about tech.

If only GTX460 had never been released, then maybe, maybe, the comments wouldn't be so baseless. But the GTX460 was released and it's faster than the GTX465 while consuming 50 W less... Is it posible for Nvidia to repeat that achievement with GF110, or maybe even... take a sit AMD fanboys... further improve the efficiency a little bit?

- No, because I'm so biased I cannot even see what's in front of my eyes.

- No, because AMD has apparently improved power efficiency further with NI, but there's no way on Earth or otherwise for Nvidia to catch up. Imposible! I mean, Nvidia never had better power efficiency than AMD/Ati. Never, never, never... hmm well... hmmmmokey "only" before Evergreen/Fermi. :rolleyes:
You just need to laugh at the stupidity and remember thats all it is, if the amd fanboys are so interested in power draw why are they not complaining about the over 300w tdp of the 5870x2 cards? (no not the 5970 i mean the ones with the speeds fo the 5870) things with 2 8 pin and one 6 pin?

I'm no fan boy for either side as they both suck in their own special ways and are both awesome in the ways as well, and although stupidity annoys me i can ignore the stupidity in the fanboy comments for both amd and nvidia..... how about you 2 join me in laughing at the stupidiy? :roll::laugh::roll:

Back on topic i'm really excied about seeing what nvidia will be bringing out, i just hope its very close to the 6970's release date as im sick of waiting for my gpu upgrade and im starting to think i would want something more powerful than 1gb 460 sli so hopefully either amd or nvidia can give me something that fits the bill and before the end of the year.
Posted on Reply
#35
Yellow&Nerdy?
I believe this is just Nvidias PR trying to damper AMD's launch. I sure hope Nvidia could put something out there to compete with Cayman, because it would definitely help bring the prices down. But somehow it's hard to believe Nvidia has got anything to counter Cayman and Antilles... Well, we'll see.
Posted on Reply
#36
KainXS
damn, im truly disappointed to hear this

but it should make it so both companies have cards with similar performance at least, somewhat unlike now.
Benetanegia+1

It's really sad tbh, sadder every day. Every day that passes is more difficult to talk about tech.

If only GTX460 had never been released, then maybe, maybe, the comments wouldn't be so baseless. But the GTX460 was released and it's faster than the GTX465 while consuming 50 W less... Is it posible for Nvidia to repeat that achievement with GF110, or maybe even... take a sit AMD fanboys... further improve the efficiency a little bit?

- No, because I'm so biased I cannot even see what's in front of my eyes.

- No, because AMD has apparently improved power efficiency further with NI, but there's no way on Earth or otherwise for Nvidia to catch up. Imposible! I mean, Nvidia never had better power efficiency than AMD/Ati. Never, never, never... hmm well... hmmmmokey "only" before Evergreen/Fermi. :rolleyes:

tpucdn.com/reviews/MSI/N480GTX_GTX_480_Lightning/images/perfwatt_1920.gif
wow that review shows that the HD5450 is the fastest card O.O
did you change that yourself
Posted on Reply
#37
erocker
*
CDdude55Woot, TPU's heavy bias comes out the wood work again! *fist pumps*
Love it. Just because a majority of people (pretty much across the internet) feel one way, it's bias. Of course, you're just a "fanboy" for pointing that out. It's all tounge and cheek, but it's either bias or the unwillingness to accept the truth from one "side" or the other. Fact of the matter is, Nvidia has failed to impress many.
Posted on Reply
#38
bear jesus
KainXSwow that review shows that the HD5450 is the fastest card O.O
did you change that yourself
That's performance per watt, so not really relative to what card is fastest.
Posted on Reply
#39
wolf
Performance Enthusiast
Again a thread full of smack talk for a GPU we know next to nothing about, bar a few who actually have something to contribute. ohwell, can't say I'm not dissapointed.

I have a feeling this will just be a GF104 + 50% of it again, making it 576sp, back to 48 ROPS, and still 384-bit GDDR5, they just need to improve the memory controller, and make sure they hit a good power consumption target.
Posted on Reply
#40
dlpatague
KainXSwow that review shows that the HD5450 is the fastest card O.O
did you change that yourself
Can you read? It says "Performance per Watt"! That has nothing to due with total performance. Geesh!
Posted on Reply
#41
CDdude55
Crazy 4 TPU!!!
erockerLove it. Just because a majority of people (pretty much across the internet) feel one way, it's bias. Of course, you're just a "fanboy" for pointing that out. It's all tounge and cheek, but it's either bias or the unwillingness to accept the truth from one "side" or the other. Fact of the matter is, Nvidia has failed to impress many.
Offering constructive criticism is no issue at all, everyone should be able to point out mistakes and criticize it in hopes of future improvements. Look around, nothing but nonsensical counterproductive comments yet again from a site that i frequently visit, the majority can say whatever BS they want, but i honestly expect more out of sites like this.
Posted on Reply
#42
cadaveca
My name is Dave
erockerNvidia has failed to impress many.
No doubt. Considering all the problems I've been complaining about on here, you'd think nV would be "my best friend" at this point, however, the Fermi launch put a big damper in thier hype, and nothing since then has really done much to improve consumer confidence, as evident in the shift in gpu marketshare.

They need a killer product. 20% over GTX480 doesn't cut it....they need 33% or so...then they'd have a real chance.

Mind you the current rumour about price drops is definately gonna work in thier favor.
Posted on Reply
#43
bear jesus
cadavecaNo doubt. Considering all the problems I've been complaining about on here, you'd think nV would be "my best friend" at this point
To be honest i don't understand how you don't hate AMD at this point :laugh: i'm pretty sure i would by now if in your position.
Posted on Reply
#44
erocker
*
CDdude55Offering constructive criticism is no issue at all, everyone should be able to point out mistakes and criticize it in hopes of future improvements. Look around, nothing but nonsensical counterproductive comments yet again from a site that i frequently visit, the majority can say whatever BS they want, but i honestly expect more out of sites like this.
What BS? A lot of it is not false, though a lot of it is quite short-sided or just blunt. I can say that I'm not particularly impressed with certain things on both sides of the coin.
Posted on Reply
#45
Mindweaver
Moderato®™
20% sounds nice... I just wonder how much room was left for overclocking? I mean 20% on a 480 should be achievable on a 480, but how much more can you push it?.. I guess time will tell. A 580 that is 20% faster than a 480 plus being able to push it another 20% would be kickass! I don't know... i hope so for everyone.. We need both to do well to keep price's right. you know?
Posted on Reply
#46
CDdude55
Crazy 4 TPU!!!
erockerWhat BS? A lot of it is not false,
In terms of the BS posts that contribute nothing to the discussion yet don't get regulate on. And for some reason always fill up certain threads...

No ones denying Fermi's issues. But as you even said, both sides have their issues.
Posted on Reply
#47
cadaveca
My name is Dave
erockerI can say that I'm not particularly impressed with certain things on both sides of the coin.
Well that's just it, isn't it?

Both sides kinda have issues, and so really, I blame TSMC.

Bear Jesus, while I can afford to just get rid of stuff and start over again, to me, that'd make all the time I've spent trying to get things working a waste of time. As far as I am concerned, either AMD fixes the issues I have, or they fail. I can't truly say that unless is see this out to the end.

But, because my usage(3 monitors) dictates I need a certain level of performance, I'm just plain out of options at this point. The 69xx-series is my last hope, or maybe this GTX580 can pull up nV's socks, and I'll switch over.

I am not "sticking it out" because I'm a fanboy...I need 60+FPS, and @ 5870x1080. A little bit of AA would be nice too. The first company that can do this, gets my cash.

For anyone else...they don't need GTX580. Seriously...a 480 is more than enough power, for anyone with a single monitor.
Posted on Reply
#48
N3M3515
cadavecaNo doubt. Considering all the problems I've been complaining about on here, you'd think nV would be "my best friend" at this point, however, the Fermi launch put a big damper in thier hype, and nothing since then has really done much to improve consumer confidence, as evident in the shift in gpu marketshare.

They need a killer product. 20% over GTX480 doesn't cut it....they need 33% or so...then they'd have a real chance.

Mind you the current rumour about price drops is definately gonna work in thier favor.
At least someone who thinks like me.
Bottom line: 20% over GTX480 is not enough, considering Cayman XT will be at the very least 30% - 40% faster than Cypress XT, that would make it go neck on neck(equal performance) with 'GTX580', and as everyone knows, due to amd chips being cheaper to produce, amd can lower prices more, and still make some profit.
Posted on Reply
#49
KashunatoR
I think that gtx 580 will be a winner if it is slightly cooler than gtx 480 and keeps the same OC potential and perfomance scaling. I own the gtx 480 and I will happily upgrade for gtx 580. many of you speak without knowing the truth. my gtx 480 OC 830/1100 doesnt't get passed 81 degrees Celsius and is almost as performant as ati 5970. and that without facing the retarded ATI drivers. you also get physx CUDA and the best minumum framerate which is most important when you playing games
Posted on Reply
#50
Benetanegia
cadavecaBoth sides kinda have issues, and so really, I blame TSMC.
There's truth there.
For anyone else...they don't need GTX580. Seriously...a 480 is more than enough power, for anyone with a single monitor.
They don't need it, true to an extent. But the thing is that I'm 100% sure that by going the 1.5x GF104 they can get a card with 20% more shaders, 50% more TMU and higher clocks, while the chip is smaller than 500 mm^2 and consumes 50w less. So why don't go for it? And of course as a consumer it's way better.
N3M3515due to amd chips being cheaper to produce, amd can lower prices more, and still make some profit.
That is a very common missconception that is based on too many vagueties and only one fact.

Fact is that a smaller chip costs less than a big chip when everything else is equal.

But that's not the end of it. There's far too many things we dont' know and have as much effect on profitability.

- How much does AMD pay per waffer and how much Nvdia, considering Nvidia buys 2x the ammount of waffers and is not going to flee to GloFo as soon as GloFo is ready?
- Chip price is only a fraction of what a card costs. How much does it cost Nvidia to make the cards and how much AMD?
- AMD's ability to make smaller chips is based on much more $$ put into that R&D department than NVidia. How much?
- How much does it cost Nvidia and AMD to operate as a company?
Posted on Reply
Add your own comment
Apr 26th, 2024 14:30 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts