• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce GTX 580 to Get Price-Cuts

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,677 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
To pave the way for the GeForce GTX 680, which will arrive later this month in small but sizable quantities, with wide availability in the months to come, NVIDIA is cutting the prices of its GeForce GTX 580 graphics card. The GF110-based behemoth of 2011, will now start at 339.8 EUR, according to European price-aggregators such as Geizhals.at. The new price makes GeForce GTX 580 1.5 GB cheaper than the Radeon HD 7950, and having a slightly improved price-performance ratio. The 3 GB variants of GeForce GTX 580 are priced similar to the HD 7950. The GTX 570 starts at 249 EUR.



View at TechPowerUp Main Site
 
edit : nevermind i just cant read lol

cant wait to see how this card performs and if this dynamic oc feature works well or if its a flop. i am eagerly awaiting nvs offerings coz i wanna give them a crack since i havent used a nv card since the 8800gts 320mb
 
That's interesting...
 
Lol and so the price dropping begins... As per usual
 
Still expensive. I can see people getting 2nd hand cards instead.
 
Well, for 340 EUR i see it as a good competition even against current HD7000 lineup despite the fact its the older series. With absolute nonsense prices around HD7950 and HD7970 (both way over 400 EUR) this GTX 580 is a good option still.
 
What Nvidia would be wise to do, is release their high end Kepler products and go for the crown, and just try to die shrink the GTX 580 and 570 to go after the HD 7800 series.

G80 to G92 transition proved you could teach an old dog new tricks and that by shrinking a existing die, making minor tweaks, and upping the clock speeds you could make a good architecture last in the market for a number of years while your competitor struggled to scale new architectures from their high end downward and still make them competitive.

A more efficient GTX 580 would be a perfect card, knock it down to 28nm and rebalance the clock speeds accordingly and even without DX11 I bet they'll sell very well against the 7800 series. And a 28nm GTX 570/560 TI would really make a lot of Nvidia's cluttered midrange obsolete.
 
And it only took what, like a year and a half?
 
What Nvidia would be wise to do, is release their high end Kepler products and go for the crown, and just try to die shrink the GTX 580 and 570 to go after the HD 7800 series.

G80 to G92 transition proved you could teach an old dog new tricks and that by shrinking a existing die, making minor tweaks, and upping the clock speeds you could make a good architecture last in the market for a number of years while your competitor struggled to scale new architectures from their high end downward and still make them competitive.

A more efficient GTX 580 would be a perfect card, knock it down to 28nm and rebalance the clock speeds accordingly and even without DX11 I bet they'll sell very well against the 7800 series. And a 28nm GTX 570/560 TI would really make a lot of Nvidia's cluttered midrange obsolete.

wow

and there i was thinking the 580 was a revision of the 480 and had dx11 :|

flushing the channels before a new launch, what a shocker :lol:
 
And it only took what, like a year and a half?

It was always priced competatively.

I also don't think it's as easy as just shrinking the pre-existing GTX580 to 28nm, it would take a hefty amount of reworking. Kepler is intented to replace Fermi as a similar but much more efficient (in terms of power and heat generation) iteration. I don't see them wasting time on reworking that design when they already have the improved version in production.
 
I have sold mine GTX 580 Black ops today @ 460$
 
$4 cheaper, 7950 still faster on the most recent games (exc. skyrim) and consume 80-150W less than 580 on peak & maximum. but if you love your card, then what can everyone say? :)
 
With HD7950's super-massive overclocking abilities even 399$ will be somewhat not worth it
 
It was always priced competatively.

I also don't think it's as easy as just shrinking the pre-existing GTX580 to 28nm, it would take a hefty amount of reworking. Kepler is intented to replace Fermi as a similar but much more efficient (in terms of power and heat generation) iteration. I don't see them wasting time on reworking that design when they already have the improved version in production.

It was never priced competitively. None of the highend cards have been for years. One company, usually nvidia, sets a price they want you to pay, and then the other company releases their cards in between those price points. There is no competition. It's price fixing without actual communication.
 
It was never priced competitively. None of the highend cards have been for years. One company, usually nvidia, sets a price they want you to pay, and then the other company releases their cards in between those price points. There is no competition. It's price fixing without actual communication.

I'm not saying I agree with it, not in the slightest, and it is completely price fixing. At this rate Nvidia's cards will be priced to compete with AMD's, so if the GTX680 is faster than the HD7970, it will probably launch at nearly $600.
 
wow

and there i was thinking the 580 was a revision of the 480 and had dx11 :|

flushing the channels before a new launch, what a shocker :lol:

I just confused my terms as I intended to write D3D11.1 but wrote DX11 since DirectX is what we are always used to talking about not Direct3D. HD 7900 series launched as the first with Direct3D 11.1 support, which is one of the major changes of Windows 8 and one of the larger selling points of this next generation of GPUs. Obviously Kepler will be Direct3D 11.1 as well, where Fermi was not - Fermi was Direct3D 11. But there you have it, a mistake, good thing I'm not a brain surgeon. :wtf:
 
$4 cheaper, 7950 still faster on the most recent games (exc. skyrim) and consume 80-150W less than 580 on peak & maximum. but if you love your card, then what can everyone say? :)

Maximum?! That can only be reached while stress testing with Furmark. Which gamer does that!

The real ones are average and peak values, at 88W and 85W respectively. Still a big difference but not that bad considering that we are comparing an old 40nm GPU to a 28nm one. All in all, 7950 looks to be a better choice, unless someone wants things like CUDA and/or physx.
 
580 is still too expensive in my area, cheapest i saw was $440
 
It really sucks that we are going to have to pay $499.99 for the GF104(now GTX680) that was slated to be the GTX660TI @299.99 because AMDs 7950 and 7970 where not as powerful as NV thought they would be.
 
It really sucks that we are going to have to pay $499.99 for the GF104(now GTX680) that was slated to be the GTX660TI @299.99 because AMDs 7950 and 7970 where not as powerful as NV thought they would be.

To me this price-cut does sound like it's going to be significantly cheaper than 500, say 350-400, or significantly faster than most of us expect now, because there would be no rush to lower GTX580 price to 330 otherwise. If they were going to sell it for 500, and it is 25% faster than GTX580, there would still be a place for GTX580 at 400 or so, no need to go as low as 330 and 250 for the GTX570. They would be making the new offering look very overpriced and it's forcing AMD to lower prices too, BEFORE GTX680 launches which is shooting themselves in the foot, because it's the GTX680 that needs the fame, not the EOL'd card.
 
It really sucks that we are going to have to pay $499.99 for the GF104(now GTX680) that was slated to be the GTX660TI @299.99 because AMDs 7950 and 7970 where not as powerful as NV thought they would be.

$499 for the GTX680? Lol, that would be a bargain. I've heard prices are going to be a bit more than that.
 
$499 for the GTX680? Lol, that would be a bargain. I've heard prices are going to be a bit more than that.

Normally I would agree but this card was intended to be a mid range $299.99 card and has now been renamed and price jacked to a flag ship because of the lack of competition with AMDs 7900s.
 
It's going to be more expensive than the 7970. I could be wrong I suppose. Lack of competition from AMD? Nvidia doesn't even have a card out yet.
 
Normally I would agree but this card was intended to be a mid range $299.99 card and has now been renamed and price jacked to a flag ship because of the lack of competition with AMDs 7900s.

I have not seen where Nvidia was releasing this as anything other than a high end card. Can someone provide a link, proof, or something other than heresay from some other post on another site?
 
It really sucks that we are going to have to pay $499.99 for the GF104(now GTX680) that was slated to be the GTX660TI @299.99 because AMDs 7950 and 7970 where not as powerful as NV thought they would be.
It doesn't suck; no one is making you purchase it, if you can't stomach the price there will be some that do. So, Nvidia isn't pressured to release a high end part, while trying to implement some new wiz-bang feature on lower chips to achieve competitiveness… and that’s somehow AMD's fault. :laugh:

So in your mind back a half-a-year ago or more Nvidia knew "with certainty" the reference 7970’s would bring about 12-18% improvement over the GTX 580, so with that they shelved the "top dog" embarking to make the GK-104 competitive. Hinging on if they can successfully implement some untried feature to boost the performance as need. :cool:

Nvidia was able to even if already a "think tank" project; completely evaluate, implement and tested all the parameters to make Dynamic profiles features function flawlessly in those 6 months? A tall order, but plausible.

You deduce AMD placed the bar too low, so Nvidia's running up to it, and figures ah! what the heck watch this… Rolls it shoulders back and limbos under the bar wowing the faithful! They throw arm up in victory saying we got to the other side; though with new rules ok? Wait for their next attempt.

So A GK-100 is such the virtuoso in performance, power, and price Nvidia couldn't bring themselves to market it? :wtf:

Don’t get me wrong if it works and they planned it this way (even as a contingency) when first setting out in developing Kepler… Kudos for them, and good game changer. :respect:

While if they had apathy 6 months back and changed their game plan based on a speculation of performance, wow they risked a lot on what might or should've been at least shown previously in a trail product. :twitch:
 
Back
Top