1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce GTX 580 to Get Price-Cuts

Discussion in 'News' started by btarunr, Mar 12, 2012.

  1. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.39/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    GK104, the 4 in the end has always been indicative of midrange/performance part. just like 6 is lower mainstream and 8 is low end. And the existence of GK100 and GK110 indicative of high-end is very well known, though nothing else is really known except that at some point they were/are in the making.

    @ Casecutter

    They didn't know exactly what AMD would bring, but they had an idea a long way back. Specs were known, there were some leaked figures, etc. So they already knew it would be close. They also had the previous generation to compare with where GF114 (midrange/performance) was faster than Cypress (high-end), even though it released later but it's what GF104 was suposed to be. So if Kepler is much better than Fermi it was relatively safe to assume they could do better this generation. But they didn't just stop making the high-end chip, who said that? They only scraped it (if they have canceled it at all) once it was known HD7970's performance. According to rumors, things were not going well with the chip, so instead of releasing another cut down chip like GTX480, they cancelled/post-poned it. And again IF they have done so at all, because no one really knows fuck about it. If GK104 couldn't compete with Tahiti, they would have forced another cut down card like GTX480.
     
  2. CrAsHnBuRnXp

    CrAsHnBuRnXp

    Joined:
    Oct 19, 2007
    Messages:
    5,573 (2.12/day)
    Thanks Received:
    675
    Why not just get the GTX 660 since the performance scale (according to Tom's Hardware) is that of a GTX 580 and its cheaper than the 580. (according to same source).

    Source
     
  3. Selene

    Joined:
    Jan 21, 2008
    Messages:
    234 (0.09/day)
    Thanks Received:
    13
    Thats the whole point, the GTX660 that was going to be around $299.99 is now going to be called the GTX680 and be around $499.99.
    I hope this is not what happens and I do not blame AMD for NV doing this but this does show us what happens when you have little or no competition.
    If the chart you linked is right thats not so bad, alittle higher then it should be but not insane.
     
  4. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.39/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    According to every recent rumour those specs and posibly everything else is false.
     
  5. CrAsHnBuRnXp

    CrAsHnBuRnXp

    Joined:
    Oct 19, 2007
    Messages:
    5,573 (2.12/day)
    Thanks Received:
    675
    Well hopefully that isnt the case because ive been holding out for months for getting a new card and if i can get a GTX660 that has the performance of a GTX580 for less im going to grab it. If they end up calling it a 680 and price it at $500, im going to be very pissed for waiting so long.

    Well they need to start releasing information so i know whether or not to buy my GTX 570 now or wait a month until new shit comes out.
     
  6. Casecutter

    Joined:
    Apr 19, 2011
    Messages:
    1,191 (0.88/day)
    Thanks Received:
    90
    Location:
    So. Cal.
    So, as said, why did Nvidia put the GK-110 on the back-bench of R&D we don't know, but that’s totally uncharacteristic? Can you recall the last time Nvidia start with something other than the Uber of offerings?

    While it's pretty well established that a GK-104 requires "Dynamic Profile" make it perform and still be on top of its' game, while remain in the established power envelope. So Kepler is so perfect they benched thier "star", then brought up the "B" league, although you contend AMD screwed the pooch on Tahiti it's at least not MIA.

    The truth... to control Kepler Nvidia needs to add a software "shine". I doubt you'll be able to take a GK-104 disable "Dynamic Profiles", and clock it anywhere close to matching a 7970. (which will have been out almost 5 months before you get what called a GTX680 in hand)

    IMO its slow, hot, and nowhere efficient, Kepler is worse and late compared to AMD, althought for that someone will pay for Nvidia’s R&D to reign it in. It always someone else fualt last time TSMC this time AMD's nice try.

    We wait... :D
     
  7. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.39/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    8800 GT? For example?

    And GK107 will be the second chip in the series to be realeased, before GK106 and that's uncharacteristic too. Maybe it's because they decided to address the markets that need more first (higher revenue), like I don't know, doing what they said they would over a year now?

    Plus it's GK100 the high-end chip that was put back, not GK110. GK110 is a refresh that may or may not be what GK100 was.

    :roll:

    :roll: Yeah I laugh again. Care to show a proof, because I've seen your posts about the dynamic clocks and it's pretty obvious you don't understand what they are at all, so any further conclusion you think you can make is just wrong.

    And really 5 months? Maybe the red tint does not allow you to follow the calendar. But it's not even going to be 3 months. 2 months if cards are actually available on the 23rd.

    EDIT: Oh and regarding the HD7970 only a real fanboy does not see the obvious elephant in the room: Tahiti is 60% bigger than Pitcairn and has 60% more shaders and TMU, but it's only 25% faster. Factor in clocks and Tahiti is still 20-25% slower than it should.
     
    Last edited: Mar 12, 2012
  8. xenocide

    xenocide

    Joined:
    Mar 24, 2011
    Messages:
    2,154 (1.57/day)
    Thanks Received:
    466
    Location:
    Burlington, VT
    You make it sound as though Dynamic Profiles are intended to address some kind of lack of performance, which is definitely not the case. Nvidia just trying to adress what people give them the most complaints about--power consumption and heat. Why have their cards run either at 30% when idle or 100% when anything is present, when it can dynamically clock the card so there is no wasted energy or extra heat generation? If a task only requires the card to run at 50%, why run it at 100%? It would be like flooring it inbetween stop signs. It is just inefficient.

    According to?

    *crickets*

    Yea, I thought so. People need to remember, this isn't just a rework of Fermi, so it's not going to behave the same. It's more than likely going to require less power, run cooler, and perform slightly better given the specs. Stop trying to make it sound like AMD released the greatest GPU ever, and look at the facts. Ben said it correctly, Tahiti is pretty terrible compared to Pitcairn which is amazing.
     
  9. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    9,943 (3.85/day)
    Thanks Received:
    3,534
    Location:
    Quantum well (UK)
    Price drop? Hell yeah, I might just get myself a second one and have some SLI fun with it.

    Note that this here enthusiast has no idea about being sensible with money and I'd still get myself the latest card, regardless. :D
     
  10. NanoTechSoldier New Member

    Joined:
    Mar 10, 2012
    Messages:
    27 (0.03/day)
    Thanks Received:
    7
    What Would You Sell A World First 28nm Graphics Card For..??

    I Think It's A Reasonable Price, For The "Advanced Micro Devices" HD7000 Series.. Plus They're PCIE 3.0 Cards etc.. Not PCIE 2.0, Like The Nvidia Series Cards...

    Graphics Cards, Have A Lot To Do With The Drivers, To Get Performance Too..

    OpenGL Drivers, Tell The OS (Or Application), What To Do & DirectX Drivers, Wait For The OS (Or Application), To Tell The Graphics Card, What To Do & Can Slow Performance Down etc.. (CPU Interupts = CPU Load etc..)

    On The Other Hand.. If An OpenGL Driver, Isn't Written Properly Or Has Bugs.. It Can Cause Problems In System Too & Cause A Performance Drop..

    Point Being... Don't Base Your Graphics Card Purchases, On Price... But, On The Software, Drivers & GPU/s That Run Them etc..
    Drivers Can Always, Be Updated Though & An OpenGL Card Is The Best Option..
     
    Last edited: Mar 13, 2012
  11. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,576 (2.57/day)
    Thanks Received:
    1,329
    Again, some proof other than what you THINK is happening?


    I have seen none, Nvidia seem to be doing a great job of keeping it under wraps. I for one welcome the competition as my wallet wins, but I haven't seen anything concrete. Just a lot of speculation and rumors that are based on rumors that are based on a idea someone had about a post they saw somewhere else.
     
    10 Million points folded for TPU
  12. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.39/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    Sorry but that is not rumor. Look, Nvidia has been using that code-naming scheme for like forever and there's not a single reason nor evidence that it is different this time around. A 4 in the end means midrange/performance. 300 mm^2 means midrange/performance. 256 bit means midrange/performance. It's known that a GK100 was in the works and then dissapeared from the rumor mill, which why it's suposed cancelled. And there is certainly a GK110 in the pipeline too. Do you have any VALID reason to believe this chip is anything but their performance chip? No, you don't.
     
  13. xenocide

    xenocide

    Joined:
    Mar 24, 2011
    Messages:
    2,154 (1.57/day)
    Thanks Received:
    466
    Location:
    Burlington, VT
    All valid points. I still believe the GK104 was originally intended to go into a GTX660, and another offering (GK100 or GK110) was intended to be their high-end offering. Nvidia probably just saw the HD7970, did some internal testing, and decided they could make more money just using the GK104 so they bumped a GTX660 to GTX680 and called it a day.

    That it's now called the GTX680 :wtf:

    Except that they will probably price the GTX680 at $600, a GTX670 which is between the HD7950 and HD7970 at $450-500, and so on. Odds are there won't really be a price war since AMD set their prices high, so Nvidia has no reason to lower their prices.
     
  14. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,576 (2.57/day)
    Thanks Received:
    1,329
    I find it funny that you all think Nvidia/ATI set complete card prices when they mostly manufacture a small piece of silicon that is soldered to a board with GDDR, vregs, a PCB, and many other components that cost money.



    ATI sells batches of 7970 GPU dies to Sapphire, HIS, XFX..... at the same price, and it is up to the board maker and the retailer to set retail price. Same for Nvidia, they have dick all to do with retailer jacking up prices.

    I also don;t have any reason to doubt the existance of a flying spaghetti monster, that I can fly if I believe hard enough, and that I'm superman.

    Nothing against you Bene, but no one in ANY thread has posted anything other than "well they did X in the past". And if the rumor mill is to be believed they have had yield issues, heat issues, and performance issues too.
     
    10 Million points folded for TPU
  15. CrAsHnBuRnXp

    CrAsHnBuRnXp

    Joined:
    Oct 19, 2007
    Messages:
    5,573 (2.12/day)
    Thanks Received:
    675
    Do you write every word in capital letters when using pen and paper too? If not why do it here? Makes no sense.
     
  16. Dj-ElectriC

    Dj-ElectriC

    Joined:
    Aug 13, 2010
    Messages:
    2,250 (1.41/day)
    Thanks Received:
    851
    NanoTechSoldier,

    [​IMG]

    On a more serious note, a lot of what you wrote are more gimmicks then actual helping features.
    BTW reading your comments makes my head ache, caps and random punctuation. No offense.
     
    Last edited: Mar 13, 2012
  17. Horrux

    Horrux

    Joined:
    Jun 2, 2011
    Messages:
    735 (0.56/day)
    Thanks Received:
    124
    You are correct, the last thing nVidia wants is to force AMD to lower their prices. If they were to do that, it would mean their offerings would be underpriced. And there is nothing good that can come of that, for these companies. They maximize profits by charging comparably on a price/performance basis. But it is definitely not competition.
     
  18. Nihilus

    Joined:
    Jul 19, 2011
    Messages:
    219 (0.17/day)
    Thanks Received:
    21
    Change of heart

    Alot of people went from "ROAR, this Kepler will destroy AMD" to "The real Kepler will come much later." Why? Very few can afford the top tier Nvidia cards as opposed to the AMD cards. The whole point of the Fermi excitement is to get the HD 7970 prices down, not to see who has the biggest e-peen. If the GTX 680 has similiar performance and price to the GTX 580 with lower power consumption, it is still a win. :toast:
     
  19. xenocide

    xenocide

    Joined:
    Mar 24, 2011
    Messages:
    2,154 (1.57/day)
    Thanks Received:
    466
    Location:
    Burlington, VT
    7970 = $550 MSRP
    GTX680 = $550 MSRP (According to Rumors)

    Whaaaaaaa???
     
  20. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.39/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    No, it's not unless they do something like 8800 GT and price it accordingly below $400 at least. The point is that high-end cards from both Nvidia (GTX500) and AMD (HD6000) are selling for the same price since they launched 15 months ago. HD7000 increased the price point instead of lowering it and apparently Nvidia will just follow suit, which makes all the sense in the world for them, but it's just crap for us. Only 2-3 years ago similar sized chips, with same amount of vram chips and similar vrm circuitry was selling for $150, now we have to pay 3-4x as much for the same thing.

    Not to mention that GTX570 and GTX560 Ti and non-Ti have always been sold cheaper than AMD counterparts. Only the flagship GTX580 has been more expensive.
     
    xenocide says thanks.
  21. Casecutter

    Joined:
    Apr 19, 2011
    Messages:
    1,191 (0.88/day)
    Thanks Received:
    90
    Location:
    So. Cal.
    Got me there... took them 5 months to get a G94 9800GTX, but some would say they had not reason to rush there was no competition I'll give you that.

    Can't say I had heard of a set a road map with intended releases with a quarter... I can take your word on that.

    My miss-type but as you say, "may or may not be what GK100 was"; so in other words the GK-100 went in the trash... got it!

    Oh so you have knowlege of what it's actually going to do, wish to share?

    The 7970 released 12/27/11. I said "which will have been out almost 5 months before you get what called a GTX680 in hand". Now for most that's reality, as any average guy who wants one will be camp out on every E-tailer hoping he'll get one of the in the basket and paid for before the other guy 10 each isn’t availably it's call Beta or pre-production. Real accessibility will be at least mid-April, so I stand by 5 months.

    That's true... irrefutable data and spec's, but the room also has a huge gaping hole from where that "whale" went missing; like not having GK1X0 to quantify GK104 Kepler against I suppose we aren't permitted to "speculate" without spec's... we wait. :D
     
  22. Casecutter

    Joined:
    Apr 19, 2011
    Messages:
    1,191 (0.88/day)
    Thanks Received:
    90
    Location:
    So. Cal.
    btarunr...? (so yes I digress) "705 MHz, which clocks down to 300 MHz when the load is lowest, and the geometric domain (de facto "core") will clock up to 950 MHz on high load."
    http://www.techpowerup.com/forums/showthread.php?t=162035

    That says it might have profiles that bump the clocks 35% over baseline. As I read. :cool:

    Nice opinion, while it's hard to be your own (only) competition.

    Am I not allowed an opinion just like "Benny" ^? :D
     
  23. Casecutter

    Joined:
    Apr 19, 2011
    Messages:
    1,191 (0.88/day)
    Thanks Received:
    90
    Location:
    So. Cal.
    First all 28Nm GPU production had a increase that basically wipe out the normal incentive of move to a die shrink, so AMD has contented with that, and so is Nvidia by working from a GK104.
    http://forums.nvidia.com/index.php?showtopic=210049

    Consider the GTX580 MSRP was $500 with 1.5Gb and hadn't deviated much from that in 17 months, though in the market now about 15% less. The 7970 comes with 3Gb, 15-18% increase of performance, efficiency, matches the GTX 580 348-Bit and for that it starting out asking an extra 10%.

    If Nvidia can bring itself to provide a GTX680 that has a capability overtake the7970 here or there for $500 that how I figure. But we realize using a much more cost-effective chip, 512-Bit (though on that I'm not sure some say 256-Bit) and probably just 2Gb. But for that you get those Dynamic Profiles.
     
  24. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.39/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    There should be no need for that. I just read the same as you did. Difference: I paid attention. When on 100% REAL load, not what software shows which is always false, the GPU will clock to it's highest clocks, ALWAYS, even going OC if the 100% is mantained for a long time. Long time in GPU terms, so miliseconds, after which 100% (again REAL not what afterburner shows) load will be gone and another profile will be loaded. And when load is lower than 100%, say 50% it will be clocked much lower, so that the chip jumps to a higher utilisation rate. The basis is that i.e. 500 SPs @ 1000 Mhz consume much more than 1000 SPs @ 500 Mhz. This will be adjusted dynamically by the hardware, differently for each clock domain and with dozens of profiles, so for 100%, 95%, 90%, etc.

    Lol, yeah and no one will get a card 2 months after official release, sure... :rolleyes:
    Either you compare official launches against each other or you simply don't. AMD did a paperlaunch like no other, but now it's time to disregard that and claim that Nvidia will do a paperlaunch with 3 months of difference between official launch and availability. Absurd and flawed thinking based on your pure speculation. Dec 22 vs March 22 == 3 months. Period.
     
    Horrux says thanks.
  25. NanoTechSoldier New Member

    Joined:
    Mar 10, 2012
    Messages:
    27 (0.03/day)
    Thanks Received:
    7
    It's more to piss you Grammer Effected off than anything... LOL.. + to Stop people, using translators etc..

    I hate, having to repeat myself, to people, that can't understand English.. It wastes my time & money...

    "When Life, Gives You Lemons.. Your Screwed.."

    That's the only way, you take it.. With a strap-on.. :confused: LOL..
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page