1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia responds to GTX480 heat concerns

Discussion in 'General Nonsense' started by Goodman, Apr 3, 2010.

  1. Goodman

    Goodman

    Joined:
    Jun 13, 2009
    Messages:
    1,519 (0.80/day)
    Thanks Received:
    324
    Location:
    Canada/Québec/Montreal
    http://www.maximumpc.com/article/news/nvidia_responds_gtx_480_heat_concerns

    What do you think that was a good tradeoff or fail...?
  2. cadaveca

    cadaveca My name is Dave

    Joined:
    Apr 10, 2006
    Messages:
    13,883 (4.53/day)
    Thanks Received:
    6,963
    Location:
    Edmonton, Alberta
    They ruined it when they introduced temp sensors in the first place. Cooking eggs on pc hardware isn't something new...

    Of course the temps don't matter, as long as it really doesn't affect longevity of the product. Time will tell that story though.
  3. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,158 (11.65/day)
    Thanks Received:
    9,477
    temps arent so much the problem on the card, but the heat in the case.

    300W of heat isnt something easy to dissipate in the average PC case, especially not with low noise.
  4. Phxprovost

    Phxprovost Xtreme Refugee

    Joined:
    Apr 6, 2009
    Messages:
    1,215 (0.62/day)
    Thanks Received:
    262
    Location:
    Pennsylvania
    :shadedshu can we just stop with this nvidia marketing fermi bullshit?
    1. Heat isn't a "trade off" its an inherent problem that has always and will always effect the operation of chips until we get to really small processes on new materials, and make drastic changes to general consumer cooling methods
    2. Last time i checked, just because your chip can run at high heat and not die within the first ~hour of use does not mean it was "engineered to run at high heat"
    shevanel, copenhagen69, AsRock and 6 others say thanks.
  5. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,158 (11.65/day)
    Thanks Received:
    9,477
    they could use components rated to 150C and it wouldnt change the fact that the rest of the hardware around it, cant take that kind of heat.

    imagine 100C+ heat right next to your NB or CPU heatsink, feeding that heat into them? its not good.

    I've got enough heat problems with my two 4870s and they're spaced out with two large heatsinks and two fans, let alone the 480 having it all condensed into one spot.
  6. eidairaman1

    eidairaman1

    Joined:
    Jul 2, 2007
    Messages:
    12,275 (4.69/day)
    Thanks Received:
    1,419
    Even Military Standard Components need to be kept cool. Just because they state they are designed for High Temp doesnt mean they should be ran at High Temp otherwise they will fail quickly or cause adverse effects on the nearby components. TBH the real Fermi should be the minor revision that NV usually pulls, aka Like they did with the GF8 and 200 series components.
  7. AthlonX2

    AthlonX2 HyperVtX™

    Joined:
    Sep 27, 2006
    Messages:
    7,159 (2.47/day)
    Thanks Received:
    1,654
    honestly if anyone can afford to purchase a fermi then they can afford a good case..if anyone tries to stuff a fermi in a P180 there stupid
  8. eidairaman1

    eidairaman1

    Joined:
    Jul 2, 2007
    Messages:
    12,275 (4.69/day)
    Thanks Received:
    1,419
    I wanna bet I can stuff the Fermi, the Dual GPU solution to that or 2 5970s in my Antec SX830 Case and not have issues with cooling.
  9. AthlonX2

    AthlonX2 HyperVtX™

    Joined:
    Sep 27, 2006
    Messages:
    7,159 (2.47/day)
    Thanks Received:
    1,654
    honestly i have never heard of the antec 830 you spoke of so i did a quick google search,that case is truly an antique but if it suits your needs i see no reason why it couldnt do the job ;) i personally run all of my systems on a tech bench
  10. eidairaman1

    eidairaman1

    Joined:
    Jul 2, 2007
    Messages:
    12,275 (4.69/day)
    Thanks Received:
    1,419
    I can change the guts out easily and everything would still fit properly, worst came to it I could modify it, such as change the 80 MM fans to 120s etc. Only thing it wouldnt support is any cooling that is larger than 8" tall.
  11. Goodman

    Goodman

    Joined:
    Jun 13, 2009
    Messages:
    1,519 (0.80/day)
    Thanks Received:
    324
    Location:
    Canada/Québec/Montreal
    I agree with you... only time will tell...
    Still i wouldn't buy such a card but i would buy a 5870 any day :)
  12. Super XP

    Super XP

    Joined:
    Mar 23, 2005
    Messages:
    2,754 (0.80/day)
    Thanks Received:
    538
    Location:
    Ancient Greece, Acropolis
    This just means NVIDIA once again proves Fermi runs like BBQ.
  13. Kantastic

    Kantastic

    Joined:
    May 12, 2009
    Messages:
    5,156 (2.66/day)
    Thanks Received:
    993
    I don't see the 5870 running that hot despite the minuscule performance difference.
  14. Lionheart

    Lionheart

    Joined:
    Apr 30, 2008
    Messages:
    4,037 (1.74/day)
    Thanks Received:
    799
    Location:
    Milky Way Galaxy
    Fermi fail
    Super XP says thanks.
  15. boulard83

    Joined:
    Jan 28, 2009
    Messages:
    399 (0.20/day)
    Thanks Received:
    61
    Ill wait for a revision of Fermi. Maybe in few month they gona be able to put something interesting on the table. Lets hope.

    Any high end gamer/user have decent case and airflow but if a traditionnal guy buy this card he is gona fry everything ! ;)
  16. driver66

    driver66 New Member

    Joined:
    Jun 4, 2007
    Messages:
    1,046 (0.40/day)
    Thanks Received:
    111
    Location:
    indiana
    PLEASE!!!! FOR THE LOVE OF GOD (or whatever) STFU about Fermi it's all rehashed shit that just causes trolling,Fanboi shit and flaming !! Let it go :shadedshu
  17. Lionheart

    Lionheart

    Joined:
    Apr 30, 2008
    Messages:
    4,037 (1.74/day)
    Thanks Received:
    799
    Location:
    Milky Way Galaxy
    I agree!
  18. sneekypeet

    sneekypeet Unpaid Babysitter Staff Member

    Joined:
    Apr 12, 2006
    Messages:
    21,528 (7.02/day)
    Thanks Received:
    6,047
  19. DonInKansas

    DonInKansas

    Joined:
    Jun 2, 2007
    Messages:
    5,096 (1.92/day)
    Thanks Received:
    1,265
    Location:
    Kansas
    Quoting out of context: Way fun!
    TRIPTEX_CAN and entropy13 say thanks.
  20. entropy13

    entropy13

    Joined:
    Mar 2, 2009
    Messages:
    4,917 (2.45/day)
    Thanks Received:
    1,193
    "I'm beating a dead horse. Backwards."
  21. Goodman

    Goodman

    Joined:
    Jun 13, 2009
    Messages:
    1,519 (0.80/day)
    Thanks Received:
    324
    Location:
    Canada/Québec/Montreal
    Only a nvidia fanboy like you would said such a thing.... :laugh: (j/k) ;)
    driver66 says thanks.
  22. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    19,859 (6.20/day)
    Thanks Received:
    5,961
    It is a big hot powerful GPU. I think nVidia tried way too hard with Fermi, trying to make it more then it needed to be. It has processor elements to it that might make it a power house for doing CUDA work, and it might even make it a DX11 powerhouse also, but really we just want a graphics card.

    I think we would have been all happier if they just took GT200b, moved to 40nm, added GDDR5 support and DX11 support to it, and doubled the specs(leaving the 256-bit memory bus though, the move to GDDR5 effectively doubles that).

    You know, nVidia had a real opportunity. G92 being so good, it allowed them to re-use it as the mid-range segment of the GT200 generation, instead of spending R&D time/money to develope a GT200 derivitive that would have been almost identical to G92 anyway. But I wonder where that extra R&D time went, because it seems wasted to me. Or maybe they had too much time, and over-engineered the damn thing, but why does it seem to me like they waited until the last minute and then just threw something together... I mean, there is no reason that Fermi should have come 6 months after RV870, with nVidia using G92 to fill the mid-range market they should have had plenty of time it develope it and get it out before or at least at the same time as RV870.

    Now, that being said. I think Fermi is getting a bad wrap mainly because of how amazingly good RV870 is. ATi seems to have hit on a real winner. I said this in a previous thread, but it is an amazing feat for a GPU to outperform the dual-GPU cards of the previous generation, a feat that has never actually been done before Fermi. That alone is amazing, but add to it the fact that it does so with better heat and power usage numbers then said previous generation dual-GPU cards. I think, if RV870 wasn't in the equation, our views of Fermi would actually be a lot different. It is only because of RV870 being so amazing that Fermi doesn't look good. It is kind of like putting a Corvette ZR1 next to a Lamborghini. The Corvette by itself is actually a pretty decent super-car. However, when compared to the Lamborghini, it looks like a huge under performing gas guzzler that uses nothing but brute force to achieve its performance.
    runnin17, TRIPTEX_CAN, Steevo and 4 others say thanks.
    Crunching for Team TPU More than 25k PPD
  23. driver66

    driver66 New Member

    Joined:
    Jun 4, 2007
    Messages:
    1,046 (0.40/day)
    Thanks Received:
    111
    Location:
    indiana
    Very well put :toast: /close Fermi threads
  24. SNiiPE_DoGG New Member

    Joined:
    Apr 2, 2009
    Messages:
    582 (0.29/day)
    Thanks Received:
    135
    I think this is a bit of backwards thinking here because if you look at the g92 compared to the g80 its not that different - then if you look at the gt200 it was quite a let down in numbers compared to g80/g92. So when you say "g92 was so good" you really do mean "gt200 was lackluster in reference to its predecessor"

    in reality ATI was very very far behind in the 3xxx series vs the g92, and it was nvidia's shortcomings in its release of the gt200 that allowed the hd4k series to come rolling in so close in performance to Nvidia's offering. (we all remember the pricing fiasco)

    In all of the past generations we can see that ATI's small chip strategy paying off - first in the pricing advantages during hd4k and now in the huge power advantages (which should have been there in hd4k but ati never came through) in the hd5k series.
  25. Flyordie

    Flyordie New Member

    Joined:
    Oct 26, 2008
    Messages:
    1,870 (0.88/day)
    Thanks Received:
    247
    Is the Thermal Take Element V average?

    Seriously though.. Its why I bought the Element V from Sneeky, my In-Win only had 1 fan. It was a 120mm Exhaust fan and when ever I played a game it was just a solid flow of heat.

    2x 120mm fans one intake and one exhaust is good enough most of the time.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page