1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GTX295's future not bright enough to wear shades?

Discussion in 'NVIDIA' started by HossHuge, Dec 5, 2008.

  1. HossHuge

    HossHuge

    Joined:
    Jun 26, 2008
    Messages:
    2,299 (0.69/day)
    Thanks Received:
    893
    Location:
    EDM, AB, CAN
    just came across this less than stellar article about the GTX295.

    from The Inquirer

    NVIDIA IS SET to trickle out the latest batch of 55nm parts. Expreview has some pictures and tidbits about the latest 55nm GT200/GT200b here, and some GX2 info here.

    It looks like the on-again, off again GT200GX2 is on again, and it is called the GTX295. Yay. The 55nm parts, internally code named GT206, are finally trickling out like we said they would, with no speed increases, and no power gains. What should have been a simple optical shrink is turning into a totally botched job, with the 'real' 55nm parts unlikely to come out until late January at the earliest following yet another spin.

    Given the lack of gains with the B2 stepping, the GX2/GTX295 still seem unmakable in volume, but such trifling concerns have never stopped Nvidia in the past. We hear they are going to launch it even though they can't make it, along with the requisite 19 parts to Newegg so they can claim it is on sale. Real volume won't happen until (if?) they can fix the power problems.

    We hear that the 'launch' is likely going to happen at a shindig on the 12th of December so they can claim the win they promised before the end of the year. One has to wonder if cherry picking parts in an attempt to use tame press to snow the public is the definition of 'Whoop-ass'? I am sure they will claim a stunning victory in any case.

    One way you can tell how screwed up the chip is is the use of a heat spreader and a stiffener (the metal ring around the chip). If you have a big die, you need mechanical support for it, or it can crack or break bumps. A stiffening ring is usually the cheapest and most efficient way to go, but in many cases, a heat spreader will do the same job.

    The problem with a heat spreader is that it introduces two additional thermal barriers, the paste under the lid and the lid itself, to the cooling of the silicon. Each one makes cooling incrementally less efficient, not to mention material and assembly costs. You don't do this unless you have to.

    If you are wondering why every modern CPU out there has one, the answer is simple, so ham-handed monkeys like most DIY people don't crack the die when they clamp the heatsink on. Think AMD K8 here. CPU makers think the cost of a spreader, and the reduction in performance it brings, is worth the protection it gives.

    GPUs however come assembled. Factory robots don't break chips, so the mechanical protection is not an issue, but the costs remains. So, why did Nvidia do it on the GT200? They can't control hot spots. The lid is a heat spreader, and it helps keep chips with poor hot spot control alive and working.

    When you see a heat spreader on a part that comes assembled, it is a pretty sure sign something is wrong thermally, it simply is not worth the cost and performance drop otherwise. Make no mistake, the spreader and stiffener combo on the GT200b is a bad bad sign.

    Why is the GT200b such a clustered filesystem check? We heard the reason, and it took us a long time to actually believe it, they used the wrong DFM (Design For Manufacturing) tools for making the chip. DFM tools are basically a set of rules from a fab that tell you how to make things on a given process.

    These rules can be specific to a single process node, say TSMC 55nm, or they can cover a bunch of them. In this case, the rules basically said what you can or can not do at 65nm in order to have a clean optical shrink to 55nm, and given the upcoming GT216, likely 40nm as well. If you follow them, going from 65nm to 55nm is as simple as flipping a switch.

    Nvidia is going to be about 6 months late with flipping a switch, after three jiggles (GT200-B0, -B1 and -B2), it still isn't turning on the requested light, but given the impending 55nm 'launch', it is now at least making sparking sounds.

    The real question is, with all the constraints and checks in place, how the heck did Nvidia do such a boneheaded thing? Sources told us that the answer is quite simple, arrogance. Nvidia 'knew better', and no one is going to tell them differently. It seems incredulous unless you know Nvidia, then it makes a lot of sense.

    If it is indeed true, they will be chasing GT200 shrink bugs long after the supposed release of the 40nm/GT216. In fact, I doubt they will get it right without a full relayout, something that will not likely happen without severely impacting future product schedules. If you are thinking that this is a mess, you have the right idea.

    The funniest part is what is happening to the derivative parts. Normally you get a high end device, and shortly after, a mid-range variant comes out that is half of the previous part, and then a low end SKU that is 1/4 of the big boy. Anyone notice that there are all of zero GT200 spinoffs on the roadmap? The mess has now officially bled over into the humor column.

    ouch!! He doesn't paint a pretty picture.
     
  2. KBD New Member

    Joined:
    Feb 23, 2007
    Messages:
    2,477 (0.65/day)
    Thanks Received:
    279
    Location:
    The Rotten Big Apple
    interesting, but i dont how trusted this source is. but i think it confirms the general trend that nvidia is not the powerhouse it used to be, their GPUs are facing stiff competition from ATI and their chipset division hasnt been heard from lately.
     
    10 Year Member at TPU
  3. trickson

    trickson OH, I have such a headache

    Joined:
    Dec 5, 2004
    Messages:
    6,486 (1.40/day)
    Thanks Received:
    955
    Location:
    Planet Earth.
    Maybe so but I wouldn't count them out just yet . ;)
     
    10 Year Member at TPU
  4. erocker

    erocker Senior Moderator Staff Member

    Joined:
    Jul 19, 2006
    Messages:
    42,171 (10.47/day)
    Thanks Received:
    18,115
    The first half of the story consists of describing Nvidia's poor practice of using thermal paste, stiffeners, and IHS's. No different from G80 and they turned out to be fine cards. It would be nice if they tried a little more in the efficiency department. People will still buy them.
     
    10 Year Member at TPU
  5. imperialreign

    imperialreign New Member

    Joined:
    Jul 19, 2007
    Messages:
    7,043 (1.92/day)
    Thanks Received:
    900
    Location:
    Sector ZZ₉ Plural Z Alpha
    ATI has been rubbing it in their face as well recently - if ATI continues on their current war-path, nVidia will be forced to change and at least move up to "current" technology . . . meaning, GDDR4 or GDDR5, getting away from the monolithic processors, etc.

    as to Inq - they're hit or miss, take their articles with a grain of salt . . . they're the wiki of the tech-reporting world.



    yep, people will still buy nVidia - the 1337-fans are numerous. Has a lot to do, IMO, with how widespread nVidia has their logo and name.
     
    10 Year Member at TPU
  6. kid41212003

    kid41212003

    Joined:
    Jul 2, 2008
    Messages:
    3,588 (1.08/day)
    Thanks Received:
    539
    Location:
    California
    I never trust The Inquirer.
    BS, I say.
     
  7. HolyCow02

    HolyCow02 New Member

    Joined:
    Jun 8, 2008
    Messages:
    1,638 (0.49/day)
    Thanks Received:
    111
    Location:
    New York
    The Inq is on and off. And off course it won't matter how bad the heating is, nvidia fanboys will still buy the card and tout it's greatness. Just like ATI people did with the 2900 series, which everyone knew was horrible
     
  8. cooler

    cooler New Member

    Joined:
    Nov 6, 2007
    Messages:
    124 (0.03/day)
    Thanks Received:
    20
    The article maybe BS

    But


    mmm... interesting
     
  9. trickson

    trickson OH, I have such a headache

    Joined:
    Dec 5, 2004
    Messages:
    6,486 (1.40/day)
    Thanks Received:
    955
    Location:
    Planet Earth.
    Funny how every one want's to put that stake in Nvidia's heart when it takes a HD4870 with 1 GB of ram to best ( and not by much ) a GTX260 ! Makes me laugh and it takes a card that is ratted to run at 110c The HD4870 X2 2GB of ram to beat the GTX260 ! then this ? COME on ! I call BS! Sounds like some one is desperate to see Nvidia go down . Just like they want Intel to go down when AMD is talked about !
     
    10 Year Member at TPU
  10. HossHuge

    HossHuge

    Joined:
    Jun 26, 2008
    Messages:
    2,299 (0.69/day)
    Thanks Received:
    893
    Location:
    EDM, AB, CAN
    I just read some other stuff from the same writer and he does seem to have negative view of the green team. I guess time will tell.
     
  11. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    33,698 (9.41/day)
    Thanks Received:
    17,228
    Location:
    Hyderabad, India
    Jen Hsun Huang's dog raped Charlie's dog. Since then they don't get along well...bad speculation, isn't it?

    NVIDIA is taking its sweet time releasing its 55nm G200 parts, meaning they're not taking chances with it. They could've rushed it in, but as you can see, you won't even be seeing them in for the crucial Xmas shopping season. Tells you something about NV.
     
    phanbuey says thanks.
  12. KBD New Member

    Joined:
    Feb 23, 2007
    Messages:
    2,477 (0.65/day)
    Thanks Received:
    279
    Location:
    The Rotten Big Apple

    agreed right there, i used mostly Nvidia until now myself. I do like their cards and mobos, and you are right, their name recognition is huge so its not only fanboys who'll be buying their stuff. Its in the interest of the consumer that nvidia releases something darn good to keep the balance in the GPU market.
     
    10 Year Member at TPU
  13. imperialreign

    imperialreign New Member

    Joined:
    Jul 19, 2007
    Messages:
    7,043 (1.92/day)
    Thanks Received:
    900
    Location:
    Sector ZZ₉ Plural Z Alpha
    they'll continue - it's a shame, though, as I've mentioned before, ATI would greatly benefit from some of the same marketing tactics. If they got their name out there more, people would buy.

    At one point, I had mentioned the idea that ATI could start asking the big CGI movie companies to place a 15-20 second add at the beginning of their film . . . most of the companies that do these movies had said they use ATI and/or AMD hardware for production, imagine how many people would see the ATI/AMD logo when these films come out, and agian when released to DVD? ATI have been known and regarded for a very long time as having superiror IQ, and people wanting to build a media center PC, with a focus on HD media, would be more apt at purchasing ATI if they believed those components to be superior in IQ . . .

    ATI's marketing division has been in need of a thorough overhaul since the X1000 series


    average 4870 doesn't have 1GB GDDR5, only 512 - the top tier cards have 1GB.

    Even still, at market release, the 4870 was over $100 cheaper than a 260 - why would anyone purchase a 260 at that price point? The only thing that has kept the 260 competitive is the massive price-hacks nVidia had to administer to their products.

    BTW, a 4870x2 slaughters a GTX260, and is a match for a GTX 280. Actually, the 4870x2 and GTX280 are pretty close in most aspects - but if you're looking for a single car solution, and have a large monitor, the 4870x2 is the better option.
     
    10 Year Member at TPU
  14. kid41212003

    kid41212003

    Joined:
    Jul 2, 2008
    Messages:
    3,588 (1.08/day)
    Thanks Received:
    539
    Location:
    California
    I think If they could, they already do it, too costly for those ads.
    That's why some people still ask what Pentium this is, when your computer is an AMD one.
     
  15. imperialreign

    imperialreign New Member

    Joined:
    Jul 19, 2007
    Messages:
    7,043 (1.92/day)
    Thanks Received:
    900
    Location:
    Sector ZZ₉ Plural Z Alpha
    good point by itself - AMD nor ATI have ever really had the funds like Intel or nVidia for marketing (although ATI had been capable of it in the past). But, like most things, there can be no rewards without them having made an effort to do something - you must take a risk to earn from it.

    But, the red camp has always played it safe . . .
     
    10 Year Member at TPU
  16. newconroer

    newconroer

    Joined:
    Jun 20, 2007
    Messages:
    3,831 (1.04/day)
    Thanks Received:
    600
    Meh, I could care less either way, who is on 'top.'

    I just want GPUs to stop touting "100FPS!" And start shouting, "Minimum 40FPS garunteed!"

    The G80 was the last card that even sparked me to be slightly impressed.
     
    10 Year Member at TPU
  17. wolf2009 Guest

    Tells you that maybe something is wrong at NV. They are not at the forefront of technology. Late to 55nm, late to DX10.1, late to DX9, late to GDDR4, GDDR5.
     
    10 Year Member at TPU
  18. Drizzt5

    Drizzt5 New Member

    Joined:
    Jul 17, 2008
    Messages:
    612 (0.19/day)
    Thanks Received:
    38
    4870x2 creams the gtx260, and takes the gtx280 down, espically at higher resoultions.
    It doesn't matter how much ram is on the thing (you also left out the fact that it is ddr5), It's about the price/performance, and ATI has won this round.
    And yes, market release was just crazy. The 9800gtx vs the 4850 was epic, then the gtx260 and gtx280 come out expensive and then 4870 comes along and rocks things. And then 4870x2 was mostly everything we expected.

    And did anyone notice when you install CCC and display drivers that it advertises ATI video cards.
     
  19. imperialreign

    imperialreign New Member

    Joined:
    Jul 19, 2007
    Messages:
    7,043 (1.92/day)
    Thanks Received:
    900
    Location:
    Sector ZZ₉ Plural Z Alpha
    agreed, and they'll be late to DX11, PCIE 3.0, GDDR6, etc, etc
     
    10 Year Member at TPU
  20. phanbuey

    phanbuey

    Joined:
    Nov 13, 2007
    Messages:
    5,836 (1.65/day)
    Thanks Received:
    1,395
    Location:
    Austin Texas
    lol... this is gooooood stuff. I love a little bit of healthy smack-talkin... Nvidia wants the performance crown in the single card dept. that is why theyr'e going with the 295.

    that may be but they are never later than the developers!... what good is dx10.1 and 11 if no one uses it yet? the 55nm bit is true tho, they sat on their butts because they dominated for so long and got what they finally deserved.
     
  21. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    33,698 (9.41/day)
    Thanks Received:
    17,228
    Location:
    Hyderabad, India
    No, it tells you they aren't taking any chances, and making sure current inventories of GTX 260 (192), 260 (216) and 280 get digested. GTX 260 (192) at $200~$220 and Core 216 for a little more, are enough to let NV sail through xmas.
     
  22. HolyCow02

    HolyCow02 New Member

    Joined:
    Jun 8, 2008
    Messages:
    1,638 (0.49/day)
    Thanks Received:
    111
    Location:
    New York

    I would hope they would advertise their cards, since they are the only cards that can use those drivers. So you only see the ad's when you own one of their cards already...;)

    If anything, they need to advertise that a certain game was developed on their hardware... like at the beginning of Crysis where you saw the Nvidia and Intel logos
     
  23. Binge

    Binge Overclocking Surrealism

    Joined:
    Sep 15, 2008
    Messages:
    6,979 (2.15/day)
    Thanks Received:
    1,754
    Location:
    PA, USA
    I've got to agree the 4870x2 is a brilliant piece of machinery, and I kind of hesitate to think a 260x2 can beat it. There's one thing this article is saying is that 55nm GTX2xx cards may take a while before they work perfectly. Come on guys the 65nm cards work beautifully. After owning a couple I prefer them to ATi cards for gaming, and I was excited for 55nm but I had heard there were problems with their process. Problems with converting that much of a chip into a small space doesn't surprise me. You probably won't see too much of a difference between the 65nm and 55nm as much as you would see a difference between 65nm and 45nm. Personally I think nVidia should just drop the 55nm and get straight to working on their next big idea. Staying one step behind the competition is no way to win in this bear market.
     
  24. imperialreign

    imperialreign New Member

    Joined:
    Jul 19, 2007
    Messages:
    7,043 (1.92/day)
    Thanks Received:
    900
    Location:
    Sector ZZ₉ Plural Z Alpha
    I would agree on that, but nVidia have been proving quite the opposite for years now . . .

    but, when ATI does stick it to them, I'm sure it hurts. The last time nVidia felt this kind of competition from ATI was the X1900 series.

    This competition is good for us all, though. We will see more innovative products (instead of the re-hashing/re-stickering that goes on when competition isn't up to par . . . *cough*nVidia*cough*), and pushing for new tech to oust the competition.
     
    10 Year Member at TPU
  25. wolf2009 Guest

    you talk like you know all and end of it, you are forgetting this is all speculation. Nobody except the people working at NV know the concrete truth.

    I said "Maybe". I didn't say it something was definitely wrong at NV .
     
    10 Year Member at TPU

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)