1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA to Launch GeForce GTX 880 in September

Discussion in 'News' started by btarunr, Aug 1, 2014.

  1. jagd

    Joined:
    Jul 10, 2009
    Messages:
    456 (0.24/day)
    Thanks Received:
    89
    Location:
    TR
    Nvidia will use what available them at TSMC ( TSMC if they dont change their foundry ) there are not much foundry fabs and switching to new process ( 28nm to 20 nm to 16 etc) costing more with every step and also difficulty and problems .
    If you look to TSMC 20nm struggle youll find what i mean , nvidia can't decide to skip to 16nm for simple reasons , it will take more time , that 16nm installation to fabs look very long way , and nvidia will be stuck at 28 nm with power and price disadvantage

     
    64K says thanks.
  2. 64K

    64K

    Joined:
    Mar 13, 2014
    Messages:
    377 (1.99/day)
    Thanks Received:
    156
    Good points. I can't see Nvidia being able to stretch the 28nm Maxwells out for another year or more after the release of the GTX 880 either and the engineering for the 20nm Maxwell has already been paid for by Nvidia so they will probably want to release a line of 20nm Maxwells to recoup their investment.
     
  3. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,285 (2.56/day)
    Thanks Received:
    1,177
    We are at teh verge of what Silicon can do for us, until we make a breakthrough in graphiene or another substance to replace it we are reaching the limit of how much performance we can get without going bigger on power use and heat dissipation. Essentially we are close to the buy a Prius, Evo, or Ferrari of performance.
     
    10 Million points folded for TPU
  4. GhostRyder

    GhostRyder

    Joined:
    Apr 29, 2014
    Messages:
    954 (6.72/day)
    Thanks Received:
    336
    Location:
    Texas
    I think many of yall are giving to little credit here...We have already been well versed in what the current process is capable of with the GTX 750ti. While that card is nothing amazing performance wise, its the power consumption difference and the power with less cores available. The point of comparing the 640 cores in the 750ti versus the 768 in the 650ti while the 750ti pretty much blows the 650ti out of the water shows that you can improve on current processes and such. Why should we be so upset that were not dropping down to a smaller process now and be focused on disappointment before anything is even shown to the public?

    Even using current basic logic, by the standard that the 650ti to the 750ti using less cores was able to outperform the older generation chip while using less power, we can assume that if the GTX 880 has the same amount of cores or even ~15% less that the performance would still be above the predecessor. Of course that's assuming core clocks remain roughly the same which depending could end up showing higher clock speeds and memory speeds (Of course that's just an assumption).

    We also are basing so much off of rumors, speculations, and possible release dates.
     
    Xzibit and rtwjunkie say thanks.
  5. mcraygsx New Member

    Joined:
    Jul 29, 2014
    Messages:
    15 (0.29/day)
    Thanks Received:
    2

    +1 OP

    Exactly what this guy said. They could have easily released Gk110 but instead they were selling 680 for a high end price. We should know what NVidia does from past.
     
  6. arbiter

    Joined:
    Jun 13, 2012
    Messages:
    181 (0.22/day)
    Thanks Received:
    26
    One thing you forget AMD will stuck at the same proc if they release a new gpu soon as well. So they would be stuck with same issue as well, so its not just problem for nvidia but could be bigger problem for amd.
     
  7. 64K

    64K

    Joined:
    Mar 13, 2014
    Messages:
    377 (1.99/day)
    Thanks Received:
    156
    I disagree. TSMC failed with the 28nm process and Nvidia had to pay per wafer for them. This resulted in the gimped GTX 780. The 28nm process is much more refined now so we should see plenty of un-gimped GM210 chips a few months after the GTX 880 actually becomes available for purchase.

    Edit: I'm not defending there price structure though. It's gone balls up.
     
  8. mcraygsx New Member

    Joined:
    Jul 29, 2014
    Messages:
    15 (0.29/day)
    Thanks Received:
    2
    I hope you are right but considering what we have all seems in past several years how NVidia has been treating us. Its too good to be true that NVidia wont sell 880 Labeled as a premium product. Of course it will have minor advantages over GeForce 780 but time will tell.
     
  9. 64K

    64K

    Joined:
    Mar 13, 2014
    Messages:
    377 (1.99/day)
    Thanks Received:
    156
    Yes, and that's the downside.

    The GTX 880 should hammer on the GTX 780 and I think it will. It may roll right over the GTX 780 Ti in performance. Time will tell.

    If anyone needs to upgrade their GPU in the next couple of months and wants to go Nvidia then the GTX 880 priced at around $425 will probably be a good deal. Otherwise wait for the 20nm Maxwells.
     
  10. arbiter

    Joined:
    Jun 13, 2012
    Messages:
    181 (0.22/day)
    Thanks Received:
    26
    Likely 500$ at lowest probably 600$ being new gpu. if it has 30% like rumored it would be within price range.
     
  11. Fluffmeister

    Fluffmeister

    Joined:
    Dec 22, 2011
    Messages:
    652 (0.65/day)
    Thanks Received:
    185
    This whole nVidia are the bad guys thing is just nonsense, again the GK104 based 680 was more than a match for the 7970, they literally had zero reason to release a consumer orientated card based on the GK110 at that time [regardless of it being ready or not].

    I guess it would have been funny to see a GK110 powered 680 vs the under clocked Tahiti 7970, embarrassing.... but funny.
     
  12. GhostRyder

    GhostRyder

    Joined:
    Apr 29, 2014
    Messages:
    954 (6.72/day)
    Thanks Received:
    336
    Location:
    Texas
    Wow dude I cannot believe that you really believe that. I mean really do you think nvidias strategy was to release a GPU that was even instead of releasing something way more powerful. If nvidia had the GK 110 ready they would have easily released it and at a price point fitting ( probably close to the 1k mark depending).

    This is the same strategy that has been used before so I do not get why people are getting so shocked. Compare the fermi architecture to the Kepler in terms of release and the chips used you will see the strategy remains the same and the same ideas can be expressed. Each cycle follows a similar strategy with the companies, you can say it's like a tick tock cycle. They release the introduction to a new architecture, show off how well it performs, gain data, and then next cycle release the full powered version of the architecture next release.

    VLIW (or the Terascale series) from ATI also followed a similar set. This is no different than the strategies were all used to (well not much at least) and we can also compare the GCN architecture in that way.

    Also anyone assuming that the GTX 880 is going to be weaker than the 780ti I feel is going to be either disappointed or impressed (depending on your outlook). It would not make much sense to release a less powerful GPU as your next gen GPU...
     
    Casecutter, Roel and 64K say thanks.
  13. 64K

    64K

    Joined:
    Mar 13, 2014
    Messages:
    377 (1.99/day)
    Thanks Received:
    156
    Well said GhostRyder.
     
    GhostRyder says thanks.
  14. Fluffmeister

    Fluffmeister

    Joined:
    Dec 22, 2011
    Messages:
    652 (0.65/day)
    Thanks Received:
    185
    Why would they need to put all their cards on the table right away? That makes absolutely zero sense.

    Fact is when the GK110 was ready, it was in the form of the K20X for Oak Ridge, low yield, high returns, much more sense than appeasing forum warriors at TPU.
     
    rtwjunkie and HumanSmoke say thanks.
  15. HumanSmoke

    HumanSmoke

    Joined:
    Sep 7, 2011
    Messages:
    1,381 (1.25/day)
    Thanks Received:
    449
    Nvidia did have GK110 up and running in the same time frame. For some reason I have yet to fathom, people seem to think that releasing the GPU as a $1000 card makes more sense than selling it in a $4500-5000 package to a client who was the damned launched customer (contact signed October 2011) and needed nineteen thousand of the things in a contract that would severely penalize Nvidia if the boards weren't supplied on schedule.
    If Nvidia had intended for the GK 110 for desktop from the outset - which they could have managed as a paper/soft launch with basically no availability but plenty of PR ( i.e. a green scenario mirroring the HD 7970's 22nd December 2011 "launch"), they in likelihood could have had parts out in time. GK 110 taped out in early January 2012 (even noted Nvidia-haters tend to agree on this point). Fabrication, testing/debug, die packaging, board assembly, product shipping to distributers takes 8-12 weeks for a consumer GPU - production GK110's are A1 silicon, so no revision was required - that means early to mid March 2012 for a possible launch date IF the GTX 680 hadn't proved sufficient....and the launch date for the GTX 680? March 22nd, 2012.
    Oak Ridge National Labs started receiving their first Tesla K20's in September 2012 (1000 or so in the first tranche), which tallies with the more stringent runtime validation process required for professional boards in general and mission critical HPC in particular.

    Unbelievable that so much FUD exists about this considering most of the facts are actually well documented by third parties.

    History tells us that the GTX 680 was sufficient. The competition (the 7970) was a known factor, so there was actually zero need to hastily put together a GK110 card. I doubt that a GK110 GTX card would have been any more than a PR stunt in any case, since Oak Ridge's contract superseded any consumer pissing contest.
    True enough. ORNL's Titan was the high profile large order customer, but more than a few people forget that Nvidia was also contracted to supply the Swiss supercomputing institute's Todi system, and the Blue Waters system for the National Center for Supercomputing, so around 22,000 boards required without taking replacements into consideration.
     
    Last edited: Aug 3, 2014
    rtwjunkie and Fluffmeister say thanks.
  16. Fluffmeister

    Fluffmeister

    Joined:
    Dec 22, 2011
    Messages:
    652 (0.65/day)
    Thanks Received:
    185
    ^ Exactly, and once those contracts were fulfilled and yields gradually improved what did we see some 5-6 months later.... *drum roll*.... the $1000 GTX Titan , still without any fear of direct competition and a price as much as about protecting Nvdia's professional product stack as.... why fuck not?

    But no, it should have been $400 bucks and called the 680, wonders never cease. :p
     
    rtwjunkie says thanks.
  17. rtwjunkie

    rtwjunkie

    Joined:
    Jul 25, 2008
    Messages:
    1,243 (0.55/day)
    Thanks Received:
    416
    Location:
    Louisiana
    Well-explained by @Fluffmeister and @HumanSmoke why the mid-levelnchip ended up as the premiere Kepler card (and remained there so long)!!

    Still, since I bought the 780 before the price drop, i prefer to keep the top of the chip line in my main rig. For me it just makes sense to wait til GM210, whenever that is (GTX 980?). Gotta get my money's worth!!

    So anyway, i take back some of my false advertising statements, about 680 and the correllary to the 880, with neither top of the line card having the top of the line chip in the lineup. It all relates to being ready as well as business committments by Nvidia.
     
    Last edited: Aug 3, 2014
  18. theoneandonlymrk

    theoneandonlymrk

    Joined:
    Mar 10, 2010
    Messages:
    3,384 (2.05/day)
    Thanks Received:
    565
    Location:
    Manchester uk
    You two sound like nvidia board members some days .it would Be nice to get back on topic at some point.
    No new news on the hybrid board , gtx880 or anything going on then I guess.
     
    GhostRyder and Xzibit say thanks.
  19. Fluffmeister

    Fluffmeister

    Joined:
    Dec 22, 2011
    Messages:
    652 (0.65/day)
    Thanks Received:
    185
    Really? Talking sense is being on the board of nVidia?

    I guess you're right.
     
  20. SIGSEGV

    SIGSEGV

    Joined:
    Mar 31, 2012
    Messages:
    506 (0.56/day)
    Thanks Received:
    107
    According to various sources AMD already stated that they will introduce TSMC's 20nm products including gpu by next year (2015)
     
  21. 64K

    64K

    Joined:
    Mar 13, 2014
    Messages:
    377 (1.99/day)
    Thanks Received:
    156
    If you need a GPU upgrade and you want 4 GB VRAM then go with GTX 880. I am 100% convinced that it will smoke the GTX 780 at this point but know what you're buying. It's not the Maxwell Flagship. It's a mid range GPU and a throwback to the 28nm process. It is by no means a Maxwell Flagship so consider the price and don't be scammed.
     
    rtwjunkie, Xzibit and GhostRyder say thanks.
  22. GhostRyder

    GhostRyder

    Joined:
    Apr 29, 2014
    Messages:
    954 (6.72/day)
    Thanks Received:
    336
    Location:
    Texas
    Makes more sense than releasing an equal product with less VRAM and performs almost exactly the same on average (Except when you take resolutions into account)...
    5-6months...Try almost a year later dude...

    GTX 680 Released: March 22, 2012
    GTX Titan Released: February 19, 2013

    Yea they released Titan as a 1k card almost 11 months later, obviously they had no problem releasing a 1k Desktop grade video card. If they had wanted to get that card out sooner they would have been happier to and charged accordingly, but they had enough trouble even getting the GTX 680 out which was out of stock and basically required camping your computer night and day to get one.

    I am getting just as tired as you are of people dragging out these threads to off subject fanboy arguments.

    But then what is going to be the excuse this time with the 880? Since everyone is convinced an un-released card with very little known about it is going to be inferior to the current lineup...

    Exactly, im at a loss how certain people keep claiming that this chip sucks before we have even seen anything...
     
    Last edited: Aug 3, 2014
    theoneandonlymrk and Xzibit say thanks.
  23. Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,121 (1.29/day)
    Thanks Received:
    252
    They pulled the PR stunt anyways with the intro of TITAN brand

    680 was good enough for them and they saw a $ benefit. 580 was FP64=1/8 and since then all Geforce have gone to a FP64=1/24. While AMD stuck to a FP64=1/4 on Tahiti until Hawaii where they lowered it to FP64=1/8.

    Taihiti had FP64=1/4 so it was AMD "Titan" successor to the 580 if u don't take sides released a year after the 580. Not to mention the prices.
    11/2010 - GTX 580 = $500
    1/2012 - HD 7970 = $550
    2/2013 - GTX Titan = $1000
    The whole "TITAN" argument applies to Tahiti with in that same time frame with the notable exception of CUDA of course.

    Now both companies are further cutting FP64 for gaming line where if Nvidia would had stuck to its old ways TITAN would have been the 580 successor not 680 nor 780.

    I hope Maxwell goes back to the old ways but I highly doubt it.
     
    Last edited: Aug 3, 2014
  24. HumanSmoke

    HumanSmoke

    Joined:
    Sep 7, 2011
    Messages:
    1,381 (1.25/day)
    Thanks Received:
    449
    ...nothing being talked about here....but since you're hanging out the bait...
    Sure did. Seems like a marketing winner.
    Spends a whole tortured introduction trying to get Titan into the topic....then screws it up.
    GeForce GTX Titan : FP64 1:3 rate (w/boost disabled - which stands to reason since overclocking and double precision aren't mutually beneficial from either a error or power consideration)
    GeForce GTX Titan Black: FP64 1:3 rate w/boost disabled
    GeForce GTX Titan Z : FP64 1:3 rate w/boost disabled
    Thanks for reminding me that AMD halved the double precision ratio for desktop high end in the current series - though I already was aware of the fact. How about not offering double precision at all on GPU's other than the top one for the Evergreen and Northern Islands series after offering FP64 on the HD 4000 series RV770 ? Crazy shit huh? or limiting Pitcairn and Curacao to 1:16 FP64 to save die space and keep power demand in check? It's called tailoring the feature set to the segment.

    Horses for courses. FP64 is a die space luxury largely unrequired in gaming GPUs.
    Nvidia figured out a while ago that the monolithic big die really isn't that economic when sold at consumer prices which was why the line was bifurcated after the Fermi architecture - who would have thought selling a 520mm² GPU for $290 (GTX 560 Ti 448) and $350 (GTX 570) wouldn't have resulted in a financial windfall !. AMD will likely do the same since they will need a big die for pro/HSA apps ( and Fiji sounds like a 500mm²+ from all accounts), and keep the second tier and lower die-area ruled by gaming considerations ( just as Barts, Pitcairn, and Curacao are now)
    The old ways of reverting back to 1:8 FP64 rate with Fermi, or 1:3 rate with the current GTX Titan range ? :confused:
     
  25. Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,121 (1.29/day)
    Thanks Received:
    252
    WOW.

    Even when I'm not arguing with you, you still come off as a jerk.

    I didn't include TITAN because that was the exception on there top series card even though it has different "branding". I understood you would know the difference. Sheesh. Didn't think crossing T's and dotting I's was needed for you to understand.

    Old way as to not change FP64 with-in chip in gaming series. GK110 was there first to do that TITAN & 780 differ. They saw an opportunity to make $ off so many that didn't meet standards but it was a smart business move but not so good for the consumer.



    P.S.
    I need to stay away from culinary school. Apparently it turns you into an even greater ass.
     
    Last edited: Aug 3, 2014
    GhostRyder says thanks.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page