1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ATI Believes GeForce GTX 200 Will be NVIDIA's Last Monolithic GPU.

Discussion in 'News' started by Polaris573, Jun 17, 2008.

  1. Polaris573

    Polaris573 Senior Moderator

    Joined:
    Feb 26, 2005
    Messages:
    4,281 (1.21/day)
    Thanks Received:
    718
    Location:
    Little Rock, USA
    The head of ATI Technologies claims that the recently introduced NVIDIA GeForce GTX 200 GPU will be the last monolithic “megachip” because they are simply too expensive to manufacture. The statement was made after NVIDIA executives vowed to keep producing large single chip GPUs. The size of the G200 GPU is about 600mm2¬¬ which means only about 97 can fit on a 300mm wafer costing thousands of dollars. Earlier this year NVIDIA’s chief scientist said that AMD is unable to develop a large monolithic graphics processor due to lack of resources. However, Mr. Bergman said that smaller chips allow easier adoption of them for mobile computers.

    Source: X-bit Labs
     
    Last edited by a moderator: Jun 18, 2008
  2. panchoman

    panchoman Sold my stars!

    Joined:
    Jul 16, 2007
    Messages:
    9,595 (3.61/day)
    Thanks Received:
    1,200
    the war for who can build the biggest monolithic gpu? and then you just x2 the monolithic? lol....

    i bet that both companies will have troubl producing big monolithic gpus.. but nvidia more because the R7 is not near the size of the G2
     
  3. imperialreign

    imperialreign New Member

    Joined:
    Jul 19, 2007
    Messages:
    7,043 (2.65/day)
    Thanks Received:
    909
    Location:
    Sector ZZ₉ Plural Z Alpha
    I kinda partially agree, only on the fact that nVidia has been sandbagging their GPU tech for a while now, and I think they're at the furthest they can go with current architecture.

    But, if it comes down to a resources debate - nVidia can most easily afford titanic productions
     
  4. panchoman

    panchoman Sold my stars!

    Joined:
    Jul 16, 2007
    Messages:
    9,595 (3.61/day)
    Thanks Received:
    1,200
    nvidia has many work arounds, like the pci freq. trick that they used, and their architecutre has been the same since like geforce 4... and on top of that.. they get hurt here because, in order to keep up with amd's R7 core, they basically slapped 2 G92 cores into a new core and released it. its like intel with the dual-dual-core chips on die to make a quad core in order to keep up with amd's phenom "true" quad core you know?
     
  5. kenkickr

    kenkickr

    Joined:
    Dec 5, 2007
    Messages:
    4,827 (1.92/day)
    Thanks Received:
    1,452
    I'm all of for Nvidia's monolithic production!! I'll just go out and buy a couple A/C units and fans for my computer room during the summer and never have to turn the heat on in the fall, winter, and early spring having one of their cards in the house, LOL :laugh:
     
    WarEagleAU says thanks.
    Crunching for Team TPU
  6. Megasty New Member

    Joined:
    Mar 18, 2008
    Messages:
    1,263 (0.52/day)
    Thanks Received:
    82
    Location:
    The Kingdom of Au
    ATi is only saying that because they already know their X2s are gonna dust the G200s. Combine that with the cost to produce them & you have a no-brainer. Its like comparing a Viper & a Mach truck. They both have the same HP but whiich one is faster :cool:
     
  7. DOM

    DOM

    Joined:
    May 30, 2006
    Messages:
    7,552 (2.46/day)
    Thanks Received:
    828
    Location:
    TX, USA
    :confused: lol amd still got there ass handed to them its a Q core ita has 4 on one cpu true doesnt mean anything

    but I want to see what amd has to offer in the gpu department :D
     
    CrAsHnBuRnXp says thanks.
  8. lemonadesoda

    lemonadesoda

    Joined:
    Aug 30, 2006
    Messages:
    6,260 (2.10/day)
    Thanks Received:
    967
    If nVidia can do a fab shrink to reduce die size and to reduce power they have a clear winner.

    THEREFORE, AMD are creating this "nVidia is a dinosaur" hype, because, truth be told, AMD cannot compete with nVidia unless they go x2. And x2? Oh, thats the same total chip size as GTX200 (+/- 15%). But with a fab shrink (to same fab scale as AMD), nVidia would be smaller. Really? Can that really be true? Smaller and same performance = nVidia architecture must be better.

    So long as nVidia can manufacture with high yield, they are AOK.
     
  9. DaJMasta

    Joined:
    Nov 6, 2005
    Messages:
    479 (0.15/day)
    Thanks Received:
    38
    Location:
    Silver Spring, MD
    I agree that this size in mm2 gpu will seldom be seen again, because it costs so much to make. But the transistor count will continue to rise, as the manufacturing process gets smaller.
     
  10. PVTCaboose1337

    PVTCaboose1337 Graphical Hacker

    Joined:
    Feb 1, 2006
    Messages:
    9,512 (2.98/day)
    Thanks Received:
    1,143
    Location:
    San Antonio, Texas
    I think that AMD is right, NVIDIA is not being progressive, but they are getting the most out of a GPU technology... and it seems to be working.
     
  11. dalekdukesboy

    dalekdukesboy New Member

    Joined:
    Mar 7, 2007
    Messages:
    278 (0.10/day)
    Thanks Received:
    27
    I have to reply to this...

    Well, that may be all fine and good and you may have a valid point...but bottom line, what performs better? The nvidia g92 or ATI's r7...The Intel duo core/quad core, or the phenom? I understand that from a purely theoretical/architectural standpoint ati/amd could be more advanced but no one can objectively tell me the phenom or ati's 3780/3750 can even keep up with nevermind beat Nvidia's g92 or Intel's current cpu lineup.
     
  12. Rurouni Strife New Member

    Joined:
    Aug 19, 2006
    Messages:
    124 (0.04/day)
    Thanks Received:
    12
    My thoughts:
    GPU's will eventually end up kinda like dual/quad core CPUs. You'll have 2 on one Die. When? who knows, but it seems that AMD is kinda working in that direction. However, people complained when the 7950GX2 came out because "it took 2 cards to beat ATI's 1 (1950XTX)". They did it again, but to a lesser degree for the 3870X2, and it'll become more accepted as it goes on, espically since AMD has said "no more mega GPUs". Part of that is they don't wanna f up with another 2900 and they dont quite have the cash, but they are also thinking $$$. Sell more high performing mid range parts. That's where all the money is made. And we all know AMD needs cash.
     
  13. mullered07

    mullered07 New Member

    Joined:
    Jan 28, 2007
    Messages:
    2,648 (0.94/day)
    Thanks Received:
    204
    Location:
    UK
    not exactly to keep up with phenom since the Q series was released like a year b4 and still pwns phenom, who actually gives a shit if its "true" quad or not, it does the job and better than amd?

    i dont understand what you mean by workaround, nvidia has handed amd there ass for the last 2 gens, if there not even trying and just making the most of old technology, then god help amd if they come up with a new architecture. ati died the day they were bought by amd :shadedshu
     
  14. imperialreign

    imperialreign New Member

    Joined:
    Jul 19, 2007
    Messages:
    7,043 (2.65/day)
    Thanks Received:
    909
    Location:
    Sector ZZ₉ Plural Z Alpha
    I kinda agree here as well - TBH, I think multi-GPU setups will be the future over the monolith designs . . . -two or more efficient GPUs can work just as effectively and efficiently, if not better, than one megaPU. With AMD behind ATI at this point, I defi see that the move towards this implimentation is already there.

    I'm sure that if indeed multi-core GPUs come marching out of ATI, we'll be seeing a lot of kicking and screaming from the green camp that "it's still 2 GPUs to our 1!!" Which, IMO, I don't believe to be the case. If one chip marches out that has 2 cores on one die, it's still 1 GPU. We don't go around saying that "my Q6600 is 4 CPUs, man!"




    Sure, a lot of this progress by ATI/AMD's part has got to be dictated by cost and resources; but I think this is one area where the red camp will be pushing new technology that nVidia will sooner or later have to accept. nVidia can go and counter with a whole new, megaPU pushing uber-1337 processing capabilites, and ATi could just say "alright, we'll add 2 more cores to our current design and match you again." nVidia could go to the drawing boards and redesign yet another 1337 GPU, and ATI could again counter with "alright, we'll add another 3 cores to our current design and take the lead."

    IMHO, the smaller package will be way more cost efficient for both manufacturer and consumer years and years down the road.
     
  15. WarEagleAU

    WarEagleAU Bird of Prey

    Joined:
    Jul 9, 2006
    Messages:
    10,797 (3.56/day)
    Thanks Received:
    546
    Location:
    Gurley, AL
    I have to kind of agree. But they will continue that if ATI doesnt do something to countermeasure it. I dont think sales of the GT200 line will be as high as NV hopes it will. As prices come down though, it will...but until then....
     
  16. imperialreign

    imperialreign New Member

    Joined:
    Jul 19, 2007
    Messages:
    7,043 (2.65/day)
    Thanks Received:
    909
    Location:
    Sector ZZ₉ Plural Z Alpha
    I agree as well - but, I think we're on the verge of seeing the first dual core GPU. Initial rumors of the R700 hinted at the possiblity, but that seems to have turned out a negative (although we still have yet to see concrete specs on the 4870x2). With the advent of Fuzion, though, I think they're further paving the way. R800 could potentially deliver the first dual-core GPU, whenever HD5000 will be released (probably next year), and if so, the next series after that we could potentially see every card in the lineup (except for the low-end cards) stouting dual-core GPUs.

    TBH, I don't forsee nVidia having the ability to counter that just yet.

    This is all speculation, though, and it's all ways off in the future anyhow. We'll just have to see.
     
    WarEagleAU says thanks.
  17. yogurt_21

    yogurt_21

    Joined:
    Feb 18, 2006
    Messages:
    4,420 (1.39/day)
    Thanks Received:
    575
    Location:
    AZ
    where in the article do you see that ati says nvidia is a dinosaur? they are merely stating that based on the performance vs cost to produce of the gtx280 it will likely be the last of it's kind. considering the 9800gx2 was cheaper to produce and offers similar if not better performance.

    it's not like nvidia can't simply go dual or even quad, seeing as they did buy up 3dfx. it would make more sense, as in the end the uberperformance seekers are going to sli those monolithic gpu's anyways. so why not make a cheaper variant that can be a dual those seeking uber performance can buy the x2 while those seeking better price/performance can be accomidated as well. The geforce 9 series did this quite well.

    and I seriously don't get all the comments about the x2's I mean when the athlon 64 x2's came out they didn't say, "oh for amd to be able to beat the pentium 4 they had to go dual" dual was a means of providing more processing power without increasing clock speed or changing architecture. just because a gpu or cpu has more than one core doesn't mean it's inferior design. it's just a different way of meeting the same performance demand.

    if anything the argument against duals should be the return for the second core as it is in the cpu market. but if ati can make a dual that ebats nvidias single for the same or cheaper cost. thats good business, not inferior design.
     
  18. evil bill New Member

    Joined:
    Jan 20, 2006
    Messages:
    370 (0.12/day)
    Thanks Received:
    14
    Location:
    Scotland
    I once read about the Nvidia v ATI "battle" being compared to a muscle car like a Viper or Mustang against a Ferrari. Nvidias stuff is modern, but not overly sophisticated and with its roots in older technologies whereas ATI/AMD tends to be pretty high-tech and cutting edge (e.g. the ringbus memory in the HD2900). You therefore get the fans of either camp decrying how the other arrives at their performance level regardless of how well it performs.

    ATIs problem is that as soon as its technological "higher ground" fails to best the competition, it puts itself under serious pressure.

    Still, hopefully the internal distractions of the ATI/AMD merger are in the past and they can concentrate on doing their stuff and keep the market moving on. I agree that Nvidia aren't being pushed hard enough by them and are probably sandbagging tech. Necessity is the mother of invention, and unless they have a strong competitor they will be tempted to make cost savings by stretching old tech for longer.
     
  19. pentastar111

    Joined:
    Aug 18, 2006
    Messages:
    994 (0.33/day)
    Thanks Received:
    31
    Location:
    Los Angeles...U.S.A
    Even if nVidia's cards are a little faster..I'll probably still go ahead as planned with my next build being an all AMD rig...$700 for a vid card :eek: is just tooooooooo much money in my opinion.
     
    captainskyhawk says thanks.
  20. wolf

    wolf Performance Enthusiast

    Joined:
    May 7, 2007
    Messages:
    5,543 (2.03/day)
    Thanks Received:
    842
    this titanic GPU may not fare that well now, but it falls right into the category of future proofing. it, like the G80GTX/Ultra, will stand the test of time, especially when the 55nm GT200b comes out with better yields/higher clocks.
     
  21. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.66/day)
    Thanks Received:
    184
    I completely disagree in the single-die multi-core GPU thing. The whole idea of using multiple GPUs is to reduce die size. Doing dual core GPU on a single die is exactly the same as doing a double sized chip, but even worse IMO. Take into account that GPUs are already multi-processor devices, in which the cores are tied to a crossbrigde bus for communication. Look at GT200 diagram:

    http://techreport.com/articles.x/14934

    In the image the conections are missing, but it suffices to say they are all conected to the "same bus". A dual core GPU would be exactly the same because GPUs are already a bunch of parallel processors, but with two separate buses, so it'd need an external one and that would only add latency. What's the point of doing that? Yields are not going to be higher, as in both cases you have same number of processors and same silicon that would need to go (and work) together. In a single "core" GPU if one unit fails you can just disable it and sell it as a lower model (8800 GT, G80 GTS, HD2900GT, GTX 260...) but in a dual "core" GPU the whole core should need to be disabled or you would need to dissable another unit in the other "core" (most probably) to keep symetry. In any case you loose more than with the single "core" aproach, and you don't gain anything because the chip is the same size. In the case of CPUs multi-core does make sense because you can't cut down/dissable parts of them, except the cache, if one unit is broken you have to throw away the whole core and in the case that one of them is "defective" (it's slower, only half the cache works...) you just cut them off and sell them separately. With CPUs is a matter of "it works/ doesn't work and if it does at which speed?", with GPUs is "how many units work?".
     
    Last edited: Jun 18, 2008
    Bundy says thanks.
  22. Rurouni Strife New Member

    Joined:
    Aug 19, 2006
    Messages:
    124 (0.04/day)
    Thanks Received:
    12
    Can't disagree with you DarkMatter, you make perfect sense. Didn't think about that. Perhaps as die sizes get smaller, the way GPU's talk to eachother can be improved via a type of HT link or whatever. Then you get shared memory, like what is rumored for R700 (don't know if thats true).

    as for evil bill-Ring Bus actually came on the x1K series of cards. Just improved for R/RV600
     
  23. WarEagleAU

    WarEagleAU Bird of Prey

    Joined:
    Jul 9, 2006
    Messages:
    10,797 (3.56/day)
    Thanks Received:
    546
    Location:
    Gurley, AL
    True Imperial.

    @yogurt. The logical next step for ATI and eventually NV would be dual gpu cores. In a sense it would be like the X2s but a bit different. Whereas AMD/ATI may not want to go uber high core like Nvidia, they may break in on the dual core gpu. Kind of awesome to say the least.
     
    pentastar111 says thanks.
  24. hat

    hat Maximum Overclocker

    Joined:
    Nov 20, 2006
    Messages:
    16,937 (5.85/day)
    Thanks Received:
    2,067
    Location:
    Ohio
    You can only make transistors so small. Thier current pholosophy seems to be "moar transistors, who cares about moar bigger gpus?"

    These rediculously large gpus are going to put out a rediculous amount of heat, and make vga coolers rediculously expensive due to the rediculous size of the bases of the heatsink needed to cool the rediculously large gpu.
     
    Crunching for Team TPU
  25. DanishDevil

    DanishDevil

    Joined:
    Oct 6, 2005
    Messages:
    10,203 (3.08/day)
    Thanks Received:
    2,090
    Location:
    Newport Beach, CA
    That's one thing I love about die shrinks. My EK full cover block cools both my 3870x2's GPUs lower than my E8500's cores at stock. I bet the GTX280 puts out quite a lot of heat, though for so much power in a single, larger chip.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page