1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GT300 ''Fermi'' Detailed

Discussion in 'News' started by btarunr, Sep 30, 2009.

  1. springs113

    Joined:
    May 24, 2007
    Messages:
    604 (0.23/day)
    Thanks Received:
    61
    you guys and your misquotes about the 5870 being 95% faster than the gtx295...u guys gotta really be ignorant...relatively speaking why would a company do that when they can milk us for our money while squeezing out a marginal percentage improvement bi annually. Not trying to being disrespectful but come on use your head...we knew where the performance was gonna be and for a single card to cost less than a dual card and performs just as good, sometimes better sometimes worst...then that is just awesome for everyone...

    I remember when the the gtx285 was what $549..shit the 5850 can beat it for just about half that price...better yet look at the 8800 ultras price point...these new cards thrash that damn card easy and cost way way less. another example i remember when the radeon 9800xtx came out i was just looking to build a pc at that time and i remember seeing $500+ on newegg
  2. kid41212003

    kid41212003

    Joined:
    Jul 2, 2008
    Messages:
    3,584 (1.59/day)
    Thanks Received:
    533
    Location:
    California
    I'm expecting the most high-end single GPU to be price at (GTX390) $449

    2 models lower with be at (GTX 380) $359 (faster than HD5870), and (GTX360) $299 (= HD5870), which will push the current HD5870 to $295 and HD 5850 to $245.

    The GTS model will be as fast or faster than (abit) GTX285 with DX11 support and will be price at $249, following with a GT at $200.

    And the GPUx2 version which use 2xGTX380 GPU, and become the HD5870X2 killer, likely will be price around $649

    :toast:



    Base on baseless sources.
    skylamer says thanks.
  3. Zubasa

    Zubasa

    Joined:
    Oct 1, 2006
    Messages:
    3,980 (1.38/day)
    Thanks Received:
    457
    Location:
    Hong Kong
    You can only hope.
    nVidia never goes the C/P route, they always push out the $600 monster and you either buy it or you don't.
    After all, if you have the more powerful product, why sell it cheaper?
    Last edited: Sep 30, 2009
  4. LaidLawJones Guest

    I agree with the waiting game. Having bleeding edge may be cool for bragging rights, but sometimes you end up with stuff like a sapphire 580 pure MB, still waiting for RMA, and a nice collection of 3870's.

    I am going to wait until summer for a new build. Prices will have settled, there will be a far larger assortment of cards, and we will see what games/programs are out and able to take advantage of hardware.

    I will be scheduling my lunch break for 13:00 Pacific.
  5. devguy

    devguy

    Joined:
    Feb 17, 2007
    Messages:
    1,239 (0.45/day)
    Thanks Received:
    171
    Location:
    SoCal
    Oh good. It is super exciting that I may soon have the opportunity to GPU hardware accelerate the thousands of Fortran programs I've been writing lately. I even heard that the new version of Photoshop will be written in Fortran!
  6. adrianx New Member

    Joined:
    Jan 9, 2008
    Messages:
    324 (0.13/day)
    Thanks Received:
    22
    Location:
    Bucharest ROMANIA
    Last edited: Sep 30, 2009
  7. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    19,838 (6.19/day)
    Thanks Received:
    5,943
    Seems like a monster. I can almost guarantee the highest end offering will be priced through the roof.

    However, there will be cut down varients, just like the previous generations. These are the SKUs I expect to be competitive both in price and performance with ATi's parts.

    Judging by the original figures, I expect mainstream parts to look something like:

    352 or 320 Shaders
    320-Bit or 256-bit Memory Bus
    1.2GB or 1GB GDDR5
    Crunching for Team TPU More than 25k PPD
  8. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.48/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    Not so baseless IMO. :)

    http://forums.techpowerup.com/showpost.php?p=1573106&postcount=131

    Although the info about memory in that chart is in direct conflict with the one in the OP, I'm still very inclined to believe in the rest. It hints to 4 models being made, and you are not too far off. I also encourage you to join that tread, we've discussed price there too, with similar conclusions. :toast:

    @newtekie

    http://forums.techpowerup.com/showpost.php?p=1573733&postcount=143 - That's what I think about the possible versions based on the TechARP chart (the other link above).

    For nerds:
    Back when MIMD was announced, it was also said the design would be much more modular that GT200. That means the ability to disable units is much improved and that means that the creation of more models is more feasible. 40nm yields are not the best in the world and having 4 models with decreasing number of clusters can greatly improve them to very high numbers.
    Last edited: Sep 30, 2009
  9. ToTTenTranz

    ToTTenTranz New Member

    Joined:
    Sep 8, 2009
    Messages:
    865 (0.48/day)
    Thanks Received:
    167
    Location:
    Porto
    But not a HD5870X2 killer, which will be its competitor, price-wise.


    Furthermore, we still don't know how fast the HD5890 will be, which should somehow address the memory bandwidth bottleneck of the HD5870.
  10. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.48/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    We don't know. Rumors have said the X2 will launch at $550-600. GTX380 will not launch at that price unless it's much much faster and it has 2 lower versions that are faster or compete with HD4870. Forget about GTX2xx launch already, those prices were based on pricing strategy of the past. GTX3xx production costs will be much lower than GTX2xx cards at launch and significantly cheaper to produce than a dual card. Not to mention that Nvidia will have a dual card too.

    How? What is what I missed?
  11. wiak

    wiak

    Joined:
    Sep 5, 2004
    Messages:
    1,744 (0.48/day)
    Thanks Received:
    198
    Location:
    Norway
    :eek:
    i know and you have to also consider that two 5870 in crossfire are bottlenecked by Core i7 965 @ 3.7ghz in some games
    http://www.guru3d.com/article/radeon-hd-5870-crossfirex-test-review/9

    and that most games exept crysis are crappy console ports :rolleyes:
    WarEagleAU says thanks.
  12. Animalpak

    Animalpak

    Joined:
    Feb 8, 2008
    Messages:
    2,081 (0.87/day)
    Thanks Received:
    550
    Specifications are very promising of this new GPU, I look forward to the announcement of the upcoming dual GPU from NVIDIA.

    The dual-GPU cards still have long life in market, if ATI has announced its X2 and we have seen the pictures it means that nvidia will do the same.

    They are been always very powerful and less expensive than two boards mounted on two physical PCI EX slot.
  13. leonard_222003 New Member

    Joined:
    Jan 29, 2006
    Messages:
    241 (0.08/day)
    Thanks Received:
    25
    Altough i hate Nvidia for what it does to games i have to say i'm impressed.
    Still , until i see it i won't take it as "the beast" , we have to wait and see what it can do , not only games but other stuff too.
    Another thing , all that C++, fortran ... , is this what DX11 should be and what ATI 5870 can do too or is just exclusive to the GT300 chip.
    I'm asking this because it is a big thing , if the programers could easily use a 3 billion trans. GPU the the CPU will be insignificant :) in some tasks , Intel should start to feel threatened , AMD too but they are too small to be bothered by this and they have a GPU too :) .
  14. VanguardGX

    VanguardGX New Member

    Joined:
    Jun 30, 2008
    Messages:
    27 (0.01/day)
    Thanks Received:
    5
    Location:
    Jamaica
    My words exactly lol!!! This thing is gonna be a number crunching beast!! Hope it can still play games:)
  15. Animalpak

    Animalpak

    Joined:
    Feb 8, 2008
    Messages:
    2,081 (0.87/day)
    Thanks Received:
    550


    This is not true, games are built from their first release for every platform for the PC version you have more 'opportunities for further improvements in graphics and stability.

    PC graphics are better, take for example asassins Creed on the PC is much better than consoles, Batman arkaham asylum, Wolfenstein, Mass Effect, Call of duty series and many many others.

    The PC is the primary platform for gaming with the PC you can make games with the consoles you can only play.
  16. Binge

    Binge Overclocking Surrealism

    Joined:
    Sep 15, 2008
    Messages:
    6,982 (3.21/day)
    Thanks Received:
    1,752
    Location:
    PA, USA
    Call me an idiot and chop off my genitals so as I can't reproduce any more retards. I've got my wallet ready and waiting :laugh:
    1c3d0g says thanks.
  17. mechtech

    mechtech

    Joined:
    Dec 26, 2006
    Messages:
    251 (0.09/day)
    Thanks Received:
    18
    Seems more like a F@H card or a GPGPU crunching card. I guess it will push good fps also, but thats kinda useless anyway, since LCD monitors can only push 60fps anyway. With the exception of the sammy 2233rz and viewsonic fuhzion.

    I think the next upgrade for me will be the sammy 2233rz, then a 5850 after the price comes down :)

    Either way though, beastly specs!!
  18. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.48/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    AFAIK that means that you can just #include C for CUDA and work with c++ like you would do with any other library and same for fortran. That's very good for some programers indeed, but only works on Nvidia hardware.

    DX11 and OpenCL are used a little bit differently, but are not any less useful and on these AMD does it too.

    Indeed that's already happening. The GPU will never replace the CPU, it will always be a CPU in the PC, but it will go from being powerfull enough to run appications fast, to be fast enough to feed the GPU that runs the applications fast. This means the end for big overpriced CPUs. Read this:

    http://wallstreetandtech.com/it-inf...ticleID=220200055&cid=nl_wallstreettech_daily

    Intead of using a CPU farm with 8000 processors, they used only 48 servers with 2 Tesla GPUs each.

    And that Tesla is the old Tesla using GT200 GPU. So that's a lot of saying actually, since GT300 does double precision 10 times faster. While GT200 did 1 TFlop single precission and 100 Gflops in double precision, GT300 will do ~2.5 TFlops in single precision and 1.25 Tflops on double precision. So yeah, if your application is parallel enough you can now say that Nvidia did open up a can of Whoop ass on Intel this time.

    All those games are ports. PC graphics are better because they used better textures, you use higher resolution and you get proper AA and AF, but the game was coded for the consoles and then ported to PC.
  19. WarEagleAU

    WarEagleAU Bird of Prey

    Joined:
    Jul 9, 2006
    Messages:
    10,796 (3.63/day)
    Thanks Received:
    545
    Location:
    Gurley, AL
    No one is mentioning the L1 and L2 caches on this. Its basically a gpu and cpu merge it seems. I have to say, being an ATI/AMD fanboy, Im impressed with a rumored spec sheet (if not concrete). I Wont be buying it, but it seems like a hell of a dream card.
    AsRock says thanks.
  20. Animalpak

    Animalpak

    Joined:
    Feb 8, 2008
    Messages:
    2,081 (0.87/day)
    Thanks Received:
    550
    I've always noticed that it is a benefit to have a graphics card that are capable of the highest number of FPS. ( look at only your monitor resolution )

    Because in games especially those with very large rooms and environments, the FPS tend to fall down because of the greater workload of pixels.

    So a graphic card that comes to 200 fps will drop to 100 fps and you will not notice any slowdown even with explosions and fast movements. While a card that makes it even more down 100 ( to 40 in some cases ) you will notice a drastic slowdown.

    This often happens in Crysis, but not games like in modern warfare that has been optimized
    to run at 60 fps stable.
  21. Binge

    Binge Overclocking Surrealism

    Joined:
    Sep 15, 2008
    Messages:
    6,982 (3.21/day)
    Thanks Received:
    1,752
    Location:
    PA, USA
    this is because they remade the shader architecture to MIMD which would not do well sharing with memory from the memory bandwidth. The MIMD is optimized for using a pool of cache instead. Otherwise there would be some serious latency with shader processing.
    AsRock says thanks.
  22. trt740

    trt740

    Joined:
    May 12, 2006
    Messages:
    10,935 (3.61/day)
    Thanks Received:
    1,113
    this will be a monster
  23. ZoneDymo

    ZoneDymo

    Joined:
    Feb 11, 2009
    Messages:
    374 (0.18/day)
    Thanks Received:
    58

    Rumours?
    Is it a rumour that Nvidia bought Ageia and left all Ageia card owners for dead?
    Is it a rumour that Nvidia made sure that an ATI + Nvidia (for PhysX) configuration is no longer do-able?
    Is it a rumour that AA does not work on ATI cards in Batman AA even though it is able to do so?

    The only rumour is that Nvidia made Ubisoft remove DX10.1 support from AC because AC was running to awesome on it with of course ATI cards (Nvidia does not support DX10.1)
  24. Binge

    Binge Overclocking Surrealism

    Joined:
    Sep 15, 2008
    Messages:
    6,982 (3.21/day)
    Thanks Received:
    1,752
    Location:
    PA, USA
    Still not monopolizing the market, so your whining won't do anything. This is not the thread for GPU war conspiracy theories.
  25. happita

    happita

    Joined:
    Aug 7, 2007
    Messages:
    2,356 (0.91/day)
    Thanks Received:
    395
    And I was wondering why I was seeing L1 and L2 caches on an upcoming video card. I thought I was going crazy hahaha.

    This will be VERY interesting, if the price is right, I may skip the 5k and go GT300 :rockout:

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page