1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce G210 and GeForce GT 220 Now Official

Discussion in 'News' started by malware, Jul 9, 2009.

  1. malware New Member

    Joined:
    Nov 7, 2004
    Messages:
    5,476 (1.50/day)
    Thanks Received:
    956
    Location:
    Bulgaria
    Earlier this week we informed you of the existance of two upcoming 40 nm NVIDIA parts, GeForce GT 220 and GeForce G210. We also gave a hypothetical release date which stated "early Q4", well we lied to you in a good way, the cards are already official and standing on NVIDIA's official page.
    Both cards support DirectX 10.1, OpenGL 3.0 and CUDA, with the G210 having analog VGA, DisplayPort and DVI outputs while the GT 220 has VGA, HDMI and DVI. The GeForce G210 has 16 processor cores and a 589 MHz CPU clock speed; that's paired with 512 MB of DDR2 memory with a 64-bit interface and 500 MHz clock speed. Its shaders run at 1402 MHz. As for the NVIDIA GeForce GT 220, it has 48 processor cores and a 615 MHz clock speed, paired with 1 GB of GDDR3 memory with a 790 MHz clock and a 128-bit interface. It has a slightly slower shaders clock speed of 1335 MHz. Neither of the two cards is expected to be available directly to consumers, both offerings are marked as OEM products and meant to be entry-level options in pre-built PCs.

    NVIDIA GeForce G210 (OEM Product) Specs
    NVIDIA GeForce GT 220 (OEM Product) Specs

    [​IMG] [​IMG]

    Source: NVIDIA
     
    Last edited by a moderator: Jul 9, 2009
  2. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,063 (6.14/day)
    Thanks Received:
    6,122
    Doesn't seem like the GT220 would be that bad of an entry level card. The G210 would probably make a decent Physx card...
     
    Crunching for Team TPU 50 Million points folded for TPU
  3. Semi-Lobster

    Semi-Lobster New Member

    Joined:
    Jul 9, 2009
    Messages:
    353 (0.18/day)
    Thanks Received:
    29
    They look like really interesting cards, its a shame we won't be able to a proper review for them in a while since they're OEM only cards.
     
  4. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,398 (11.53/day)
    Thanks Received:
    9,697
    its funny that the really low performance "should be onboard VGA" cards have DX10.1, and mainstream nvidia dont.


    at least they have HDMI and DVI even when using the low profile bracket, thats good.
     
  5. wolf

    wolf Performance Enthusiast

    Joined:
    May 7, 2007
    Messages:
    5,543 (2.03/day)
    Thanks Received:
    842
    Could be an interesting card to use for physx/folding and extra monitors (HDMI :rockout:), and could be chopped down to fit into a 4x pci-e slot too from what Ive seen.

    EDIT: ive been thinking of buying a single slot card to do just that to for my Asus P6T deluxe, packing 48 shaders, not much heat and no need for extra power, it makes a good case for itself.
     
  6. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,747 (11.14/day)
    Thanks Received:
    13,676
    Location:
    Hyderabad, India
    x1 too.
     
    wolf says thanks.
  7. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,398 (11.53/day)
    Thanks Received:
    9,697
    i'm waiting for Nv to release a 'physics' card - GF 8200 GPU (or better, whatever works), no monitor outputs, 128MB of ram PCI-E 1x, small - purely for PhysX (or F@H, lol)
     
  8. wolf

    wolf Performance Enthusiast

    Joined:
    May 7, 2007
    Messages:
    5,543 (2.03/day)
    Thanks Received:
    842
    How many PPD would 16 sp's crunch? that would be the really lol part :p i'd say the best starting point for such a card would be 32 sp's, they alone can make a decent contribution and indeed game (8600GT)

    On paper the GT220 card seems to be better than an 8600GT, so it will even game well at low enough res/settings. We've seen what Nvidias ION can do in terms of low... LOW end gaming, this card paired with any modern desktop CPU is a to-the-wire budget gamers delight imo.
     
  9. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,063 (6.14/day)
    Thanks Received:
    6,122
    Not really surprising. When the mainstream cores were developed DX10.1 was useless, and even today it is arguably pointless to consider. The difference is pretty unnoticable, especially to the average consumer, and I couldn't even tell you which games support DX10.1. Essentially DX10.1 was a marketting gimmick.
     
    Crunching for Team TPU 50 Million points folded for TPU
  10. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,398 (11.53/day)
    Thanks Received:
    9,697
    wait til DX11 hits and the backward compatibility trickles down - i'll be enjoying my free AA :p
     
  11. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,063 (6.14/day)
    Thanks Received:
    6,122
    What backwards compatibility? The only backwards compatibility DX11 will have is the same backwards compatibility DX10 had...
     
    Crunching for Team TPU 50 Million points folded for TPU
  12. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,398 (11.53/day)
    Thanks Received:
    9,697
    DX11 games will work on DX10 and 10.1 cards, with features disabled. so i'm getting the DX10.1 features enabled, and nvidia users wont.

    To clarify: one exe runs DX 11, 10.1 and 10.0 - the game merely disables any features that your card doesnt suport. MS learned from their mistakes, and they dont want another DX10 fiasco.
     
  13. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,063 (6.14/day)
    Thanks Received:
    6,122
    Only if the game developers actually support DX10.1 features in the game. Not all DX11 games will support DX10.1, DX11 doesn't guarantee DX10.1. If they continue on their trend of lazyness, then DX10 is all we are going to get.

    The developers have to code in a rendering path for DX10.1 before it can be used. Essentially, every DX11 game will have to have 4 rendering paths coded for it. DX11, DX10.1, DX10, and DX9. That is a real pain in the ass. I'm going to guess we will be lucky to see DX10.1 and DX9 actually supported in DX11 game though. I'm guess they will continue to only support DX10 and DX11.

    And beyond that, who really cares about DX10.1 anyway? I can't even see the difference between DX10 and DX10.1 in the few games that actually support DX10.1.
     
    Crunching for Team TPU 50 Million points folded for TPU
  14. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,398 (11.53/day)
    Thanks Received:
    9,697
    DX11 games support it from the get go, you're missing the point.

    When they make DX11 games, all this stuff is included as part of DX11.
    Its like running a source game, they have a drop down for DX7, 8.1 and 9.0C.

    DX10.1 is mostly speed boosts, particularly with AA. you wont "see" a difference.
     
  15. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,063 (6.14/day)
    Thanks Received:
    6,122
    No DX11 games do not support DX10 or DX10.1 by default. The rendering paths need to be included by the developers for the game to support DX10/.1. Unless they have changed something in DX11 that makes DX11 include DX10 in the standard(and they may have, I don't know).

    The source engine has rendering paths all the way back to DX7, but not all DX9.0C game support DX7 or DX8 natively. It doesn't work like that, the developers have to add support for those versions of DX manually. With the source engine, that was pretty easy since DX8 was in use when developement was started. So they started developement based on DX8, with DX7 support, then as they developed it, DX9 came out and they added support for that. Now DX9 included everything you need to run DX8 and DX7 games, as DX9 kept the library files for DX8 and DX7, but the game still had to support using those library files.

    However, with DX10, it was a whole new API that didn't natively include support for DX9 and earlier. I was under the understanding that DX11 will be like this also. It will be a completely new API that doesn't include the DX10/.1 library files.(Again, I could be wrong here.) But even if DX11 does include the DX10/.1 API, the game developers still have to manually code the game to use them.

    And DX10.1 is mostly performance improvements, you're correct. However, the nVidia cards don't need the performance boosts to outperform the ATi cards that have them, so your point is kind of moot anyway...:D
     
    Crunching for Team TPU 50 Million points folded for TPU
  16. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,398 (11.53/day)
    Thanks Received:
    9,697
     
  17. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,063 (6.14/day)
    Thanks Received:
    6,122
    That is pretty obvious, no game developer in their right mind would limit a game to DX11 only. I know that DX11 games will run on DX10 cards, but will they all have DX10.1 support? Has Microsoft required this? Because if they have only required DX10 support, then that is all we will get. I didn't think the DX10 API was part of DX11, or is it?

    Not trying to turn it into a red vs. green, just stating a truth. You are correct that DX10.1 is mostly performance related, so it is kind of pointless to worry about if the DX10 cards are outperforming the DX10.1 card.
     
    Crunching for Team TPU 50 Million points folded for TPU
  18. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,398 (11.53/day)
    Thanks Received:
    9,697
    i'm confident MS has required 10.1 to be included in the 11 specs.

    The only reason the features included in 10.1 werent in 10 originally, was because nvidia couldnt do it. ATI will simply have better AA performance than nvidia, until they come out with new cards.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page