1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia expected to offer DirectX 10.1 GPU in Q1 2009

Discussion in 'General Software' started by tylerward, Jul 28, 2008.

  1. tylerward

    tylerward New Member

    Joined:
    Jul 27, 2008
    Messages:
    43 (0.02/day)
    Thanks Received:
    0
    Location:
    Anaheim, CA
    Chicago (IL) – Nvidia so far declined to provide any information if and when the company will consider to support the DirectX 10.1 API in its GPUs, a technology that is integrated in AMD’s Radeon cards for some time now. Roadmap information we stumbled across today offers a bit more clarity and suggests that the company’s next-generation desktop and notebook chips will support DirectX 10.1.

    DirectX 10.1 has been a confusing story for most of us, with no clear indication which graphics card you should buy to be able to get access to the best feature set. ATI Radeon cards as well as S3 have been supporting DirectX 10.1 for a while now, but Nvidia remains silent about its future API plans – leaving the gaming market and its customers in uncertainty.

    A presentation slide we received, but unfortunately cannot share with you in order to protect our source, clearly states that Nvidia will offer DirectX 10.1 support with its next-generation notebook GPUs that are scheduled for a spring 2009 release. DirectX 10.1 is also likely to be offered in the next desktop GPU generation, which should debut either late in Q4 2008 or Q1 2009, with a possible ramp throughout Q1 and Q2 of 2009.

    So, what does that mean? Well, it depends on your view.

    What we know for sure is that with Nvidia’s decision to support DX10.1, the industry will be embracing this API.

    On the very high end, it may mean that you should think twice about spending $500 or more on a DX10.0 card. DX10.1 cards may be the better value proposition, if you want to run the latest games and don’t want to buy another $500 card six months from now. Nvidia’s new GPU generation, we hear, will also be 1.5 to 2 times faster than the current technology.

    This decision may also have some implications for AMD. Realistically, AMD has a six-month advantage over Nvidia in terms of API support right now and also appears to have competitive hardware in place as well. If AMD plays this game smart, it should be able to regain market share, as the 4800 series may be the more attractive technology for computer graphics at this time – at least for those of us with a limited budget.

    Oh, and we almost forgot: Nvidia will also switch to GDDR5 memory, most likely within 2008. As GDDR5 chips are more available, we expect first Nvidia GDDR5 cards to hit the market in Q4.
    Source: http://www.tgdaily.com/content/view/38247/135/

    I have a 10.1 card already from ATI and am wondering when DirectX 10.1 will be out.
    Last edited: Jul 29, 2008
  2. wolf2009 Guest

    DX 10.1 is alreadu out with Vista SP1 . Assasins Creed was the first game to use DX10.1 . Ubisoft issued a patch later to remove DX10.1 because they said it caused problems with Nvidia Cards ( we all know what problems they were, since the game was in the TWIMTBP program and nvidia;s own cards didn't support DX 10.1 , LOL ) .
  3. tylerward

    tylerward New Member

    Joined:
    Jul 27, 2008
    Messages:
    43 (0.02/day)
    Thanks Received:
    0
    Location:
    Anaheim, CA
    so assassins creed has DirectX 10.1. I gotta get this game then and just NOT patch it since my card is ATI and 10.1 compatible.
  4. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    7,598 (2.68/day)
    Thanks Received:
    1,568
    Location:
    Missoula, MT, USA
    Just like DX10 was and still is to an extent, it's more gimmick than useful at this point...and with DX11 possibly looming, what makes a DX10.1 card any more future proof? Won't 11 be compatable with 10 and 10.1 cards? If it's the next standard...


    I dunno...I don't see a point of deterrance of a purchase if a product is better yet supports an older DX10, I really don't see what the big drawback is...maybe when DX10, 10.1 or even 11 is actually used wide enough to make a larger impact in the massive DX9 market will there be a serious point to any of this imo.

    I say get what suits your wallet and the performance you expect from gaming NOW...or be one of those that keeps waiting for the next, next, next, next best thing and hope your gamble was worth the wait! Either way I don't really feel one would lose, but I'd rather play and enjoy my games now, and for a while with my 260 than worry if DX10.1 might sneak up and ruin my gaming experiance ya know? Is it a feature to gloat or worry about, today no, tomorrow no, next year..meh, we'll see.

    :toast:
  5. EastCoasthandle

    EastCoasthandle New Member

    Joined:
    Apr 21, 2005
    Messages:
    6,889 (2.04/day)
    Thanks Received:
    1,505
    this news is dated July 3, 2008. It's hard to stay but if Nvidia does support DX10.1 it's possible to believe that DX11 may not be all that it's cracked up to be. Time will tell.
  6. newconroer

    newconroer

    Joined:
    Jun 20, 2007
    Messages:
    2,836 (1.10/day)
    Thanks Received:
    267
    DDR5 on which cards? It won't be on the GT200 rehashes, and there won't be a GT300 before Christmas so...
    I will have a chuckle if they dump useless DDR5 on the budget card range.



    DX10 is highly useful, just not too many developers want to use it. Plain and simple. Watch, when GPU sided physics go mainstream, 10, 10.1 and whatever else will balloon.


    Though I think DX should just die altogether.
  7. Megasty New Member

    Joined:
    Mar 18, 2008
    Messages:
    1,263 (0.55/day)
    Thanks Received:
    82
    Location:
    The Kingdom of Au
    DX11 seems to be more of an optimization of DX10/10.1 than anything else. You wonder why DX11 is compatible with DX10 hardware. It just means that DX10 is more taxing than DX11. Most game producers didn't like the fact that if they built their games to run specifically for DX10 then it would take a large performance hit w/o seeing a great increase in quality - Crytek had to learn this the hard way. Even so, if they built it for DX10.1, it would feck up on NV cards :eek: DX10.1 is the apex of system taxing although it does look better to the eye. NV is always late to join in the gimmicks. If a card can handle DX10.1 then it can handle anything.
  8. tylerward

    tylerward New Member

    Joined:
    Jul 27, 2008
    Messages:
    43 (0.02/day)
    Thanks Received:
    0
    Location:
    Anaheim, CA
    it really just has to do with the API. It makes it easier on developers to make better textures and more quantities of polygons and making better use of hardware. Thats why hardware developers are making same model cards, if a 2GB X99 came out (made up name) then at least 6-7 companies would make a 2GB X99, this is so everyone is playing games and there aren't as many video cards to test. Testing all DX 8-9 on 10.1 or 11 would take waaaaay too long and be way too much of a hastle.
  9. CrAsHnBuRnXp

    CrAsHnBuRnXp

    Joined:
    Oct 19, 2007
    Messages:
    5,451 (2.22/day)
    Thanks Received:
    635
    It auto patches itself. If you dont let it patch, you cant play it. So basically you are forced to remove DX10.1 and install all its updates by force.
  10. Megasty New Member

    Joined:
    Mar 18, 2008
    Messages:
    1,263 (0.55/day)
    Thanks Received:
    82
    Location:
    The Kingdom of Au
    I never used the auto-update thing. I just play the game using the DX10 exe instead of the silly launcher - no updates, no problems.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page