1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD/ATI Tempts Game Developers with DirectX 10.1

Discussion in 'News' started by btarunr, Jul 1, 2008.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    29,748 (10.67/day)
    Thanks Received:
    14,211
    Location:
    Hyderabad, India
    With a market position resurrection in progress, AMD/ATI look to compete using a tried and tested tool for technological supremacy over its rival(s), developer-level optimizations for their games. Blizzard has been looking at implementing DirectX 10.1 with its future games. If that happens, it becomes a favorable scenario for AMD's products since Blizzard aren't habituated to making games that run best on only the most expensive hardware, but that with DirectX 10.1, they will look to implement certain DX10.1-exclusive effects, which means that even mid-range users of ATI products could enjoy the best visuals that the game has to offer, something NVIDIA and its users could miss out on.

    It is learned that AMD is looking to team up with developers for implementation of the DirectX 10.1 features. An 'old friend' of ATI, Valve could just have DirectX 10.1 implemented with ATI apparently making sure that happens. The next major revision of the Source engine that drives upcoming game titles such as Half Life 2: Episode 3, Portal 2 and Left 4 Dead could just be DirectX 10.1 titles looking at the flexibility of the Source engine and the ease with which new technologies could be incorporated into it.

    Game developers have a tendency to play it safe though, and whether there will be any exclusive effects or not remains to be seen. There is no reason as to for why they shouldn't implement DirectX 10.1 though, the worst-case-scenario is that people with compliant hardware will get a performance boost where DX10.1 makes a difference over DX10.0. On the surface, DirectX 10.1 is touted to be more of a feature upgrade than performance.

    Source: NordicHardware
     
  2. lemonadesoda

    lemonadesoda

    Joined:
    Aug 30, 2006
    Messages:
    6,317 (1.98/day)
    Thanks Received:
    976
    btarunr says thanks.
  3. InnocentCriminal

    InnocentCriminal Resident Grammar Amender

    Joined:
    Feb 21, 2005
    Messages:
    6,484 (1.73/day)
    Thanks Received:
    847
    I have an interview with Gabe Newell from a few months back where Gabe was stressing that DX10 wasn't needed and that the Source engine was going to become completely multi-threaded before they even attempt moving into DirectX10 territory so, that could mean they've done that now and they're ready to start buffing up the graphics with some DX10.1 feature sets.

    Should be interesting, but saying that, look at CoD 4 - DX10 style GFX, and it's still DX9. DX10 still needs to do something extraordinary to win me over.
     
  4. Atnevon

    Atnevon

    Joined:
    Sep 5, 2007
    Messages:
    443 (0.16/day)
    Thanks Received:
    29
    "Play it safe"?

    Tell that to EA with Crysis. Some playing it safe there, and look what that did.
     
  5. ShinyG

    ShinyG New Member

    Joined:
    Sep 17, 2005
    Messages:
    186 (0.05/day)
    Thanks Received:
    13
    Location:
    Romania
    @InnocentCriminal:
    How does a 25-30% bump in performance with AA sounds like?
    That is the reason Assassin's Creed dumped DX10.1 support: it would have made AMD's get better framerates than nVidia's even though the game "proudly" displays the "nVidia, the way it's meant to be played!".
     
    WarEagleAU says thanks.
  6. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,995 (11.07/day)
    Thanks Received:
    10,304
    from what i heard, DX10.1 was supposed to add 'performance free' 4x AA. that may have just been rumour.

    10.1 forces 'optional' 10.0 features to become mandatory.

    DX10 titles have been lacking so far, and i agree.. Call of duty 4 had some damned awesome graphics, and its only DX9.
     
  7. lemonadesoda

    lemonadesoda

    Joined:
    Aug 30, 2006
    Messages:
    6,317 (1.98/day)
    Thanks Received:
    976
    ^^ "performance free". No. That's marketing spin. From what I understand, (see my previous post and links), there was a shader-to-frame-buffer bug that meant that is took twice as long to AA in DX10 (2 renders needed) as it did in DX9 (1 render needed). DX10.1 "fixes" that bug.

    So replace "performance for free" with "bug fix to remove performance fail"
     
    Exceededgoku says thanks.
  8. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,995 (11.07/day)
    Thanks Received:
    10,304
    ahh marketing. isnt it great.

    well regardless, i have a 10.1 card (media PC has a 3450) so i can always use that and see what the fuss is about :p
     
  9. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,958 (2.20/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    Thats the same thing :laugh:


    things look good for AMD... :)
     
    WarEagleAU says thanks.
  10. Exceededgoku

    Joined:
    Mar 26, 2006
    Messages:
    446 (0.13/day)
    Thanks Received:
    30
    Location:
    Stamford, UK
    That deserves a thanks :roll:
     
  11. crsh1976

    Joined:
    Jun 8, 2008
    Messages:
    3 (0.00/day)
    Thanks Received:
    2
    Location:
    Montreal, CDN
    Hasn't Nvidia said they have no plan to adopt 10.1 though? I can't see how that helps developers given Nvidia's rather large market share, why bother to put out performance features that only work on ATI cards?
     
  12. [I.R.A]_FBi

    [I.R.A]_FBi New Member

    Joined:
    May 19, 2007
    Messages:
    7,664 (2.62/day)
    Thanks Received:
    540
    Location:
    c:\programs\kitteh.exe
    why bother with TWIMTBP?
     
    WarEagleAU says thanks.
  13. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    29,748 (10.67/day)
    Thanks Received:
    14,211
    Location:
    Hyderabad, India
    Take it this way: You can max out a game using your NVIDIA card, but get away with an extra bit of AA performance and some visual enhancement if you have ATI ;)
     
    WarEagleAU says thanks.
  14. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.61/day)
    Thanks Received:
    184
    Actually from what I've understood of the thing and after searching a lot about the feature in Beyond3D and GameDev forums, DX10.1 doesn't need that extra pass, while (the implementation that devs do of) DX10 (explanation below) and DX9 needed them. Anyway in the same threads many developers(?) point out two interesting things:

    1- Many said you STILL may need that pass for many other things, so getting rid of it is not always beneficial. After reding this thing I really wonder if that was the case of AC. Maybe they just scrapped the rendering pass because it wasn't needed for AA, totally forgetting they (another developer agnostic to the AA implementation) were using for something else. Fits with the explanation of the devs IMO. After reading a lot about the issue, I'm all for this theory and totally against the Nvidia paid and developers cheat one.

    2- Many said you CAN do the same in DX10 as in DX10.1 regarding the feature that improved AA, but because in DX10 was not mandatory it was not properly documented, it was a lot harder to implement but not impossible. According to them the performance in DX10 would be almost the same, but for many of them due to lack of documentation it didn't make a lot of sense. TBH they also say that in DX10.1 it's a lot better implemented, in the sense of how easy is to use, and sometimes this is more important than anything else.
     
    WarEagleAU says thanks.
  15. lemonadesoda

    lemonadesoda

    Joined:
    Aug 30, 2006
    Messages:
    6,317 (1.98/day)
    Thanks Received:
    976
    I feel sorry for the developers. You cant be developing separate code paths for DX9, DX10, AND DX10.1. What a PITA
     
  16. InnocentCriminal

    InnocentCriminal Resident Grammar Amender

    Joined:
    Feb 21, 2005
    Messages:
    6,484 (1.73/day)
    Thanks Received:
    847
    Yeah I already know about that, pissed me off when they did that. :shadedshu

    @ DarkMatter, what d'you mean by pass?
     
  17. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.61/day)
    Thanks Received:
    184
    Rendering pass. The rendering is always done in more than one pass. You can say layers if you prefer, just as the layers in photoshop I mean, but devs call them passes, because that's the proper name as they render one after the other. For example post-processing effects are on a different (more than one sometimes) rendering pass, HDR lighting is usually on another one I think, shadow maps are calculated on just another one and AA on another one or two if you didn't make a depth test pass for some of the above. You don't need all of those as you can blend some of them AFAIK but there's always more than one. For instance DX10.1 eliminates the need for that second AA pass, but as I said above, aparently many devs use that depth data for many other things, rendering that ability useless for them. Once you need to render that pass it doesn't matter a lot if you can do AA without it, you have it just use it. Here instead of render maybe calculate sounds easier to understand, because I've found people usually take render as "display something in the screen" and that's not always it's meaning.

    What I'm just trying to explain is that apparently (at least is what I understand of what I've read), what we saw in AC is not something we will see as much, because it only offers a performance boost in some limited cases. In fact AC devs said that by removing that pass, the engine was not rendering the same thing, because that pass was probably used for something else besides AA under DX10, even if it's not very apparent.

    EDIT: It's important to note here that the discussion I'm talking about occured a lot before Assasins Creed and that it is me who has related those comments about DX10 and 10.1 features, and it's pros and cons, with what happened with AC.
     
  18. vojc New Member

    Joined:
    Mar 29, 2008
    Messages:
    85 (0.03/day)
    Thanks Received:
    9
    shame on nvidia that forgot on dx10.1 in new generation GPUs
     
  19. InnocentCriminal

    InnocentCriminal Resident Grammar Amender

    Joined:
    Feb 21, 2005
    Messages:
    6,484 (1.73/day)
    Thanks Received:
    847
    Arrh rendering pass, righto!

    ;)
     
  20. imperialreign

    imperialreign New Member

    Joined:
    Jul 19, 2007
    Messages:
    7,043 (2.45/day)
    Thanks Received:
    909
    Location:
    Sector ZZ₉ Plural Z Alpha
    I kinda agree, although, we haven't seen any games yet that were designed from the ground up with DX10 in mind as the primary graphic structure. As of now, DX10 support has been more of an afterthought. I think, though, once we start seeing games devloped for DX10 primarily, we'll start seeing more improvements overall.


    Someone correct me if I'm wrong, though, but I thought AC reinstated DX10.1 with version 1.2 patch after both Ubisoft and nVidia came under a lot of fire for patch 1.1 and the removal of .1 support, and neither company wanted to give a straight answer and neither story matched up?


    IMO, though, I'd like to see DX10.1 support start rolling into games, it would really help boost AMD/ATI's current growing market, and ATI could even use the opportunity to work with game developers more, and help push their AMD GAME! campaign. We really need the competition in all areas of the market . . . everything hardware wise has become very stale over the last 3-5 yeras, too many companies with sole dominance and lack of real competition.
     
  21. WarEagleAU

    WarEagleAU Bird of Prey

    Joined:
    Jul 9, 2006
    Messages:
    10,812 (3.33/day)
    Thanks Received:
    547
    Location:
    Gurley, AL
    I remember saying a year or so ago, that the AMD buyout of ATI would eventually pan out, they would take some bumps and losses, but eventually come back, if given a chance. Seems to me like they already are starting to do that. The new phenoms are a good example (of course, still no match for Intel performance wise and oc wise) These new cards and this right here are another prime example.
     
  22. brian.ca New Member

    Joined:
    Nov 1, 2007
    Messages:
    71 (0.03/day)
    Thanks Received:
    14
    I think the point was that ATI is predicted to grab back market share with this generation so introducing features that are available only to them has gained some value compared to like a year ago.
     
    Last edited: Jul 2, 2008
  23. panchoman

    panchoman Sold my stars!

    Joined:
    Jul 16, 2007
    Messages:
    9,595 (3.34/day)
    Thanks Received:
    1,200
    go amd go!
     
  24. brian.ca New Member

    Joined:
    Nov 1, 2007
    Messages:
    71 (0.03/day)
    Thanks Received:
    14
    Some of the things going on now really should not be a surprise... there were predictions back closer to the merger that things would be rough for them for like a year or so before picking back up for AMD/ATI.

    On the ATI side specifically I'm pretty sure people were also noting how the new architecture of the R600 had a lot more growth potential suggesting that while Nv's g80 architecture was better at the time it would hit a wall before ATI's did. Personally I think that's what happened now.
     
  25. syeef

    Joined:
    Jul 5, 2008
    Messages:
    287 (0.11/day)
    Thanks Received:
    68
    I think it's all Microsoft's fault...
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page