Tuesday, July 1st 2008

AMD/ATI Tempts Game Developers with DirectX 10.1

With a market position resurrection in progress, AMD/ATI look to compete using a tried and tested tool for technological supremacy over its rival(s), developer-level optimizations for their games. Blizzard has been looking at implementing DirectX 10.1 with its future games. If that happens, it becomes a favorable scenario for AMD's products since Blizzard aren't habituated to making games that run best on only the most expensive hardware, but that with DirectX 10.1, they will look to implement certain DX10.1-exclusive effects, which means that even mid-range users of ATI products could enjoy the best visuals that the game has to offer, something NVIDIA and its users could miss out on.

It is learned that AMD is looking to team up with developers for implementation of the DirectX 10.1 features. An 'old friend' of ATI, Valve could just have DirectX 10.1 implemented with ATI apparently making sure that happens. The next major revision of the Source engine that drives upcoming game titles such as Half Life 2: Episode 3, Portal 2 and Left 4 Dead could just be DirectX 10.1 titles looking at the flexibility of the Source engine and the ease with which new technologies could be incorporated into it.

Game developers have a tendency to play it safe though, and whether there will be any exclusive effects or not remains to be seen. There is no reason as to for why they shouldn't implement DirectX 10.1 though, the worst-case-scenario is that people with compliant hardware will get a performance boost where DX10.1 makes a difference over DX10.0. On the surface, DirectX 10.1 is touted to be more of a feature upgrade than performance.Source: NordicHardware
Add your own comment

24 Comments on AMD/ATI Tempts Game Developers with DirectX 10.1

#2
InnocentCriminal
Resident Grammar Amender
I have an interview with Gabe Newell from a few months back where Gabe was stressing that DX10 wasn't needed and that the Source engine was going to become completely multi-threaded before they even attempt moving into DirectX10 territory so, that could mean they've done that now and they're ready to start buffing up the graphics with some DX10.1 feature sets.

Should be interesting, but saying that, look at CoD 4 - DX10 style GFX, and it's still DX9. DX10 still needs to do something extraordinary to win me over.
Posted on Reply
#3
Atnevon
"Play it safe"?

Tell that to EA with Crysis. Some playing it safe there, and look what that did.
Posted on Reply
#4
ShinyG
@[USER=7140]InnocentCriminal[/USER]:
How does a 25-30% bump in performance with AA sounds like?
That is the reason Assassin's Creed dumped DX10.1 support: it would have made AMD's get better framerates than nVidia's even though the game "proudly" displays the "nVidia, the way it's meant to be played!".
Posted on Reply
#5
Mussels
Moderprator
from what i heard, DX10.1 was supposed to add 'performance free' 4x AA. that may have just been rumour.

10.1 forces 'optional' 10.0 features to become mandatory.

DX10 titles have been lacking so far, and i agree.. Call of duty 4 had some damned awesome graphics, and its only DX9.
Posted on Reply
#6
lemonadesoda
^^ "performance free". No. That's marketing spin. From what I understand, (see my previous post and links), there was a shader-to-frame-buffer bug that meant that is took twice as long to AA in DX10 (2 renders needed) as it did in DX9 (1 render needed). DX10.1 "fixes" that bug.

So replace "performance for free" with "bug fix to remove performance fail"
Posted on Reply
#7
Mussels
Moderprator
by: lemonadesoda

So replace "performance for free" with "bug fix to remove performance fail"
ahh marketing. isnt it great.

well regardless, i have a 10.1 card (media PC has a 3450) so i can always use that and see what the fuss is about :P
Posted on Reply
#8
tkpenalty
by: lemonadesoda
^^ "performance free". No. That's marketing spin. From what I understand, (see my previous post and links), there was a shader-to-frame-buffer bug that meant that is took twice as long to AA in DX10 (2 renders needed) as it did in DX9 (1 render needed). DX10.1 "fixes" that bug.

So replace "performance for free" with "bug fix to remove performance fail"
Thats the same thing :laugh:


things look good for AMD... :)
Posted on Reply
#9
Exceededgoku
by: lemonadesoda
^^ "performance free". No. That's marketing spin. From what I understand, (see my previous post and links), there was a shader-to-frame-buffer bug that meant that is took twice as long to AA in DX10 (2 renders needed) as it did in DX9 (1 render needed). DX10.1 "fixes" that bug.

So replace "performance for free" with "bug fix to remove performance fail"
That deserves a thanks :roll:
Posted on Reply
#10
crsh1976
Hasn't Nvidia said they have no plan to adopt 10.1 though? I can't see how that helps developers given Nvidia's rather large market share, why bother to put out performance features that only work on ATI cards?
Posted on Reply
#11
[I.R.A]_FBi
by: crsh1976
Hasn't Nvidia said they have no plan to adopt 10.1 though? I can't see how that helps developers given Nvidia's rather large market share, why bother to put out performance features that only work on ATI cards?
why bother with TWIMTBP?
Posted on Reply
#12
btarunr
Editor & Senior Moderator
by: crsh1976
Hasn't Nvidia said they have no plan to adopt 10.1 though? I can't see how that helps developers given Nvidia's rather large market share, why bother to put out performance features that only work on ATI cards?
Take it this way: You can max out a game using your NVIDIA card, but get away with an extra bit of AA performance and some visual enhancement if you have ATI ;)
Posted on Reply
#13
DarkMatter
by: lemonadesoda
^^ "performance free". No. That's marketing spin. From what I understand, (see my previous post and links), there was a shader-to-frame-buffer bug that meant that is took twice as long to AA in DX10 (2 renders needed) as it did in DX9 (1 render needed). DX10.1 "fixes" that bug.

So replace "performance for free" with "bug fix to remove performance fail"
Actually from what I've understood of the thing and after searching a lot about the feature in Beyond3D and GameDev forums, DX10.1 doesn't need that extra pass, while (the implementation that devs do of) DX10 (explanation below) and DX9 needed them. Anyway in the same threads many developers(?) point out two interesting things:

1- Many said you STILL may need that pass for many other things, so getting rid of it is not always beneficial. After reding this thing I really wonder if that was the case of AC. Maybe they just scrapped the rendering pass because it wasn't needed for AA, totally forgetting they (another developer agnostic to the AA implementation) were using for something else. Fits with the explanation of the devs IMO. After reading a lot about the issue, I'm all for this theory and totally against the Nvidia paid and developers cheat one.

2- Many said you CAN do the same in DX10 as in DX10.1 regarding the feature that improved AA, but because in DX10 was not mandatory it was not properly documented, it was a lot harder to implement but not impossible. According to them the performance in DX10 would be almost the same, but for many of them due to lack of documentation it didn't make a lot of sense. TBH they also say that in DX10.1 it's a lot better implemented, in the sense of how easy is to use, and sometimes this is more important than anything else.
Posted on Reply
#14
lemonadesoda
I feel sorry for the developers. You cant be developing separate code paths for DX9, DX10, AND DX10.1. What a PITA
Posted on Reply
#15
InnocentCriminal
Resident Grammar Amender
by: ShinyG
@InnocentCriminal:
How does a 25-30% bump in performance with AA sounds like?
That is the reason Assassin's Creed dumped DX10.1 support: it would have made AMD's get better framerates than nVidia's even though the game "proudly" displays the "nVidia, the way it's meant to be played!".
Yeah I already know about that, pissed me off when they did that. :shadedshu

@ DarkMatter, what d'you mean by pass?
Posted on Reply
#16
DarkMatter
by: InnocentCriminal
Yeah I already know about that, pissed me off when they did that. :shadedshu

@ DarkMatter, what d'you mean by pass?
Rendering pass. The rendering is always done in more than one pass. You can say layers if you prefer, just as the layers in photoshop I mean, but devs call them passes, because that's the proper name as they render one after the other. For example post-processing effects are on a different (more than one sometimes) rendering pass, HDR lighting is usually on another one I think, shadow maps are calculated on just another one and AA on another one or two if you didn't make a depth test pass for some of the above. You don't need all of those as you can blend some of them AFAIK but there's always more than one. For instance DX10.1 eliminates the need for that second AA pass, but as I said above, aparently many devs use that depth data for many other things, rendering that ability useless for them. Once you need to render that pass it doesn't matter a lot if you can do AA without it, you have it just use it. Here instead of render maybe calculate sounds easier to understand, because I've found people usually take render as "display something in the screen" and that's not always it's meaning.

What I'm just trying to explain is that apparently (at least is what I understand of what I've read), what we saw in AC is not something we will see as much, because it only offers a performance boost in some limited cases. In fact AC devs said that by removing that pass, the engine was not rendering the same thing, because that pass was probably used for something else besides AA under DX10, even if it's not very apparent.

EDIT: It's important to note here that the discussion I'm talking about occured a lot before Assasins Creed and that it is me who has related those comments about DX10 and 10.1 features, and it's pros and cons, with what happened with AC.
Posted on Reply
#17
vojc
shame on nvidia that forgot on dx10.1 in new generation GPUs
Posted on Reply
#19
imperialreign
by: lemonadesoda
I feel sorry for the developers. You cant be developing separate code paths for DX9, DX10, AND DX10.1. What a PITA
I kinda agree, although, we haven't seen any games yet that were designed from the ground up with DX10 in mind as the primary graphic structure. As of now, DX10 support has been more of an afterthought. I think, though, once we start seeing games devloped for DX10 primarily, we'll start seeing more improvements overall.


Someone correct me if I'm wrong, though, but I thought AC reinstated DX10.1 with version 1.2 patch after both Ubisoft and nVidia came under a lot of fire for patch 1.1 and the removal of .1 support, and neither company wanted to give a straight answer and neither story matched up?


IMO, though, I'd like to see DX10.1 support start rolling into games, it would really help boost AMD/ATI's current growing market, and ATI could even use the opportunity to work with game developers more, and help push their AMD GAME! campaign. We really need the competition in all areas of the market . . . everything hardware wise has become very stale over the last 3-5 yeras, too many companies with sole dominance and lack of real competition.
Posted on Reply
#20
WarEagleAU
Bird of Prey
I remember saying a year or so ago, that the AMD buyout of ATI would eventually pan out, they would take some bumps and losses, but eventually come back, if given a chance. Seems to me like they already are starting to do that. The new phenoms are a good example (of course, still no match for Intel performance wise and oc wise) These new cards and this right here are another prime example.
Posted on Reply
#21
brian.ca
by: crsh1976
Hasn't Nvidia said they have no plan to adopt 10.1 though? I can't see how that helps developers given Nvidia's rather large market share, why bother to put out performance features that only work on ATI cards?
I think the point was that ATI is predicted to grab back market share with this generation so introducing features that are available only to them has gained some value compared to like a year ago.
Posted on Reply
#23
brian.ca
by: WarEagleAU
I remember saying a year or so ago, that the AMD buyout of ATI would eventually pan out, they would take some bumps and losses, but eventually come back, if given a chance. Seems to me like they already are starting to do that. The new phenoms are a good example (of course, still no match for Intel performance wise and oc wise) These new cards and this right here are another prime example.
Some of the things going on now really should not be a surprise... there were predictions back closer to the merger that things would be rough for them for like a year or so before picking back up for AMD/ATI.

On the ATI side specifically I'm pretty sure people were also noting how the new architecture of the R600 had a lot more growth potential suggesting that while Nv's g80 architecture was better at the time it would hit a wall before ATI's did. Personally I think that's what happened now.
Posted on Reply
#24
syeef
I think it's all Microsoft's fault...
Posted on Reply
Add your own comment