EastCoasthandle
New Member
- Joined
- Apr 21, 2005
- Messages
- 6,885 (0.99/day)
System Name | MY PC |
---|---|
Processor | E8400 @ 3.80Ghz > Q9650 3.60Ghz |
Motherboard | Maximus Formula |
Cooling | D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB |
Memory | XMS 8500C5D @ 1066MHz |
Video Card(s) | HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean) |
Storage | 2 |
Display(s) | 24" |
Case | P180 |
Audio Device(s) | X-fi Plantinum |
Power Supply | Silencer 750 |
Software | XP Pro SP3 to Windows 7 |
Benchmark Scores | This varies from one driver to another. |
Assassin's Creed is a third-person stealth game found on the PC, PS3 and Xbox360. When it was announced that Ubi would remove DX10.1 do to a rendering pass in post processing that is costly conspiracy theories starting brewing. The Tech Report was able to setup a brief interview with Charles Beauchemin, the tech lead for the Assassin's Creed development team to answer a few questions surrounding this controversy. Below, is interview:
In a nutshell, the performance gains with DX10.1 and AA are in fact real. The removal of the rendering pass in post processing is a direct result of DX10.1's efficiency and doesn't effect IQ. What is concerning is they are non-committal in resorting DX10.1 once removed via patch. I encourage you can to read more of this article here. One thing that caught my attention about this article is this:
Here is the article from TG Daily. The gist of it quoted below. Take note of the article's title.
TR: First, what is the nature of the "costly" "post-effect" removed in Assassin's Creed's DX10.1 implementation? Is it related to antialiasing? Tone mapping?
Beauchemin: The post-effects are used to generate a special look to the game. This means some color correction, glow, and other visual effects that give the unique graphical ambiance to the game. They are also used for game play, like character selection, eagle-eye vision coloring, etc.
TR: Does the removal of this "render pass during post-effect" in the DX10.1 have an impact on image quality in the game?
Beauchemin: With DirectX 10.1, we are able to re-use an existing buffer to render the post-effects instead of having to render it again with different attributes. However, with the implementation of the retail version, we found a problem that caused the post-effects to fail to render properly.
TR: Is this "render pass during post-effect" somehow made unnecessary by DirectX 10.1?
Beauchemin: The DirectX 10.1 API enables us to re-use one of our depth buffers without having to render it twice, once with AA and once without.
TR: What other image quality and/or performance enchancements does the DX10.1 code path in the game offer?
Beauchemin: There is no visual difference for the gamer. Only the performance is affected.
TR: What specific factors led to DX10.1 support's removal in patch 1?
Beauchemin: Our DX10.1 implementation was not properly done and we didn't want the users with Vista SP1 and DX10.1-enabled cards to have a bad gaming experience.
TR: Finally, what is the future of DX10.1 support in Assassin's Creed? Will it be restored in a future patch for the game?
Beauchemin: We are currently investigating this situation.
In a nutshell, the performance gains with DX10.1 and AA are in fact real. The removal of the rendering pass in post processing is a direct result of DX10.1's efficiency and doesn't effect IQ. What is concerning is they are non-committal in resorting DX10.1 once removed via patch. I encourage you can to read more of this article here. One thing that caught my attention about this article is this:
We'll be watching to see what happens next. For our part, the outcome will affect whether and how we use Assassin's Creed and other Ubisoft and Nvidia "TWIMTBP" titles in our future GPU evaluations.
Here is the article from TG Daily. The gist of it quoted below. Take note of the article's title.
...The difference that developers failed to explain is the way how AntiAliasing is controlled in DirectX 10.0 and 10.1. In DX10.0, it was impossible to access information for each sample from a depth buffer. This actually led to a costly slowdown in AntiAliasing operations. 10.1 allows shader units to access all AntiAliasing buffers. All of this was brought to limelight by article an over at Rage3D (http://www.rage3d.com/articles/assassinscreed/).
Following three quotes from software developers, this effect was experienced with all DirectX 10 titles, and there is a good chance that you've already played their games. We talked to a (DX10.0) game developer close to Ubisoft, who requested to remain anonymous, told us that Ubisoft’s explanation walks on thin ice. Here is what he responded to our inquiry and his take on Ubisoft’s statement:
“Felt you might want to hear this out. Read the explanation and laughed hard … the way how DX10.1 works is to remove excessive passes and kill overhead that happened there. That overhead wasn't supposed to happen - we all know that DX10.0 screwed AA in the process, and that 10.1 would solve that [issue]. Yet, even with DX10.0, our stuff runs faster on GeForce than on Radeon, but SP1 resolves scaling issues on [Radeon HD 3800] X2.”
We received a second reply from another game developer, who is currently a DirectX 10.1 title that fully compliant with DX10.0 hardware:
“Of course it removes the render pass! That's what 10.1 does! Why is no one pointing this out, that's the correct way to implement it and is why we will implement 10.1. The same effects in 10.1 take 1 pass whereas in 10 it takes 2 passes.”
A third email reply reached us from a developer a multiplatform development studio:
“Our port to DX10.1 code does not differ from DX10.0, but if you own DX10.1-class hardware from either Nvidia or ATI, FSAA equals performance jump. Remember "Free FSAA"?”
Last edited: