Wednesday, June 3rd 2009

AMD Demonstrates World’s First Microsoft DirectX 11 Graphics Processor

At a press conference in Taipei, Taiwan today, AMD publicly demonstrated the world’s first Microsoft DirectX 11 graphics processor. The series of demonstrations shed new light on the significantly improved computing experience set to debut at the end of 2009. The fusion of AMD’s new ground-breaking graphics processors with the forthcoming DirectX 11 programming interface is set to forever change both applications and PC gaming for the better. To illustrate, AMD showed numerous examples of faster application performance and new game features using the world’s first true DirectX 11 graphics processor.

  • Get ready for a revolution: Games and other applications are about to get a lot better as a result of AMD’s new graphics hardware and DirectX 11. DirectX 11 features such as tessellation will bring consumers higher quality, superior performing games making use of 6th generation AMD technology. Another DirectX 11 feature, the compute shader, will enable AMD’s DirectX 11 graphics cards to help make Windows 7 run faster in a wide number of applications and in a manner that’s completely transparent to users, for example, in seamlessly accelerating the conversion of video for playback on portable media players through a drag-and-drop interface.
  • DirectX 11 done right on AMD: The development of DirectX 11 has been broadly influenced by AMD graphics technology. Each new version of DirectX builds on the versions that came before it, and many of the capabilities of DirectX 11 were pioneered on AMD GPUs, including DirectX 10.1, tessellation, compute shaders, Fetch4, custom filter anti-aliasing and high-definition ambient occlusion shading.
  • Bringing consumers DirectX 11 sooner: The preview of the world’s first DirectX 11 graphics processor at Computex 2009 validates AMD’s commitment to delivering leading technologies to market before anyone else, and to continuing to foster innovation in computing.
  • Fueling developer demand: It’s not just consumers who are excited about the prospects of DirectX 11, game developers are also incredibly enthusiastic about taking advantage of new DirectX 11 hardware to bring even better games to market, in large part due to AMD’s readiness to meet their DirectX 11 needs. Many developers have indicated their commitment to building DirectX 11 games initially on AMD’s DirectX 11 hardware, delivering superior performance and compatibility.
“AMD has a long track record of delivering pioneering features that have gone on to become mainstays in the DirectX experience, and we’re doing it again with two mature, AMD-developed technologies in DirectX 11 – tessellation and the compute shader – both of which enable a better DirectX 11 experience for consumers,” said Rick Bergman, Senior Vice President, AMD Products Group. “Today, we’re previewing AMD’s DirectX 11 graphics processor to build enthusiasm for this key technology so developers will have games available at launch and shortly thereafter. With the benefits it delivers to gaming, applications and Windows 7, developers are lining up to get their hands on our hardware, and we’re confident that consumers will too.”Source: AMD
Add your own comment

61 Comments on AMD Demonstrates World’s First Microsoft DirectX 11 Graphics Processor

#1
Wile E
Power User
360 has the tessellation option as well. Just an fyi.
Posted on Reply
#2
Mussels
Moderprator
Wile E said:
360 has the tessellation option as well. Just an fyi.
because its got sexy ATI hardware inside, thats why :)
Posted on Reply
#3
Valdez
Mussels said:

Whether or not ATI's existing tesselation is compatible with what they used in DX11 is up for debate, no one really knows yet.
They are not compatible.


Battlefield: bad company 2 is a dx10.1 title with ati's tessellation support (2xxx,3xxx,4xxx series and xbox). They ported it to dx11 just to "import" dx11 tessellation, which can be used by gt300 (and rv870).
They don't use any of dx11 features, only the dx11 tessellation.

Slides:
http://hardocp.com/news.html?news=Mzk5NTAsLCxoZW50aHVzaWFzdCwsLDE=
Posted on Reply
#4
Mussels
Moderprator
Valdez said:
They are not compatible.


Battlefield: bad company 2 is a dx10.1 title with ati's tessellation support (2xxx,3xxx,4xxx series and xbox). They ported it to dx11 just to "import" dx11 tessellation, which can be used by gt300 (and rv870).
They don't use any of dx11 features, only the dx11 tessellation.

Slides:
http://hardocp.com/news.html?news=Mzk5NTAsLCxoZW50aHVzaWFzdCwsLDE=
i hope games use ATI's tesselation as well as the DX11 one. They could possibly GPGPU some of it.
Posted on Reply
#5
Easo
Bring it on!
Posted on Reply
#6
Valdez
Mussels said:
from what i've pieced together, anyone with DX9 hardware or above can run a DX11 title - unsupported features are just disabled (and will therefore hurt graphics, or performance)

So if you use a DX10 video card, you get DX10 level graphics and performance. If you're an ATI user, you get DX10.1 and get the far better antialiasing. If you're a DX11 user... well, you get the whole thing.
As far as i know dx11 is backward compatible to dx10, not to dx9.
So dx10.x cards could run dx11 titles (with some eye-candy and performance loss), but dx9 cards couldn't.
Just like today, you need a dx10 card to run dx10 titles.
Anyway dx9 only cards are too weak for today and future titles.


However with software rendering you can render dx10/11 games on cpu (with a dx9 card), but it will be very slow.
http://msdn.microsoft.com/en-us/library/dd285359.aspx
Posted on Reply
#7
wolf
Performance Enthusiast
that picture really doesn't impress me much, show me the GPU, show me its specs, and show me a fully rendered DX11 Scene already!

Nvidias first Dx11 GPU should be a creature of might too, cant wait for more than buzz words to surface.
Posted on Reply
#8
Valdez
http://www.anandtech.com/video/showdoc.aspx?i=3573
We can ascertain from the wafer shot AMD provided us that there are about 19.5 dies vertically and 25.5 dies horizontally. As this is a 300mm wafer, we can sort of "guess" the dimensions of the chip at roughly 15.38mm x 11.76mm resulting in a die area of a little over 180mm^2. This is much smaller than the 55nm RV770 which is a 260mm^2 part
Compare this die size to RV740 that weighs in at 137mm^2 and 826 million transistors, and we can estimate very loosely that Evergreen could come in at something over 1 billion transistors.
Posted on Reply
#9
amschip
I was playing a liittle bit with dx10 sdk and the way it works is developers have set of routines to perform hardware check be it dx9, 10 or 11. Since dx11 is evolution od dx10 (or super set) it might not support dx9 due to a different routines syntax and coding (dx9 routine invocation might have different syntax than say dx11 one or be competely obsolete)
Also about dx10 and 11 compatibility and convertion say tessellation unit for example. Dx10 sdk requires developer to have supportive hardware (which is not there) or create software routine on his own while dx11 will have both already. So lets say dx10 developers designed software tessellation unit. Now the only thing they have to do is to invoke sdk routine instead. Simple isn't it.
Posted on Reply
#10
Studabaker
Studabaker said:
End of the year for me = PII X4, DDR3, RAID0 SSD, DX11 Radeon, Win7 & daily MW2 beatdowns! :rockout:
That's what I said :D
Posted on Reply
#11
The Witcher
This is stupid, our current hardware's already have hard time running DX10 applications and now they are going to release DX11, we all know that the original DX10 crysis (from the first trailers) used DX10 fully (I guess) but they had to lower the graphics because the current hardware's couldn't handle it, so what's the point in releasing DX11 other than speeding the process of releasing new hardware which are going to be extremely expensive just like the 8800 Ultra...maybe.

But to be honest I can't wait to see the first images of a DX11 game because I want to see the future of gaming :D
Posted on Reply
#12
ChaoticAtmosphere
This is very good news. I guess it this news goes hand in hand with the announcement of AMD/ATI's development and late fall release (I'm guessing) of the R800, RV870, RV840 and RV810 GPU's in the 5000 series.

I'll be working overtime to get my hands on one of those babies...I'm wondering if Crysis 2 will have support DX11 fully.
Posted on Reply
#13
Valdez
The Witcher said:
This is stupid, our current hardware's already have hard time running DX10 applications and now they are going to release DX11, we all know that the original DX10 crysis (from the first trailers) used DX10 fully (I guess) but they had to lower the graphics because the current hardware's couldn't handle it, so what's the point in releasing DX11 other than speeding the process of releasing new hardware which are going to be extremely expensive just like the 8800 Ultra...maybe.

But to be honest I can't wait to see the first images of a DX11 game because I want to see the future of gaming :D
There is only one dx10 game on the market (stormrise). All of the other "dx 10" titles were just dx9 ports. Yes, crysis too. Crysis is a dx9 game with a dx10 executable which doesn't use dx10 features at all.
Posted on Reply
#14
PCpraiser100
Omg! Look at those shaders. Finally I can see the horizon on these DX11 cards, guess I don't need a second card after all.

I love tessellation, I find it to be a great alternative to anti-aliasing with little performance drop. At least not as much.
Posted on Reply
#15
Sihastru
The Witcher said:
This is stupid, our current hardware's already have hard time running DX10 applications and now they are going to release DX11, we all know that the original DX10 crysis (from the first trailers) used DX10 fully (I guess) but they had to lower the graphics because the current hardware's couldn't handle it, so what's the point in releasing DX11 other than speeding the process of releasing new hardware which are going to be extremely expensive just like the 8800 Ultra...maybe.

But to be honest I can't wait to see the first images of a DX11 game because I want to see the future of gaming :D
Features look good, but if it takes so long for game studios to adopt them as it did with DX10/DX10.1, by the time a good DX11 video card appears on the market they will be ready for DX12. And I am not talking about porting. I want pure DX11 games, that will blow my mind off. I want to see these features that ATI "invented" and "pioneered", but no one used until now, put to good use.

Anyway I remember the Microsoft Flight Simulator X DX10.1 TARGET pre-rendered images. They don't come even close to the real thing. So, be happy they are moving forward, support them by buying the next king-of-the-hill video card. I know I will. And I also know I WILL be disappointed.

It's how the world works... They give you a prequel to a cookie (DX10.1), then, when you get so close to enjoying the actual cookie (DX10.1), you see that it isn't what you hoped it would be, but it's all ok because a new cookie with hazelnuts (DX11) appears and you forget all about the last cookie (DX10.1) and all the other cookies that you were forced to swallow (<= DX9) and left you with a sour aftertaste.
Posted on Reply
#16
1Kurgan1
The Knife in your Back
Sounds great, I just want to know what card they were using for DX11, 5870? :D
Posted on Reply
#17
The Witcher
Valdez said:
There is only one dx10 game on the market (stormrise). All of the other "dx 10" titles were just dx9 ports. Yes, crysis too. Crysis is a dx9 game with a dx10 executable which doesn't use dx10 features at all.
hmmm, then how did they make this :

http://www.youtube.com/watch?v=Utz8D5aSK84

Anyway I won't buy any Ati video card until they change their VGA coolers and add physix support or whatever its called.
Posted on Reply
#18
MrAlex
The Witcher said:
hmmm, then how did they make this :

http://www.youtube.com/watch?v=Utz8D5aSK84

Anyway I won't buy any Ati video card until they change their VGA coolers and add physix support or whatever its called.
Yeah, would be great is NVIDIA let them...:shadedshu
Posted on Reply
#19
Valdez
The Witcher said:
hmmm, then how did they make this :

http://www.youtube.com/watch?v=Utz8D5aSK84

Anyway I won't buy any Ati video card until they change their VGA coolers and add physix support or whatever its called.
How did they what?

I'm not native English, thus i don't know what do you not understand on what i wrote about crysis and dx10.
Posted on Reply
#20
Mussels
Moderprator
he doesnt get the point that crysis was a DX9 game ported to DX10, instead of being made originally for DX10.


The Witcher: this is no different to how GTA IV was mode for console games, and ported to PC. If it was made for PC first, it wouldnt run so poorly, or need such excessive hardware to run.

The same is said for all DX10 games out there - if theyd made them for DX10 from the start, they would be FASTER than DX9, not slower.
Posted on Reply
#22
The Witcher
Mussels said:
he doesnt get the point that crysis was a DX9 game ported to DX10, instead of being made originally for DX10.


The Witcher: this is no different to how GTA IV was mode for console games, and ported to PC. If it was made for PC first, it wouldnt run so poorly, or need such excessive hardware to run.

The same is said for all DX10 games out there - if theyd made them for DX10 from the start, they would be FASTER than DX9, not slower.
Damn...I sounded like an ignorant guy :(

Thanks for the explanation, now I got the idea :D
Posted on Reply
#24
BumbleBee
here is a link to the presentation if you want to watch it.
Posted on Reply
#25
p3ngwin
Wile E said:
Or leaves, or any oft repeated object in any game.
i think you guys are confusing Geometry TESSELLATION with GEOMETRY INSTANCING.

GEOMETRY INSTANCING From Wikipedia:
"In real-time computer graphics, geometry instancing refers to the practice of rendering multiple copies of the same mesh in a scene at once. This technique is primarily used for objects such as trees, grass, or buildings which can be represented as repeated geometry without appearing unduly repetitive, but may also be used for characters.

Although vertex data is duplicated across all instanced meshes, each instance may have other differentiating parameters (such as color, or skeletal animation pose) changed in order to reduce the appearance of repetition. By factoring out common data between instances to achieve lower memory usage, this technique is an example of the flyweight design pattern
."

Geometry TESSELLATION From ExtremeTech:
The hull shader takes control points for a patch as an input. Note that this is the first appearance of patch-based data used in DirectX. The output of the hull shader essentially tells the tessellator stage how much to tessellate. The tessellator itself is a fixed function unit, taking the outputs from the hull shader and generating the added geometry. The domain shader calculates the vertex positions from the tessellation data, which is passed to the geometry shader.

It's important to recognize that the key primitive used in the tessellator is no longer a triangle: It's a patch. A patch represents a curve or region, and can be represented by a triangle, but the more common representation is a quad, used in many 3D authoring applications.

What all this means is that fully compliant DirectX 11 hardware can procedurally generate complex geometry out of relatively sparse data sets, improving bandwidth and storage requirements. This also affects animation, as changes in the control points of the patch can affect the final output in each frame.

The cool thing about hardware tessellation is that it's scalable. It's possible that low end hardware would simply generate less complex models than high-end hardware, while the actual data fed into the GPUs remains the same
.




much more efficient.

Shiny Entertainment did geometry sub-division in their engine for "Messiah" and "sacrifice". it was all software done by the engine.

now we get it hardware accelerated as part of the DirectX API


N.B.
i may be a little off on some parts, so anyone more knowledgeable feel free to set things straight for the greater good :)
Posted on Reply
Add your own comment