• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Demonstrates World’s First Microsoft DirectX 11 Graphics Processor

tesselation is a great feature.

ATI cards have supported it for a long time - there was a beta version of it that worked on ATI hardware in the original far cry. THAT long ago.

The reason it never took off (like DX10.1) is because Nvidia never adopted it.

Whether or not ATI's existing tesselation is compatible with what they used in DX11 is up for debate, no one really knows yet.
 
360 has the tessellation option as well. Just an fyi.
 
Whether or not ATI's existing tesselation is compatible with what they used in DX11 is up for debate, no one really knows yet.

They are not compatible.


Battlefield: bad company 2 is a dx10.1 title with ati's tessellation support (2xxx,3xxx,4xxx series and xbox). They ported it to dx11 just to "import" dx11 tessellation, which can be used by gt300 (and rv870).
They don't use any of dx11 features, only the dx11 tessellation.

Slides:
http://hardocp.com/news.html?news=Mzk5NTAsLCxoZW50aHVzaWFzdCwsLDE=
 
They are not compatible.


Battlefield: bad company 2 is a dx10.1 title with ati's tessellation support (2xxx,3xxx,4xxx series and xbox). They ported it to dx11 just to "import" dx11 tessellation, which can be used by gt300 (and rv870).
They don't use any of dx11 features, only the dx11 tessellation.

Slides:
http://hardocp.com/news.html?news=Mzk5NTAsLCxoZW50aHVzaWFzdCwsLDE=

i hope games use ATI's tesselation as well as the DX11 one. They could possibly GPGPU some of it.
 
from what i've pieced together, anyone with DX9 hardware or above can run a DX11 title - unsupported features are just disabled (and will therefore hurt graphics, or performance)

So if you use a DX10 video card, you get DX10 level graphics and performance. If you're an ATI user, you get DX10.1 and get the far better antialiasing. If you're a DX11 user... well, you get the whole thing.


As far as i know dx11 is backward compatible to dx10, not to dx9.
So dx10.x cards could run dx11 titles (with some eye-candy and performance loss), but dx9 cards couldn't.
Just like today, you need a dx10 card to run dx10 titles.
Anyway dx9 only cards are too weak for today and future titles.


However with software rendering you can render dx10/11 games on cpu (with a dx9 card), but it will be very slow.
http://msdn.microsoft.com/en-us/library/dd285359.aspx
 
that picture really doesn't impress me much, show me the GPU, show me its specs, and show me a fully rendered DX11 Scene already!

Nvidias first Dx11 GPU should be a creature of might too, cant wait for more than buzz words to surface.
 
http://www.anandtech.com/video/showdoc.aspx?i=3573

We can ascertain from the wafer shot AMD provided us that there are about 19.5 dies vertically and 25.5 dies horizontally. As this is a 300mm wafer, we can sort of "guess" the dimensions of the chip at roughly 15.38mm x 11.76mm resulting in a die area of a little over 180mm^2. This is much smaller than the 55nm RV770 which is a 260mm^2 part

Compare this die size to RV740 that weighs in at 137mm^2 and 826 million transistors, and we can estimate very loosely that Evergreen could come in at something over 1 billion transistors.
 
I was playing a liittle bit with dx10 sdk and the way it works is developers have set of routines to perform hardware check be it dx9, 10 or 11. Since dx11 is evolution od dx10 (or super set) it might not support dx9 due to a different routines syntax and coding (dx9 routine invocation might have different syntax than say dx11 one or be competely obsolete)
Also about dx10 and 11 compatibility and convertion say tessellation unit for example. Dx10 sdk requires developer to have supportive hardware (which is not there) or create software routine on his own while dx11 will have both already. So lets say dx10 developers designed software tessellation unit. Now the only thing they have to do is to invoke sdk routine instead. Simple isn't it.
 
This is stupid, our current hardware's already have hard time running DX10 applications and now they are going to release DX11, we all know that the original DX10 crysis (from the first trailers) used DX10 fully (I guess) but they had to lower the graphics because the current hardware's couldn't handle it, so what's the point in releasing DX11 other than speeding the process of releasing new hardware which are going to be extremely expensive just like the 8800 Ultra...maybe.

But to be honest I can't wait to see the first images of a DX11 game because I want to see the future of gaming :D
 
This is very good news. I guess it this news goes hand in hand with the announcement of AMD/ATI's development and late fall release (I'm guessing) of the R800, RV870, RV840 and RV810 GPU's in the 5000 series.

I'll be working overtime to get my hands on one of those babies...I'm wondering if Crysis 2 will have support DX11 fully.
 
This is stupid, our current hardware's already have hard time running DX10 applications and now they are going to release DX11, we all know that the original DX10 crysis (from the first trailers) used DX10 fully (I guess) but they had to lower the graphics because the current hardware's couldn't handle it, so what's the point in releasing DX11 other than speeding the process of releasing new hardware which are going to be extremely expensive just like the 8800 Ultra...maybe.

But to be honest I can't wait to see the first images of a DX11 game because I want to see the future of gaming :D


There is only one dx10 game on the market (stormrise). All of the other "dx 10" titles were just dx9 ports. Yes, crysis too. Crysis is a dx9 game with a dx10 executable which doesn't use dx10 features at all.
 
Omg! Look at those shaders. Finally I can see the horizon on these DX11 cards, guess I don't need a second card after all.

I love tessellation, I find it to be a great alternative to anti-aliasing with little performance drop. At least not as much.
 
This is stupid, our current hardware's already have hard time running DX10 applications and now they are going to release DX11, we all know that the original DX10 crysis (from the first trailers) used DX10 fully (I guess) but they had to lower the graphics because the current hardware's couldn't handle it, so what's the point in releasing DX11 other than speeding the process of releasing new hardware which are going to be extremely expensive just like the 8800 Ultra...maybe.

But to be honest I can't wait to see the first images of a DX11 game because I want to see the future of gaming :D

Features look good, but if it takes so long for game studios to adopt them as it did with DX10/DX10.1, by the time a good DX11 video card appears on the market they will be ready for DX12. And I am not talking about porting. I want pure DX11 games, that will blow my mind off. I want to see these features that ATI "invented" and "pioneered", but no one used until now, put to good use.

Anyway I remember the Microsoft Flight Simulator X DX10.1 TARGET pre-rendered images. They don't come even close to the real thing. So, be happy they are moving forward, support them by buying the next king-of-the-hill video card. I know I will. And I also know I WILL be disappointed.

It's how the world works... They give you a prequel to a cookie (DX10.1), then, when you get so close to enjoying the actual cookie (DX10.1), you see that it isn't what you hoped it would be, but it's all ok because a new cookie with hazelnuts (DX11) appears and you forget all about the last cookie (DX10.1) and all the other cookies that you were forced to swallow (<= DX9) and left you with a sour aftertaste.
 
Sounds great, I just want to know what card they were using for DX11, 5870? :D
 
There is only one dx10 game on the market (stormrise). All of the other "dx 10" titles were just dx9 ports. Yes, crysis too. Crysis is a dx9 game with a dx10 executable which doesn't use dx10 features at all.

hmmm, then how did they make this :

http://www.youtube.com/watch?v=Utz8D5aSK84

Anyway I won't buy any Ati video card until they change their VGA coolers and add physix support or whatever its called.
 
he doesnt get the point that crysis was a DX9 game ported to DX10, instead of being made originally for DX10.


The Witcher: this is no different to how GTA IV was mode for console games, and ported to PC. If it was made for PC first, it wouldnt run so poorly, or need such excessive hardware to run.

The same is said for all DX10 games out there - if theyd made them for DX10 from the start, they would be FASTER than DX9, not slower.
 
he doesnt get the point that crysis was a DX9 game ported to DX10, instead of being made originally for DX10.


The Witcher: this is no different to how GTA IV was mode for console games, and ported to PC. If it was made for PC first, it wouldnt run so poorly, or need such excessive hardware to run.

The same is said for all DX10 games out there - if theyd made them for DX10 from the start, they would be FASTER than DX9, not slower.

Damn...I sounded like an ignorant guy :(

Thanks for the explanation, now I got the idea :D
 
Last edited:
here is a link to the presentation if you want to watch it.
 
Back
Top