Discussion in 'Graphics Cards' started by HalfAHertz, Mar 19, 2011.
Found an interesting read about what developers think of the current state of DirectX
I didn't see much fro mdev's there, all I saw was a bunch of AMD's Richard Huddy posturing. There one bit at the end from Crytek, justifying Richard's perspective.
That said, I agree, but maybe the approach taken by these guys isn't the best out there, and they look to be just wanting to make it far more complex, not easier, as they might suggest.
Not one mention of the benfits, really, that DirectX provides, either. Very one-sided article, for sure.
read that yesterday..i think it was on the [H].
id have to agree. directx is what allows us to play so many games with so many hardware configurations. its no secret that one of the few disadvantages of pc gaming is those multiples of hardware configurations. for pc gaming directx is here to stay as long as we all build our own computers with different(and upgradeable!) parts.
From what I understand, some devs are not happy that the rendering path is entirely dependent on the CPU, meaning that for each object that needs to be rendered, the GPU has to wait for the CPU to feed it data. So they would like an alternative, something like a kernel that runs on the GPU itself and bypasses the CPU for the fetch and decode instructions.
Personally I find it kind of weird considering that the majority of games are written for Dx9 and very few games really stress your CPU.
the real reason why we get crappy pc games is because ps4 and the xbox 360 successor aren't out yet. that's the sad truth. pc gamers don't mean anyting to developpers. if they cared about us, all games would support dx11. i don't think that anything is going to change until the next gen consoles.
it's also clear that amd and nvidia are smalltime companies comparing to microsoft and sony.
the gpu manufacturers should pay more to the developpers so that they design games that can really take advantage of the horsepower of the new videocards. that would encourage more people to upgrade their sistems and it would eventually end up in bigger sales in pc titles too.
but most of the developpers are in it just for a quick buck, take crytek for example
Wow, great find there. I always knew that DirectX took a heavy performance hit, but didn't know it was quite this heavy. :shadedshu
I first noticed it a decade ago when comparing Unreal Tournament on my Voodoo 3. Using Glide gave about twice the performance of DX7. Similar thing with Half-Life and OpenGL v DX. It can be the difference between doing 30-45fps and 60-80fps ie juddery and rubbish or liquid smooth.
Also, despite those consoles having a lower spec than PCs, I did wonder why the few times I saw the games, they appeared to kinda have more wow factor, despite this. Well now I know: it's the difference between 3000 draw calls on a PC compared with 20000 on a console. That's a vast difference.
Unfortunately, as others have pointed out, you need the API for stability and compatibility across a wide range of hardware. So that's the price we pay, I guess.
I'd just love to see what a GTX 580 can do maxed out, without this overhead, wouldn't you?
or any card for that matter its really sad but theres not enough money to push development at this point, theres no point when most of the money is elsewhere(not gonna say it).
Imho the guy is right theres only a few engines that games actually use and it seems to me that most are done with unreal engine, and there are only 2 choices in the gfx card dept so how hard would it really be to optimise per gpu ctm stlye then reuse that engine many times they do it anyway the lazy gets lol.
and / or couldnt microsoft rewrite the dx core with less reliance on cpu and possibly ditch all the old code kept in there for old game compatibility instead of building shit on top of shit and hence pass us back the performance weve all paid for.
i dont see why you cant run seperate DX code in an OS,
example DX9-11 could be legacy code when a game runs os detects the DX needed correct? run that specific code then move on to DX12 which offers low lvl support,
i see no reason they 2 cant run side by side and offer both sides of the coin, its all coding and software, you can do anything with that if your willing to put in the time and R&D, after all no one wants to sacrifice full backwards compatibility so again, have both code bases 1 is CPU dependent aka the DX9-10-11 code path, and DX12 can be based on a kernel ment to run on GPUs the solution dosent seem all that hard to me, the hard part is getting every giant corporation to agree and go along with it.
I mean think about it, we use to run GLIDE, Open GL, Direct Along side each other, why not 2 different versions of Direct X, 1 cpu driven the other low lvl hardware access to GPUs as the article was stating. Seems like a quick fix but one that makes everyone happy,
PC devs can push the boundries, console devs and can still port there shit, and old games will still run seems like a win win from perspective
Lets say they can code that way, would they need to optimize the low level code to run on every architecture and their variations? (G80, G92, etc, RV680, RV770, RV870, etc, etc, etc) and don't forget about the mid and low end cards. And when a new card is out they would need to release a patch to support it.
It would be awesome, but devs don't bother to optimize the games now, i don't see them doing lots of R&D so the game runs on a wide variety of VGAs.
If DirectX is the problem, AMD, Nvidia and Microsoft should form a Team to develop a better DirectX,
DirectX – You Can’t Live With it, You Can’t Live Without it
Ok, I read the article start to finish. It gives a rather skewed view. The "draw calls" thing...
totally this ^
DirectX and OpenGL work exactly the same here - rooted in the days when CPUs *were* much more powerful than graphics chips; the mid 1990s. The CPU actions all rendering tasks in batches. The GPU then crunches out textured polys. Hardware T&L shifted the work to GPUs a little (geometry to perspective-view calcs), as do shaders (maths-based effects), but CPU software still controls the pipeline, and feedback routines from the GPU are a major interruption.
Anyhow the solution is to build an engine that puts more work into fewer batches. Of course that's not always convenient or easy, but it does work. That is where the article is most misleading. A console game engine that relies on sending many small data sets - which is ok using some custom hardware route on PS3, XBOX - is gonna chug when the CPU must get involved in a PC environment. Really I think that's more of an argument against porting console engines without majorly modifying them.
Increasing the efficiencies of DX and OpenGL would of course be a good thing, but to 'Make the API go away.' as the AMD guy in the article says, would be insanity from my POV as a programmer. I mean image COD:MW5 or something saying "works on GTX 7XX and Radeon 20XXX only" because it was written closely around those chips. Not to mention that devs don't have the resources to basically code drivers into their engines. Oh, and the security risk (i.e. crashes, memory corruption).
That was tried with DX10. nVidia didn't want to wait, and we got DX9c instead. Consoles use these chips, slightly modified, but you can almost literally blame the problem with performance disparity between platforms entirely on nV pushing out capable hardware too early.
The flip-side, of course, is that consoles are still doing as well as they are thanks to nV pushing forward-looking technology earlier.
Plain and simple, nV officially stated, Microsoft doesn't know hardware, so what Microsoft wanted to do with DX10 just wasn't the right way to do things.
..and here we sit. Exponential real-world hardware performance increases, usable performance increase down to outright mediocre.
nVidia has then gone on to exploit what Microsoft put forward in 10.1, and said screw you, Microsoft, once again, and that inititive failed yet again.
Now we have CUDA, and nVidia has effectively held back Microsoft from implementing what truly is the proper platform.
So, now, nvidia hold patents that prevent other makers from implementing the same sort of software-hardware integration using DX(CUDA API library), so DX has to go, so that nVidia no longer holds so much control.
the problem is that the replacement...it's gonna be filled with the same problems, no matter what.
its just propaganda .. amd is looking for a "directx 12" that they can use to promote their new products.
back to close to metal development means going back to the times of DOS. every developer had to write their own engine, own directx, own hardware support. windows gaming & directx is successful because everybody with any budget can produce titles
for every serious developer the bottom line of this article is "screw the pc and develop for consoles" - which is what almost everybody is doing
They never tried to seriously develop a decent DirectX, all you said is true, but it would help all of them to sell more. I'm not going to replace my 5850 anytime soon, i mean a 3-4 year old 8800GT/GTS/GTX can run Crysis 2 quite well according to Nvidia (i have one and it's true). Now we have a new enemy to blame apparently, DirectX and Consoles.
But as i said before: DirectX – You Can’t Live With it, You Can’t Live Without it
oh and has AMD/ATI ever succeeded in bringing any graphics API/interface/paradigm to market successfully ? I can only think of the failures
Making a new API would just give you new overhead.
It seems to me that either you write your own optimized API for each game engine or you suck it up and use DirectX.
Perhaps we should lay down a gaunlet for AMD's Richard Huddy: get a crack team of coders and some top notch hardware - in one fixed configuration, and see how soon AMD can create an engine which runs "on the metal".
I mean that's basically what Sony, Nintendo, and MS do "do" with consoles ...and it takes them 5+ years between generations.
I kinda get the feeling the guy probably didn't mean exactly what he said
Yeah, AMD/ATi, seemingly, lack in the software department, yet at the same time, nV truly excels at it. They've managed to leverage this ability in thier staff time and again, and then we hear the calls the change it all, os that someone else might have a chance at the same success.
I kinda wish that Microsoft would leave the API-development to nVidia only, nVidia would leave the hardware to AMD/ATI and Intel only(who liscence the API from nV), and everyone would end up happy.
Dunno how well that would work business-wise, but with Mr J.H. saying nV is a software company, it kinda seems like the direction it's headed anyway.
What I'd like to see is a stripped down modular API that can be customized according to what you need, so that every dev can make their own version and only add the things he needs for realizing his particular vision.
HHardware accelerated upscaled video and a 3D title while using two sound devices, one on the gpu rendering both items with no issues?
Consoles can't do that, graphics cards can.
The API is old and cart in front of the horse, buut it allows for alot that we want a PC to do.
an interesting read, shows the eventual path GPU's will take.
When game developers treat PC games like consoles then is it any wonder you get articles such as these? The idea that programming directly to hardware for PC games would be something "good" over DX is suspect to the intent of the such a statement.
Does anyone really think if it was this bad that DX would have lasted this long? The only thing that makes sense to me as to why ATI (of all companies only Intel has a worse track record with drivers) would suggest such a thing is to some how help move along OpenCL. Just the nightmare that would ensue from no DX API on a PC would end in a subsequent nightmare for the driver team.
One thing Bit-Tech fails to point out in the article is that console games are heavily CPU intense because the GPU power is criteria 2003 (roughly). When games get moved over to the PC, the excessiveness of the power available begs the question of why take time to tweak the game (plus most sales are seen on consoles)? GTA IV was probably the most recent fail of this approach that I had the displeasure of taking part in when the game came out.
Not to defend ATI (because I agree with you) but nVidia (at best) buys APIs. Though at least they can maintain them.
I agree with Wiz, it's just marketing BS. They are trying to talk up a new wave of sales. Perhaps its a DX12 or perhaps its just consoles.
THe problem that AMD and nvidia have is that only a small percentage of games really need much GPU power at all and the majority of games will soon be easily available via the cloud. If the average joe gamer can play 80% of his games using an integrated intel unit and a subscripton to a live service, where will the console sales go? Phones?
The GPU manufacturers need get used to the idea that it might only be hard core PC based DX buyers left in the market in a few years time, a thought I'm sure they dread. In the meantime, why not make up a story about DX11 being crap to dicourage game developers from leaving the herd.
wow just wow? direct x is holding us back?? wtf lol
Separate names with a comma.