1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The DirectX Performance Overhead

Discussion in 'Graphics Cards' started by HalfAHertz, Mar 19, 2011.

  1. HalfAHertz

    HalfAHertz

    Joined:
    May 4, 2009
    Messages:
    1,886 (0.97/day)
    Thanks Received:
    378
    Location:
    Singapore
    No they'll only have to code it for each architecture. Nowadays both Nvidia and Ati use some form of a scalar architecture, meaning that the lower performance cards are derived from the higher performing cards. So if we say that the game devs needs to support cards 2 generations back, that'd mean 6 profiles at the most. It does sound like a lot but is not unrealistic at all.
    And unlike on the consoles, you'd still have control over all the graphical settings and will be able to scale them up or down individually according to your needs.
    Still I'm speculating now and we won't know for certain until somebody tries it.
  2. W1zzard

    W1zzard Administrator Staff Member

    Joined:
    May 14, 2004
    Messages:
    14,788 (3.93/day)
    Thanks Received:
    11,490
    even within the same architecture there are lots of differences between the gpu models, look at all the updates gpuz needs to support new cards.

    what developers would have to do is the combined job of ati, nvidia, intel, s3 and microsoft: write a graphics driver, write a directx with the features they need.

    no developer can afford that.

    also there is very little public info on how to program modern gpus, look at the efforts of the linux community to make an accelerated 3d driver, that's what every games studio would have to do
    mastrdrver, yogurt_21 and qubit say thanks.
  3. HalfAHertz

    HalfAHertz

    Joined:
    May 4, 2009
    Messages:
    1,886 (0.97/day)
    Thanks Received:
    378
    Location:
    Singapore
    Keep in mind we're talking about an imaginary situation. I completely agree with you.Now, currently there is no info and support for such a thing. But that's because there isn't a need(read market) for it. However if things were different and if we imagine that there was no Dx, i bet there would be other languages, tools and kits to take over and speed the design process up and lots and lots of support from the hardware producers.
  4. inferKNOX

    inferKNOX

    Joined:
    Jul 17, 2009
    Messages:
    899 (0.48/day)
    Thanks Received:
    118
    Location:
    SouthERN Africa
  5. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    19,838 (6.19/day)
    Thanks Received:
    5,943
    Yeah, I stopped reading after that, because the idiot writing this article has no idea what he is talking about.

    Does he not realize that consoles render all the games at 720p and upscale?(Some games even lower, I'm looking at you here FFXIII on Xbox.) Shit, a 9800GTX can max out Crysis at 720p, and the graphics do blow the doors of anything on a console.
    Crunching for Team TPU More than 25k PPD
  6. Captain.Abrecan

    Captain.Abrecan New Member

    Joined:
    Oct 21, 2010
    Messages:
    175 (0.12/day)
    Thanks Received:
    64
    Location:
    MA
    Yeah, he also contradicted himself. "They look better" then "dont beat console graphics"...
    what a jackass:shadedshu
  7. crazyeyesreaper

    crazyeyesreaper Chief Broken Rig

    Joined:
    Mar 25, 2009
    Messages:
    8,137 (4.10/day)
    Thanks Received:
    2,778
    Location:
    04578
    well hes semi right look at most games today,

    Darksiders,

    almost any unreal engine based game almost all games come with a texture upgrade and you can tweak features also sure you get higher resolutions but the change is tiny in many games now because well devs have a large install base happy with the good enough graphics, sure we want more and in some titles we get it but PCs are such and after thought now that most hardware is irrelevent, Crysis 2 can be maxed at 1920x1200 on a 9800GTX in DX9 sure we get dx10 or 11 later but not at release, thats more what its targeting,

    we have the hardware today it can do amazing things but how many games are actually amazing, list me some games are fundamentally in both graphics AND gameplay that are phenominal that use the hardware we have to there abilities that showcase something new. hes not so much a jackass as being a realistic.

    sure are graphics look better but they also dont truly beat consoles either,

    look at the games these days that lack features like Anti Aliasing from the get go or they remove features and lock them certain vendors and those features work console side but not PC side look at the bigger picture, side by side PCs win but in many games these days owning a console and a high end rig, i can see what realistic settings are for most users and in that situation the games play worse look the same and in general feel like shit ports.

    you can get mad about it or deal with it, the situation it was it is. graphics world is stagnant.

    i pay to play with maxxed out effects but i can tell you in many titles my hardware is utterly wasted and worthless and as newtekie knows a fun experience is doable on much more modest means, and sure we can raise the resolution but highre resolutions dont cover up console ports lacking textures, etc. Even bigger games today start console first PC 2nd if we are lucky. and besides

    supposedly all this posturing in the article is words taken out of context granted i dont buy that but overall, AMD isnt wrong but there not right either. there exceptions to the rule, but for just a moment think about it if you could right code and render out graphics straight to the GPU with no API in the way no cumbersome overhead, how far could they raise the bar with near 90% effieciency instead of the maybe 30-40% we get if lucky.
    Last edited: Mar 24, 2011
  8. ctrain New Member

    Joined:
    Jan 12, 2010
    Messages:
    393 (0.23/day)
    Thanks Received:
    72
    I don't know where people are getting these insane overhead numbers from, it's already possible to push right up against theoretical hardware limits in contained situations. Can you get into these insane every unit / subunit utilized to the max at all times scenarios on PC? No, doubtful you will ever. It's just no feasible, low level access or not.


    And incredibly enough if you pay no attention whatsoever and try a naive port for an engine designed from the ground up for a completely different hardware architecture it might run like shit for seemingly no good reason. It's not the API's fault, it's the developers for simply not giving a shit.

    Look what happens when it's the other way around. Valve games ran like fucking shit on the consoles and pull about 3 million fps on any remotely decent computer. Even 360 -> PS3 ports don't fare so well often. Oh man low level access totally saved black ops from being 20+ fps lower at times vs the 360 version. Capcom's engine / games are ported from PC to consoles and it shows. The console versions struggle to maintain 30 fps (with the PS3 chugging per usual over 360) meanwhile I plow out over triple that framerate at far high settings.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page