1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

(We introduce) CPUGFX - Using CPU Cores to improve visual experience

Discussion in 'Programming & Webmastering' started by minx, Aug 8, 2014.

  1. minx

    minx

    Joined:
    Aug 26, 2013
    Messages:
    38 (0.09/day)
    Thanks Received:
    40
    Location:
    Deutschland
    Hi!

    Currently, we're devloping an experimental framework called CPUGFX. CPUGFX is a framework which enables developers to utilize a custom number of CPU cores exactly the same way as a GPU.

    An example configuration:

    [​IMG]

    Assume a PC using an Octacore CPU runs a game that puts nearly 100% load on the GPU, but not on the CPU. You can now use CPU cores which are in idle mode to calculate additional, non-crucial, effects to improve the players visual experience. CPUGFX' performance (what a surprise) scales with the number of cores available.

    Shaders and Effects which should be calculated using the CPU are just passed to the CPUGFX framework which takes care of translating the shader source (e.g. GLSL) to CPU instructions. CPUGFX adds no overhead once the application is compiled, because the shader will be precompiled.

    Small and fast Shaders that rely on CPU-based Physics can be offloaded to CPUGFX in order to eliminate GPU-CPU communication overhead (expect for APU / Integrated Graphics).

    Here is a famous GLSL shader, compiled by CPUGFX, that creates a realistic iris (this is just a snapshot, the iris reacts to light exposure when used ingame):

    [​IMG]

    Following is a sandbox test of the CPUGFX iris shader in a DirectX test enviroment. This test uses the CPU simultaniously for both smoke physics and the iris shader:

    [​IMG]
    (we know that this is not a realistic scene :rolleyes: )

    The state of this project is: It works in the most cases. :D That means, we have a lot to improve, before we can create a serious tech demo, but we are not that far away. :cool:
     
  2. OneMoar

    OneMoar

    Joined:
    Apr 9, 2010
    Messages:
    3,391 (2.07/day)
    Thanks Received:
    873
    Location:
    Rochester area
    software rendering you aren't serious .... rofl
     
    hellrazor says thanks.
  3. minx

    minx

    Joined:
    Aug 26, 2013
    Messages:
    38 (0.09/day)
    Thanks Received:
    40
    Location:
    Deutschland
    We are. Using spare power to improve on performance is a serious topic.
     
  4. OneMoar

    OneMoar

    Joined:
    Apr 9, 2010
    Messages:
    3,391 (2.07/day)
    Thanks Received:
    873
    Location:
    Rochester area
    modern game engines such as unreal 4 and frostbite 3 already do this to a extent
     
  5. minx

    minx

    Joined:
    Aug 26, 2013
    Messages:
    38 (0.09/day)
    Thanks Received:
    40
    Location:
    Deutschland
    I'm aware of that. This is just a new way to do this and it's easy to include this functionality in all sorts of engines. Not only game Engines, but every DirectX or OpenGL Engine. We're developing on a lower level than I think you think we are. We are developing Driver plugins, modifying the underlying mechanisms of every engine.
     
    AthlonX2 and Solaris17 say thanks.
  6. AthlonX2

    AthlonX2 HyperVtX™

    Joined:
    Sep 27, 2006
    Messages:
    7,179 (2.45/day)
    Thanks Received:
    1,664
    Dont mind OneMoar he is a troll. :)
     
  7. minx

    minx

    Joined:
    Aug 26, 2013
    Messages:
    38 (0.09/day)
    Thanks Received:
    40
    Location:
    Deutschland
    Well I hope he isn't, judging by his post count :p </sarcasm>.

    We're looking forward to release a public tech demo, but as of now, the running examples have way too much dependencies to be run on another machine ;) . Plus: We're coders, not 3D artists, so it takes a bit longer to produce something visually appropriate :D
     
  8. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,279 (11.59/day)
    Thanks Received:
    9,563
    until this actually works, you're mostly going to get trolls on the internet. too many companies with genius ideas generating hype for investors, then nothing ever comes of it.
     
    Pehla and OneMoar say thanks.
  9. minx

    minx

    Joined:
    Aug 26, 2013
    Messages:
    38 (0.09/day)
    Thanks Received:
    40
    Location:
    Deutschland
    Like money is involved in this :laugh: . That's rather a do-because-we-can and I don't think you'll see this officially supported by anyone or anything ever. We do this in our free time and we'll open-source the result. :)
     
    MxPhenom 216 says thanks.
  10. MxPhenom 216

    MxPhenom 216 Corsair Fanboy

    Joined:
    Aug 31, 2010
    Messages:
    10,001 (6.70/day)
    Thanks Received:
    2,244
    Location:
    Seattle, WA
    This sounds like something Nvidia would try to do. Cool.
     
  11. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,279 (11.59/day)
    Thanks Received:
    9,563

    wasnt physX the exact OPPOSITE of this? XD
     
    minx says thanks.
  12. Aquinus

    Aquinus Resident Wat-man

    Joined:
    Jan 28, 2012
    Messages:
    6,352 (6.49/day)
    Thanks Received:
    2,118
    Location:
    Concord, NH
    I know I'm late to the party, but I felt I should comment.

    It's an interesting idea, but I'm not sure if the added complexity is going to actually yield tangible benefits. Shaders and GPGPU works well because they're all exactly the same, there are a ton of them, and they're very close together on the die. If you start adding a CPU into the mix, you're adding coordination overhead, you're adding frame latency to compose the entire scene, and you're in general making the engine bigger (code wise) and harder to change.

    I personally would advocate for a more simple 3d engine because as it stands right now, programming OpenGL in any language other than C/C++ is a bear and using XNA complicates Linux support.

    As a developer, I would like a 3D library that's simple, fast, and has an API that matches the paradigm of the language it's written in. Personally, I develop Clojure which is a JVM language with all the libraries that the JVM has to offer in addition to Clojure libs. Unfortunately there are very few OpenGL wrapper libraries for Java and the ones do exist are a one-for-one mapping of OpenGL C functions to Java, which is sub-optimal.

    Sometimes making the engine (or any library of piece of software) more complex is only a hindrance to the developer who has to work with it.

    With that all said, I hope you gain something out of this but I personally don't think it will yield the results you intend to gain. Good luck. :toast:
     
    Last edited: Aug 30, 2014
  13. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,279 (11.59/day)
    Thanks Received:
    9,563
    Any updates on thhis Minx?


    Or is this turning into another vaporware thread...
     
  14. minx

    minx

    Joined:
    Aug 26, 2013
    Messages:
    38 (0.09/day)
    Thanks Received:
    40
    Location:
    Deutschland
    Nope. I'm in hospital right now because of some serious eye sickness. Thus i can't stare at my screen as long as I'd like to. Sorry for any delays, but yes, we are still working on this.
     
  15. Suka

    Suka

    Joined:
    May 6, 2013
    Messages:
    54 (0.11/day)
    Thanks Received:
    7
    Location:
    Steam Servers
    Well get well soon :peace: and looking forward to the final product
     
    minx says thanks.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page