Thursday, March 7th 2013

NVIDIA Announces PhysX and APEX Support for Sony PlayStation 4

NVIDIA today announced support for Sony Computer Entertainment's PlayStation4 with the popular NVIDIA PhysX and NVIDIA APEX software development kits (SDKs). Game designers use PhysX and APEX technologies for collision detection and simulation of rigid bodies, clothing, fluids, particle systems and more across a wide range of platforms, including desktop PCs, game consoles, and mobile and handheld devices.

NVIDIA PhysX technology is the world's most pervasive physics solution for designing real-time, real-world effects into interactive entertainment titles. The PhysX development environment gives developers unprecedented control over the look of their final in-game interactivity.

Taking PhysX technology content creation to the next level, NVIDIA APEX technology lets artists create intricate physics-enabled environments. They can expand the quantity and visual quality of destructible objects; make smoke and other particle-based fluids integral to game play; and create life-like clothing that interacts with the character's body to achieve more realism in their games.

"Great physics technology is essential for delivering a better gaming experience and multiplatform support is critical for developers," said Mike Skolones, product manager for PhysX at NVIDIA. "With PhysX and APEX support for PlayStation4, customers can look forward to better games."

NVIDIA PhysX and APEX technologies are designed to run on a variety of CPU architectures and can be accelerated by any CUDA architecture-enabled NVIDIA GPU, GeForce 8-series or higher.
Add your own comment

102 Comments on NVIDIA Announces PhysX and APEX Support for Sony PlayStation 4

#1
Bunchies
isnt the ps4 using amd hardware? wait wtf why do i even care lol

just makes no sense
Posted on Reply
#2
newtekie1
Semi-Retired Folder
by: buildzoid
Yeah because so far it's implementations are limited to aesthetics if it was use for highly detailed environment destruction (Redfaction style but much bigger something like leveling a skyscraper/s) or something similar then the requirements would start stacking up.
Yeah, but your still not even coming close to approaching the 3TFLOPS the GTX680 is capable of pumping out, and in that scenario the amount of crap on the screen that would need to be rendered would bring any graphics setup to its knees so the PhysX calculations would still be a minor part.


by: Phusius
Do we even want Physx? When I tried out a GTX 680 Arkham City generally ran fine but when Physx was enable it still dipped into 20's and 30's FPS at times... and that was a 680...

You pretty much have to have a separate card for Physx to enable it on High in games like Arkham City... so I dunno.
I'd love to see both major next gen consoles supporting it, perhaps then we'd see developers actually start to use it for more than just useless smoke effects.

But in the end we're only going to see CPU driven PhysX, so meh...

And the performance issues, at least on my GTX670 were graphical, not actually caused by PhysX. There is just so much extra shit on the screen that has to be rendered it bogs down the card. Dropping a dedicated GTX470 GPU into my system for PhysX didn't help.
Posted on Reply
#3
Fiendish
It seems some people need to go back and learn the actual history behind PhysX, it might help clarify things. http://physxinfo.com/wiki/Main_Page

It should also be remembered that when Nvidia created CUDA, their competitor ATI, had already created and were pushing their own proprietary standard, Close-To-Metal.
Posted on Reply
#4
[H]@RD5TUFF
Doesn't matter the PS4 will still blow fat donkey dicks
Posted on Reply
#5
NeoXF
Holy crap, I find it amazing that you people still refuse to call the PS4 hardware what it is... an APU... meaning whatever the CPU does, the GPU can muscle it up and accelerate it. There is no "CPU and GPU", there is "APU".

There is no spoon


Edit: God this thread suuuuuuuuuuuuucks...
Posted on Reply
#6
oNyX
You mean to tell me that the Asus ARES II cannot render Mafia 2's attic dust and pre-rendered rocks that fall on the floor when you shoot a wall Just because it doesn't support PhysX? :nutkick:

I have two Nvidia cards and not once I've really needed PhysX. I still prefer CPU based physics and/or engines like Havok. When you're high on Monster energy drinks at a LAN or playing online with your console, nobody cares about PhysX. Unless some kid high on Rockstar energy drinks happens to mention it. You'll probably get a few guys high on SCORE engergy drinks ignorantly screaming "they chose Nvidia because it has PhysX" or "I chose Xbox 360 because it's got GOW3." Well congrats to you buddy. :rockout:

By the way. TressFX isn't physics like Nvidia's PhysX. It's just real-time hair and foilage rendering, the rest could be Havok powered. More and more games are falling under the AMD banner and some that used to be Nvidia, so it's clear that Nvidia is turning their attention away from the Gaming Industry and more towards them floor tiles known as tablets.

I've been a Nvidia user, but not for long. My next upgrade WILL be AMD Radeon. I have reasons other than PhysX-free, fanboynism or ignorance why I chose AMD.
Posted on Reply
#7
Rebel333
What Nvidia want with Phisiyks on PS4? If I know well there is no a little piece of Nvidia hardware in it:D.
Posted on Reply
#8
arnoo1
Wtf physx on amd hardware

I don't wanna live on this planet anymore xd
Posted on Reply
#9
Prima.Vera
by: arnoo1
Wtf physx on amd hardware

I don't wanna live on this planet anymore xd
On the contrary. Life is getting more and more interesting! ;):p
Posted on Reply
#10
TheLaughingMan
by: NeoXF
Holy crap, I find it amazing that you people still refuse to call the PS4 hardware what it is... an APU... meaning whatever the CPU does, the GPU can muscle it up and accelerate it. There is no "CPU and GPU", there is "APU".

There is no spoon

Edit: God this thread suuuuuuuuuuuuucks...
If your only comment is the thread sucks, then why are you reading it? And that is not even remotely close to how GPU acceleration works. While AMD does want to move toward of a heterogeneous processing system, we are far far away from that being a possibility. So why a single chip is housing a GPU and CPU, there are still separate in function.

by: Fiendish
It seems some people need to go back and learn the actual history behind PhysX, it might help clarify things. http://physxinfo.com/wiki/Main_Page

It should also be remembered that when Nvidia created CUDA, their competitor ATI, had already created and were pushing their own proprietary standard, Close-To-Metal.
They were working on their own proprietary standard for physics calculations. A project that was never completed as AMD decided to instead support working on a universal physics system close to metal through open standards such as OpenCL. A noble move that to this day still has not produced universal anything which is why everyone is still using Havok and PhysX were needed.

by: Bunchies
isnt the ps4 using amd hardware? wait wtf why do i even care lol

just makes no sense
You care because this is now an x86-64 based system like your computer. Using AMD designed hardware and DirectX 11 +. This means what runs on PS4, runs on PC with minor tweaks to expand CPU/GPU hardware support. This will make porting games between console and PC far easier for developers leaving them no reason not to do so with brand exclusives being the exception to that rule.

You care because this will result in our PC games (which are mostly ported from console) looking better, being less buggy, and give us more titles.
Posted on Reply
#11
newtekie1
Semi-Retired Folder
by: NeoXF
Holy crap, I find it amazing that you people still refuse to call the PS4 hardware what it is... an APU... meaning whatever the CPU does, the GPU can muscle it up and accelerate it. There is no "CPU and GPU", there is "APU".

There is no spoon


Edit: God this thread suuuuuuuuuuuuucks...
I find it amazing that people have no clue how an APU works. Just because they put the GPU on the same die as the CPU doesn't mean that the GPU can magically start doing the same work as a CPU. They are still two very different pieces of hardware that operate in two very different ways, even if they are on the same piece of silicon.
Posted on Reply
#12
theoneandonlymrk
by: newtekie1
I find it amazing that people have no clue how an APU works. Just because they put the GPU on the same die as the CPU doesn't mean that the GPU can magically start doing the same work as a CPU. They are still two very different pieces of hardware that operate in two very different ways, even if they are on the same piece of silicon.
Whist I appreciate what your saying and agree in principle when applied to pc Apu's, I don't think it a sound statement when this is a next gen Apu with universal Imc and memory, you can't say with certainty how sony and devs will use it, this could well be the true beginning of the Hsa revolution, after all sony isnt tied to using the hardware in the same way pcs do or pc os and api's, bit early to say how much the gpu might generally
Be used, you might note ,, Sony are a HSa contributer and there is still a founder spot free or maybe it isn't.
Posted on Reply
#13
tokyoduong
by: theoneandonlymrk
Whist I appreciate what your saying and agree in principle when applied to pc Apu's, I don't think it a sound statement when this is a next gen Apu with universal Imc and memory, you can't say with certainty how sony and devs will use it, this could well be the true beginning of the Hsa revolution, after all sony isnt tied to using the hardware in the same way pcs do or pc os and api's, bit early to say how much the gpu might generally
Be used, you might note ,, Sony are a HSa contributer and there is still a founder spot free or maybe it isn't.
GPU is always better at massively parallel and CPU is better at low latency, highly random, out of order instructions. You cannot change the physical design of a chip with software layers. Sony can dramatically optimize their codes and compiler to this one specific design but you will see that it will work the same way as it is currently working on the PC. The only definite conclusion we can have right now is that the console version will be much more efficient since it's a fixed spec.
Posted on Reply
#14
theoneandonlymrk
by: tokyoduong
GPU is always better at massively parallel and CPU is better at low latency, highly random, out of order instructions. You cannot change the physical design of a chip with software layers. Sony can dramatically optimize their codes and compiler to this one specific design but you will see that it will work the same way as it is currently working on the PC. The only definite conclusion we can have right now is that the console version will be much more efficient since it's a fixed spec.
Hsa is all about the integrated use of what's there , the key is unified memory and the whole point is to use what's best at the job , which means exactly that and not moving all the work to the gpu, just what its good at, easily possible with what they have.
Oh and this one specific design is not so dissimilar from the phone you might own soon , some of the biggest players in mobile chips are into Hsa as much as sony and amd, don't be blind to what agwan
Posted on Reply
#15
TheHunter
Why are people still confused by this announcement?

Its is only for PS4 CPU SDK..
Posted on Reply
#16
theoneandonlymrk
by: TheHunter
Why are people still confused by this announcement?

Its is only for PS4 CPU SDK..

No its Pr spin and Remember us man ship
Posted on Reply
#17
newtekie1
Semi-Retired Folder
by: theoneandonlymrk
Whist I appreciate what your saying and agree in principle when applied to pc Apu's, I don't think it a sound statement when this is a next gen Apu with universal Imc and memory, you can't say with certainty how sony and devs will use it, this could well be the true beginning of the Hsa revolution, after all sony isnt tied to using the hardware in the same way pcs do or pc os and api's, bit early to say how much the gpu might generally
Be used, you might note ,, Sony are a HSa contributer and there is still a founder spot free or maybe it isn't.
Next gen APU with universal IMC and memory? That is how the current APUs work. There is one IMC that controls the system memory, and the GPU and CPU access what they need. The only thing new here is that the IMC is a GDDR5 controller instead of DDR3.

There is nothing revolutionary here that will suddenly allow a GPU to do work coded for a CPU. When the CPU load gets too high the GPU can't just step in and help out like NeoFX is claiming.
Posted on Reply
#18
Slizzo
People need to remember that PhysX is a software AND hardware solution. When running on an AMD system, the game is still using a PhysX physics engine. Much like any game that uses Havok, it's a software physics engine.
Posted on Reply
#19
Prima.Vera
by: Slizzo
People need to remember that PhysX is a software AND hardware solution. When running on an AMD system, the game is still using a PhysX physics engine. Much like any game that uses Havok, it's a software physics engine.
While I agree with your statement, let's not forget that if you run Physx in software mode, you have to be prepared for some serious fps drop. Sure you can run it with any hardware acceleration, but something tells me is more then than with PS4.
Posted on Reply
#20
Fluffmeister
by: Slizzo
People need to remember that PhysX is a software AND hardware solution. When running on an AMD system, the game is still using a PhysX physics engine. Much like any game that uses Havok, it's a software physics engine.
Exactly, at least someone gets it.
Posted on Reply
#21
newtekie1
Semi-Retired Folder
by: Prima.Vera
While I agree with your statement, let's not forget that if you run Physx in software mode, you have to be prepared for some serious fps drop. Sure you can run it with any hardware acceleration, but something tells me is more then than with PS4.
No. You are still getting things confused. PhysX offers both a software and hardware solution to developers. The software solution offers a lower level of effects, it basically brings PhysX down to what Havok can do. When the developers use the software only version it runs on the CPU just fine. All games can use this mode, even ones that have the hardware accelerated solution, when a game coded with the hardware solution in mind doesn't detect an nVidia PhysX capable GPU, the game will drop down to the software mode. That is why hardware accelerated PhysX games are still completely playable on AMD hardware, the game just uses software mode. There is no FPS impact here. There are about 390 games that use the PhysX SDK, most of them use the software version only.

What you are talking about is the hardware solution being forced to run on the CPU, which nVidia gives you the option to do in their control panel, in that case performance is crap. But that isn't what software PhysX is.
Posted on Reply
#22
theoneandonlymrk
by: newtekie1
Next gen APU with universal IMC and memory? That is how the current APUs work. There is one IMC that controls the system memory, and the GPU and CPU access what they need. The only thing new here is that the IMC is a GDDR5 controller instead of DDR3.

There is nothing revolutionary here that will suddenly allow a GPU to do work coded for a CPU. When the CPU load gets too high the GPU can't just step in and help out like NeoFX is claiming.
A cpu does what it does , a gpu can do what it does this we agree and needs no further mention.
Im saying the os, and software that sony chooses to build/use might well use the gpu (where appropriate) more than is presently done . ....... how the heck can you say that's wrong do you work for sony if not you cant know.
And the revolution is via software and new instruction sets and languages made possible with slight but important hardware changes that allow them. Again something that's too new to dissmiss.
I don't need a lesson on physx or its use as ive investigated it fully despite xfire amd gfx as I have also looked into hsa etc.
Any and most chips that are made have circuits functions and whole instruction sets in them for use but that are fused off , they do this to allow multiple end uses and to pre test yield effects from new tech whos to say whats in there sony still haven't disclosed every spec imho.
Posted on Reply
#23
Bjorn_Of_Iceland
by: cadaveca
The funny thing about that response, to me, is that Sony highlighted porting physics calculation off of the CPU, and onto the GPU, in it's premiere(1 million physics objects, or something). What Nvidia is offering, seemingly, is the exact opposite.

Puzzling.

I smell a $3-$5 per-copy licensing fee for devs that use it.
Or, 2 cents for every in game pebble.

by: Phusius
Do we even want Physx? When I tried out a GTX 680 Arkham City generally ran fine but when Physx was enable it still dipped into 20's and 30's FPS at times... and that was a 680...

You pretty much have to have a separate card for Physx to enable it on High in games like Arkham City... so I dunno.
TressFX hogs the framerates as well even on an AMD gpu.. not to mention, that the performance penalty is very big given that it is only Lara's hair being rendered. I'd say we are pretty much stuck to those numbers in terms of framerate degradation as of now (well.. if they wanted to have persistent pebbles and floaty hair). Well at least nVidia gives an option to have a dedicated card to offload computing.. for physx that is.
Posted on Reply
#24
newtekie1
Semi-Retired Folder
by: theoneandonlymrk
A cpu does what it does , a gpu can do what it does this we agree and needs no further mention.
Im saying the os, and software that sony chooses to build/use might well use the gpu (where appropriate) more than is presently done . ....... how the heck can you say that's wrong do you work for sony if not you cant know.
And the revolution is via software and new instruction sets and languages made possible with slight but important hardware changes that allow them. Again something that's too new to dissmiss.
I don't need a lesson on physx or its use as ive investigated it fully despite xfire amd gfx as I have also looked into hsa etc.
Any and most chips that are made have circuits functions and whole instruction sets in them for use but that are fused off , they do this to allow multiple end uses and to pre test yield effects from new tech whos to say whats in there sony still haven't disclosed every spec imho.
Ok, I'll say it again, the GPU can't just magically do work designed for a CPU. That is what NeoFX is claiming.

Yes, Sony could program the OS to be more GPU hardware accelerated. And the game developers could leverage the GPU to do more work than just graphics rendering(but that would be stupid). However, the software would have to be designed to use the GPU architecture, and once it is designed to use the GPU architecture it likely wouldn't be able to use the CPU anymore. It is extremely difficult, to design software that can use both an x86 architecture and a GPU architecture to do the same work at the same time. It might even be impossible, I've certainly never seen it.

And I'm not lecturing you on PhysX, was talking to someone else. For someone that claims to know so much and be so smart I'd think you'd be able to grasp the concept that when I quote someone and then put a response under it I'm talking to that person and no you. Every response in the thread isn't a response to you, I hate to burst your bubble, but you aren't the center of the universe. The universe does not revolve around you.
Posted on Reply
#25
nt300
Good luck nv, it aint happening. AMD has its own solutions with its own partners.
by: Bunchies
isnt the ps4 using amd hardware? wait wtf why do i even care lol

just makes no sense
Because AMD will have control and a strong say about gaming from now on, they run al the new consoles now.
Posted on Reply
Add your own comment