Hairworks is literally only in 18 games (
https://steamdb.info/tech/SDK/NVIDIA_HairWorks/), discounting the modkits, demos, and betas. Only three games of note: Witcher 3, Final Fantasy XV, and Far Cry 4. There is absolutely no way it's a "standard." I've never even seen it discussed outside of Witcher 3.
PhysX is just a physics engine, and runs on the CPU in 99.9% of implementations. Nvidia even made it open-source! There's far more games than you think that use PhysX, it's a solid physics engine, and even the default engine in Unity. The GPU-accelerated effects are entirely OPTIONAL, and would not have been possible at the time (c. 2010) without the work Nvidia put into CUDA. The two choices weren't PhysX effects or something else, the options were PhysX effects or there's literally no other way to run those complex particle effects on the computers of that time. AMD never bothered to create a unified compute language for their GPUs like CUDA, so it's entirely their fault that AMD GPUs couldn't be used for GPU-accelerated effects (or a bazillion professional programs, e.g. Blender). And again, GPU-accelerated effects are optional. If you're referring to dropping 32-bit support for Blackwell, blame the developers for never releasing a 64-bit binary of the game. Nvidia can't be bound to support antiquated code until the end of time just so you can have more smoke in your favorite Arkham game.
Raytracing is how literally all lighting is computed in video games, do you understand that? Baked lighting is just ray tracing done ahead of time by the developer, with lots of tricks and cheats developed over the years to make it look as realistic as possible when the player is in the game and changing the environment. Real-time ray tracing was always going to be the future of video games when GPUs were capable enough, it's entirely AMD's fault that they are asleep at the wheel and still haven't bothered to build a full ray tracing core (lots of the RT pipeline is still done on shaders, even in RDNA 4).