Raytracing is so expensive even with Ampere's supposedly 4x DXR performance increase, we're still looking at faking it being a decent option.
I always like to refer back to
this video, when BF5's raytracing was at its highest quality. DICE later improved performance by dialling the RTX quality back a bit, and the patched version was definitely worth the small fidelity loss for such a significant performance increase.
I mean, even when raytracing settings were set unrealistically high by DICE - so high that a 2080Ti was required to hit 60fps at 1080p - it was still only marginally better than faking it with shaders. Yes, if you stopped playing the game and actually just zoomed in on fine details, the DXR renderer was better looking. It's just that the cost was too high for such a subtle improvement.
You only have to play QuakeII RTX and experiment with the temperal filtering, GI ray count, and de-noiser to get an idea of just how basic an aproximation of raytracing current DXR implementations are. There's almost as much fakery and guesswork going on with DXR as the shader-based fakery we're already used to.