Tuesday, March 28th 2023

Forza Horizon 5 Gets DLSS 3 Update

NVIDIA has announced that Forza Horizon 5 has finally got the DLSS 3 and NVIDIA Reflex update. Last week, DLSS 2 was available for Deceive Inc. and Tchia!, and DLSS 2 is also coming to Gripper, The Last of Us Part I, and Smalland: Survive The Wilds next week.

According to NVIDIA, with the DLSS 3 update, Forza Horizon 5 can now run at over 120 frames per second at 4K/UHD resolution on RTX 40 series, including the RTX 4070 Ti. According to the NVIDIA benchmark, the game was running on Intel Core i9-12900K, 32 GB of RAM and at Super Resolution Performance Mode with maximum settings and extreme ray tracing mode. With DLSS 3, even the RTX 4070 Ti hits an average of 125.2 frames per second.
The NVIDIA DLSS 3 and Reflex update for Forza Horizon 5 comes just in time for the Rally Adventure expansion, which is launching tomorrow, March 29th. This is the second expansion for Forza Horizon 5, and brings an entirely new location, the rugged and biome diverse Sierra Nueva. The new expansion is included in the Forza Horizon 5 Premium Add-ons Bundle, the Premium Edition and Expansions Bundle, and also available as a standalone purchase at $19.99.

In addition to the Forza Horizon 5 DLSS 3 update, NVIDIA is announcing DLSS 2 support for three more games, Gripper and Smalland: Survive the Wilds, both launching tomorrow, March 29th, and The Last of Us Part I, which launches today.

NVIDIA has also updated the Image Comparison and Analysis Tool (ICAT), a neat tool used to analyze image quality between screenshots and videos, which now gets support for HEVC format videos, as well as an ability to export the analysis as a video file.



Source: NVIDIA
Add your own comment

12 Comments on Forza Horizon 5 Gets DLSS 3 Update

#1
zo0lykas
WHY they compare DLSS off against 3.0 version, compare to the last gen, and show your(nGridia) improvements!
Posted on Reply
#2
CyberCT
I just tried DLSS3 for the first time with Cyberpunk on my 4090. While I did notice a minimal addition to input lag, having the game run at 4K maxed out at 120fps most of the time on the projector was awesome. I didn't notice any graphic artifacting. I use a controller for that setup. For a competitive game IDK how well DLSS3 would work with the slight increase in input lag, but so far I really like the DLSS3 technology.
Posted on Reply
#3
rv8000
Have they fixed the camera swap stretching yet? Or the UI flicker? I can’t remember if they fixed that in a previous update. If not what’s the point of another 10-20 fps vs older DLSS version when there are visible artifacts that are way more distracting than an increase to perceived smoothness with no latency improvement.

#stillunimpressed
Posted on Reply
#4
Garrus
except it is lying, it is not running at that frame rate, it is smoothed to appear like that frame rate, but not exactly
Posted on Reply
#5
oxrufiioxo
CyberCTI just tried DLSS3 for the first time with Cyberpunk on my 4090. While I did notice a minimal addition to input lag, having the game run at 4K maxed out at 120fps most of the time on the projector was awesome. I didn't notice any graphic artifacting. I use a controller for that setup. For a competitive game IDK how well DLSS3 would work with the slight increase in input lag, but so far I really like the DLSS3 technology.
It's really good in both Witcher/CP2077, I don't care for it in Spiderman/MSFS/A Plague tale because of the UI issues.
rv8000Have they fixed the camera swap stretching yet? Or the UI flicker? I can’t remember if they fixed that in a previous update. If not what’s the point of another 10-20 fps vs older DLSS version when there are visible artifacts that are way more distracting than an increase to perceived smoothness with no latency improvement.

#stillunimpressed
I didn't notice that in any camera swap issues in an of the 6 games I've tried it in Witcher, CP2077, Hitman, A plague tale, MSFS, Spiderman you'd probably have to slow it down way below what my 4090 is capable of to notice any of that. I do think lower end GPUs that would benefit the most from this will likely have the most issues.... I did play around with Caps with the Nvidia control panel method in CP2077 and at least down to around 40 fps native 80 fps ish framegen any artifacts were hard to notice.

UI elements are still an issue in some games which to me is a developer issues because they are pretty much non existent in CDPR games.
Garrusexcept it is lying, it is not running at that frame rate, it is smoothed to appear like that frame rate, but not exactly
I agree they shouldn't use it to compare performance the actual technology is super impressive though. This is coming from a person who hates the way TVs do it this is on another level especially in the two CDPR games that use it.

My biggest annoyance with it is not them using it in graphs as long as its clearly labeled DLSS3 it's that they couldn't or didn't want to get it working on 30 series.
Posted on Reply
#6
rv8000
oxrufiioxoIt's really good in both Witcher/CP2077, I don't care for it in Spiderman/MSFS/A Plague tale because of the UI issues.


I didn't notice that in any camera swap issues in an of the 6 games I've tried it in Witcher, CP2077, Hitman, A plague tale, MSFS, Spiderman you'd probably have to slow it down way below what my 4090 is capable of to notice any of that. I do think lower end GPUs that would benefit the most from this will likely have the most issues.... I did play around with Caps with the Nvidia control panel method in CP2077 and at least down to around 40 fps native 80 fps ish framegen any artifacts were hard to notice.

UI elements are still an issue in some games which to me is a developer issues because they are pretty much non existent in CDPR games.



I agree they shouldn't use it to compare performance the actual technology is super impressive though. This is coming from a person who hates the way TVs do it this is on another level especially in the two CDPR games that use it.

My biggest annoyance with it is not them using it in graphs as long as it’s clearly labeled DLSS3 it's that they couldn't or didn't want to get it working on 30 series.
The camera swap stretching was strictly related to the one of the F1 games. Flickering UI elements has been an issue across several games, some of that was fixed in a recent update for cyberpunk; to a degree (flickering).

UI artifacts are an immediate NO imo, as they’re very apparent and easily noticeable as they’re continuous static elements that should not change.

The last time I check the HWUB coverage for DLSS there was a significant amount of ghosting/smearing in the Spider-Man game, to the point of which why are you buying a 4080 and above with a relatively high refresh monitor with quality motion persistence to introduce ghosting, both of which the ladder help prevent.

I think HWUB also did a good job of explaining that DLSS theoretically gets worse the less performant the card is; these lower end cards wouldn’t be able to cross the fps threshold to help hide many of the visual artifacts DLSS 3 introduces. This makes it a pointless feature on cards you’re already getting more than enough FPS with 4070ti/4080/4090 because in many cases you’re decreasing the visual quality for no gain in true latency, and on lower end cards where in a perfect world DLSS 3 would be most useful, the artifacts end up being more noticeable.
Posted on Reply
#7
oxrufiioxo
rv8000The camera swap stretching was strictly related to the one of the F1 games. Flickering UI elements has been an issue across several games, some of that was fixed i a recent update for cyberpunk; to a degree (flickering).

UI artifacts are an immediate NO imo, as they’re very apparent and easily noticeable as they’re continuous static elements that should not changed.

The last time I check the HWUB coverage for DLSS there was a significant amount of ghosting/smearing in the Spider-Man game, to the point of which why are you buying a 4080 and above with a relatively high refresh monitor with quality motion persistence to introduce ghosting, both of which the ladder help prevent.

I think HWUB also did a good job of explaining that DLSS theoretically gets worse the less performant the card is; these lower end cards wouldn’t be able to cross the fps threshold to help hide many of the visual artifacts DLSS 3 introduces. This makes it a pointless feature on cards you’re already getting more than enough FPS with 4070ti/4080/4090 because in many cases you’re decreasing the visual quality for no gain in true latency, and on lower end cards where in a perfect world DLSS 3 would be most useful, the artifacts end up being more noticeable.
Not sure if its because I use a large oled vs an IPS monitor they likely tested it on but I haven't noticed any flicker in either CDPR games or in A plague tale. I haven't used it long enough in the other games to get a good enough feel because I don't plan on using it in those games unless they fix the UI issues at least to the quality of CP2077

Spiderman is by far the worse game that uses the tech and really the only game I hate it in that I've tried although unless you have a 4k 240hz monitor it's pretty useless in this game anyway.

The tech is definitely not suitable for every game type and I'd personally only use it with games that have a massive amount of RT I'm excited to try it out with the new overdrive mode coming to cyberpunk

I still believe people need to actually try it out themselves while playing a game normally although I've tried to spin around in circles rapidly to break it in every game but was unsuccessful other than in Spiderman lol.

My own personal test with it has been to play using it for at least one hour then turn it off and decide what I like better.
Posted on Reply
#8
natr0n
I dont bother with dlss gimmick. Downsample to upsample to a mess ....but it has AI.
Posted on Reply
#9
r9
Garrusexcept it is lying, it is not running at that frame rate, it is smoothed to appear like that frame rate, but not exactly
$100 4k tvs can do the same 60hz to 120hz :D
Posted on Reply
#10
JimmyDoogs
I'm surprised by DLSS 3 frame generation. I didn't imagine that I would prefer it over DLSS 2 upscaling.
Posted on Reply
#11
Garrus
JimmyDoogsI'm surprised by DLSS 3 frame generation. I didn't imagine that I would prefer it over DLSS 2 upscaling.
it gives a poor frame rate and poor visuals in Forza 5, I'm sure you won't prefer it there
Posted on Reply
#12
JimmyDoogs
Garrusit gives a poor frame rate and poor visuals in Forza 5, I'm sure you won't prefer it there
Oof okay. I'll check it out this weekend. Thanks for the heads up. FG in the Dead Space Remake and Hogwarts Legacy were real nice. The %1 lows were unbelievable and I saw no loss of quality.
Posted on Reply
Add your own comment
May 17th, 2024 10:59 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts