• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Forspoken: FSR 3

in starfield, New Atlantis for example, it looks choppy even with freesync on at around 50-60 fps, FG will bring that 120+ smoothness and it will look smoother when running around say New Atlantis.
I'll try it as soon as I'm done with Kingdom Come: Deliverance.

Edit: Although, I've never seen a game that felt choppy to me at 50 FPS.
 
I'll try it as soon as I'm done with Kingdom Come: Deliverance.

Edit: Although, I've never seen a game that felt choppy to me at 50 FPS.
New man sky feel choppy to me no matter what framerate i have. Well at least the last time i played it
 
AMD even said it runs better on RDNA3, I remember reading a quote a few days ago.

So yeah, not sure why it is being tested on nvidia, regardless, I think it looks great, I just watched maxus video at 4k... and I see none of the issues he talked about regarding the magic stuff being blurry, all looked the same to me. I think some people are just overly picky. I'll take those double frames thanks.

Why use a nvida card in these test?

Yeah as visible as TPU is you'd think an AIB partner could have sent over an eval 7900 XTX for testing.

There were a lot of unused RDNA 3 (new WMMA instructions, etc) goodies on the compute die that led to speculation they'd be employed to hardware accelerate FSR 3 and/or Anti-Lag+.

That might explain why Anti-Lag+ doesn't work on RDNA2 and older similar to how DLSS 3 won't work on anything older than Lovelace.
 
Frame rate is a metric that people focused too much on, so much so that the likes of Nvidia and AMD are generating fake frames to bump the number up. But does it translates to better experience? So far most reputable reviewers have concluded that yes (with a big caveat), if the base frame is high enough. So for those with a lower end GPU looking for a DLSS/ FSR moment where they can improve "actual" performance, frame generation is clearly not going to help. Yes, you will see a higher FPS on the counter, but latency is going to be a problem with a low base FPS. To ask for a base FPS of 60 or 70 FPS is a high bar, which means you need to sacrifice graphic quality to see a fake FPS number. If that is the case, I might as well just run DLSS/ FSR with like a mid or high mix of graphical settings instead of using frame generation.
 
Frame rate is a metric that people focused too much on, so much so that the likes of Nvidia and AMD are generating fake frames to bump the number up. But does it translates to better experience? So far most reputable reviewers have concluded that yes (with a big caveat), if the base frame is high enough. So for those with a lower end GPU looking for a DLSS/ FSR moment where they can improve "actual" performance, frame generation is clearly not going to help. Yes, you will see a higher FPS on the counter, but latency is going to be a problem with a low base FPS. To ask for a base FPS of 60 or 70 FPS is a high bar, which means you need to sacrifice graphic quality to see a fake FPS number. If that is the case, I might as well just run DLSS/ FSR with like a mid or high mix of graphical settings instead of using frame generation.
Exactly. Why bother with FG if you can't use it properly. This FG was supposed to help the most with the low end cards where the FG would have had the most impact on. With the current 70FPS min there is no point for this with all the latency and lag lurking there for the lower end card users. Just lower the settings use DLSS/FSR and forget about the FG
 
What is this article lol...
Is this serious?
The entirety of this can be summarised as "AMD bad" to the point that I'd think I've just read the Onion.

Framerate improvements? Nothing.
Motion vector quality/problems? Nothing.
Actual effect of the technology? NOTHING!

"You need VSync, that's bad"
"You need to use FSR 2 which IS TERRIBAD, THAT'S TERRIBAD"
"You have a sharpening filter" (how many times am I gonna read that?)

This is an actual article that was published where nothing at all is said about the technology, no mention of the actual value of it, it's just negativity on quibbles about how to make it work. Nothing at all is said about what it's like when it's on. Literally the Onion.

When 17 year old Youtubers (Vex, or Daniel Owen) make a more complete assessment than a supposed professional website...

It's not even bias anymore, it's just a parody.
Oh and running on an Nvidia card of course.
 
Unfortunately this is a very poor article. Running FSR on a Nvidia hardware? This doesn't reflect the best case scenario for FSR, this is equivalent of attempting to run DLSS on AMD hardware and writing an article.
 
Last edited:
Unfortunately this is a very poor article.

Running FSR on a Nvidia hardware?

Please try running DLSS on AMD hardware and write an article.
I dont think it is poor and you have shown exactly what AMD is all about. FSR on a NV hardware? YES.
DLSS on an AMD hardware? NO
That is exactly what this is all about. AMD is hardware agnostic and that is the beauty of it. Of course TPU can give it a go with the FG and other AMD features with AMD hardware to showcase how it works with AMD hardware and compare.
 
This article is really bad, I'm not sure what is going on with tech journalism these days. They don't even bother to use an AMD card to test this exciting new feature which looks great IMO. By not using an AMD GPU that means that FSR3 FG has no support for important supporting tech such as Anti Lag and Anti Lag + in AMD Software, and the tech just generally working better on AMD cards. What a disappointment.
 
Last edited:
I don’t think the article is bad but when you compare two techs, it should have been done with the best hardware possible for them.
4080+DLSS vs 7900XTX+FSR.

The fact that FSR runs on a potato does not justify the use of a nVidia or Intel gpu to get results and judge the quality of it.

In general it’s useful to know how much better the FSR is on a rdna gpu than on a competitors one.
 
Bad take. Framegen is useful for people who have a high refreshrate monitor (144 and up), got significantly less before that who want to get their monitors refreshrate.
In this particular case I can go from ~100fps in Starfield to 165fps.

I'm not saying it's going to be useful to everyone, but I can certainly see some uses for it.
Agreed. I was also on the "FG bad" clan originally, but I changed my mind a lot since.
FG is entirely pointless in a lot of games, and I'm pretty sure that neither Forspoken nor Aveum are good picks for it. The most action packed a game is, the less you care about a tech that'll exchange latency for higher framerate.

However, in slower RPGs, strategy games, or simulations (might work in racing games, might not), no Man's Sky likes, a lot of games really, where it's about immersing yourself in a world, that kind of "max out your monitor's refresh rate and enjoy the smoothness" is actually a really really great tech.
I'm kind of split on things like Witcher/Cyberpunk, where there are action parts but a lot of it is just traversing the city and enjoying the atmosphere. I'd say that depending on how sensitive you are to lag and/or possible dips in action scenes, the tech is still very much worth it.

I think FG has a bright future ahead of itself, just like upscaling (we'll need that one for 6K/8K for sure). It's all about using the right tool for the right game.
 
Edit: Although, I've never seen a game that felt choppy to me at 50 FPS.

It's not about being choppy, it's about reducing persistence blur of sample-and-hold digital displays. The higher the framerate, the less motion blur.

If you could interpolate to 1000 FPS you'd basically eliminate persistence blur. That would be a drastically better experience even with 60 FPS input latency.
 
I tried capping my fps to just below my monitors refresh rate limit (as suggested in this article) and I still got judder. The only way I could remove judder was if I let v sync do the frame rate capping—like the days before vrr. I used the nvidia fps cap option in their control panel. I’ve heard people have had success capping the frame rate without judder with rtss (I don’t have it and can’t be bother to download it).
 
I don’t think you guys understand how amd played this one. There was no prerelease info, briefing or support. The tech suddenly appeared publicly in two games last week

On Friday afternoon, right before everybody goes to weekend, their pr people emailed me „new tech is live now. do you want keys?“

I’m in paris right now for some family time between launches

if you think you can do a better job? Just let me know, I’ll even pay you
 
I don’t think you guys understand how amd played this one. There was no prerelease info, briefing or support. The tech suddenly appeared publicly in two games last week

On Friday afternoon, right before everybody goes to weekend, their pr people emailed me „new tech is live now. do you want keys?“

I’m in paris right now for some family time between launches

if you think you can do a better job? Just let me know, I’ll even pay you

Then you delay until you have the right hardware on hand to do a proper test.

Better to deliver a good quality late article than to deliver a low quality article quickly IMO.
 
However, in slower RPGs, strategy games, or simulations (might work in racing games, might not), no Man's Sky likes, a lot of games really, where it's about immersing yourself in a world, that kind of "max out your monitor's refresh rate and enjoy the smoothness" is actually a really really great tech.
The last time I saw a review of "fake frames" applied to simulators, it turned the UI into an ugly garbled mess. Have they fixed that stuff yet?
 
Then you delay until you have the right hardware on hand to do a proper test.

Better to deliver a good quality late article than to deliver a low quality article quickly IMO.
THIS

we wait for TPU's tests for their seriousness... if we want instant BS reviews they're a shitload of websites like that

i'm reading TPU for more than a decade, after all those years i made a forum account juste to say this... im quite bitter right now coz I hold TPU in high regard . This is not the 1st time i see "questionnable" things like this and i find it quite sad, the only thing you gain is a lose of trust...
 
Last edited:
It's not about being choppy, it's about reducing persistence blur of sample-and-hold digital displays. The higher the framerate, the less motion blur.

If you could interpolate to 1000 FPS you'd basically eliminate persistence blur. That would be a drastically better experience even with 60 FPS input latency.
Persistence blur? This is the first time I've heard about this term. Or am I already too old for these things? :laugh:
 
Persistence blur? This is the first time I've heard about this term. Or am I already too old for these things? :laugh:

That's been around since the beginning of the LCD era. LCDs are blurry in motion, in part because of pixel response time, but mainly because of how the image is presented. Each frame is displayed as a static image (it "persists" for 16.7 ms at 60 Hz), and it's our eyes that cause the illusion of motion blur, not the display itself.

OLED has almost instant pixel response time, but it uses the same sample and hold technology, which is also why it's blurry in motion.

On a CRT the image is constantly in motion, because it's being drawn line by line, that's why there's no motion blur.
 
That's been around since the beginning of the LCD era. LCDs are blurry in motion, in part because of pixel response time, but mainly because of how the image is presented. Each frame is displayed as a static image (it "persists" for 16.7 ms at 60 Hz), and it's our eyes that cause the illusion of motion blur, not the display itself.

OLED has almost instant pixel response time, but it uses the same sample and hold technology, which is also why it's blurry in motion.

On a CRT the image is constantly in motion, because it's being drawn line by line, that's why there's no motion blur.
I get the techy bits, I just don't know where I'm supposed to see the blur, or why I should be bothered by it.
 
This is like doing an in depth review of XeSS, and running it on AMD, disregarding that it's meant to work best with intel gpus but can also run on the cards of other vendors albeit at a lower quality.

Man.
 
I get the techy bits, I just don't know where I'm supposed to see the blur, or why I should be bothered by it.

You might just be used to it after ~20 years and it seems normal to you. If you'd look at a CRT right now, or an LCD with BFI/ULMB, you'd be astonished. ;)

I usually don't notice any TAA ghosting until people point it out. Sometimes not even then. I think some people do actually confuse TAA ghosting with LCD smearing, because they say they see something in a video, while I don't see it on an OLED, whether in motion or paused.
 
I don’t think you guys understand how amd played this one. There was no prerelease info, briefing or support. The tech suddenly appeared publicly in two games last week

On Friday afternoon, right before everybody goes to weekend, their pr people emailed me „new tech is live now. do you want keys?“

I’m in paris right now for some family time between launches

if you think you can do a better job? Just let me know, I’ll even pay you
I appreciate the hard work, I really do... but it's problematic when a site that has gained the public's trust over time releases tests with obvious errors and misinformation like I showed in my previous comment.
 
I don’t think you guys understand how amd played this one. There was no prerelease info, briefing or support. The tech suddenly appeared publicly in two games last week

On Friday afternoon, right before everybody goes to weekend, their pr people emailed me „new tech is live now. do you want keys?“

I’m in paris right now for some family time between launches

if you think you can do a better job? Just let me know, I’ll even pay you

I think most peoples issue is with testing an AMD feature, solely on Nvidia hardware, when other key features that exclusively work on RDNA3 hardware arent there to support FG. Same goes for the FSR reviews tbh, just because AMD allows certain software features to be hardware agnostic doesn’t mean testing with one brand of hardware will paint the entire, or proper picture. It absolutely comes across as a bad way to review software features.
 
Back
Top