• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD FSR 3 FidelityFX Super Resolution Technology Unveiled at GDC 2023

T0@st

News Editor
Joined
Mar 7, 2023
Messages
3,198 (3.97/day)
Location
South East, UK
System Name The TPU Typewriter
Processor AMD Ryzen 5 5600 (non-X)
Motherboard GIGABYTE B550M DS3H Micro ATX
Cooling DeepCool AS500
Memory Kingston Fury Renegade RGB 32 GB (2 x 16 GB) DDR4-3600 CL16
Video Card(s) PowerColor Radeon RX 7800 XT 16 GB Hellhound OC
Storage Samsung 980 Pro 1 TB M.2-2280 PCIe 4.0 X4 NVME SSD
Display(s) Lenovo Legion Y27q-20 27" QHD IPS monitor
Case GameMax Spark M-ATX (re-badged Jonsbo D30)
Audio Device(s) FiiO K7 Desktop DAC/Amp + Philips Fidelio X3 headphones, or ARTTI T10 Planar IEMs
Power Supply ADATA XPG CORE Reactor 650 W 80+ Gold ATX
Mouse Roccat Kone Pro Air
Keyboard Cooler Master MasterKeys Pro L
Software Windows 10 64-bit Home Edition
AMD issued briefing material earlier this month, teasing an upcoming reveal of its next generation FidelityFX at GDC 2023. True to form, today the hardware specialist has announced that FidelityFX Super Resolution 3.0 is incoming. The company is playing catch up with rival NVIDIA, who have already issued version 3.0 of its DLSS graphics enhancer/upscaler for a small number of games. AMD says that FSR 3.0 is in an early stage of development, but it is hoped that its work on temporal upscaling will result in a number of improvements over the previous generation.

The engineering team is aiming for a 2x frame performance improvement over the existing FSR 2.0 technique, which it claims is already capable of: "computing more pixels than we have samples in the current frame." This will be achieved by generating a greater number of pixels in a current frame, via the addition of interpolated frames. It is highly likely that the team will reach a point in development where one sample, at least, will be created for every interpolated pixel. The team wants to prevent feedback loops from occurring - an interpolated frame will only be shown once, and any interpolation artifact would only remain for one frame.



However, a number of potential setbacks were noted - a reliance on color clamping to correct color of outdated samples is not entirely feasible. It will be difficult to produce non linear motion interpolation on 2D screen space motion vectors, and the interpolation of final frames will mean that all post-processing needs to be interpolated, also counting the user interface in the foreground. One of AMD's diagrams shows how a native rendering technique stacks up to FSR 2.0 and 3.0.



FSR 3.0 will enable a smoother overall gaming experience, and simultaneously this allows developers to focus more GPU time on visual quality. Latency reduction is a key focus area for FSR 3.0 - AMD has the gamer in mind, with high frame rates and the lowest achievable latency as basic requirements. The engineers are also aiming for a smooth upgrade path from titles that currently utilize version 2.0 of FSR.

View at TechPowerUp Main Site | Source
 
Interesting, and this points out to the frame interpolation feature just being a reaction to Nvidia, as many of us supposed.
 
fake frames from amd too, great yaaay
fuckkkkk, i hate this trend
 
Does anyone know if FSR3 will be supported for 5000/6000 cards?
also so Fluid motion is back so video playback will change to 60fps?

If not, guess I will be using Vega cards/older cards forever and proabldy end up selling my 5000/6000 GPU's.
 
Does anyone know if FSR3 will be supported for 5000/6000 cards?
also so Fluid motion is back so video playback will change to 60fps?

If not, guess I will be using Vega cards/older cards forever and proabldy end up selling my 5000/6000 GPU's.
Should work on any GPU if it follows FSR2 logic
 
fake frames from amd too, great yaaay
fuckkkkk, i hate this trend


Regardless of if they are fake I still like the tech in Witcher 3/CP2077 I hope AMDs variant is just as good. Don't get me wrong Nvidia or AMD shouldn't be using this tech in benchmarks to try and sell gpu's but at lest with Nvidia variant it being a first generation implementation it's already light years better than DLSS 1.0.
 
Walking away from any NVIDIA's presentation, many attendees will feel excited (or cautious) for future tech, but... many attendees that are coming from any AMD's presentation, though, I bet most caught up with their sleep there.
 
I might be a villain here, but in FSR 3, the most the can get a performance boosts will probably be the RDNA 3 GPU's with this Frame Interpolation they will be needing AI Accelerators on this area, which is part of RDNA 3 GPU.
the previous gen GPU's might only get an improvement but that is only on what FSR 2 can offer. Just don't get your hopes up I am just giving you a realistic idea on what you need to expect on FSR 3.
If you can recall FSR 3 was introduced in RDNA 3 launching, so those new feature that they were talking about performance uplift were run on RDNA 3 GPU.
So that explains why RDNA3 GPU are expensive because they added 2 AI Accelerators 1 RT per CU to prepare this Hardware on FSR 3.

Does anyone know if FSR3 will be supported for 5000/6000 cards?
also so Fluid motion is back so video playback will change to 60fps?

If not, guess I will be using Vega cards/older cards forever and proabldy end up selling my 5000/6000 GPU's.
You shouldn't, don't join with this Ray Tracing bandwagon thing... just run your game without ray tracing and you will be good for a couple of years. I watched my son play games on his new computer with Radeon 6000 card, disabled ray tracing and still appreciate how crisps the graphics are. So yeah you will be fine without ray Tracing...
 
Last edited:
Nvidia is utter scam at this point, insane GPU prices so they introduce fake frames to have the delusion as if you are playing at higher frames.

Now AMD is following suit with all of Nvidia's garbage!

How about create cheaper and faster GPU's? Jesus Christ, how did we get from 25-50% faster performance for the same price generation over generation to 0-50% faster performance at 50% higher price!
 
Regardless of if they are fake I still like the tech in Witcher 3/CP2077 I hope AMDs variant is just as good. Don't get me wrong Nvidia or AMD shouldn't be using this tech in benchmarks to try and sell gpu's but at lest with Nvidia variant it being a first generation implementation it's already light years better than DLSS 1.0.

I'd say DLSS 1.0 and DLSS frame generation are equally as bad.

DLSS frame gen introduces latency and artifacts. HWUB states that the tech is only really useful between 70 - 120 FPS and even then, only in certain titles.

You are getting more frames but you can't interact with any of those frames. It's nice to look at but horrible disorienting if your latency is too high as the disconnect between what you see and you inputs grows.
 
You shouldn't, don't join with this Ray Tracing bandwagon thing... just run your game without ray tracing and you will be good for a couple of years. I watched my son play games on his new computer with Radeon 6000 card, disabled ray tracing and still appreciate how crisps the graphics are. So yeah you will be fine without ray Tracing...
Ray tracing? I was referrring to AMD Fluid motion features.
 
I'd say DLSS 1.0 and DLSS frame generation are equally as bad.

DLSS frame gen introduces latency and artifacts. HWUB states that the tech is only really useful between 70 - 120 FPS and even then, only in certain titles.

You are getting more frames but you can't interact with any of those frames. It's nice to look at but horrible disorienting if your latency is too high as the disconnect between what you see and you inputs grows.

I've actually used it extensively I don't need hub or any other site to show me what it does. I like it in 2 out of 5 games I've tried it in and the artifacts above 100fps are impossible to notice. latency isn't noticeable as I play both those games with a controller. The games I don't care for it in are due to UI issues but minus that it's also pretty impressive. DLSS1 wasn't even usable whatsoever in any game it was pretty much as worthless as FSR.

People need to actually try it not just watch videos of it which to me are useless especially with how youtube handles above 60fps.

I don't even actually need it for good performance in the two games I like it in but still choose to use it.

It is useless for MP game which I do play alot though. But I will probably try it in D4 when it comes out just to see the implementation.
 
Last edited:
I used to go to the GDC for many of years... until it got to the point that there is a lot of incredible amount of spin doctoring going on with their products.

So meh on the announcement. I'll wait until a real review is done.
 
I'm looking into my crystal ball and there's something that seems to be shaping up...

FSR 3 will land, and despite IQ shortcomings, it will be reviewed as a DLSS3/FG killer, and virtually overnight a whole demographic of PC gamers will suddenly think it's a killer tech and since it's acceptable IQ wise, and open so DLSS3/FG should go die in a fire.

Well we already know it's a great and desirable tech, and the more competition the better, that's good for everyone.

Another massively important footnote, it's optional, so if the mere thought of 'fake frames' is a line you've decided should not be crossed, fear not, just don't enable it. Bonus point! all frames are fake frames.
 
I think the interesting thing with FSR 3 here is how aggressive it looks to be being setup to drop artifacts, which whilst thats going to reduce its upper frame rate, should improve image quality pretty handily. It also looks like it shouldn't increase latency all that much compared to DLSS 3 either.

Honestly, I'm somewhat excited for this one, the architecture looks like a meaningfully different take compared to DLSS 3, unlike FSR 2 which whilst I applaud the competition, neither did I really care as an Nvidia user already being serviced by DLSS 2.
 
So that explains why RDNA3 GPU are expensive because they added 2 AI Accelerators 1 RT per CU to prepare this Hardware on FSR 3.
What? Talk me more for AI in rDNA3 architecture. :)
Maybe in GPU architecture we might be waiting to 2026 or more to AI added inside it. AI will be part of CPU architecture for first.
 
fake frames from amd too, great yaaay
fuckkkkk, i hate this trend

but why though? the concept makes sense as long as the base amount of actual frames rendered is sufficient.
If you render 60 frames per second, the difference between frame 13 and 14 is very small, AI with motionvectors could easily make an educated guess of what the frame in between would look like, so really it is a fine tech.

I just dont want, same for the upscaling techs, that the devs of games become lazy when it comes to optimizing or that the gpu's advance less because this tech will solve all the performance lacking problems (while obviously costing the same if not more.....)
 
I just dont want, same for the upscaling techs, that the devs of games become lazy when it comes to optimizing or that the gpu's advance less because this tech will solve all the performance lacking problems (while obviously costing the same if not more.....)
This has been the plague of game development for years, a lot of rushed games that rely on the evolution of the hardware instead of being well done.
 
What? Talk me more for AI in rDNA3 architecture. :)
Maybe in GPU architecture we might be waiting to 2026 or more to AI added inside it. AI will be part of CPU architecture for first.
Lol what?

Nvidia has their tensor cores which are AI accelerators in their geforce gpus. Amd has some AI hardware in RDNA3 gpus now too.

Screenshot_20230823_074215_Edge.jpg
 
Lol what?

Nvidia has their tensor cores which are AI accelerators in their geforce gpus. Amd has some AI hardware in RDNA3 gpus now too.

View attachment 310328
So wait for results of that be revealed to us, because I didn't see usage in home consumer relation. Where is my 2.7X improvement between rDNA2 and rDNA3 in games or playing video or OS GUI. I know for the MMA. Was in an answer to my question, below much older news, from other colleague.
 
I'd say DLSS 1.0 and DLSS frame generation are equally as bad.

DLSS frame gen introduces latency and artifacts. HWUB states that the tech is only really useful between 70 - 120 FPS and even then, only in certain titles.

You are getting more frames but you can't interact with any of those frames. It's nice to look at but horrible disorienting if your latency is too high as the disconnect between what you see and you inputs grows.
So I assume you can't play anything on an amd card, since latency on these things are dreadful compared to nvidia. Igorslab tested just that, amd cards on average have much higher latency than nvidia. Even with dlss frame gen enabled, rofl.
 
What? Talk me more for AI in rDNA3 architecture. :)
Maybe in GPU architecture we might be waiting to 2026 or more to AI added inside it. AI will be part of CPU architecture for first.
Sorry, Nvidia's Tensor cores have been part of the Turing generation of GPUs.

From an architectural standpoint, today's ML cores have much more in common with GPU raster cores than CPU cores.

If you're waiting for AI to be included in CPU architecture, be patient because you may have to wait a long time. A really looooooong time.
 
So I assume you can't play anything on an amd card, since latency on these things are dreadful compared to nvidia. Igorslab tested just that, amd cards on average have much higher latency than nvidia. Even with dlss frame gen enabled, rofl.

Surely you'd be so kind as to link the Igor's article your are misrepresenting so I can shred your misinformation to pieces. Otherwise I'll take your comment as the nonsense it is.
 
Ray tracing? I was referrring to AMD Fluid motion features.

FluidMotion may still work with it, I haven't finished my build yet to see if a 7900 series and Win11 will support it.
 
Back
Top