• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

FSR 4 Support Arriving Day One for All Current FSR 3.1 Game Titles According to Leak

It'll happen once there is no fruit left, and I think that'll be sooner rather than later
Time will tell won't it, Till then we can enjoy the race to push each the technology through competition, and perhaps a few I told you so's along the way from both uhhh 'sides'.
 
and possibly announce somthing to combat MFG.
Why would they invent something when they have AMFM FG+FRS FG and it has been working for over a year. No sane person would combine them, but that is what NV proposes to do with their shiny new MFG garbage
 
Not sure why people are going crazy for "Upscaling" & "Fake Frames" it's Ludacris!

I'll take quality over Blurry Fake Frames any day.

Destroying gaming quality...

Cheers
 
Hardware-dependent closed solutions are hurting gaming - they're only good for increasing mindshare and revenue of a single company.
Platform and hardware agnostic tech used to be a desired trait in the tech world but for reasons that I really dont understand, that is now a con and ignored.

And agreed, it does hurt gaming, as a matter of fact, it is already doing a great deal of damage to it and the overall openness of what it was a major point of the PC platform is being rapidly eroded in front of our eyes.
 
"Fake Frames" it's Ludacris!

1738091045047.jpeg


You calling me a fake frame, b****?

Platform and hardware agnostic tech used to be a desired trait in the tech world but for reasons that I really dont understand, that is now a con and ignored.

And agreed, it does hurt gaming, as a matter of fact, it is already doing a great deal of damage to it and the overall openness of what it was a major point of the PC platform is being rapidly eroded in front of our eyes.
Nah, Nvidia doesn't have that power and it certainly won't hinge on whether upscalers work or don't, or are any good or not.

Its a nice to have, and that is all it really is right now. People overvalue this shit waaay too much. Its a temporary thing.

Devs are never going to stop or go depending on whether X or Y upscalers might work or not. Same thing as RT, its there by grace of the current motions, but its totally not required to make a good game. As long as single devs in attics and small teams can release content that takes the world by storm (and they do, more than ever before in history of gaming, its one surprise hit after another the last 10 years), the PC is as secure as it'll ever be. Everything else is just dancing around that reality, trying to exercise control they can never truly gain.
 
Last edited:
View attachment 382210

You calling me a fake frame, b****?


I try to refrain from calling them fake frames and rather refer to them as interpolated frames although I still slip up from time to time..... Technically everything you see in a game visually is faked it just comes down to how you like your image faked lol.
 
I try to refrain from calling them fake frames and rather refer to them as interpolated frames although I still slip up from time to time..... Technically everything you see in a game visually is faked it just comes down to how you like your image faked lol.
By my definition:
"Normal" frames = ones generated using geometry data, physics and user input.
Fake frames = ones generated using data from another frame.
 
Wasn't open and democratized the name of the game with this swanky new cons00mer AI? ;)

It'll happen once there is no fruit left, and I think that'll be sooner rather than later. Look at how fast AMD is chasing now with featureset. If there is nothing that sets these upscalers apart, there's no point wasting money on three approaches anymore.
We might also end up with a GPGPU situation :
- Nvidia immediately went for a closed API.
- Apple collaborated with AMD, Intel, and Nvidia to make OpenCL, believing that open source was the way to democratize GPGPU.
- Nvidia made a massive sweep by being overcommitted to CUDA, when the other players were fairly passive, and whished for the best.
- We now have CUDA/METAL/HIP/OneAPI. With HIP and OneAPI somehow being opensource, but only backed by the company who created it., And Apple had to make nvidia persona non grata on MacOS so that Metal could have a fghting change.
Devs now have to work with 4 different API, with no unification in sight with Direct X's Direct Compute seeming to be a fucking joke. (Optix merely being CUDA optimized for offline 3D RT/denoising).
1738103223023.png
 
By my definition:
"Normal" frames = ones generated using geometry data, physics and user input.
Fake frames = ones generated using data from another frame.

It's more like adaptive frames than fake. A fake one would be generated from no base data in essence from scratch. AI text to image is more of a fake frame and that does is based on data and training by extension, but certainly the training algorithms aren't perfect and can't fully guess what we intend or mean with the prompts. I mean hundred people can ask AI to draw a bird, but they aren't all necessarily thinking of the same type bird so does it make it wrong or fake if they differ or vary no the AI is just pooling at random based on the data of many types of birds and doing it's best come up with acceptable bird of any type since it wasn't specified if it's a eagle or a hawk or blue jay or purple flamingo with a beaver tail that breaths butterflies on fire.
 
By my definition:
"Normal" frames = ones generated using geometry data, physics and user input.
Fake frames = ones generated using data from another frame.

I get what you mean but technically the frame is fake regardless of if it's being generated locally or not. Developers have to fake all sorts of stuff just so we have a cohesive image to begin with. Gpu makers are just helping them fake it more input latency and image quality be damned....
 
We might also end up with a GPGPU situation :
- Nvidia immediately went for a closed API.
- Apple collaborated with AMD, Intel, and Nvidia to make OpenCL, believing that open source was the way to democratize GPGPU.
- Nvidia made a massive sweep by being overcommitted to CUDA, when the other players were fairly passive, and whished for the best.
- We now have CUDA/METAL/HIP/OneAPI. With HIP and OneAPI somehow being opensource, but only backed by the company who created it., And Apple had to make nvidia persona non grata on MacOS so that Metal could have a fghting change.
Devs now have to work with 4 different API, with no unification in sight with Direct X's Direct Compute seeming to be a fucking joke. (Optix merely being CUDA optimized for offline 3D RT/denoising).
View attachment 382238
Are you suggesting that it's AMD/Intel/Apple's fault that Nvidia pushed their own standard so much and devs jumped on it?

It's more like adaptive frames than fake. A fake one would be generated from no base data in essence from scratch. AI text to image is more of a fake frame and that does is based on data and training by extension, but certainly the training algorithms aren't perfect and can't fully guess what we intend or mean with the prompts. I mean hundred people can ask AI to draw a bird, but they aren't all necessarily thinking of the same type bird so does it make it wrong or fake if they differ or vary no the AI is just pooling at random based on the data of many types of birds and doing it's best come up with acceptable bird of any type since it wasn't specified if it's a eagle or a hawk or blue jay or purple flamingo with a beaver tail that breaths butterflies on fire.
I get what you mean but technically the frame is fake regardless of if it's being generated locally or not. Developers have to fake all sorts of stuff just so we have a cohesive image to begin with. Gpu makers are just helping them fake it more input latency and image quality be damned....
That was my definition. I never said you have to agree with it, but I'm sticking to it. A game is based on user input-output. Anything the engine does outside of it is fake. It increases latency for no reason.
 
Fake it til you make it. The way it's meant to be played. It's really adapting previous frame data rather than faking it. FXAA is as much a fake frame as frame interpolation and upscale trying to spatially scale additional pixels that didn't exist from existing ones thru clever quantized algorithm approaches.
 
That was my definition. I never said you have to agree with it, but I'm sticking to it. A game is based on user input-output. Anything the engine does outside of it is fake. It increases latency for no reason.

By your definition using TAA would make a frame fake..... Almost every game uses it.....

TAA anti-aliasing, which stands for "Temporal Anti-Aliasing", works by combining information from previous frames with the current frame to smooth out jagged edges on moving objects, essentially "blending" pixels from multiple frames to create a more refined image

It also technically has a performance cost increasing latency....
 
By your definition using TAA would make a frame fake..... Almost every game uses it.....

TAA anti-aliasing, which stands for "Temporal Anti-Aliasing", works by combining information from previous frames with the current frame to smooth out jagged edges on moving objects, essentially "blending" pixels from multiple frames to create a more refined image

It also technically has a performance cost increasing latency....
I knew this would come up. ;)

That's probably why everybody hates it. It works with data from different frames that have nothing to do with the current one, and therefore isn't entirely accurate.
 
I knew this would come up. ;)

That's probably why everybody hates it. It works with data from different frames that have nothing to do with the current one, and therefore isn't entirely accurate.

I like TAA less than I like Frame gen and I don't even currently like frame gen in almost all scenarios..... On a monitor anyways... on a TV it's fine with a controller at least 10 feet away....


I hate the direction both technologies are going personally.... Especially now with GPU makers including them in their benchmarks and gamers going man I love me some of dat....
 
I think everybody should team up and create an industry standard like DirectX or OpenGL and call it OpenUpscale or something.

Hardware-dependent closed solutions are hurting gaming - they're only good for increasing mindshare and revenue of a single company.

There’s no money in open.

Proprietary = profitable. Evidence: Nvidia vs AMD
 
Are you suggesting that it's AMD/Intel/Apple's fault that Nvidia pushed their own standard so much and devs jumped on it?
What I've heard from devs in the 3D industry, is that CUDA wasn't just artificially pushed by Nvidia, OpenCL was also a pain to work with, comparatively, especially with Apple. Getting help to fix issues wasn't as easy. Devs had bugs with OpenCL that weren't an issue with CUDA. Blender removed OpenCL support entirely once AMD provided HIP because the performance wasn't good, and Redshift3D only started to support AMD once HIP was available as well.
OpenCL is still popular in some areas, but that API is also scaring developers away in other fields. OpenCL isn't "dead", they just decided to avoid using it since weird stuff happened with it, and no sign of improvement was in sight.
OpenCL rendering support was removed. The combination of the limited Cycles kernel implementation, driver bugs, and stalled OpenCL standard has made maintenance too difficult. We are working with hardware vendors to bring back GPU rendering support on AMD and Intel GPUs, using others APIs.
I have some serious reservations about the lack of an Nvidia GPU option, partially because of poor OpenCL support from Apple compared to CUDA support from Nvidia. So I did some digging and contacted the developers of Neat Image, a denoise plugin for images and video that supports both CUDA and OpenCL. They confirmed my impression that Apple has lackluster support compared to Nvidia's CUDA support for OS X or even AMD's OpenCL and Nvidia on Windows and Linux.
A pro with serious workstation needs reviews Apple’s 2013 Mac Pro - Ars Technica
Cycles - Blender Developer Documentation

It would be one thing if CUDA only made it because Nvidia gave people money, but CUDA also made it because it was easier to use. I refuse to believe that Nvidia poached all the talent in the US meaning that if they aren't involved in something it's doomed to fail.
 
And as a consumer, why would I need to support that?

That's something every consumer has to ask themselves though and neither conclusion is really wrong

People just need to buy whatever gives them the most for their money the thing is that isn't going to be the same for every person for me that was the 4090 last generation for someone else that might have been a 4060 or 7800XT none of the choices are wrong though.

People who love frame gen are gonna love MFG it's basically just frame gen on crack for better or worse.

AMD does not have a high bar this generation to make a compelling product the GeForce products under 1000 are 4060ti like improvements over their Super counterparts so if there was a gen to turn it around it would be this one. Hopefully both FSR4 and the 9000 series are awesome not because I am likely to use or buy either but because good products benefit everyone.
 
Because without profit companies don’t exist.
That's not a reason. You can buy products that work with open standards. For example, FSR and XeSS work on every modern GPU, but AMD and Intel are still making money on the GPUs themselves. We've got Linux for free, maintained by the community for free. The music industry isn't going tits up just because I'm not buying renting my music on Amazon or iTunes.
 
There’s no money in open.
I'd argue there is some money in it, there are clearly people who will put open/hardware agnostic over a solution that's proprietary but qualitatively superior. So much so they would even decry and not use the feature itself, but see fit to draw a line in the sand over it.

And then there are the people to whom that facet of the argument is effectively below the waterline, they pay more to get the better feature/s, and they get access to the hardware agnostic ones too.

I don't think either buyer is necessarily right or wrong, and I don't see a moral high ground to be claimed either. Those who innovate and have the most desirable product get to charge for it, then the open standards follow and eventually take over.
 
That's not a reason. You can buy products that work with open standards. For example, FSR and XeSS work on every modern GPU, but AMD and Intel are still making money on the GPUs themselves. We've got Linux for free, maintained by the community for free. The music industry isn't going tits up just because I'm not buying renting my music on Amazon or iTunes.

Have you seen AMD’s and Intel’s financials? They are losing their asses on GPUs.

Linux for free? Take out all the for profit developers that work on it and you have nothing. Do you know who makes 75% of the kernel commits? Intel. They aren’t doing it out of the goodness of their heart. They do it because it makes them a profit.

Do you really think AMD is being altruistic when they open their software? They open their software with the hope that someone develops it for them, because they suck at software. Evidence: RocM.
 
Have you seen AMD’s and Intel’s financials? They are losing their asses on GPUs.
They are right now because everyone's into Nvidia's proprietary technology.

Linux for free? Take out all the for profit developers that work on it and you have nothing. Do you know who makes 75% of the kernel commits? Intel. They aren’t doing it out of the goodness of their heart. They do it because it makes them a profit.
Which for profit developers? We need Wine and Proton for compatibility with Windows software because there's so many for profit organisations developing native Linux software, obviously. :rolleyes:

Do you really think AMD is being altruistic when they open their software?
No, it's a business plan just the same. You can develop an open technology and hope that customers will catch up. I'm just saying that I find this business plan more likeable than locking people into a closed standard.
 
Back
Top