• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

New idea to get MORE FPS in games similar to WideScreen flawlesswidescreen.org Zoomed in

Joined
Oct 25, 2022
Messages
40 (0.04/day)
System Name M.S.I. Flagship Laptop.
Processor Intel
Motherboard Overheating
Cooling Inadequate
Memory 32G never enough
Video Card(s) Overheating 3070
Storage Overheating 3TB
Display(s) Never enough Hz but at least better than retina
Case Overheating
Audio Device(s) Bloatwared Intel/Realtek
Power Supply Not enough !!!
Mouse BEST IN THE WORLD Glitching like hell?!
Keyboard Best in the world, but not for typing.
Software God help me.
Benchmark Scores NEVER ENOUGH !!!
Hello,

The idea is similar to the flawlesswidescreen.org software.

The idea is to CUT the viewport of the game so that it DOES NOT RENDER the top and bottom of the screen, replacing that with black bars, saving on pixels.

This idea will allow for High and ultra settings while keeping FRAME TIMES low and No GPU scaling! (adds latency and reduces quality).

This idea requires modding and software changes to game, driver, maybe OS.

If anyone works with Nvidia AMD Intel, game engine makers please send them this idea.

Also please share your thoughts on this. No negativity allowed.
 
That's part of the reason why variable rate shading was invented .. and it won't even have black borders .. look it up ... unfortunately not many game devs are using it
 
Create a new custom resolution and adjust your display/gpu not to scale to full screen. Voila! Problem solved. It's wasting - as your proposal - screen space but it will do what you want. I would use upscaling instead but looks like you want a native resolution. Other solution: Get a display with a low native resolution.

Edit: Easiest way: Select a lower than native display resolution in the game and play in windowed mode. When native resolution is important: CRT displays are great for that. Nowadays harder to find and probably more expensive when new but used you can get them even for free.
 
Last edited:
Viewport dimensions/aspect ratios are part of the game's design. Cutting it to save performance would be no different than, say, modding out enemy NPC because it would save some CPU cycles. It does improve perfomance, but you're sacrificing aspects core to the gameplay itself, not just some eye candy.

Besides, letterboxes are awful even in games that natively supports it. The Evil Within came with it, people hated it so much, it got patched out.
 
Hello,

The idea is similar to the flawlesswidescreen.org software.

The idea is to CUT the viewport of the game so that it DOES NOT RENDER the top and bottom of the screen, replacing that with black bars, saving on pixels.

This idea will allow for High and ultra settings while keeping FRAME TIMES low and No GPU scaling! (adds latency and reduces quality).

This idea requires modding and software changes to game, driver, maybe OS.

If anyone works with Nvidia AMD Intel, game engine makers please send them this idea.

Also please share your thoughts on this. No negativity allowed.


Two points


1. The meaning of forum -

a meeting or medium where ideas and views on a particular issue can be exchanged.
"we hope these pages act as a forum for debate"

Debate can include negativity and is perfectly valid if it can be backed up with good rationale and reasoning behind it.


2. We have the owner and moderators on the site to enforce and ensure we know the rules.....not you
 
Important is the Choice to the end user.

I would like to Not render a lot of PIXELS or things, during 4K max settings gameplay of certain titles. I would certainly like to be able to Turn On all sorts of variable rate shading and other ideas, to reduce load, while keeping 4k HDR crispness and the latency variability below 1ms.

The low milliseconds are important, every ms counts.

The sensitivity to latency is related to the gameplay in fast action such as Quake Champions. For example there are "flick shots" that happen when FPS is above a certain number, and do not really happen otherwise.

If someone knows some technical details, please share some diagrams and facts.

Please do not post unless you can shed some light.
 
Last edited:
Important is the Choice to the end user.

I would like to Not render a lot of frames during 4K max settings gameplay of certain titles. I would certainly like to be able to Turn On all sorts of variable rate shading and other ideas, to reduce load, while keeping 4k HDR crispness and the latency variability below 1ms.

The low milliseconds are important, every ms counts.

The sensitivity to latency is related to the gameplay in fast action such as Quake Champions. For example there are "flick shots" that happen when FPS is above a certain number, and do not really happen otherwise.

If someone knows some technical details, please share some diagrams and facts.

Please do not post unless you can shed some light.

What you are asking for is Upscaling and quite different to what your original post was. If you do not want to use DLSS, FSR or XeSS you have to select a lower than 4k resolution in your games and have them upscaled by the GPU. In the Adrenalin software, for example, there is an option for GPU upscaling with options to preserve aspect ratio and integer scaling and I bet nVIDIA does have such options, too. Not so sure about intel but if you are on intel you probably will have different problems.

But you will always have to pay a certain price... latency, FPS or sharpness, etc. or a combination of some sort.

If you want maximum performance with best latency at Ultra settings: Buy an RTX 4090.


PS: About flawlesswidescreen.org:
"Flawless Widescreen was created in an effort to make it easier to craft fixes and patches to get games functioning correctly in UltraWide/Surround/Eyefinity gaming resolutions, often developers neglect these types of users leaving them to fend for themselves and find their own solutions, or in some unfortunate cases - live without the wonderful world of ultra-wide support."
That is actually the complete opposite of what you were asking for in the original posting.
 
Last edited:
No not Upscaling.

Unexcited by the 4090 because the raytracing FPS is not impressive. Saving money for the next gen, while using previous gen.

I am excited about optimizations that will be relevant for every machine.

The idea is to improve the experience while keeping certain native resolution details original to 4K where the look of the game world is more immersive, engaging and visually satisfying as well as having extreme low total lag end to end (input lag).

I do not know how to DLSS Quake Champions. I have tried turning down the in-game resolution, and the reneging Percentage. I have extensively tested all options and combinations.

I added custom resolution which adds black bars top and bottom, but then the game Zoomed out and rendered even more pixels for "widescreen" which is Wrong.

It should keep the Zoom and simply CUT the picture. This is the ideal result. This should be added as option by the devs.

Similar to how 4:3 would work-keeping that Zoom.
 
What you are asking for is Upscaling and quite different to what your original post was.
They are asking for a letterbox option.
Same trick some console games used back in the day to save performance (sometimes hiding behind "cinematic" bs. See The Order 1884, The Evil Within).
The solution in your first post is probably as close as they can get to their goal...

I added custom resolution which adds black bars top and bottom, but then the game Zoomed out and rendered even more pixels for "widescreen" which is Wrong.
Greater horizontal to vertical ratio automatically means greater horizontal field of view, all other things being equal.

The game didn't "render more pixels." It rendered exactly the amount you've set in the custom resolution. It drew the scene using a wider lens, perhaps, but that has nothing to do with output resolution. If your custom resolution was set to -say- 1920x900, then you are rendering less pixels than 1920x1080, even if the camera shown more of the world.

You can play around with FoV settings (although not all titles expose them) to try and reach the visual results you want, this, however, won't change the amount of pixels your computer is churning out.

This should be added as option by the devs.
While it shouldn't be technically difficult (Reducing vertical axis of camera frustum/rect while maintaining output image size and h-FoV with a black background fill would do the trick), it would require level designers and directors to do double the work to insure their scene composition fits both the normal and letterboxed modes, not to mentioned reworking UI, menus, etc. And that's not really something worth doing for performance sake when you have other options like lowering the renderer resolution or, for more granular control, independently reduce res of shadows, effects, textures, and in some cases even meshes. All wouldn't require much extra work on the level/game design. Let's not even get into DLSS and VRS.
 
No need to fiddle around with driver hacks. Just select a screen mode that's wider than your monitor. You may have to create a custom mode to do it though, for example, 1920 x 1000.

NVIDIA has a custom mode maker built into its driver control panel. Not sure about AMD.
 
This can basically be solved with a CRT screen, I can play games at any resolution I want and it'll still look good, I ran Cyberpunk at 800x600 and got 130 FPS, looked good enough for me. No need for black bars or upscaling, plus no ghosting or lag.

If LCD was never comercially viable we could have thicc widescreen CRTs but noooo
 
If LCD was never comercially viable we could have thicc widescreen CRTs but noooo
I actually owned one of the few widescreen CRTs ever made. Thing even had a hdmi input. Frankly, it was a pain in the ass and I don't miss it.
 
I actually owned one of the few widescreen CRTs ever made. Thing even had a hdmi input. Frankly, it was a pain in the ass and I don't miss it.
ow why a pain in the a?? I'd love one
 
ow why a pain in the a?? I'd love one
Too big and never could get the alignments on the edges perfect.

The picture was fine otherwise, naturally. It's been gone for years now.
 
I actually owned one of the few widescreen CRTs ever made. Thing even had a hdmi input.
I don't like you any more. <qubit throws toys out of pram and sulks>
 
I don't like you any more. <qubit throws toys out of pram and sulks>
It's probably in a dump near Hawks Prairie in Washington. Just go digging?
 
The question is: why buy a 4k display in the first place if you want better frametimes/latency/etc. and maybe even (ultra)wide resolutions? In this case you are better of buying a high refresh rate 2560x1080 display and playing games at native resolution.
 
Ideally, to help both consumer and creators, this will be included, and easy. The games will have all pixel and stuff reduction features available + allowed by the graphics driver.

The game + driver will scale to existing hardware first, and just inform the user with side by side picture, so the user can decide what to get, while playing.

This is correct because even high end machines and combos have some limitations that need to be managed this way.

It is not quite corret to separate gaming tings and gamers from normal computer users. It is better to think of gamers as computer users who enjoy some games a little bit more and are better at some games.

From economics perspective there is no difference from a computer user and a gamer. Both want LOW LATENCY and quality in all apps.

I don't know if some decision makers on the game and hardware creator side, will read this.

In Quake Champions I achieved 2x+ results (KDR kill death ratio) by doing the mentioned in Post3 + put the keyboard and mouse behind the display, and the display closer to the eyes.

It works to get a better gaming experience with the available 4K display (high end laptop that I spent a lot on).

The same trick will work on some other machines that my friends still use. They do not want to buy anything. Maybe I can buy new friends? Lol.
 
Last edited:
It is not quite corret to separate gaming tings and gamers from normal computer users. It is better to think of gamers as computer users who enjoy some games a little bit more and are better at some games.
Of course it is. Not everyone wants a 4K 240Hz monitor, or a motherboard full of RGB, or a tempered glass case, those are "gaming" things and not for us puny normal computer users that just want a thing that works.

I like videogames and play them from time to time but I'm not a gamer, an old term to define me was "power user" or "geek" but now the term geek has been extremely normified and overused so I don't like it anymore.

From economics perspective there is no difference from a computer user and a gamer. Both want LOW LATENCY and quality in all apps.
What do you want low latency for in MS Word? or while reading your e-mail? it's not that much of a problem, and you don't need 4K to read an email or talk to your friends on a chat program. And what's with the HUGE monitors? I get 27" or bigger is nice to play games on but I assure you it's terrible to stare at for more than 30 minutes in an office environment, unless you're a "content creator" and work in marketing, video editing, etc.
The company I work at has all 19" monitors and me being in the IT dept I haven't got any complaints about them. Oh and 720p for everyone!
 
I have been running some busy forums and sites in the past as well as other community initiatives. Benn there done that.

Many people seem to be SLOWly working with computers. This is extremely wrong. Very strange to justify LOW PERFROMANCE.

From evolutionary perspective, machines are tools are weapons to hit targets with, precisely.

I guess no point in arguing when your users become smarter and know better they will start BYODing their stuff unless other factors are also limiting them?

I hate limits and low performance.

I am so happy I can buy any low latency gear on the market. Thanks to everyone who understands low latency, quality and productivity.

Replace the displays with qdoleds and watch productivity increase 2x+.
 
Back
Top