• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Control Benchmark Test & RTX Performance Analysis

If my rig can't do what you say yours can, then you are clearly lying. Even with a 2080Ti, you will only be getting an extra 5-10fps over my 2080, and at those settings you claim, my rig sits in the 20-30fps at times.

So your system will only be in the mid 30's at best, mid 30's fps is nobody's definition of 'smooth as silk'.

I've got my GPU set at +100 and the only thing i have set back is msaa at x2 and i'm getting 49 to 60 fps with no studder and that looks good to me and that's all that matters.

note the fps counter.....
1.jpg
 
This game seems to be badly optimised across the board (for both AMD and Nvidia), especially for the visuals that you get.

Anyway, I don't think there's anything wrong with Wizz's numbers. Other reviewers are seeing similar performance (taking into consideration the settings being used when testing).

OC3D: https://www.overclock3d.net/reviews/software/control_pc_performance_review_optimisation_guide/10
PCGamer: https://www.pcgamer.com/control-system-requirements-settings-benchmarks-and-performance-analysis/
 
Is there any chance that AMD fanboys on this forum finally learn how RT works and what it affects?
It's been months. For how long will we be tortured with "mirror reflections worked in the 90s"?
:-(
Mirror reflection on blood is awesome :D
 
This game seems to be badly optimised across the board (for both AMD and Nvidia), especially for the visuals that you get.

Anyway, I don't think there's anything wrong with Wizz's numbers. Other reviewers are seeing similar performance (taking into consideration the settings being used when testing).

OC3D: https://www.overclock3d.net/reviews/software/control_pc_performance_review_optimisation_guide/10
PCGamer: https://www.pcgamer.com/control-system-requirements-settings-benchmarks-and-performance-analysis/

Apparently it's even worse on consoles LOL.
 
Mirror reflection on blood is awesome :D
That's down to a poor decision from the devs to treat blood as water, it doesn't detract anything from RTRT. It's a new technology in games, everyone is learning.
I for one am pretty eager to see what the 2nd generation of nvidia RTX cards, AMD's first RT-enabled cards and next-gen consoles can do.
 
I'm playing this game with the following settings on my RTX 2070 Super at 1080p render with DLSS up to 4k:
- RTX: High
- Far Object Detail: High
- Global Reflections: Off
- SSR: verify off
- Shadow Resolution: High
- Shadow Filtering: Medium
- SSAO: Off
- Texture Filtering: High
- Texture Resolution: High
- Volumetric Lighting: Medium

I am able to get 40-60 fps, which is completely acceptable to me. The DLSS does a pretty good job of cleaning up the image. It is definitely much sharper than 1080p native. I think the effects are great. There's something really cool about seeing a reflection of a light on the ceiling in a puddle, or catching a glimpse of glass on a picture reflecting what is in front of you. It truly is something that you have to see in motion to realize how realistic it makes the graphics feel. I had trained myself to ignore shadow and lighting errors in other games. When I started playing Control, it was very impressive to see the correct reflections and shadows everywhere.

I know I would still enjoy it at 1440p or 4k with the RT off, but it is truly a sight to see and I'm more than happy to deal with some lower framerates and resolution to see something so clearly transformative. The motion blur does a good job of keeping things fluid even below 60 fps in my opinion. For those who say they don't see much of a difference, I'm guessing they're just looking at photos. It is truly a sight to see in person when you can explore the angles of the reflections and see how it all works exactly as you would expect it in real life.

Without all the reflective surfaces, it wouldn't be nearly as interesting. I think that's why Battlefield was a bit of a joke. So it's not a be-all, end-all technology. But I'm definitely not turning any of the RT features off because it's truly the first time I've felt like graphics have obviously improved in the last 10 years.

Also: LOL at the 5700 XT vs 2070 Super. We're looking at 14% better frames at 1080p, 17% better at 4k and 1440p. That should shut up the fanboys who insist the 5700 XT is just as fast as the 2070 Super.
 
Last edited:
Having RTX make this game next level visual actually. I'm fighting across a hallway full of destructible glass panels, shooting through glass and see the reflections of the enemy on the remaining glasses just make me feel giggity. Playing at 3440x1440 everything max with DLSS i'm getting 70fps with 2080Ti, quite smooth actually and I have no problem with aiming, the game probably has built in low latency mode.
 
Having RTX make this game next level visual actually. I'm fighting across a hallway full of destructible glass panels, shooting through glass and see the reflections of the enemy on the remaining glasses just make me feel giggity. Playing at 3440x1440 everything max with DLSS i'm getting 70fps with 2080Ti, quite smooth actually and I have no problem with aiming, the game probably has built in low latency mode.

I actually turned on the driver-level low latency mode and it felt a bit better. "On," not "Ultra." I was initially running at 1440p render resolution and at 30 fps, the low latency mode made a difference.
 
I actually turned on the driver-level low latency mode and it felt a bit better. "On," not "Ultra." I was initially running at 1440p render resolution and at 30 fps, the low latency mode made a difference.

Ultra low latency mode only works in DX11, I'm using DX12 but it feel really responsive despite FPS is lower than what I'm accustomed to.
 
Ultra low latency mode only works in DX11, I'm using DX12 but it feel really responsive despite FPS is lower than what I'm accustomed to.

Huh, I forgot about that. Guess I fooled myself! I agree though, the game feels responsive and I think the motion blur helps reduce the feeling of judder or high input latency.
 
For those interested, I've gathered lowest FPS screenshots and highest, as well as my settings. The only setting not on is film grain. Everything else is as high as it will go. You can see by OSD that usage is not really all that extreme on anything.
control-screenshot-2019-09-04-19-23-59-88-medium.png

control-screenshot-2019-09-04-19-24-11-36-medium.png



control-screenshot-2019-09-02-17-43-08-96-medium.png



control-screenshot-2019-09-02-19-29-34-100-medium633.png


control-screenshot-2019-09-02-22-20-55-85-medium.png


control-screenshot-2019-09-03-18-21-14-73-medium.png


control-screenshot-2019-09-03-18-44-01-97-medium.png


control-screenshot-2019-09-03-22-08-45-30-medium.png
 
As in, the nvidia graphics cards? Someone had to do it first, and as the company with the most funds available, it had to be nvidia.
Ray tracing, particularly in this game, is next level amazing. The hardware just needs to get (much) better now so we can all take advantage of it.

I don't think it looks good at all. Everything looks a bit washed in the background and in the foreground just odd can't put words to it yet all I that comes to mind right now is uncanny valley but not in the good sense. Referencing Metalfiber's screens from above those reflections are just wrong in intensity, and in direction(not following z perspective of the cameras position nor character) but I feel like this was deliberate since the camera is in the third person rendering from camera perspective would be wrong, and rendering from character perspective would probably be awkward. Then there's the carpet reflecting light.
 
As in, the nvidia graphics cards? Someone had to do it first, and as the company with the most funds available, it had to be nvidia.
Ray tracing, particularly in this game, is next level amazing. The hardware just needs to get (much) better now so we can all take advantage of it.


Thing is, doing it this generation before they had a dieshrink meant they had to choose: sacrifice 4k/high framerate gains or sacrifice limited-use-case RTX. They chose poorly. I'd have preferred they waited one more generation for the upcoming dieshrink, solidified their 4k/high framerate performance across their product stack for the 20 series, used the dieshrink to make plenty of room for new RTX, and then differentiate the 30 series (or whatever) by varying levels of RTX performance in the next gen. Waiting one generation isn't really asking for much if it improves everyone's performance in EVERYTHING up to baseline levels to set the stage for RTX.

Instead, we have the majority of the stack can't hit 4k/highframerates and RTX is bringing everyone down to 1080p, forcing people to consider upscaling as their only recourse. I hope nvidia fixes this with their next GPU.
 
I don't think it looks good at all. Everything looks a bit washed in the background and in the foreground just odd can't put words to it yet all I that comes to mind right now is uncanny valley but not in the good sense. Referencing Metalfiber's screens from above those reflections are just wrong in intensity, and in direction(not following z perspective of the cameras position nor character) but I feel like this was deliberate since the camera is in the third person rendering from camera perspective would be wrong, and rendering from character perspective would probably be awkward. Then there's the carpet reflecting light.
Some of it comes down to artistic vision, some of it comes down to getting carried away (and it sort of being a showcase game for RTX/RTRT), like the ultra-reflective blood. Haven't really felt anything out of perspective myself, but it could be down to the camera (3rd person).

Thing is, doing it this generation before they had a dieshrink meant they had to choose: sacrifice 4k/high framerate gains or sacrifice limited-use-case RTX. They chose poorly. I'd have preferred they waited one more generation for the upcoming dieshrink, solidified their 4k/high framerate performance across their product stack for the 20 series, used the dieshrink to make plenty of room for new RTX, and then differentiate the 30 series (or whatever) by varying levels of RTX performance in the next gen. Waiting one generation isn't really asking for much if it improves everyone's performance in EVERYTHING up to baseline levels to set the stage for RTX.

Instead, we have the majority of the stack can't hit 4k/highframerates and RTX is bringing everyone down to 1080p, forcing people to consider upscaling as their only recourse. I hope nvidia fixes this with their next GPU.
That's the early adopter's fee (besides the obvious monetary fee as well). And this way, when the next cards come out, you already have some games available to try it out, some devs already have some experience with it and many others are developing games with it. So it's not all bad.
 
Also: LOL at the 5700 XT vs 2070 Super. We're looking at 14% better frames at 1080p, 17% better at 4k and 1440p. That should shut up the fanboys who insist the 5700 XT is just as fast as the 2070 Super.
Who is claiming that? They clearly have no idea of specs or pricing if they are claiming that!

  • 5700 at $350 was aimed as direct competition for the (original) $350 RTX2060;
    Nvidia haven't replaced the RTX2060 at the $350 price point, so that comparison still stands.

  • 5700XT at $400 was aimed at the (original) $529 RTX2070, and it's a close match;
    Nvidia damn-near rebranded the RTX 2070 into a 2060 Super to compete with the 5700XT.

The 2070 Super comes from 2080 silicon and almost double the price point. There is no fair comparison and anyone making the comparison without understanding that it's an unfair $400 vs $700 matchup is an idiot. The fact that 17% more performance than a 5700XT costs an extra $300 is laughable, but that' always the case at the high-end because of diminishing returns and other bottlenecks.
 
Not everyone wants/can spend 1200 euros on a GPU alone.

Yep, some people have a life outside of gaming. Some people have a mate, some kids, some house payments, some car payments, etc,etc....I don't, how is that my problem?
 
Yep, some people have a life outside of gaming. Some people have a mate, some kids, some house payments, some car payments, etc,etc....I don't, how is that my problem?
The arrogance...in my case is more want since I have the money. But it´s not cost effective, meaning, it´s not well spent money, in my humble opinion. But everyone has their own opinion, of course.
 
Who is claiming that? They clearly have no idea of specs or pricing if they are claiming that!

  • 5700 at $350 was aimed as direct competition for the (original) $350 RTX2060;
    Nvidia haven't replaced the RTX2060 at the $350 price point, so that comparison still stands.

  • 5700XT at $400 was aimed at the (original) $529 RTX2070, and it's a close match;
    Nvidia damn-near rebranded the RTX 2070 into a 2060 Super to compete with the 5700XT.

The 2070 Super comes from 2080 silicon and almost double the price point. There is no fair comparison and anyone making the comparison without understanding that it's an unfair $400 vs $700 matchup is an idiot. The fact that 17% more performance than a 5700XT costs an extra $300 is laughable, but that' always the case at the high-end because of diminishing returns and other bottlenecks.

Your whole argument is that we can't compare the $400 5700 XT to the $500 2070 Super because the discontinued 2080 where the TU104 debuted is/was $700? Not much of an argument there. Actually it's pretty deceitful to say "double the pricepoint." Were you saying it wasn't fair to compare the 980 Ti to the Fury X because it "came from the Titan X" and therefore was a $1000 card? Places like r/amd, techspot, guru3d all were selling the story that the 2070 Super was just 5% ahead. It's clearly another rung on the power ladder.
 
Thing is, doing it this generation before they had a dieshrink meant they had to choose: sacrifice 4k/high framerate gains or sacrifice limited-use-case RTX. They chose poorly.
No. The goal was to popularize RTRT among clients and game studios. It clearly worked.

Most big technological changes work that way.
The first gen product is usually worse than what it's trying to replace, as the old technology is always well controlled and polished.

There are a few excellent examples, but lets go for the automotive analogy - like we usually do here.
The first cars sold to general public were slower, weaker, more difficult to operate and less safe than horse carriages.
But they stimulated changes in law and infrastructure. And in society as well: people learned to drive and slowly got used to cars.
This was crucial for the next generations to become as successful as they were.

One day RTRT can become as ubiquitous as anti-aliasing.
But for that to happen, we'll have to tolerate the products that we can get today. Even worse: we'll have to buy and use them. :-)
 
The arrogance...in my case is more want since I have the money. But it´s not cost effective, meaning, it´s not well spent money, in my humble opinion. But everyone has their own opinion, of course.
yep, having no life is real arrogant...wanna keep trying to get my goat or is that it?
 
Sorry, it sounded that way. My bad.

No problem man. I got left in the dust in early 2000s when the graphics slots changed from AGP to PCIe. All i had to fall back on is Consoles for over 10 years. So i try to future proof my system just in case that time comes again...and it will at some point. :peace:
 
Back
Top