• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Hitman 2 Benchmark Performance Analysis

It could still be applied to his armpits :D

I thought of the eyebrows... but otherwise he could be the first baldy to apply awesome RTX Raytracing... You could always see someone lurking from behind... like a mirror lol. it would be hilarious.
 
I thought of the eyebrows... but otherwise he could be the first baldy to apply awesome RTX Raytracing... You could always see someone lurking from behind... like a mirror lol. it would be hilarious.
He could also be the first HDR baldy with the sun bouncing off his top :P
 
Um w1zzard your results are wayyyyy different from every other results out there
different test scene as mentioned before. go play the suburbia level
 
I don't get it why some people think if their GPU can't handle a resolution with absolute max settings they should drop the resolution in the first place.
At least in my opinion and from my experience on a 32" 4K screen, playing at a higher resolution with some dialed back settings is way more visually beneficial than just maxing out every single setting and playing at a lower resolution.
I recenlty upgraded from a GTX 970 and There are many 2017/2018 titles that my overclocked GTX 1080 can't maintain a solid 60FPS at 1440p with max settings but I never will go back to 1080p again.
 
I don't get it why some people think if their GPU can't handle a resolution with absolute max settings they should drop the resolution in the first place.
At least in my opinion and from my experience on a 32" 4K screen, playing at a higher resolution with some dialed back settings is way more visually beneficial than just maxing out every single setting and playing at a lower resolution.
I recenlty upgraded from a GTX 970 and There are many 2017/2018 titles that my overclocked GTX 1080 can't maintain a solid 60FPS at 1440p with max settings but I never will go back to 1080p again.
Great point right there. Reviews test using presets and/or max settings, because that's repeatable without investing too much time in each title.
The thing is you can usually dramatically improve on that if you play with the settings (assuming there are setting to play with and the title is not another dumb console port), often without losing any perceptible quality. But who looks into that, now that we are in the age where everyone demands quick answers/solutions? I mean, it was worth Nvidia's time to create an application to apply optimum settings on a per title basis, that's how lazy/entitled gamers have become.
 
  • Like
Reactions: M2B
Liked the last one, hard pass on this one at least until a few patches from now... and possibly a price drop or 2.
 
Liked the last one, hard pass on this one at least until a few patches from now... and possibly a price drop or 2.
That's how you support franchises you enjoy. You go, girl!
 
I find it unacceptable that these new games are using DX11, at the end of 2018. Game engines now should be built from the ground up to support DX12 and/or Vulkan. It's lazy developers not wanting to push their engines to use the latest technology and improve performance for everyone: AAA devs are driven by greed and money over passion to make great games. Sad times for the entire gaming industry.

This isn't a game I will buy and play anyway. But DX11 needs to die.
 
Easy way to see if invisible tesselation is being used for AMD is forcing it off in the Adrenalin control panel.

Or 8x & 16x at most.
 
is so clear.

but we must remembe that example amd rx 590 is brand new gpu and its skyhigh oc'd version on it, but gtx 1060 is just default version,NO oc'd,still beat rx 590.

...and of coz gtx 1060 need 50% power to beat rx 590!

its so schoking how lausy gpu rx 590 is,totally usless junk and i wonder,why amd release it...
 
I find it unacceptable that these new games are using DX11, at the end of 2018. Game engines now should be built from the ground up to support DX12 and/or Vulkan. It's lazy developers not wanting to push their engines to use the latest technology and improve performance for everyone: AAA devs are driven by greed and money over passion to make great games. Sad times for the entire gaming industry.

This isn't a game I will buy and play anyway. But DX11 needs to die.
Besides DX11 being older than DX12, why do you think it needs to die? Do you understand the differences between the two? Have you ever asked why almost nobody claims OpenGL needs to die just because Vulkan is around?
 
Interesting. Because Hardware Unboxed got quite different results from you guys:

Basically 570 matches 1060 3gb, 580 matches 1060 6gb and then the cards between gtx 1070 till rtx 2080 are all within few fps difference in both min and avg fps.

I don't own the game so cannot confirm but some reddit users mentioned the game now has hairworks option. Was it enabled? If so that would explain the gimped performance on amd cards.

i already laughed hard on this dx11 things and now hairworks option ... oh shit, really? lol lol lol
 
AAA devs are driven by greed and money over passion to make great games. Sad times for the entire gaming industry.
Then you might as well condemn all the good indies too. No one is compelled to use DX12 yet, because until RTX came along, there was no visual improvement, only fewer cpu cycles. It’s not lazy, it’s cost effectiveness. All businesses try to do as much as they can with as little expense or use of resources possible.
 
Computerbase got Rx580's number , higher than 1060 FE , around 20% (Ultra). so far tehcpowerup is only one that got worst number for RX580.
 
More just the slippery slope of PC gaming being nothing but a console.
"MAKE GAMING GREAT AGAIN"?
 
The crap about anything below 60FPS is unplayble is bullsh*t. Same games don't need to be at 60FPS to be smooth.

Frame pacing. A 60 fps V-sync game with slight frame drops can feel worse than a locked 30.
 
Game developers have enough moving parts in the air to worry about without having to write MORE low-level code, the whole bloody point of DirectX is that it's an abstraction layer to make developing 3D an easier task. So DX12 was pretty much doomed from the get-go, especially among smaller developers who don't have the resources to write two code paths. Maybe sanity will prevail in DirectX vNext (they won't call it 13 because that's an unlucky number in the West) and we will see Microsoft actually put in effort to make the DX runtime more intelligent in potentially compiling code down to machine level, instead of forcing devs to do it.

But DX11 needs to die.

So game devs must just stop making software for Windows 7, an OS that still has more than a third of the market? You really didn't think this one through, did you.
 
For those wondering about why these results are so different from most others, it's like Wizz said, it's based on the level, but more specifically, it's based on the LoD in any particular level.


LoD on Ultra is crushing performance on some levels, especially on AMD cards. Switching from Ultra to Medium nets a 94% increase on the 580 on the Another Life stage. TPU's benchmarks are accurate, though adjusting just one setting can result in far better performance than what's shown.
 
Dropping DX12 from a title/engine that previously supported it is a big ouch to DX12's adoption.
Also, 1060 faster than the newly released 590. That's an extra $50 well spent :D
And what about Battlefield V or CoD BO4 where AMD cards massively outperform their NV counterparts? Which games sell better? The latter two or the former?

Plus check Sokol94's link of HU (Techspot). Massive difference. So you may be even wrong in this title.
 
And what about Battlefield V or CoD BO4 where AMD cards massively outperform their NV counterparts? Which games sell better? The latter two or the former?

Plus check Sokol94's link of HU (Techspot). Massive difference. So you may be even wrong in this title.
What about them? You spend $50 more on a card, you expect $50 more worth of performance out of it. A two year old cheaper card shouldn't even come close.
 
So game devs must just stop making software for Windows 7, an OS that still has more than a third of the market? You really didn't think this one through, did you.

I NEVER THINK THINGS THROUGH. Who do you think I am? Some kind of think things througher?

Then you might as well condemn all the good indies too. No one is compelled to use DX12 yet, because until RTX came along, there was no visual improvement, only fewer cpu cycles. It’s not lazy, it’s cost effectiveness. All businesses try to do as much as they can with as little expense or use of resources possible.
Go away with your Logic and Reason.
 
Frame pacing. A 60 fps V-sync game with slight frame drops can feel worse than a locked 30.

That's why they give you the option to V-Sync to half the refresh rate. As long as you can maintain above 30FPS, you're smooth.
 
Last edited:
Back
Top