• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Ghost Recon Wildlands: Performance Analysis

I wish TPU would do 3440x1440 res benchmark also... This res can be custom forced on a 4K monitor in case a 21:9 one is not available ;)
 
Ran the benchmark at ultra 7680x1440 triple monitor and it sucks with 3 GTX Titan X SLI (Maxwell).

Overall performance sucked in beta and continues to suck even outside of mega multi monitor resolutions.
 
I like the idea that NVidia is using their own version of volumetrics with their GI rays, but calling it "god-rays," that name screams hype-gimmick. Is NVidia going to give GPU-rendered fire, through NVidia Gameworks, the name "Devil-fire" just to hype it up and bring in more $$$$. I face-palm. I face-palm hard.
god rays has been the term for years, long before gameworks ever became a thing.
 
I myself have done some bench tests with ultra and very hay. I do not understand how I had only 48 FPS (Ultra settings 1440x3440 2xGTX1080GPU, i73770 4.8gh) and only 80% use of GPUs. I did the tests on the beta game and bugs have occurred in the image. I wondered whether it works or not SLI i meen .Driver will need some correction or the game itself. So far it is too costly game for a bunch of the bugs I do not pay.
Beta game hes expire to me i play 20 min and made bench, Ubisoft and attitude to future buyers.
 
Even a GTX970 is getting ~58fps on Very High settings @ 1080p while VRAM usage barely gets passed 4GB. How is the game being "poorly optimized"? Unless you're expecting your mid-range GPU to do wonders on Ultra settings, then I think you're thinking unrealistically & overestimated it's capability.
 
Ran the benchmark at ultra 7680x1440 triple monitor and it sucks with 3 GTX Titan X SLI (Maxwell).

Overall performance sucked in beta and continues to suck even outside of mega multi monitor resolutions.
Screenshot of the results please. It's not concrete enough coz it sounds like it's a lie.... are you sure THREE GTX Titan X in Triple-SLI isn't fast enough for Triple Ultrawide monitors? Coz, from what I know, that card barely break a sweat running 4K or 1440p G-Sync.
 
I'd be very curious to see CPU testing in it as well. My 5820k had some load on all 12 threads while playing the beta, would like to see just how well it scales with that.
"Some load" is not enough for scaling. If those cores are not maxed out, the work can be batched together and handled by fewer cores just as well.
 
Alright, sorry im not on the Anti-Ubisoft bandwagon.



I am saying based on the benchmarks it doesnt look that bad when it comes to optimization, but maybe im missing something. The cards look to be where one would expect at the resolutions and settings. Please enlighten me on this so called "bad optimization" or is that just a blanket term people seem to throw around because its a game from Ubisoft? Or is it because YOU dont have the hardware to run it, its all of a sudden unoptimized?

I'm in no bandwagon. I got nothing against UBIsoft. It's just that I feel the game doesn't look nearly good-enough to warrant a fucking titan for smooth, steady 60 FPS in 1080p ultra. Bad coding. May be deliberate. They want people to buy titans. Go ahead. I won't. Just like I won't be buying Mafiaa3. I don't feel it's worth it. You do? Go ahead. I'm not a fanboy of anything. I make my decisions on a case-by-case basis.
 
The Vram consumption really means nothing at all it seems. Since the Fury has shown to scale so well with only 4 GB of vRam, I wonder if their would be a performance hit with a 1080ti having only 6GB of RAM. From what I have seen with the Fury, I would say no. These new games do great using dynamic memory.

It is interesting to see what some of these games use in Vram "per megapixel" (x) and simply "overhead" (y). 4k res is 8.3 MPixels and 1080P is 2.07 MP.
For Ultra settings on this game: 8.3x + y = 6.19 GB and 2.07x + y = 5.07 GB ----- so y = 5.07 GB - 2.07x --- substitution --- 8.3x + 5.07 -2.07x + 6.19 ---
{so x = 0.180 or .180 GB for every megapixel.} again, y = 5.07 -2.07x --- {so y = 4.70 GB of "overhead"}

In other words, if you were unfortunate enough to play on a ONE PIXEL screen, the game would still eat up 4.70 GB of vRAM (y).

The math above can be verified by using 3.69 MPixels for 1440p: 3.69(.180) + 4.7 = 5.36 GB ----- Pretty close to the 5.40 GB actual for that res.
 
Another EFFIN Gameworks Title .. Shit optimized and demanding just for the sake of being able to say its demanding. Do you think they realize that people actually play these games or do they think we just stand still in an open field and just observe the Graphics.. For fucks sakes no one like Gameworks stop using it.
 
Screenshot of the results please. It's not concrete enough coz it sounds like it's a lie.... are you sure THREE GTX Titan X in Triple-SLI isn't fast enough for Triple Ultrawide monitors? Coz, from what I know, that card barely break a sweat running 4K or 1440p G-Sync.
Lie? Only lie here is you telling yourself you know things. Chew on some concrete. Game performance sucks, buyer's beware, still in beta.

IMG_2992.PNG
 
24 fps are enough. You can't perceive a Difference to 45 fps.

Is this a joke, or do you really think people's eyes work at 24 Hz? The frame rate of film has nothing to do with human perception. There is a big difference between 24 FPS and 45 FPS.
 
@Kal-EL it's IN beta. Read the fine lines. Of course the performance is not refined yet. It's unchartered territory. Also, you're pushing a 4K Ready GPU way out of it's limit. Just because u have 12GB VRAM doesn't mean u can push it. Also, scaling wise on SLI is a little skittish, especially when u have more than 2 GPUs in sync.
 
Gameworks or not, it doesn't matter to me as long I can get 60fps @ Very High settings without compromising other graphics settings, it's a keeper. If you ppl complain about it, save up next time & invest in a faster Nvidia GPU, not wasting on nonsense.
 
This game looks very demanding considering how low the polygon count and texture details are…
 
This game looks very demanding considering how low the polygon count and texture details are…
Gameworks happened... Just unneeded tessellation here and there and off you go the game is slow enough on nVidia, and terrible on the other guy :)
 
Gameworks happened... Just unneeded tessellation here and there and off you go the game is slow enough on nVidia, and terrible on the other guy :)

Why do you hate without any evidence? Load the game in a frame analyzer, show that the tessellation level is unreasonably high and then comment.

The gameworks techniques make the game more beautiful. If you don't like them, turn them off for yourself in the menu, and let us enjoy the max visuals.
 
Why do you hate without any evidence? Load the game in a frame analyzer, show that the tessellation level is unreasonably high and then comment.

The gameworks techniques make the game more beautiful. If you don't like them, turn them off for yourself in the menu, and let us enjoy the max visuals.

Tessellation is just an example, mate. Though nVidia uses tessellation in many of Gameworks features, because they are the best at it. Also not all the Gameworks features can be turned off in certain games...

Let's try this perspective. Your "max visuals" can be achieved with other approaches, though it would take more time and effort from the devs. Gameworks provide a working version of those "visual", with bias punishments on the performance. So the studios goes with Gameworks to cut their budget and forces the punishment on the customers. Is this a good practice? I must say no.
 
Lie? Only lie here is you telling yourself you know things. Chew on some concrete. Game performance sucks, buyer's beware, still in beta.

View attachment 84922
Because SLI is not working for this particular game ;)

24 fps are enough. You can't perceive a Difference to 45 fps.
Only for movies, since those are using frame interpolation and motion blur. For games, the blur would be madness if enabled on the movie levels ;)
 
The Vram consumption really means nothing at all it seems. Since the Fury has shown to scale so well with only 4 GB of vRam, I wonder if their would be a performance hit with a 1080ti having only 6GB of RAM. From what I have seen with the Fury, I would say no. These new games do great using dynamic memory.

It is interesting to see what some of these games use in Vram "per megapixel" (x) and simply "overhead" (y). 4k res is 8.3 MPixels and 1080P is 2.07 MP.
For Ultra settings on this game: 8.3x + y = 6.19 GB and 2.07x + y = 5.07 GB ----- so y = 5.07 GB - 2.07x --- substitution --- 8.3x + 5.07 -2.07x + 6.19 ---
{so x = 0.180 or .180 GB for every megapixel.} again, y = 5.07 -2.07x --- {so y = 4.70 GB of "overhead"}

In other words, if you were unfortunate enough to play on a ONE PIXEL screen, the game would still eat up 4.70 GB of vRAM (y).

The math above can be verified by using 3.69 MPixels for 1440p: 3.69(.180) + 4.7 = 5.36 GB ----- Pretty close to the 5.40 GB actual for that res.

I am using High Setting with a few tweaks and it's using around 2700GB, Very High uses the same. Ultra tops out at 3900 despite the game saying over budget and well is unplayable. I play at 1440 and I get Avg 50 or so, Very, High, settings in between, all seems to amount to the same 50 give or take...
Edit: I just tested it at Medium and it was worse? 43 avg?? One thing I DID notice is it's using a crap ton of system RAM. I monitor in percentage and it was using 66% of 16GB? and that was on Medium.
 
Last edited:
Tessellation is just an example, mate. Though nVidia uses tessellation in many of Gameworks features, because they are the best at it. Also not all the Gameworks features can be turned off in certain games...

Let's try this perspective. Your "max visuals" can be achieved with other approaches, though it would take more time and effort from the devs. Gameworks provide a working version of those "visual", with bias punishments on the performance. So the studios goes with Gameworks to cut their budget and forces the punishment on the customers. Is this a good practice? I must say no.

lol give me one example where gameworks effect cannot be turned off at all.
 
Back
Top