• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Ghost Recon Wildlands: Performance Analysis

These guys get very similiar vRam usage numbers:
http://www.guru3d.com/articles_page..._graphics_performance_benchmark_review,9.html

It seems that over 4 GB of vRAm overhead (y) is the normal now. I find it fascinating that the amount of vRam / megapixel does not change much. It is always 0.1 to 0.2 GB/ megapixel, though

----------------------------------------------------------

Go all the way back to Crisis:
http://hardforum.com/showthread.php?t=1456645
defaultluser reports .31 GB@ .48 MP, .36GB@ .79 MP .45GB@ 1.31 MP
and .575 GB at 1.9 megapixels
using 1.9x + y = .575 and .48x + y = .310
.....using substitution....
x = .187 Still the same GB/ megapixel as today
y = .22 GB This was 4xAA btw.
Checking with .79 megapixels or 1024x768 .....
.79x + y = .368 compared to .36 GB that they recorded

---------------------------------------------------------------------

The most "efficient" (low y factor) modern game is Tomb Raider, which looks great.
Starting with 4k:
8.4x + y = 3.1 .... and now 1080p:
2.07x + y = 1.5 .... using subsitution,
8.4x + 1.5 - 2.07x = 3.1 ...reduce to x and
x=.253 or in other words .253 GB needed for each Megapixel (highest I have seen)
now we can solve for y:
2.07x + y = 1.5
y = 1.5 - 2.07x ....replace x with .253 and
y=.977 VERY NICE!
Now lets test with 1440p or 3.7 Megapixels which was said to use 1.94 GB
3.7x + y = 1.91 ---> pretty damn close!!
 
So.. barely 60fps @1080p with my GPU and that is assuming you had a 7700k, lol.. which i don't :)

Now the thing is, above say 1080-1440p i would have had no problem seeing whatever insane requirements they all fancied imposing on us.. no problem at all. Extreme or high end should always cost more for the consumer, one way or the other. But.. at a modern day and age minimum of 1080p? Needing the latest Intel, OCed mind, not stock, and a GTX1080?

And there are honestly people claiming that this is, what? O.K.? The "price of progress"? Jeesus folks, honestly. I understand paying; i can even understand paying more often than the average user. This however is something else entirely.
With that kind of thinking we aren't going anywhere. Or, nowhere nice anyway, let's put it like that. Careful with what you excuse, or why.
 
Last edited:
Project Cars Physx

if you consider that as gameworks then Hitman (2016), Deus EX Mankind Divided and every game use UE4 and Unity out there are gameworks tittle.

just because a game use nvidia PhysX that's mean the game is gameworks tittle. now for real give me an example where gameworks effect that cannot be turned off.
 
if you consider that as gameworks then Hitman (2016), Deus EX Mankind Divided and every game use UE4 and Unity out there are gameworks tittle.

just because a game use nvidia PhysX that's mean the game is gameworks tittle. now for real give me an example where gameworks effect that cannot be turned off.
Well, PhysX is part of the GameWorks suite.
Project Cars is probably not the best example, since it's built to feel realistic and it's hard to achieve that without physics. It is technically a title that uses part of the GameWorks suite and does not have the ability to turn if off, but when it cannot be accelerated by the GPU, PhysX will just run on the CPU. You typically don't need to disable it anyway.
 
Well, PhysX is part of the GameWorks suite.
Project Cars is probably not the best example, since it's built to feel realistic and it's hard to achieve that without physics. It is technically a title that uses part of the GameWorks suite and does not have the ability to turn if off, but when it cannot be accelerated by the GPU, PhysX will just run on the CPU. You typically don't need to disable it anyway.

GPU PhysX might be part of GW but i don't think CPU based PhysX are really included in that. Project Cars only using CPU based PhysX. so in that game nvidia PhysX in any way will never affect AMD gpu performance.
 
Gameworks happened... Just unneeded tessellation here and there and off you go the game is slow enough on nVidia, and terrible on the other guy :)
That's not even remotely true.

When a game struggles to push these low model and texture details with a GTX Titan, something is wrong.
See the markings in these:
ghost_recon_0.jpg ghost_recon_1.jpg ghost_recon_2.jpg
Keep in mind that Titan X should be able to push several million polygons per frame at 60 FPS.
 
That's not even remotely true.

When a game struggles to push these low model and texture details with a GTX Titan, something is wrong.
See the markings in these:
View attachment 84953 View attachment 84954 View attachment 84955
Keep in mind that Titan X should be able to push several million polygons per frame at 60 FPS.

Jesus youre just nitpicking at shit that doesnt matter. Would you like that orange tubing to be tesselated? Would that satisfy you and ultimately change your overall experience of the game?
 
Jesus youre just nitpicking at shit that doesnt matter. Would you like that orange tubing to be tesselated? Would that satisfy you and ultimately change your overall experience of the game?
Well, I think that is the whole point of tesselation. What the guy is trying to say is this game exhibits the downside of tesselation (lower performance), but lacks the upside (nicer graphics).
 
Jesus youre just nitpicking at shit that doesnt matter. Would you like that orange tubing to be tesselated? Would that satisfy you and ultimately change your overall experience of the game?
I'm not talking about tessellation at all. I'm talking about what amount of details the game is able to deliver with this hardware. The polygon count and texture details I've highlighted is not worthy of a game in 2017 with these extreme requirements.
 
I'm not talking about tessellation at all. I'm talking about what amount of details the game is able to deliver with this hardware. The polygon count and texture details I've highlighted is not worthy of a game in 2017 with these extreme requirements.

I disagree, I think the game so far is the best looking game we that has released in 2017. Granted there hasnt been that many new games released since January.
 
I disagree, I think the game so far is the best looking game we that has released in 2017. Granted there hasnt been that many new games released since January.
Disagree all you want, but you can't simply deny the fact that many elements of the scenes are still in very low details, not worthy of a demanding game.
 
Disagree all you want, but you can't simply deny the fact that many elements of the scenes are still in very low details, not worthy of a demanding game.

They are such minor elements in the scenes that it really shouldnt matter. Not every texture needs to be 4k resolutions, not everything needs a million polys, etc. etc.
 
They are such minor elements in the scenes that it really shouldnt matter. Not every texture needs to be 4k resolutions, not everything needs a million polys, etc. etc.
Minor elements like the landscape? (see the second picture I referenced)
Games should do better in 2017 with a decent LoD algorithm.
 
Minor elements like the landscape? (see the second picture I referenced)
Games should do better in 2017 with a decent LoD algorithm.

Im sure if you apply they will have you get right on that.
 
They are such minor elements in the scenes that it really shouldnt matter. Not every texture needs to be 4k resolutions, not everything needs a million polys, etc. etc.

You are kind of defeating your own argument here. I do agree that it is ridiculous that a Titan X can't push 1080/60 steady. There's just no excuse unless you pour something like 8x SSAA over it, which is not the case. Nvidia has used some extremely expensive, partly physics-based effects in the game, of which some are brand new and extremely taxing. It is something very similar to when they introduced Hairworks and all these 'new technologies' always coincidentally are extremely well timed with a new game + GPU/ SKU launch. Back then they were launching Maxwell, now they're launching their Titan X for gamers.

The less informed would indeed conclude you'd need that 1080ti @ 1080p/60. How convenient.

About Project Cars: the main reason for performance gaps was precisely the GPU accelerated PhysX ability of Nvidia cards versus the non-GPU accelerated PhysX on an AMD card, in an option you couldn't remove. AMD had to build this at the driver level because running it on CPU was too taxing by definition. Nvidia was therefore ahead of the curve here.
 
You are kind of defeating your own argument here. I do agree that it is ridiculous that a Titan X can't push 1080/60 steady. There's just no excuse unless you pour something like 8x SSAA over it, which is not the case. Nvidia has used some extremely expensive, partly physics-based effects in the game, of which some are brand new and extremely taxing. It is something very similar to when they introduced Hairworks and all these 'new technologies' always coincidentally are extremely well timed with a new game + GPU/ SKU launch. Back then they were launching Maxwell, now they're launching their Titan X for gamers.

The less informed would indeed conclude you'd need that 1080ti @ 1080p/60. How convenient.

About Project Cars: the main reason for performance gaps was precisely the GPU accelerated PhysX ability of Nvidia cards versus the non-GPU accelerated PhysX on an AMD card, in an option you couldn't remove. AMD had to build this at the driver level because running it on CPU was too taxing by definition. Nvidia was therefore ahead of the curve here.
Eh, back in the day you couldn't max out Quake. I think we needed to wait two generations of video cards before being able to max that title out. FarCry and then Crysis were no better.
Great titles simply push the limits of the current generation hardware. Wannabe titles push some setting to ridiculous level trying mimic that behaviour.

Edit: Just for clarity, I don't think this title is looking as good as it could.
 
Eh, back in the day you couldn't max out Quake. I think we needed to wait two generations of video cards before being able to max that title out. FarCry and then Crysis were no better.
Great titles simply push the limits of the current generation hardware. Wannabe titles push some setting to ridiculous level trying mimic that behaviour.

Edit: Just for clarity, I don't think this title is looking as good as it could.

Well yeah, but GR Wildlands isn't really pushing the envelope now is it? It is another version of the infinitely rehashed AnvilNext engine, that is already heavily overkill in terms of Post FX and other nonsense that serves to hide the ugliness underneath such as low poly counts and low res texturing.

What's striking is perhaps a straight comparison with The Division, which is another Ubisoft title, but here Ubisoft is only the publisher. Now look at how optimized THAT game is compared to how it looks. With a crapload of post effects, but also with solid texturing across the board. Compare the Nvidia driver improvements on that game too and you can see how a fresh engine makes a difference. The Division is rock solid in terms of performance, and has been since launch.

Post effects are cheaper to implement than high quality texture but do tax GPUs much more. Let's stop fooling each other here, please, and compare apples and apples. Far Cry and Quake really pushed the envelope, Crysis did too, and GR Wildlands looks remarkably similar to Crysis 1 both in setting and the widespread use of physics (including physics on foliage!). Yet it still has lower res texturing than Crysis 1 which at the time ALSO had volumetric clouds AND lighting... something Nvidia is now pretending to 'invent' as an awesome feature. Look at their gameworks trailer :P It's a joke.
 
Last edited:
I played both closed an open beta and thus privius bad experience with UBISOFT and releases I actually went all in a bought both the game and season pass. Its simply lot of fun play with a couple of friends, not mind blowing - but just good coop gaming. It plays well on my R9 290 4GB and 16GB DDR4 i5 6400 system, its going 30-40 with som lowering af quality in my Eyefinity setup with 3 monitors in 6000x1200. In the betas there was som issues with playing full screen and placement of minimap - but its seems smooth and fixed in release. Im currently hopeing for AIB RX VEGA upgrade later this year.
 
Thought i would share waiting for my 1080Ti :P, good old 980TI doesn't do to bad.

Untitled_1.jpg
 
I just noticed 2 things:

a) you can change objectives in Map. It is wonderful to do your own stuff.

b) W1zzard was right: coop is more fun. I tried it and yes I l ike the game more now.
 
This game is terribly optimized. I can push 60fps no problem out of an e3-1270v2(clocks to 3.7ghz turbo all day long), with 50%ish CPU usage, and 90% GPU usage out of a GTX 1080 FTW hybrid clocked to 2050 core, 5200 mem. Something is wrong with this game, to the point that even though it was free with my 1080, i feel like i want my money back. Its nothing groundbreaking on how pretty games look, its just another piece of garbage code from ubisoft that can't take full advantage of the hardware you throw at it. GPU and CPU clock speed differences made little change to the FPS that you get too. clocking my FTW hybrid up to 2200 core and 5400 mem under full boost made a whole non-statistical difference, it just made the GPU utilization go lower. It feels like they put in FPS caps based on your resolution and graphics hardware rather than a general FPS cap, since they don't want to make the console versions look too terribly bad, since that is the bread and butter of their business right now.
 
DI am enjoying this game more if I do not care about the "precooked missions". It is really nice for now. And map is more than huge.
Just for those interested: constant 2000Mhz and nice FPS. Although the fan at 80% gets a bit loud, headphones recommended at night. : )
GRWL.jpg
 
GTX 980TI SLI

Untitled-1.jpg
 
Back
Top