• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Fable Legends DX12 Benchmarked

Bottom line, with GTX 980, despite of all the "AMD is going to be so much awesome" I just don't see it unless you make some specific shit like in that whatever game that has been designed with AMD since day 1. Not really any different than games that have been designed with NVIDIA since day 1. Frankly, things don't really seem to be any different at all.
 
This is the benchmark. Just in-case people are wondering


Its more of a tech scenery benchmark/demo. They should have used something similar to what they had at E3 for X-Box One. Probably on two different development paths.

Bottom line, with GTX 980, despite of all the "AMD is going to be so much awesome" I just don't see it unless you make some specific shit like in that whatever game that has been designed with AMD since day 1. Not really any different than games that have been designed with NVIDIA since day 1. Frankly, things don't really seem to be any different at all.

390X
$389.99 w/Rebate
$399.99

980
$449.99 w/Rebate
$479.99



Unreal Engine 4.9 Release Notes

New: Experimental DirectX 12 Support
DirectX 12 is now supported as an experimental feature! If you are using Windows 10, try it out by running the engine with "-DX12" on the command line.

image_7.jpg


Microsoft's engineers added support for DirectX 12 to UE4 and we have worked with them to integrate their changes into 4.9. The feature is still new and is considered experimental. DirectX 12 offers a much lower-level rendering API that is more efficient and allows for rendering commands to be submitted in parallel across many threads, a feature inspired by console rendering APIs. Going forward, we'll continue to improve support for DirectX 12 and look for ways to leverage the new API in upcoming versions of the engine.


The Rendering Hardware Interface (RHI) now supports asynchronous compute (AsyncCompute) for Xbox One.

This feature was implemented by Lionhead Studios.

It will be up to developers to introduce async to the UE4 engine
 
Last edited:
Where can I get this benchmark?

You can't yet.

This is the benchmark. Just in-case people are wondering


Its more of a tech scenery benchmark/demo. They should have used something similar to what they had at E3 for X-Box One. Probably on two different development paths.



390X
$389.99 w/Rebate
$399.99

980
$449.99 w/Rebate
$479.99



Unreal Engine 4.9 Release Notes








It will be up to developers to introduce async to the UE4 engine

From PCPer:

http://www.pcper.com/reviews/Graphi...-Benchmark-DX12-Performance-Testing-Continues

Compute shader simulation and culling is the cost of our foliage physics sim, collision and also per-instance culling, all of which run on the GPU. Again, this work runs asynchronously on supporting hardware.

It uses Asynchronous compute. Wasn't sure if @Xzibit was referencing Lionhead saying they had made async for XBox One or if the inference was also for PC? Just clarifying - it does use Async Compute in the PC Benchmark.

And from Techreport:

http://techreport.com/review/29090/fable-legends-directx-12-performance-revealed

Lionhead Studios has made several additions to the engine to implement advanced visual effects, and has made use of several new DirectX 12 features, such as Async Compute, manual Resource Barrier tracking, and explicit memory management to help the game achieve the best possible performance.

fable4k-fps.gif
 
Last edited:
Yeah, I was looking at the benches on Tech Report earlier. This game is being pushed by MS to showcase the DX12 features. Naturally I would expect for Hawaii and Fiji to have the edge but apparently not by too much.

fable-fps.gif



fable4k-fps.gif


Source: TechReport
 
Am I missing somthing? How are these graphics and better than DX11 ones?
 
Yeah, I was looking at the benches on Tech Report earlier. This game is being pushed by MS to showcase the DX12 features. Naturally I would expect for Hawaii and Fiji to have the edge but apparently not by too much.

fable-fps.gif



fable4k-fps.gif


Source: TechReport

It shows that other than the Fury X, the 980 hangs with all the new AMD cards, and the 970 doesn't fall far behind.. It looks like for the most part, games will be an even playing field.

This should be good news for AMD to recover some market, and also means Maxwell, despite lack of Async are perfectly valid and are likely to remain so.
 
Hmmmmm...so presumably a few frames better would be the result for the AMD cards? Is that what you're saying? If they are better drivers, then it makes a difference.
In DX12 pure power and architecture advance will win. Drivers won't be important anymore due to the nature of the API. Drivers matter up to DX11.
 
Last edited:
In DX12 pure power and archtecture advance will win. Drivers won't be important anymore due to the nature of the API. Drivers matter up to DX11.

So, essentially, you are saying it didn't matter that they used older drivers on the AMD cards?
 
So, essentially, you are saying it didn't matter that they used older drivers on the AMD cards?
Drivers were never that important to begin with for over all performance. W1zz proved that a few years ago.
 
Drivers were never that important to begin with for over all performance. W1zz proved that a few years ago.

Yeah, I know he did. The whole line of questioning was predicated on @medi01 stating on page 1 that the results were suspect because Anand didn't use the newest AMD drivers.
 
Let me get this straight.
  • They're testing cards with out of date drivers
  • without drivers FOR directx 12
  • on an OS that's barely two months old
  • benchmarking an API that's not even out of production yet
  • by using a game(s) that aren't even out yet
  • on cards that probably won't see proper DX12 games for another 2 years.
There's a reason this kinda crap doesn't make front page news on TPU. It's a total waste of time.

EDIT: i7 + Fury X/980ti benchmarking on 720p. What's the point here? :P
 
So, essentially, you are saying it didn't matter that they used older drivers on the AMD cards?
Once they make the driver for a game it will be maxed out performance and not 70% of it which will take 1 year to reach at 100%. Remember battlefield 3? Over 30% better performance after 3-4 driver updates on 7970 for example. Drivers will be of use but not to unlock performance, just to use features depending to each GPU's architecture.
 
DX12 looks kinda lame.
 
Let me get this straight.
  • They're testing cards with out of date drivers
  • without drivers FOR directx 12
  • on an OS that's barely two months old
  • benchmarking an API that's not even out of production yet
  • by using a game(s) that aren't even out yet
  • on cards that probably won't see proper DX12 games for another 2 years.
There's a reason this kinda crap doesn't make front page news on TPU. It's a total waste of time.

EDIT: i7 + Fury X/980ti benchmarking on 720p. What's the point here? :p

What was your gfx post thing that almost went viral last year??? :p

I posted the link as a kind of balancing act versus AoS. My intent was clear - DX12? - not relevant yet. FL simply redresses the fanboy pish that was being sprayed asunder. Neither AMD or Nvidia will suffer with their current cards on the upcoming DX12 titles. And remember - Deus Ex (Feb 2016) is DX12. Admittedly, the future is still DX11 for a while but it's nice to get the laundry aired.
 
And remember - Deus Ex (Feb 2016) is DX12. Admittedly, the future is still DX11 for a while but it's nice to get the laundry aired.

I was under the impression Mankind Divided was going to be DX11, with W7, 64-bit as a minimum requirement. This is not so anymore?
 

LOL, OK, I commented in there, even, but forgot. It is all murky and foggy in thatarticle as to whether it's a dual DX release or exclusive DX 12. Just saying DX12 support doesn't mean anything really, just that it's SUPPORTED.

Anyway, enough off-topic, we will find out specifics as we get closer to February I guess! :)
 
It means, like most games, it will have a choice DX 11 or DX12. AMD jumping behind Square Enix for that title means it's going to get all of the DX12 bells and whistles.
 
It will be auto-detected.

PCPerspective said:
UPDATE: It turns out that the game will have a fall-back DX11 mode that will be enabled if the game detects a GPU incapable of running DX12.

It be silly not to since UE4 isn't being developed with DX12 in-mind. They will add things to the engine as they see fit. Developers will have to incorporate async themselves if they are using UE4.
 
Last edited:
This is the benchmark. Just in-case people are wondering

Its more of a tech scenery benchmark/demo. They should have used something similar to what they had at E3 for X-Box One. Probably on two different development paths.

390X
$389.99 w/Rebate
$399.99

980
$449.99 w/Rebate
$479.99

prices are subjective and vary by region. but here is something that stays consistent, average power draw during gaming,
GTX 980: 156W
390X: 344W


that's pretty darn awful if you decide to go CF. during a long gaming session 400W of extra power is going to turn your room into a sauna on a summer day. i know this because i was once a not-so-proud owner of a pair of GTX 480. and never mind the few extra dollars a month on the energy bill. AMD can't possibly charge the same price for a 390X.

back on topic, this stuff is pretty much pointless. we need actual games to see which cards are better. no one buys these cards to run half ass demos on beta drivers
 
prices are subjective and vary by region. but here is something that stays consistent, average power draw during gaming,
GTX 980: 156W
390X: 344W


that's pretty darn awful if you decide to go CF. during a long gaming session 400W of extra power is going to turn your room into a sauna on a summer day. i know this because i was once a not-so-proud owner of a pair of GTX 480. and never mind the few extra dollars a month on the energy bill. AMD can't possibly charge the same price for a 390X.

back on topic, this stuff is pretty much pointless. we need actual games to see which cards are better. no one buys these cards to run half ass demos on beta drivers

Not sure if you want to double up on a 980 or 390X for 1080p

Techpowerup said:
Average: Metro: Last Light at 1920x1080, representing a typical gaming power draw. Average of all readings (12 per second) while the benchmark was rendering (no title/loading screen). In order to heat up the card, we run the benchmark once without measuring power consumption.

Power_02.png


If your concern is power draw. You might want to stick to DX11. DX12 utilizes more of the CPU and GPU so there is less opportunity for power savings features "Power Tune & GPU Boost".
 
Last edited:
If your concern is power draw. You might want to stick to DX11. DX12 utilizes more of the CPU and GPU so there is less opportunity for power savings features "Power Tune & GPU Boost".
um last i checked gpu boost isn't a power saving feature, it uses more power since it overclocks the gpu.
 
Back
Top