Wednesday, November 14th 2018

Battlefield V with RTX Initial Tests: Performance Halved

Having survived an excruciatingly slow patch update, we are testing "Battlefield V" with DirectX Ray-tracing and NVIDIA RTX enabled, across the GeForce RTX 2070, RTX 2080, and RTX 2080 Ti, augmenting the RTX-on test data to our Battlefield V Performance Analysis article. We began testing with a GeForce RTX 2080 Ti graphics card with GeForce 416.94 WHQL drivers on Windows 10 1809. Our initial test results are shocking. With RTX enabled in the "ultra" setting, frame-rates dropped by close to 50% at 1080p.

These may look horrifying, given that at its highest setting, even an RTX 2080 Ti isn't able to manage 1080p 120 Hz. But all is not lost. DICE added granularity to RTX. You can toggle between off, low, medium, high, and ultra as "degrees" of RTX level of detail, under the "DXR ray-traced reflections quality" setting. We are currently working on 27 new data-points (each of the RTX 20-series graphics cards, at each level of RTX, and at each of the three resolutions we tested at).

Update: Our full performance analysis article is live now, including results for RTX 2070, 2080, 2080 Ti, each at RTX off/low/medium/high/ultra.
Add your own comment

180 Comments on Battlefield V with RTX Initial Tests: Performance Halved

#126
Armagg3don
L33t said:
Try disabling SLI. I have a feeling RTX is not working with SLI. Specially since DX12 never worked with SLI and DX12 is a requirement for RTX.

Dice will have to add support for DX12 Explicit Multi-GPU..
You are wrong, SLI support depends on the developer and not on the API (in this case DX12).
There are DX12 games that do support SLI, like GoW4 and RoTTR, and I must say that the support is quite good.
Posted on Reply
#127
Vayra86
lexluthermiester said:
They provided the documentation, limited dev tools and consultation, sure.

But it's still the full responsibility for DICE to get things right. They are the dev house, not NVidia.

Wrong. This is exclusively the failure of DICE and EA because..

EA still green lit the release before proper testing was done or completed.

That first part is right. However, if NVidia's engineers were present, they were there only in an advisory capacity.

NVidia's engineers are like teachers, they can only teach so much, it's up to the students to pay attention, take the time to practice and work hard to get things right. DICE and EA didn't do so and it shows. This unpleasantness is not a failure of NVidia's tech at all. The ultimate accountability is on DICE and EA and ONLY those two.
Let's not split hairs, at least you already backed down to EA + DICE. I'm good with that. No matter the accountability though, these early tech projects are still collaborative projects. I'm quite sure even Nvidia's engineers still take away valuable info from these field tests and push improvements based on it. And as long as they provide a codebase, they have a responsibility and there is no way its perfect from the get-go.

I work with software suppliers too (SaaS applications) and I've also seen the first implementations of it in the field, and this is always how it works out. There is no single company arrogant enough to say their code is perfect, and that alone prompts change.
Posted on Reply
#128
Slizzo
Armagg3don said:
You are wrong, SLI support depends on the developer and not on the API (in this case DX12).
There are DX12 games that do support SLI, like GoW4 and RoTTR, and I must say that the support is quite good.
He may have been directly referencing Battlefield with his comment.
Posted on Reply
#129
theoneandonlymrk
[XC
Oj101, post: 3941740, member: 102321"]Maybe you should read my full post before replying, such as:









You too are judging an entire technology based on one game which was not designed for ray tracing.

Calling RTX anything less than revolutionary is an insult to every engineer who worked on it. Just because you don't understand it or we don't have the software available to take full advantage of it doesn't take anything away from what it is.
Your sounding like you don't understand it ,this is Not full raytracing its lighting and shadows based on ray paths, an engine made for Rtx will stll be s rasterizing engine.
One game doesn't write it off i agree.

The fact it costs so much does, most buy a balanced pc so high end is normally a pretty decent pc attached to a decent monitor able now to play any game at 1440p or 4k ,that pc was not made to show 1080p in most owners eyes.
Posted on Reply
#131
INSTG8R
lexluthermiester said:
They provided the documentation, limited dev tools and consultation, sure.

But it's still the full responsibility for DICE to get things right. They are the dev house, not NVidia.

Wrong. This is exclusively the failure of DICE and EA because..

EA still green lit the release before proper testing was done or completed.

That first part is right. However, if NVidia's engineers were present, they were there only in an advisory capacity.

NVidia's engineers are like teachers, they can only teach so much, it's up to the students to pay attention, take the time to practice and work hard to get things right. DICE and EA didn't do so and it shows. This unpleasantness is not a failure of NVidia's tech at all. The ultimate accountability is on DICE and EA and ONLY those two.
Except you’re missing the fact that this DX DXR not an Nvidia tech just a new DX12 feature so it of course it’s all exposed to Devs and anyone who cares to use it AMD could use it too and have access to all the same libraries. It’s a “standard” DX12 feature now maybe DICE didn’t get it perfect but it’s not Nvidia “special sauce”
Posted on Reply
#132
lexluthermiester
Vayra86 said:
Let's not split hairs, at least you already backed down to EA + DICE.
You what I meant though.
Vayra86 said:
I'm quite sure even Nvidia's engineers still take away valuable info from these field tests and push improvements based on it.
Very likely. Every failure can be a great opportunity to learn!

INSTG8R said:
Except you’re missing the fact that this DX DXR not an Nvidia tech just a new DX12 feature so it of course it’s all exposed to Devs and anyone who cares to use it AMD could use it too and have access to all the same libraries. It’s a “standard” DX12 feature now maybe DICE didn’t get it perfect but it’s not Nvidia “special sauce”
That's a very good point actually.
Posted on Reply
#133
GFalsella
Vayra86 said:
Oh but I have no doubts about ray tracing itself, it is like you say ALREADY an established tech for content creators. So then I say, what's new here? Nothing. RTX is vaporware and content creators get Quadro. Don't assume I don't know what it is...

What I have doubts about is doing it in a live scene and in fast paced gameplay where performance is key, and I have doubts about the timing and way RTX is implemented. There is no outlook on any kind of reasonable adoption rate in the near future. Consoles don't have it. Low-mid range doesn't have it. Only grossly overpriced, underperforming high end cards have it, from one single company. De facto monopolized tech with no market share is not something to put your bets on. This isn't CUDA or something.

Even still, RTRT in this implementation is just some low quality passes tacked on to rasterized rendering. It is nowhere near the realism you'd want to extract from it. Its just a minor graphical upgrade with a massive performance hit - much like Hairworks ;)
Ok, if you put it that way i can really agree with you. Too many people don't understand and start to treat it like "silly rays nice reflections" without knowing what they are talking about. I think you touched a very important point: consoles. I really doubt ps5 and others will be capable of ray tracing and this will nullify all advantages for software houses (turning it into extra work for pc portings) so we won't see the actual advantages im hoping for during this decade.
Posted on Reply
#134
Armagg3don
Slizzo said:
He may have been directly referencing Battlefield with his comment.
Maybe, but then it's too soon to talk about not supporting SLI in DX12 for this game because it' s only a few days since it was released.
Posted on Reply
#135
INSTG8R
Armagg3don said:
Maybe, but then it's too soon to talk about not supporting SLI in DX12 for this game because it' s only a few days since it was released.
Apparently it’s coming.
Posted on Reply
#136
btarunr
Editor & Senior Moderator
ikeke said:
That seems to suggest theres a memory bottleneck....
Memory overclock improves RTX performance more than non-RTX performance. We saw a 5% improvement for RTX-on vs. 2% improvement for RTX-off at a given overclocked memory speed. However, we can't go too far with RTX 2080 Ti memory OC without having to play with advanced settings/cooling.
Posted on Reply
#137
B-Real
Funny this is. Barely 60 fps with RTX with a 1200$ card in probably the best optimized game on the market.
Posted on Reply
#138
theoneandonlymrk
Armagg3don said:
Maybe, but then it's too soon to talk about not supporting SLI in DX12 for this game because it' s only a few days since it was released.
Head over to Nvidia forums and you will learn just how great sli on dx12 is, ie it is not really supported , aggressively or proper, and i doubt it's really sli and not explicit Multi Gpu as it should be too, what's in a name after all.

Why everyone defends this shits beyond me.

And It is not too soon they worked on this for years (3-5)not fecking days, ,if it does or not is of no matter to 99% of Rtx owner's who are not going to buy two.

How many Devs work on a thing for 0.0002% approximately of their audience.

Rtx just died for three more years imho, at best.
Posted on Reply
#139
Konceptz
INSTG8R said:
Nah Man we couldn’t even do that. I mean seeing as NV is still using the same ancient CP you can still select what runs PhysX even now

I"m sorry but this is friggin hilarious LOL
Posted on Reply
#140
ikeke
btarunr said:
Memory overclock improves RTX performance more than non-RTX performance. We saw a 5% improvement for RTX-on vs. 2% improvement for RTX-off at a given overclocked memory speed. However, we can't go too far with RTX 2080 Ti memory OC without having to play with advanced settings/cooling.
Thanks!

Is there any difference in temperatures or power consumption with RTX on/off?

Any changes to core frequency?

Could it be power limited - so that in order to run RTX cores other areas of chip are power throttled or lose some headroom?
Posted on Reply
#141
mouacyk
[XC
Oj101, post: 3941799, member: 102321"]Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.

So how exactly did NVIDIA introduce this that was so wrong in your opinion?
I honestly believe that customers shouldn't be paying for games that aren't made yet, either.
Posted on Reply
#142
londiste
ikeke said:
Is there any difference in temperatures or power consumption with RTX on/off?
Any changes to core frequency?
Could it be power limited - so that in order to run RTX cores other areas of chip are power throttled or lose some headroom?
Same temperatures, same power consumption, same frequency.
Sounds logical that using RT cores throttle other parts of the GPU but it's not directly visible in clocks or power.
Posted on Reply
#143
Majikaru
londiste said:
Horrifying?
So you mean it does 80+FPS at 1080p with DXR at highest possible setting?
Assuming the same (less than) 50% hit, it'll do 60+FPS at 1440p, and 40+FPS at UHD 4K.
What you mean. There's a chart showing RTX turned on goes from 158 fps to 65 FPs at 1080p. You definitely aren't getting 60+ FPS at 1440p.
Posted on Reply
#144
londiste
Majikaru said:
What you mean. There's a chart showing RTX turned on goes from 158 fps to 65 FPs at 1080p. You definitely aren't getting 60+ FPS at 1440p.
Well, the initial news post has a unique understanding of close to 50% in light of the results that were updated and published later :D
Posted on Reply
#145
xorbe
Personally I don' think this feature is going to win over users on a fast paced shoot 'em up game. Needs a single player slow paced game like Portal 3 or something.
Posted on Reply
#146
DirtbagDave
I'm just curious if anyone was actually going to use this tech to begin with. I got myself an RTX card but was not even remotely looking forward to this technology, we all knew the performance in actual gameplay was going to be horrible.
Posted on Reply
#147
Xuper
Told ya.it's still too early to process DXR. Probably 2025 or even more.We really need to bring DXR into middle range cards which is more like equal to 2080Ti's Performance. I think 2025 is year to go.I saw some screenshot , It's really nice.
Posted on Reply
#148
Prima.Vera
Just a thought, nobody would have bitched about the crappy RTRT performance of the cards in any of the games, if the cards would have been 40% cheaper, like they should normally be priced. If not 50% less. Asking PREMIUM for something that doesn't work properly it's completely disrespectful and insulting. But we already know we cannot expect anything better from a typical greedy and scrupulous corporation like nGreedia.
Posted on Reply
#149
moproblems99
Prima.Vera said:
Just a thought, nobody would have bitched about the crappy RTRT performance of the cards in any of the games, if the cards would have been 40% cheaper, like they should normally be priced. If not 50% less. Asking PREMIUM for something that doesn't work properly it's completely disrespectful and insulting. But we already know we cannot expect anything better from a typical greedy and scrupulous corporation like nGreedia.
Sadly, it didn't stop anyone from buying them. And honestly, the people that are complaining probably don't own one. And really, no one should be complaining because it was widely reported pre-launch that performance was going to tank hard.

Even if performance didn't fall through the floor, I am not sure I like how it looks. Perhaps with MOAR rays it will look better? Everything looks like it is made of glass and sort of comes out less realistic...

*Edited for grammar
Posted on Reply
#150
Th3pwn3r
It's as if the majority of people crapping on turing don't know that you don't have to ray tracing. You can still have a greatly improved 4k gaming experience by paying the premium price. If I bought a 2080ti I wouldn't care about ray tracing but the slight bump up in FPS @ 4k resolution.
Posted on Reply
Add your own comment