Tuesday, November 14th 2017

Battlefield V with RTX Initial Tests: Performance Halved

Having survived an excruciatingly slow patch update, we are testing "Battlefield V" with DirectX Ray-tracing and NVIDIA RTX enabled, across the GeForce RTX 2070, RTX 2080, and RTX 2080 Ti, augmenting the RTX-on test data to our Battlefield V Performance Analysis article. We began testing with a GeForce RTX 2080 Ti graphics card with GeForce 416.94 WHQL drivers on Windows 10 1809. Our initial test results are shocking. With RTX enabled in the "ultra" setting, frame-rates dropped by close to 50% at 1080p.

These may look horrifying, given that at its highest setting, even an RTX 2080 Ti isn't able to manage 1080p 120 Hz. But all is not lost. DICE added granularity to RTX. You can toggle between off, low, medium, high, and ultra as "degrees" of RTX level of detail, under the "DXR ray-traced reflections quality" setting. We are currently working on 27 new data-points (each of the RTX 20-series graphics cards, at each level of RTX, and at each of the three resolutions we tested at).

Update: Our full performance analysis article is live now, including results for RTX 2070, 2080, 2080 Ti, each at RTX off/low/medium/high/ultra.
Add your own comment

180 Comments on Battlefield V with RTX Initial Tests: Performance Halved

#101
Vayra86
GFalsella said:
Are you stupid, legally blind or just trolling? I get all the skepticism towards ray tracing, i thinks it's way to early for it too but comparing a totally game changing tech with hairworks saying it's barely visible is not a statement one can take serious. This is more like 2D->3D than hairworks but ok, guess hating is easier. (don't get me wrong, it's right to hate on nvidia giving their shitty business practices but rtx != nvidia.
Got some in-game comparison screens to make your point, or did you specifically make an account here to shitpost without substance?

As always, I'm open to any source material you can provide, and ready to change my opinion. Proof is in the pudding and so far, I can't say it is more than I said it was.

So far what I have seen from live RTRT is that I really don't prefer the ray traced image to the 'old' rasterized image. I've seen Metro with scenes where you get blinded by sunlight and can't see half of the scene, I've seen some zoomed in RTX ON showcase material of a reflection in a car, a low poly-man on the moon techdemo, and I've seen some ultra-low FPS BFV content with low quality RTRT in it. If you have other examples, go at it.

Also... RTX =! Nvidia?! I think you're confused with DXR? RTX is literally the most blatantly visible shitty business practice Nvidia has employed in the last ten years.
Posted on Reply
#102
OneMoar
There is Always Moar
wow shocking COMPLETELY Shocked didn't see this coming at all ....

just buy it ... smh
Posted on Reply
#103
[XC] Oj101
INSTG8R said:
Well you’re the one trying to justify it in it’s current state despite the facts that it’s not the the amazing miracle tech you continue to claim that it is. What part of its not ready do you need explained to you again? The numbers don’t lie and it’s only going to get worse with the 2080 and 2070. How are you going to try to spin those numbers favourably?
Bottom line the hardware DOESN’T exist.
So please let me hear how you propose it would "be ready"? :)
Posted on Reply
#104
OneMoar
There is Always Moar
[XC
Oj101, post: 3941861, member: 102321"]So please let me hear how you propose it would "be ready"? :)
not crash the system
not halve frame-rates
work as advertised ?

just those 3 ...
Posted on Reply
#105
INSTG8R
And also let’s be clear this is DXR not any NV proprietary stuff. So it’s the DX12 “Standard” RT that AMD could make use of. I wonder if my Vega could leverage its Conpute units for example.
Posted on Reply
#106
Aldain
[XC
Oj101, post: 3941861, member: 102321"]So please let me hear how you propose it would "be ready"? :)
Because HUANG said

"It just works"

Sorry m8 you have no ground to stand on
Posted on Reply
#107
R0H1T
Vayra86 said:
Got some in-game comparison screens to make your point, or did you specifically make an account here to shitpost without substance?

As always, I'm open to any source material you can provide, and ready to change my opinion. Proof is in the pudding and so far, I can't say it is more than I said it was.

So far what I have seen from live RTRT is that I really don't prefer the ray traced image to the 'old' rasterized image. I've seen Metro with scenes where you get blinded by sunlight and can't see half of the scene, I've seen some zoomed in RTX ON showcase material of a reflection in a car and I've seen some ultra-low FPS BFV content with low quality RTRT in it. If you have other examples, go at it.

Also... RTX =! Nvidia?! I think you're confused with DXR?
Bling, bling, bling :roll:

It's obvious none have ever thought of the downsides of blinding shiny new god rays, yeah let's all wear gunnar shades while gaming. Next thing you know, we'll have to turn the brightness down to zero on all our shiny HDR monitors :banghead:
Posted on Reply
#108
[XC] Oj101
OneMoar said:
not crash the system
not halve frame-rates
work as advertised ?

just those 3 ...
1. That's on DICE, not NVIDIA.
2. We don't have a single native ray traced game available. How do you propose we get there without the hardware to back it?
3. It does - it allows real time ray tracing.

Aldain said:
Because HUANG said

"It just works"

Sorry m8 you have no ground to stand on
Are you purposefully being obtuse or do you genuinely not understand that the developers still have to implement it?
Posted on Reply
#109
INSTG8R
[XC
Oj101, post: 3941861, member: 102321"]So please let me hear how you propose it would "be ready"? :)
I have said it many times the hardware needs to be improved because this iteration clearly isn’t capable of the simplest implementation. DX12 DXR requires more horsepower than the current hardware can provide, period, end of story. I just got and HDR capable monitor, I’m quite enjoying the lighting differences I’ve seen and it doesn’t require me to sacrifice 40% of my performance to do it.
Posted on Reply
#110
Vayra86
R0H1T said:
Bling, bling, bling :roll:

It's obvious none have ever thought of the downsides of blinding shiny new god rays, yeah let's all wear gunnar shades while gaming. Next thing you know, we'll have to turn the brightness down to zero on all our shiny HDR monitors :banghead:
1000 nits in your face biatch! This takes griefing to a whole new level :) Flashlights are going to be top of the food chain.
Posted on Reply
#111
mouacyk
Please add some salt in the TPU review, in the form of relative 1080 TI raster performance.
Posted on Reply
#112
[XC] Oj101
INSTG8R said:
I have said it many times the hardware needs to be improved because this iteration clearly isn’t capable of the simplest implementation. DX12 DXR requires more horsepower than the current hardware can provide, period, end of story. I just got and HDR capable monitor, I’m quite enjoying the lighting differences I’ve seen and it doesn’t require me to sacrifice 40% of my performance to do it.
I have to ask which natively ray tracing capable engine you’re using to come to this conclusion? The hardware is capable, the software needs to catch up. This has happened with every generation of console, where the launch games look nowhere near as good as the games further down the line. The hardware doesn’t change, the software does. Or are you blaming NVIDIA for the developers not having the experience yet? You’re bashing NVIDIA’s technical merits based on the ineptitude of the developers.
Posted on Reply
#113
Aldain
[XC
Oj101, post: 3941867, member: 102321"]1. That's on DICE, not NVIDIA.
2. We don't have a single native ray traced game available. How do you propose we get there without the hardware to back it?
3. It does - it allows real time ray tracing.



Are you purposefully being obtuse or do you genuinely not understand that the developers still have to implement it?
Nope.. It was a marketing hype that fell falt on its face. There is no saving grace for Turing..
Posted on Reply
#114
ikeke
[XC
Oj101, post: 3941867, member: 102321"]1. That's on DICE, not NVIDIA.
2. We don't have a single native ray traced game available. How do you propose we get there without the hardware to back it?
3. It does - it allows real time ray tracing.



Are you purposefully being obtuse or do you genuinely not understand that the developers still have to implement it?
Rgd #2, Claybook ise out for some time. So actually one can do RT without Nvidia magicsauce...

[MEDIA=twitter]976132464337764352[/MEDIA]
Posted on Reply
#115
lexluthermiester
Seems like a bunch of cheese needs to be passed around..
Aldain said:
Nope.. It was a marketing hype that fell falt on its face. There is no saving grace for Turing..
For the record, the problems being experienced with BFV are exclusively EA's fault, not NVidia's. Like anyone should be surprised that EA FUBAR's a game release...
Posted on Reply
#116
INSTG8R
[XC
Oj101, post: 3941874, member: 102321"]I have to ask which natively ray tracing capable engine you’re using to come to this conclusion? The hardware is capable, the software needs to catch up. This has happened with every generation of console, where the launch games look nowhere near as good as the games further down the line. The hardware doesn’t change, the software does. Or are you blaming NVIDIA for the developers not having the experience yet? You’re bashing NVIDIA’s technical merits based on the ineptitude of the developers.
You keep saying software but you’ve got it backwards. This DX12 DXR is anew “standard” API that anyone can leverage. This is pure hardware limitation. They’ve already had different levels of DXR and even the lowest still shows severe performance limitations There’s nothing wrong with the software when you can turn it all the way down and still see subpar performance. It’s ALWAYS been the hardware that catches up to the software. Remember Tesselation? How about the DX jumps 9-10-11 The APIs haven’t changed the hardware has been improved to leverage the new APIs/features always been that way and hasn’t changed just because you say so.
Why you keep getting it wrong as an excuse for it’s current failure is getting quite humorous at this point
Posted on Reply
#117
Aldain
lexluthermiester said:
Seems like a bunch of cheese needs to be passed around..

For the record, the problems being experienced with BFV are exclusively EA's fault, not NVidia's. Like anyone should be surprised that EA FUBAR's a game release...
wrong... nvidia has a major role to play in this.. it was nvidia that worked CLOSELY with dice on this.. this is as much as nvidia FAIL as DICE-S.. Stop with the meat-shield mentality
Posted on Reply
#118
Vya Domus
[XC
Oj101, post: 3941874, member: 102321"]The hardware is capable
Clearly it's not. The RT cores are capable of X amount of rays, no software will ever change that, it's a hardware limit.

You seriously need to back down on the Nvidia koolaid.
Posted on Reply
#119
Vayra86
lexluthermiester said:
Seems like a bunch of cheese needs to be passed around..

For the record, the problems being experienced with BFV are exclusively EA's fault, not NVidia's. Like anyone should be surprised that EA FUBAR's a game release...
Wow wow wow. Hold on. DICE is still not EA. Dev versus publisher. The problems experienced are entirely creditable to DICE's programming work and/or collaborative work with Nvidia engineers, of which I'm sure they have a couple walking around.

Let's be honest here, DICE has borked the last four Battlefield releases. BF3, BF4, Hardline and this one. Only BF1 had a somewhat smooth launch. The pattern here is with DICE, not EA. EA's only influence might have been planning/release date, but then again, wasn't it already delayed once?

Time will tell if this was purely a time constraint or a talent/coding problem. The DX12 build for BFV was already shaky as hell and DICE needed years to fix their netcode.

Its also too simple to say its not Nvidia's fault, given their engineers that do help out with these projects. Apparently what Nvidia provides in terms of tooling is far from bug free.
Posted on Reply
#120
GFalsella
Vayra86 said:
Got some in-game comparison screens to make your point, or did you specifically make an account here to shitpost without substance?

As always, I'm open to any source material you can provide, and ready to change my opinion. Proof is in the pudding and so far, I can't say it is more than I said it was.

So far what I have seen from live RTRT is that I really don't prefer the ray traced image to the 'old' rasterized image. I've seen Metro with scenes where you get blinded by sunlight and can't see half of the scene, I've seen some zoomed in RTX ON showcase material of a reflection in a car, a low poly-man on the moon techdemo, and I've seen some ultra-low FPS BFV content with low quality RTRT in it. If you have other examples, go at it.

Also... RTX =! Nvidia?! I think you're confused with DXR? RTX is literally the most blatantly visible shitty business practice Nvidia has employed in the last ten years.
There would be no need for proof if you was more informed about the subject. I suggest you read something about the different techniques of rendering, difference between rasterization/raytracing/path tracing and not only see what it means for the end customer (which already now, with the sad tech demos we were shown are more than visible) but also what it means for the creators. This is truly game changing and will have huge effects on the industry. The only problem is that nvidia pushed for it way too early in order to make up for the lack of traditional performance upgrade and many people now think it's just a gimmick. (because let's be honest, that's what it is for now) Give it 5 more years and even your untrained eye will see the difference. Sorry if i was so aggressive in my formulation, in retrospective that was uncalled for. And with rtx i mean ray tracing which is a technique that exists since decades, not nvidias gpu series or their temporary exclusive support.
Posted on Reply
#121
Vayra86
GFalsella said:
There would be no need for proof if you was more informed about the subject. I suggest you read something about the different techniques of rendering, difference between rasterization/raytracing/path tracing and not only see what it means for the end customer (which already now, with the sad tech demos we were shown are more than visible) but also what it means for the creators. This is truly game changing and will have huge effects on the industry. The only problem is that nvidia pushed for it way too early in order to make up for the lack of traditional performance upgrade and many people now think it's just a gimmick. (because let's be honest, that's what it is for now) Give it 5 more years and even your untrained eye will see the difference. Sorry if i was so aggressive in my formulation, in retrospective that was uncalled for. And with rtx i mean ray tracing which is a technique that exists since decades, not nvidias gpu series or their temporary exclusive support.
Oh but I have no doubts about ray tracing itself, it is like you say ALREADY an established tech for content creators. So then I say, what's new here? Nothing. RTX is vaporware and content creators get Quadro. Don't assume I don't know what it is...

What I have doubts about is doing it in a live scene and in fast paced gameplay where performance is key, and I have doubts about the timing and way RTX is implemented. There is no outlook on any kind of reasonable adoption rate in the near future. Consoles don't have it. Low-mid range doesn't have it. Only grossly overpriced, underperforming high end cards have it, from one single company. De facto monopolized tech with no market share is not something to put your bets on. This isn't CUDA or something.

Even still, RTRT in this implementation is just some low quality passes tacked on to rasterized rendering. It is nowhere near the realism you'd want to extract from it. Its just a minor graphical upgrade with a massive performance hit - much like Hairworks ;)
Posted on Reply
#122
Slizzo
Vayra86 said:
Wow wow wow. Hold on. DICE is still not EA. Dev versus publisher. The problems experienced are entirely creditable to DICE's programming work and/or collaborative work with Nvidia engineers, of which I'm sure they have a couple walking around.

Let's be honest here, DICE has borked the last four Battlefield releases. BF3, BF4, Hardline and this one. Only BF1 had a somewhat smooth launch. The pattern here is with DICE, not EA. EA's only influence might have been planning/release date, but then again, wasn't it already delayed once?

Time will tell if this was purely a time constraint or a talent/coding problem. The DX12 build for BFV was already shaky as hell and DICE needed years to fix their netcode.

Its also too simple to say its not Nvidia's fault, given their engineers that do help out with these projects. Apparently what Nvidia provides in terms of tooling is far from bug free.
Wait, how was BFV's launch not smooth? I've been playing since last Thursday and it's been smooth as all hell for me. Haven't had any crashes, haven't had any bugs.

Granted I haven't enabled DX12, but I haven't for any game yet as it always has lowered performance on games, especially BF1 and now BFV (since it's using the same engine).
Posted on Reply
#123
Vayra86
Slizzo said:
Wait, how was BFV's launch not smooth? I've been playing since last Thursday and it's been smooth as all hell for me. Haven't had any crashes, haven't had any bugs.

Granted I haven't enabled DX12, but I haven't for any game yet as it always has lowered performance on games, especially BF1 and now BFV (since it's using the same engine).
DX11 is smooth, indeed. But for RT you do need DX12. I'm not surprised they finally got a DX11 release right, its about time isn't it? Even so it was postponed at the very last minute.
Posted on Reply
#124
GFalsella
BigMack70 said:
I feel bad for anyone who got suckered by Nvidia's marketing and bought 20 series specifically for ray tracing.

Only one card in the 20 series - the 2080 Ti - has a real reason to be purchased, and that is specifically for maximum possible 4k rasterized gaming performance. All the 20 series cards are (and always were going to be) hot garbage for ray tracing.

If you want ray tracing, wait 1-3 years for 2nd and 3rd generation GPUs designed for it.
Actually 2080 makes a lot of sense to buy even for traditional purposes. Outside of the US 1080ti go for around 100 bucks more than a 2080. It makes sense for the wrong reason but it does.
Posted on Reply
#125
lexluthermiester
Aldain said:
wrong... nvidia has a major role to play in this..
They provided the documentation, limited dev tools and consultation, sure.
Aldain said:
it was nvidia that worked CLOSELY with dice on this..
But it's still the full responsibility for DICE to get things right. They are the dev house, not NVidia.
Aldain said:
this is as much as nvidia FAIL as DICE-S.. Stop with the meat-shield mentality
Wrong. This is exclusively the failure of DICE and EA because..
Vayra86 said:
Wow wow wow. Hold on. DICE is still not EA. Dev versus publisher.
EA still green lit the release before proper testing was done or completed.
Vayra86 said:
The problems experienced are entirely creditable to DICE's programming work and/or collaborative work with Nvidia engineers
That first part is right. However, if NVidia's engineers were present, they were there only in an advisory capacity.

NVidia's engineers are like teachers, they can only teach so much, it's up to the students to pay attention, take the time to practice and work hard to get things right. DICE and EA didn't do so and it shows. This unpleasantness is not a failure of NVidia's tech at all. The ultimate accountability is on DICE and EA and ONLY those two.
Posted on Reply
Add your own comment