Tuesday, November 14th 2017

Battlefield V with RTX Initial Tests: Performance Halved

Having survived an excruciatingly slow patch update, we are testing "Battlefield V" with DirectX Ray-tracing and NVIDIA RTX enabled, across the GeForce RTX 2070, RTX 2080, and RTX 2080 Ti, augmenting the RTX-on test data to our Battlefield V Performance Analysis article. We began testing with a GeForce RTX 2080 Ti graphics card with GeForce 416.94 WHQL drivers on Windows 10 1809. Our initial test results are shocking. With RTX enabled in the "ultra" setting, frame-rates dropped by close to 50% at 1080p.

These may look horrifying, given that at its highest setting, even an RTX 2080 Ti isn't able to manage 1080p 120 Hz. But all is not lost. DICE added granularity to RTX. You can toggle between off, low, medium, high, and ultra as "degrees" of RTX level of detail, under the "DXR ray-traced reflections quality" setting. We are currently working on 27 new data-points (each of the RTX 20-series graphics cards, at each level of RTX, and at each of the three resolutions we tested at).

Update: Our full performance analysis article is live now, including results for RTX 2070, 2080, 2080 Ti, each at RTX off/low/medium/high/ultra.
Add your own comment

180 Comments on Battlefield V with RTX Initial Tests: Performance Halved

#77
btarunr
Editor & Senior Moderator
ikeke said:
Could the RT cores be memory limited? They seem to be from 1080p teaser image..
Don't know about that yet. The teaser graph contains data from RTX 2080 Ti, which already has the highest bandwidth in the market. After all this is over, we'll see how memory OC affects this.
Posted on Reply
#78
moproblems99
Forget the performance, I don't even like how it looks it many areas. Anyone with half a brain saw this coming because it was reported that it killed FPS so no surprise. RTX is still some time away but at least that $1300 bought some 30% performance increase without the RTX that will have to be disabled so it wasn't a complete waste of people's money.
Posted on Reply
#79
PerfectWave
"The more you buy, the more you save "......
Posted on Reply
#80
Gasaraki
I hate these "patch" that add technology to a game. If the game was not design with this built in, the performance is not going to be good. Just look at all the games that have DX12 as a "patch" vs. games that are developed with DX12 natively.
Posted on Reply
#81
INSTG8R
RH92 said:
Fist of all to make this conclusion you have to compare the graphical quality of 1080p with RTX enabled and 4K without RTX enabled . As far as i know we don't have such a comparison so you can't make such a conclusion . Don't get me wrong , maybe in some early games the difference won't be worth the hassle , maybe on others it totaly will but as time goes real time ray tracing will for sure show superior results than rasterized rendering since the technology is simply superior .

You say that it is not ready for prime time , but how do you make it ready for prime time if nobody owns such a product ? Do you think game developpers will develope technology for a phantom product ?
The answer is obvious !
Well “regular” BF5 performance numbers are readily available so I’m not sure why you’re trying to split hairs 2080ti gets 83 FPS at 4K Ultra vs the initial numbers of 65FPS at 1080 Ultra the only graphical difference being the addition of RT How can I not form a conclusion from that? Do you not think that the DICE Dev team aren’t all using RTX products for dev work? This is as good as it can be in its current form with the hardware available.
Let’s be clear I’m not hating on RT I’m saying it’s not ready for the consumer space with current hardware. Like all new tech it takes a generation at least before it becomes readily accessible.
Posted on Reply
#82
stimpy88
[XC
Oj101, post: 3941740, member: 102321"]Maybe you should read my full post before replying, such as:









You too are judging an entire technology based on one game which was not designed for ray tracing.

Calling RTX anything less than revolutionary is an insult to every engineer who worked on it. Just because you don't understand it or we don't have the software available to take full advantage of it doesn't take anything away from what it is.
So we wait 3 years, or maybe 5 for a completely new game engine to be built from the ground up, then for the game it’s self to be created using this engine? Yeah, the nGreedia is strong with this one.

Let’s face it, nGreedia knew this was going to happen, and made sure that there was no RTX content for just long enough to get a few tens of thousands of very expensive cards out there to people with more money than sense. They must have been rolling around on their office floors pissing themselves with laughter while they were watching the sales figures counting up minute by minute.

And to help your comprehension, nobody has ever said that RTX technology is a bad idea. It’s the way nGreedia have gone about introducing it that has caused all this hatred. You see, people are not all totally stupid, some of us know when we are being manipulated and lied to.
Posted on Reply
#83
btarunr
Editor & Senior Moderator
RTX 2080 and 2070 data is a bloodbath. They post even bigger %losses than 2080 Ti. ETA on our review is ~1 hr.
Posted on Reply
#84
[XC] Oj101
stimpy88 said:
So we wait 3 years, or maybe 5 for a completely new game engine to be built from the ground up, then for the game it’s self to be created using this engine? Yeah, the nGreedia is strong with this one.

Let’s face it, nGreedia knew this was going to happen, and made sure that there was no RTX content for just long enough to get a few tens of thousands of very expensive cards out there to people with more money than sense.
Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.

stimpy88 said:
And to help your comprehension, nobody has ever said that RTX technology is a bad idea. It’s the way nGreedia have gone about introducing it that has caused all this hatred. You see, people are not all totally stupid, some of us know when we are being manipulated and lied to.
So how exactly did NVIDIA introduce this that was so wrong in your opinion?
Posted on Reply
#85
Aldain
[XC
Oj101, post: 3941799, member: 102321"]Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.



So how exactly did NVIDIA introduce this that was so wrong in your opinion?
"It just works" a blatant marketing HUANG lvl of LIE
Posted on Reply
#86
ikeke
btarunr said:
RTX 2080 and 2070 data is a bloodbath. They post even bigger %losses than 2080 Ti. ETA on our review is ~1 hr.
That seems to suggest theres a memory bottleneck....
Posted on Reply
#87
Bytales
stimpy88 said:
I love to say it... I told you so!

nGreedia... The way you're meant to be played! Fools.

Hope you 2080Ti owners enjoy the 1080p 30 to 60FPS gaming on your shiny 4K 144Hz monitors! And what about the experts here saying what value for money the 2070 is? Yeah, real value, right there! An 2070 at 15-35FPS 1080p sounds great for a $600+ “value”.

Can't wait to hear how wrong I am! nGreedia will be sending the software update out to the green NPC army right about now, so that should make this fun.
Yap, we are still 1 to 2 Generations away from Raytracing implementation on the whole range of Cards doing 1080p@60 with RT enabled even for the lowest end Cards. AMD knows this, ist why they said theyre going to take care or Raytracing only when all their Cards will suport it.

And this is not everything, since you can increase or decrease the number of rays that are to be processed. So even then later on, with 5times the Performance we have now, put enough rays, Performance will still be 30 fps.

Questions is, how much rays are enough ? Perhaps and adaptive algorithm can be used that increases and decreases ray numbers to maintain a target Frame rate would be in order ? Thus lower end Card would use less rays, and higher end Cards more rays for the same Performance Levels. I would rather have something smart and done right like this, then this shitshow we have now. 1400 euro Cards doing 30 fps in 1080p. Pretty much outrageous. You can get a xbox, ps4, nintendo Switch, and and maybe have leftovers for a Laptop for that Money.
Posted on Reply
#88
stimpy88
[XC
Oj101, post: 3941799, member: 102321"]Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.



So how exactly did NVIDIA introduce this that was so wrong in your opinion?
I stopped spoon feeding my daughter when she was 2, tell me why I should be spoon feeding you?

Stop making arguments up. I never said anything of the sort. You really are not good with this, you really need to connect to the nGreedia server, and get your software updated... You said that we have to wait for game devs to write an engine that supports RTX from the ground up. I told you the timeframe that would happen in.

I guess you have AMD and Trump to bring up next, followed by a third course of white male privilege and homophobia baiting?
Posted on Reply
#89
[XC] Oj101
stimpy88 said:
I stopped spoon feeding my daughter when she was 2, tell me why I should be spoon feeding you?

Stop making arguments up. I never said anything of the sort. You really are not good with this, you really need to connect to the nGreedia server, and get your software updated... You said that we have to wait for game devs to write an engine that supports RTX from the ground up. I told you the timeframe that would happen in.

I guess you have AMD and Trump to bring up next, followed by a third course of white male privilege and homophobia baiting?
and made sure that there was no RTX content for just long enough to get a few tens of thousands of very expensive cards out there to people with more money than sense.
So then what exactly are you saying here?
Posted on Reply
#90
Pruny
Bytales said:
Yap, we are still 1 to 2 Generations away from Raytracing implementation on the whole range of Cards doing 1080p@60 with RT enabled even for the lowest end Cards. AMD knows this, ist why they said theyre going to take care or Raytracing only when all their Cards will suport it.

And this is not everything, since you can increase or decrease the number of rays that are to be processed. So even then later on, with 5times the Performance we have now, put enough rays, Performance will still be 30 fps.

Questions is, how much rays are enough ? Perhaps and adaptive algorithm can be used that increases and decreases ray numbers to maintain a target Frame rate would be in order ? Thus lower end Card would use less rays, and higher end Cards more rays for the same Performance Levels. I would rather have something smart and done right like this, then this shitshow we have now. 1400 euro Cards doing 30 fps in 1080p. Pretty much outrageous. You can get a xbox, ps4, nintendo Switch, and and maybe have leftovers for a Laptop for that Money.
For me 0 rays are enough, make separate cards for rtx only tensors.
Posted on Reply
#91
pky
btarunr said:
RTX 2080 and 2070 data is a bloodbath. They post even bigger %losses than 2080 Ti. ETA on our review is ~1 hr.
I was thinking... if you're using Ray Tracing, do the "regular" shadows/lighting/reflections settings affect visual quality, and do they affect FPS? If they don't affect quality (considering RT takes care of that), but they do affect FPS - maybe lowering them/turning them off would boost FPS without quality loss?
Posted on Reply
#92
Nioktefe
ikeke said:
That seems to suggest theres a memory bottleneck....
RTX 2070 has more bandwidth per TFLOPS than RTX2080ti

It would be interesting to see clocks and power consumption along with performance
Posted on Reply
#93
INSTG8R
[XC
Oj101 said:
Oj101, post: 3941820, member: 102321"]So then what exactly are you saying here?
Well you’re the one trying to justify it in it’s current state despite the facts that it’s not the the amazing miracle tech you continue to claim that it is. What part of its not ready do you need explained to you again? The numbers don’t lie and it’s only going to get worse with the 2080 and 2070. How are you going to try to spin those numbers favourably?
Bottom line the hardware DOESN’T exist.
Posted on Reply
#95
AMX85
looking all settings performance, looks like an GPU Engine bug, is´nt using all Cores or someting, Drivers and game will fix it, but i still expecting performance lost, maybe 20-30%


greetings
Posted on Reply
#96
GFalsella
Vayra86 said:




Actually I'm more inclined to say this is Hairworks 2.0. Barely visible improvements at major performance hit, but don't worry, you can make it super low quality to have decent performance. PhysX on GPU has almost always been pretty flawless and barely or didn't impact FPS at all.
Are you stupid, legally blind or just trolling? I get all the skepticism towards ray tracing, i thinks it's way to early for it too but comparing a totally game changing tech with hairworks saying it's barely visible is not a statement one can take serious. This is more like 2D->3D than hairworks but ok, guess hating is easier. (don't get me wrong, it's right to hate on nvidia giving their shitty business practices but rtx != nvidia.
Posted on Reply
#97
kastriot
How about performance @720p it should be OK i guess and next gen cards will do 1080p and after that 3rd gen card will do 1440p and then..
Posted on Reply
#98
Steevo
Why didn't you but two cards? Seriously, did you peons expect this to work with just one? If you can afford one 1300 card that means you can afford 2, and what's money anyway when it comes to having the best of the best so your epeen is huge....
Posted on Reply
#99
BigMack70
I feel bad for anyone who got suckered by Nvidia's marketing and bought 20 series specifically for ray tracing.

Only one card in the 20 series - the 2080 Ti - has a real reason to be purchased, and that is specifically for maximum possible 4k rasterized gaming performance. All the 20 series cards are (and always were going to be) hot garbage for ray tracing.

If you want ray tracing, wait 1-3 years for 2nd and 3rd generation GPUs designed for it.
Posted on Reply
#100
SIGSEGV
PerfectWave said:
"The more you buy, the more you save "......
quote of the day. :roll:

it's hilarious, seriously. TBH, I don't have any reason to buy such a ...they said a 'luxury card' with let say 'RTX' technology (just sound oh wow~ to me) but it sacrifices the performance almost half. really?
It just doesn't make sense. I am so happy and glad about my second-hand 1080Ti purchase.
Posted on Reply
Add your own comment