• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Battlefield V with RTX Initial Tests: Performance Halved

wrong... nvidia has a major role to play in this..
They provided the documentation, limited dev tools and consultation, sure.
it was nvidia that worked CLOSELY with dice on this..
But it's still the full responsibility for DICE to get things right. They are the dev house, not NVidia.
this is as much as nvidia FAIL as DICE-S.. Stop with the meat-shield mentality
Wrong. This is exclusively the failure of DICE and EA because..
Wow wow wow. Hold on. DICE is still not EA. Dev versus publisher.
EA still green lit the release before proper testing was done or completed.
The problems experienced are entirely creditable to DICE's programming work and/or collaborative work with Nvidia engineers
That first part is right. However, if NVidia's engineers were present, they were there only in an advisory capacity.

NVidia's engineers are like teachers, they can only teach so much, it's up to the students to pay attention, take the time to practice and work hard to get things right. DICE and EA didn't do so and it shows. This unpleasantness is not a failure of NVidia's tech at all. The ultimate accountability is on DICE and EA and ONLY those two.
 
Try disabling SLI. I have a feeling RTX is not working with SLI. Specially since DX12 never worked with SLI and DX12 is a requirement for RTX.

Dice will have to add support for DX12 Explicit Multi-GPU..
You are wrong, SLI support depends on the developer and not on the API (in this case DX12).
There are DX12 games that do support SLI, like GoW4 and RoTTR, and I must say that the support is quite good.
 
They provided the documentation, limited dev tools and consultation, sure.

But it's still the full responsibility for DICE to get things right. They are the dev house, not NVidia.

Wrong. This is exclusively the failure of DICE and EA because..

EA still green lit the release before proper testing was done or completed.

That first part is right. However, if NVidia's engineers were present, they were there only in an advisory capacity.

NVidia's engineers are like teachers, they can only teach so much, it's up to the students to pay attention, take the time to practice and work hard to get things right. DICE and EA didn't do so and it shows. This unpleasantness is not a failure of NVidia's tech at all. The ultimate accountability is on DICE and EA and ONLY those two.

Let's not split hairs, at least you already backed down to EA + DICE. I'm good with that. No matter the accountability though, these early tech projects are still collaborative projects. I'm quite sure even Nvidia's engineers still take away valuable info from these field tests and push improvements based on it. And as long as they provide a codebase, they have a responsibility and there is no way its perfect from the get-go.

I work with software suppliers too (SaaS applications) and I've also seen the first implementations of it in the field, and this is always how it works out. There is no single company arrogant enough to say their code is perfect, and that alone prompts change.
 
You are wrong, SLI support depends on the developer and not on the API (in this case DX12).
There are DX12 games that do support SLI, like GoW4 and RoTTR, and I must say that the support is quite good.

He may have been directly referencing Battlefield with his comment.
 
Maybe you should read my full post before replying, such as:









You too are judging an entire technology based on one game which was not designed for ray tracing.

Calling RTX anything less than revolutionary is an insult to every engineer who worked on it. Just because you don't understand it or we don't have the software available to take full advantage of it doesn't take anything away from what it is.
Your sounding like you don't understand it ,this is Not full raytracing its lighting and shadows based on ray paths, an engine made for Rtx will stll be s rasterizing engine.
One game doesn't write it off i agree.

The fact it costs so much does, most buy a balanced pc so high end is normally a pretty decent pc attached to a decent monitor able now to play any game at 1440p or 4k ,that pc was not made to show 1080p in most owners eyes.
 
They provided the documentation, limited dev tools and consultation, sure.

But it's still the full responsibility for DICE to get things right. They are the dev house, not NVidia.

Wrong. This is exclusively the failure of DICE and EA because..

EA still green lit the release before proper testing was done or completed.

That first part is right. However, if NVidia's engineers were present, they were there only in an advisory capacity.

NVidia's engineers are like teachers, they can only teach so much, it's up to the students to pay attention, take the time to practice and work hard to get things right. DICE and EA didn't do so and it shows. This unpleasantness is not a failure of NVidia's tech at all. The ultimate accountability is on DICE and EA and ONLY those two.
Except you’re missing the fact that this DX DXR not an Nvidia tech just a new DX12 feature so it of course it’s all exposed to Devs and anyone who cares to use it AMD could use it too and have access to all the same libraries. It’s a “standard” DX12 feature now maybe DICE didn’t get it perfect but it’s not Nvidia “special sauce”
 
Let's not split hairs, at least you already backed down to EA + DICE.
You what I meant though.
I'm quite sure even Nvidia's engineers still take away valuable info from these field tests and push improvements based on it.
Very likely. Every failure can be a great opportunity to learn!

Except you’re missing the fact that this DX DXR not an Nvidia tech just a new DX12 feature so it of course it’s all exposed to Devs and anyone who cares to use it AMD could use it too and have access to all the same libraries. It’s a “standard” DX12 feature now maybe DICE didn’t get it perfect but it’s not Nvidia “special sauce”
That's a very good point actually.
 
Oh but I have no doubts about ray tracing itself, it is like you say ALREADY an established tech for content creators. So then I say, what's new here? Nothing. RTX is vaporware and content creators get Quadro. Don't assume I don't know what it is...

What I have doubts about is doing it in a live scene and in fast paced gameplay where performance is key, and I have doubts about the timing and way RTX is implemented. There is no outlook on any kind of reasonable adoption rate in the near future. Consoles don't have it. Low-mid range doesn't have it. Only grossly overpriced, underperforming high end cards have it, from one single company. De facto monopolized tech with no market share is not something to put your bets on. This isn't CUDA or something.

Even still, RTRT in this implementation is just some low quality passes tacked on to rasterized rendering. It is nowhere near the realism you'd want to extract from it. Its just a minor graphical upgrade with a massive performance hit - much like Hairworks ;)
Ok, if you put it that way i can really agree with you. Too many people don't understand and start to treat it like "silly rays nice reflections" without knowing what they are talking about. I think you touched a very important point: consoles. I really doubt ps5 and others will be capable of ray tracing and this will nullify all advantages for software houses (turning it into extra work for pc portings) so we won't see the actual advantages im hoping for during this decade.
 
He may have been directly referencing Battlefield with his comment.
Maybe, but then it's too soon to talk about not supporting SLI in DX12 for this game because it' s only a few days since it was released.
 
Maybe, but then it's too soon to talk about not supporting SLI in DX12 for this game because it' s only a few days since it was released.
Apparently it’s coming.
 
That seems to suggest theres a memory bottleneck....

Memory overclock improves RTX performance more than non-RTX performance. We saw a 5% improvement for RTX-on vs. 2% improvement for RTX-off at a given overclocked memory speed. However, we can't go too far with RTX 2080 Ti memory OC without having to play with advanced settings/cooling.
 
Funny this is. Barely 60 fps with RTX with a 1200$ card in probably the best optimized game on the market.
 
Maybe, but then it's too soon to talk about not supporting SLI in DX12 for this game because it' s only a few days since it was released.
Head over to Nvidia forums and you will learn just how great sli on dx12 is, ie it is not really supported , aggressively or proper, and i doubt it's really sli and not explicit Multi Gpu as it should be too, what's in a name after all.

Why everyone defends this shits beyond me.

And It is not too soon they worked on this for years (3-5)not fecking days, ,if it does or not is of no matter to 99% of Rtx owner's who are not going to buy two.

How many Devs work on a thing for 0.0002% approximately of their audience.

Rtx just died for three more years imho, at best.
 
Memory overclock improves RTX performance more than non-RTX performance. We saw a 5% improvement for RTX-on vs. 2% improvement for RTX-off at a given overclocked memory speed. However, we can't go too far with RTX 2080 Ti memory OC without having to play with advanced settings/cooling.
Thanks!

Is there any difference in temperatures or power consumption with RTX on/off?

Any changes to core frequency?

Could it be power limited - so that in order to run RTX cores other areas of chip are power throttled or lose some headroom?
 
Last edited:
Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.

So how exactly did NVIDIA introduce this that was so wrong in your opinion?
I honestly believe that customers shouldn't be paying for games that aren't made yet, either.
 
Is there any difference in temperatures or power consumption with RTX on/off?
Any changes to core frequency?
Could it be power limited - so that in order to run RTX cores other areas of chip are power throttled or lose some headroom?
Same temperatures, same power consumption, same frequency.
Sounds logical that using RT cores throttle other parts of the GPU but it's not directly visible in clocks or power.
 
Horrifying?
So you mean it does 80+FPS at 1080p with DXR at highest possible setting?
Assuming the same (less than) 50% hit, it'll do 60+FPS at 1440p, and 40+FPS at UHD 4K.

What you mean. There's a chart showing RTX turned on goes from 158 fps to 65 FPs at 1080p. You definitely aren't getting 60+ FPS at 1440p.
 
What you mean. There's a chart showing RTX turned on goes from 158 fps to 65 FPs at 1080p. You definitely aren't getting 60+ FPS at 1440p.
Well, the initial news post has a unique understanding of close to 50% in light of the results that were updated and published later :D
 
Personally I don' think this feature is going to win over users on a fast paced shoot 'em up game. Needs a single player slow paced game like Portal 3 or something.
 
I'm just curious if anyone was actually going to use this tech to begin with. I got myself an RTX card but was not even remotely looking forward to this technology, we all knew the performance in actual gameplay was going to be horrible.
 
Told ya.it's still too early to process DXR. Probably 2025 or even more.We really need to bring DXR into middle range cards which is more like equal to 2080Ti's Performance. I think 2025 is year to go.I saw some screenshot , It's really nice.
 
Last edited:
Just a thought, nobody would have bitched about the crappy RTRT performance of the cards in any of the games, if the cards would have been 40% cheaper, like they should normally be priced. If not 50% less. Asking PREMIUM for something that doesn't work properly it's completely disrespectful and insulting. But we already know we cannot expect anything better from a typical greedy and scrupulous corporation like nGreedia.
 
Just a thought, nobody would have bitched about the crappy RTRT performance of the cards in any of the games, if the cards would have been 40% cheaper, like they should normally be priced. If not 50% less. Asking PREMIUM for something that doesn't work properly it's completely disrespectful and insulting. But we already know we cannot expect anything better from a typical greedy and scrupulous corporation like nGreedia.

Sadly, it didn't stop anyone from buying them. And honestly, the people that are complaining probably don't own one. And really, no one should be complaining because it was widely reported pre-launch that performance was going to tank hard.

Even if performance didn't fall through the floor, I am not sure I like how it looks. Perhaps with MOAR rays it will look better? Everything looks like it is made of glass and sort of comes out less realistic...

*Edited for grammar
 
Back
Top