• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Battlefield V with RTX Initial Tests: Performance Halved

Joined
Apr 30, 2012
Messages
3,494 (1.24/day)
Many people said "it was to be expected, that RTX effects had a high performance hit". To be honest, I did not. Because those RT cores are just there to process those RTX effects, so I would have expected a few percentage lost for the overhead, but not double digit numbers. To me it looks like either DICE did a very bad job at the DXR optimization, or NVIDIA has balanced those RTX cards completely wrong. As Techspot points out:

Source

While the RTX effects are impressive, to me it looks more like a tech demo. NVIDIA will have to considerably increase the RT core count (like double or triple), or the technique has to get optimized a lot before this will become a real game changer. Also as a buyer of an RTX 2070 (which I am not), I would be rather disappointed, since those meager 36 RT cores make DXR almost useless.

Still I am excited to see what other upcoming games or DXR patches will bring to the table.
RTX itself (very low sample rate + denoiser) is the optimization that is required for "Real-Time". Its the same as if you took your favorite renderer (blender) and used 1spp to render with a denoiser. Its very low quality but gets the job done as quick as possible.

The optimization has to come in making those RT cores better and adding more of them. Just adding more RT cores right now will take up too much die space for not much benefit like the scaling from 2070 to 2080 TI. They need to be multiple times better x10+ so they wont be bogged down if games use more then one RTX effect.
 
Last edited:

VPII

New Member
Joined
Nov 18, 2018
Messages
5 (0.01/day)
This is a case of software catching up with technology and not the other way around. DICE will eventually get it to work as it should. I mean go and look at the Star Wars demo using the Unreal Engine with RTRT running with one RTX 2080.
 
Last edited:
Joined
Feb 3, 2017
Messages
2,004 (1.84/day)
Processor i5-8400
Motherboard ASUS ROG STRIX Z370-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-3200 CL16
Video Card(s) Gainward GeForce RTX 2080 Phoenix
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Logitech G700
Keyboard Corsair K60
RT cores do not seem to be that expensive in terms of die space, 10% or a little below that compared to Volta and ~15% compared to Pascal.
Tensor cores and other Volta stuff is what takes up a lot of space on Turings.
 
Joined
Apr 30, 2012
Messages
3,494 (1.24/day)
RT cores do not seem to be that expensive in terms of die space, 10% or a little below that compared to Volta and ~15% compared to Pascal.
Tensor cores and other Volta stuff is what takes up a lot of space on Turings.
Turing has a 1:1 (SM:RT) they need to make them better or increase 1:2+
 

VPII

New Member
Joined
Nov 18, 2018
Messages
5 (0.01/day)
In all honesty as I stated before, I do not think the current RTX cards cannot handle real time ray tracing it is more a case of software not using the hardware properly and that will be ironed out going forward.
 
Joined
Feb 3, 2017
Messages
2,004 (1.84/day)
Processor i5-8400
Motherboard ASUS ROG STRIX Z370-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-3200 CL16
Video Card(s) Gainward GeForce RTX 2080 Phoenix
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Logitech G700
Keyboard Corsair K60
Top