Wednesday, September 5th 2018

DICE to Dial Back Ray-tracing Eye-candy in Battlefield V to Favor Performance

EA-DICE, in an interview with Tom's Hardware, put out some juicy under-the-hood details about the PC release of "Battlefield V." The most prominent of these would be that the commercial release of the game will slightly dial back on the ray-tracing eye-candy we saw at NVIDIA's GeForce RTX launch event demo. DICE is rather conservative about its implementation of ray-tracing, and seems to assure players that the lack of it won't make a tangible difference to the game's production design, and will certainly not affect gameplay (eg: you won't be at a competitive disadvantage just because a squeaky clean car in the middle of a warzone won't reflect an enemy sniper's glint accurately).

"What I think that we will do is take a pass on the levels and see if there is something that sticks out," said Christian Holmquist, technical director at DICE. "Because the materials are not tweaked for ray tracing, but sometimes they may show off something that's too strong or something that was not directly intended. But otherwise we won't change the levels-they'll be as they are. And then we might need to change some parameters in the ray tracing engine itself to maybe tone something down a little bit," he added. Throughout the game's levels and maps, DICE identified objects and elements that could hit framerates hard when ray-tracing is enabled, and "dialed-down" ray-tracing for those assets. For example, a wall located in some level (probably a glass mosaic wall), hit performance too hard, and the developers had to tone down its level of detail.
At this time, only GeForce RTX series users have access to the ray-tracing features in Battlefield V, and can turn them off to improve performance. There are no DXR fallbacks for people with other graphics cards (GeForce GTX or Radeon). "…we only talk with DXR. Because we have been running only NVIDIA hardware, we know that we have optimized for that hardware. We're also using certain features in the compiler with intrinsics, so there is a dependency. That can be resolved as we get hardware from another potential manufacturer. But as we tune for a specific piece of hardware, dependencies do start to go in, and we'd need another piece of hardware in order to re-tune." DICE appears to be open to AMD sending hardware with its own DXR feature-set implementation, so it could add it to Battlefield V at a later stage. The RTX features themselves will only make it via a day-zero patch when the game releases in October, and won't feature in tomorrow's open-beta. There's also no support for NVIDIA SLI. The interview also reveals that Battlefield V has been optimized for processors with up to 6 cores and 12 threads. Source: Tom's Hardware
Add your own comment

62 Comments on DICE to Dial Back Ray-tracing Eye-candy in Battlefield V to Favor Performance

#26
Naito
Vayra86 said:
We aren't ready for RT when it sets us back entire generations of graphics performance...
Someone has to get the ball rolling
Posted on Reply
#27
Gungar
RejZoR said:
Of course they like the raytracing approach. Because they are lazy bastards and just blasting rays at things and recording their bounce is literally the most brute force approach one can think of. ANd no, itz's not art design and workarounds. I call BS on all that which is yet another NVIDIA's attempt to make everything we had for years gone from the history because now thy have RTX. Just like they erased alt he physics when they pushed PhysX.

Here's CryEngine from 2014 doing actual realistic real-time reflections:
<div class="youtube-embed" data-id="aTPQmPfMA2A"><img src="https://i.ytimg.com/vi/aTPQmPfMA2A/hqdefault.jpg" /><div class="youtube-play"></div><a href="https://www.youtube.com/watch?v=aTPQmPfMA2A" target="_blank" class="youtube-title"></a></div>

There is nothing pre-computed or fixed for doing this. Engine simply has to do it in real-time because you can actually see individual items from the world inside the reflections. FOUR YEARS AGO (at least video was made then, which means it can be even older). Four years in gaming industry is like eternity. So, there's that... Sure ray tracing looks more precise, but do we really need that kind of precision NOW when they clearly can't deliver good enough performance? Frankly, no.

Another one...
<div class="youtube-embed" data-id="N6QmDspWgaI"><img src="https://i.ytimg.com/vi/N6QmDspWgaI/hqdefault.jpg" /><div class="youtube-play"></div><a href="https://www.youtube.com/watch?v=N6QmDspWgaI" target="_blank" class="youtube-title"></a></div>

...from 5 years ago.

Skyrim engine from 4 years ago (with some mods I'm assuming)...
<div class="youtube-embed" data-id="MeyTkjDtf0E"><img src="https://i.ytimg.com/vi/MeyTkjDtf0E/hqdefault.jpg" /><div class="youtube-play"></div><a href="https://www.youtube.com/watch?v=MeyTkjDtf0E" target="_blank" class="youtube-title"></a></div>

Perspective and distance correct reflections.
Yeap exactly what i was thinking.
Posted on Reply
#28
iO
Wouldn't be too suprising if the other RTX showcase games will also get downgraded...
Posted on Reply
#29
eidairaman1
The Exiled Airman
Why? RTX a weak card?

kabarsa said:
Ray tracing is basically a geometrical optics approximation. It's fairly straightforward solution that solves a lot of problems with reflective surfaces, transparent and reflective, soft shadows from area light, ambient occlusion etc. We have a lot of tricks now in forwards/deferred rendering paths, that make life a lot harder if you will try to achieve at least something similar to what ray tracing can do simple by design. The only problem is performance. It's really good that someone with a market share and resources pushing it. Look at OTOY and Brigade, and Imagination Technologies hardware they demonstrated with UE4 ray tracing few years ago. It is possible if you are willing to make a dedicated hardware. I hope that AMD and Intel will hop on that train soon and it'll end up in some next generation of consoles to speed up adoption.
Works cited please or you lie.
Posted on Reply
#30
londiste
iO said:
Wouldn't be too suprising if the other RTX showcase games will also get downgraded...
Downgraded how or why? Compared to what was shown in the RTX demos? Probably. On the other hand we would not want all the excessive effects like all very reflective surfaces or badly performing stuff in actual games anyway.
Posted on Reply
#32
StrayKAT
eidairaman1 said:
Why? RTX a weak card?


Works cited please or you lie.
Doesn't seem "weak" so much as a "two steps forward, one step back" sort of thing. Real time ray tracing sounds cool, but not if it cripples at 1080p.
Posted on Reply
#34
londiste
Quake Wars: Ray Traced - http://www.qwrt.de/ (other projects linked on the page)
Quake 2: http://amietia.com/q2pt.html
<div class="youtube-embed" data-id="x19sIltR0qU"><img src="https://i.ytimg.com/vi/x19sIltR0qU/hqdefault.jpg" /><div class="youtube-play"></div><a href="https://www.youtube.com/watch?v=x19sIltR0qU" target="_blank" class="youtube-title"></a></div>

hat said:
The Darkplaces quake (that's quake 1) engine also supports ray tracing
Are you sure? I could not find anything about that with a quick search.
Posted on Reply
#35
notb
cadaveca said:
Maybe you should look up QuakeRT or Pantaray. For people like me that remember weird stuff, this whole RTX thing and the discussions behind it are hilarious.

https://software.intel.com/en-us/articles/quake-wars-gets-ray-traced
Why hilarious?
2004: 40 Xeon CPUs doing RTRT in Quake3 at 512x512px @ 20fps - called a great achievement (I remember this one from the press).
2009: 4 Xeon CPUs doing RTRT in ET:QW at 720p @ 20-35fps - fantastic, finally usable without HPC.
2018: a single GPU doing RTRT in BF V at 4K @ 20-30fps - awful, pointless.

And even if you play ET on highest settings and BF on lowest, the difference in details (and minimal requirements) is just enormous.

Are RTX cards too early? Not for those that are fine with 1080p. Everyone else can wait until RTX cards match their requirements. What we have is a PoC, but also a consumer product that gaming studios can work with.
It really doesn't matter when RTRT would be included in GPUs, because it would always mean a significant performance drop from a level that we would be used to at that point.

"Amazingly, this happened in 2004, a time when most people rejected the concept of real-time ray tracing. " - nothing changed.
Posted on Reply
#36
cadaveca
My name is Dave
notb said:
Why hilarious?
Asks question, yet gives exact answer.:roll:

But really, Ray tracing for lighting has been used in movies to cinematic effects for at least a couple of decades now, but not in real-time. Peter Jackson was quite the tech investor when he started on the Lord of the Rings Trilogy, and those movies have been out forever. :p Anyway, with that said, the idea that ray-tracing would enter the gaming arena has been worked on extensively by all players in the CPU/GPU field, and the idea that such effects would potentially cripple a system aren't new either, as you've pointed out. So devs need to very carefully balance the effects vs performance for all GPUs not just "in general", but now quite specifically, and that's a lot of work. We should be commending the effort that is being put into this tech now. If anyone thinks that DICE has delayed a month because of RTX alone, they are greatly mistaken... it's been part of the engine since day one. The only thing that has changed, maybe, is the specs of the cards that support it, giving them a clear line in the sand that they need to optimize to. But given the huge size of this franchise, they should have had such data for a long time, and if they haven't, well, that's hilarious to me as well.
Posted on Reply
#37
Prince Valiant
GPUs faster for graphics processing tasks, who knew :p? I don't find it this amazing since they had to portion off space on a humungous GPU for dedicated HW to handle RT or some aspect of it. I'd have been amazed if they implemented it without an enormous penalty.
Posted on Reply
#38
notb
Prince Valiant said:
GPUs faster for graphics processing tasks, who knew :p? I don't find it this amazing since they had to portion off space on a humungous GPU for dedicated HW to handle RT or some aspect of it. I'd have been amazed if they implemented it without an enormous penalty.
To be honest, Ray Tracing isn't really a very efficient task for GPUs. RTRT is still far from playable on GPGPU - that's why Nvidia went for RT ASIC.

Actually, RT is more like tasks that we usually run on CPUs. Each ray (or group of parallel rays) is traced in a single thread. It moves through different matter, it bounces and diffracts. It's a complicated serial problem and running it on GPGPU means massive number of cycles.
CPU rendering workstations (high-core Xeons) are still doing pretty well agains best GPUsdespite having 50 or 100 times less cores. :-)
Posted on Reply
#39
theoneandonlymrk
notb said:
To be honest, Ray Tracing isn't really a very efficient task for GPUs. RTRT is still far from playable on GPGPU - that's why Nvidia went for RT ASIC.

Actually, RT is more like tasks that we usually run on CPUs. Each ray (or group of parallel rays) is traced in a single thread. It moves through different matter, it bounces and diffracts. It's a complicated serial problem and running it on GPGPU means massive number of cycles.
CPU rendering workstations (high-core Xeons) are still doing pretty well agains best GPUsdespite having 50 or 100 times less cores. :-)
So not because of Caustic Ip then expert?.

As Cadeveca points out , Nvidia reinvents wheel, we're not convinced, they buried Caustic over ten years ago Afaik and they Had working Asics that they said could scale well.

Nvidia bought and buried them, and now ten years on ish , look a wheel.

Nope sorry it was imagination technologies that bought Caustic ( i am stressed at work atm ,brain ache inc)

And they sell add in cards apparently

Still No ty Nvidia.
Posted on Reply
#40
Captain_Tom
LOL so those hardly noticeable differences in the demo will be further toned down - probably so 1080@60 is possible. PS4 Slim performance brought to nvidia cards!
Posted on Reply
#41
RejZoR
notb said:
Why hilarious?
2004: 40 Xeon CPUs doing RTRT in Quake3 at 512x512px @ 20fps - called a great achievement (I remember this one from the press).
2009: 4 Xeon CPUs doing RTRT in ET:QW at 720p @ 20-35fps - fantastic, finally usable without HPC.
2018: a single GPU doing RTRT in BF V at 4K @ 20-30fps - awful, pointless.

And even if you play ET on highest settings and BF on lowest, the difference in details (and minimal requirements) is just enormous.

Are RTX cards too early? Not for those that are fine with 1080p. Everyone else can wait until RTX cards match their requirements. What we have is a PoC, but also a consumer product that gaming studios can work with.
It really doesn't matter when RTRT would be included in GPUs, because it would always mean a significant performance drop from a level that we would be used to at that point.

"Amazingly, this happened in 2004, a time when most people rejected the concept of real-time ray tracing. " - nothing changed.
The only reason why it was "impressive" back then is "holy shit, we can actually have this thing moving", because till then, only time we could see such graphics were rendered on rendering farms for hours or days (offline render). When you just have variations of existing, it's less impressive.

Make no mistake, I can't wait the day when entire scene will be ray traced in real-time at 60+ fps at any resolution, even 4K or more. Basically, from there on, we'll basically reach peak realism, only thing we'll be able to build upon will be number of rays and bounces used. And more work on textures and models. There won't have to be "creative" part to achieve effects, they just happen naturally then. But we're still far away from there.

Imagine current ray tracing maturity being somewhere on the level of first pixel shaders when everyone was obsessed over them and they basically only used them to make water...
Posted on Reply
#42
Captain_Tom
notb said:
To be honest, Ray Tracing isn't really a very efficient task for GPUs. RTRT is still far from playable on GPGPU - that's why Nvidia went for RT ASIC.

Actually, RT is more like tasks that we usually run on CPUs. Each ray (or group of parallel rays) is traced in a single thread. It moves through different matter, it bounces and diffracts. It's a complicated serial problem and running it on GPGPU means massive number of cycles.
CPU rendering workstations (high-core Xeons) are still doing pretty well agains best GPUsdespite having 50 or 100 times less cores. :)
In fact the PS3's CELL was capable of Real Time raytracing on one of the daughter cores, and I believe it was used in a couple of games (looked really good, though of course 720p@30Hz is all it ran at).

RejZoR said:
The only reason why it was "impressive" back then is "holy shit, we can actually have this thing moving", because till then, only time we could see such graphics were rendered on rendering farms for hours or days (offline render). When you just have variations of existing, it's less impressive.

Make no mistake, I can't wait the day when entire scene will be ray traced in real-time at 60+ fps at any resolution, even 4K or more. Basically, from there on, we'll basically reach peak realism, only thing we'll be able to build upon will be number of rays and bounces used. And more work on textures and models. There won't have to be "creative" part to achieve effects, they just happen naturally then. But we're still far away from there.

Imagine current ray tracing maturity being somewhere on the level of first pixel shaders when everyone was obsessed over them and they basically only used them to make water...
Wake me up when we can do it in 8K@480Hz. Imo we should try to reach 8K and ultra-high framerates first so we can actually notice the difference in quality.
Posted on Reply
#43
cadaveca
My name is Dave
At the Siggraph 2008 event, NVIDIA demonstrated a fully interactive GPU-based ray-tracer, which featured real-time ray-tracing in 30 frames/second (fps) and a resolution of 1920 x 1080 pixels. The demo saw NVIDIA flex its muscle with using almost every element in ray-tracing for which technology has been developed so far, namely a two-million polygon demo, an image-based paint shader, ray traced shadows, reflections and refractions.
https://www.techpowerup.com/68864/nvidia-demonstrates-real-time-interactive-ray-tracing


notb said:
To be honest, Ray Tracing isn't really a very efficient task for GPUs. RTRT is still far from playable on GPGPU - that's why Nvidia went for RT ASIC.
Not a new idea, either, which adds to the hilarity. Let's jump back to 2006:
Ray tracing has long been considered too expensive for mainstream rendering purposes. Movie production studios have only recently begun the transition to using it; however, the true cost of ray tracing has been very poorly understood until recently. It is now poised to replace raster graphics for mainstream rendering purposes. Its behavior is very well suited to CPU processors, and scales well with hyper threading and multi-processor configurations. The traditional cache hierarchy associated with CPUs is very effective at managing the external memory bandwidth requirements.
https://arstechnica.com/uncategorized/2006/08/7430/
Posted on Reply
#44
mouacyk
Checkout the BFV RTX video by JackFrag. A soldier's reflection popped into the window glass, when it entered the "tracing volume" (probably about 100yards). How realistic... could have done a simple blend or something to ease the transition. They likely will for the final release.
Posted on Reply
#45
notb
theoneandonlymrk said:

As Cadeveca points out , Nvidia reinvents wheel, we're not convinced, they buried Caustic over ten years ago Afaik and they Had working Asics that they said could scale well.
This is the first product that makes RTRT at home possible in real-life gaming. At a decent premium, sure, but you don't need a Xeon server or a cluster to do it. How is that "reinventing wheel"?
Sorry to say this, but it seems like many of you just heard about RTRT for the first time...
cadaveca said:

Not a new idea, either, which adds to the hilarity. Let's jump back to 2006:
Of course the idea isn't new. But the consumer-friendly implementation is first and should remain unmatched for a while. That's the great part.
RTRT itself is nothing special... at least for me.

Again: it seems like some people here just had their first contact with this idea. But they quickly found info that someone has done it already 10 years ago, so it's clearly nothing special.
I guess on this forum the only exciting thing about games is more fps. Booooring.
Posted on Reply
#46
CrAsHnBuRnXp
the54thvoid said:
But it highlights the price of the initial cards. It's a lot of cash for minimal return on effects.
Same can be said of the 10 series cards for 4K. But ppl werent complaining too hard on that
Posted on Reply
#47
theoneandonlymrk
notb said:
This is the first product that makes RTRT at home possible in real-life gaming. At a decent premium, sure, but you don't need a Xeon server or a cluster to do it. How is that "reinventing wheel"?
Sorry to say this, but it seems like many of you just heard about RTRT for the first time...

Of course the idea isn't new. But the consumer-friendly implementation is first and should remain unmatched for a while. That's the great part.
RTRT itself is nothing special... at least for me.

Again: it seems like some people here just had their first contact with this idea. But they quickly found info that someone has done it already 10 years ago, so it's clearly nothing special.
I guess on this forum the only exciting thing about games is more fps. Booooring.
Look ,I think genuinely you're stuck in your own perspective here, you have a 1050 so clearly game at 1080p , even console gamers are edging away from 1080@60.

I will likely get one somehow at some point jyst to try , but I insist on high res and Qi so it's not for me, i know this.
Still I'm not saying it's rubbish just that it'll be rubbish to me and many others.

And RTRT isn't new, you definitely are not the only one who knows what is what here ,you talk a lot, doesn't mean it's worth reading.
And Nvidia invented a cheaper(in computation) algorithm and the hardware, to run a less then equal version of Raytracing (it is not fully Raytraced) yet still heavily impact performance , im not paying over odds for that.
By the sound of it a 2080ti should play pretty much any present game at 4k@60 without Rtx or some games with Rtx on at 1080@60.

Those two resolutions work just fine on my monitor but do you really believe I or any other 4k owner Ever goes down to 1080.
Also because of this ,ie a 2080/2080ti owner is going to connect it to a pretty Decent monitor, they're not then going to be happy playing the Latest AAA games at 1080 and swapping back to 4k for everything else ,so i expect dev adoption to die off, even with big NV money about.
Posted on Reply
#48
Captain_Tom
theoneandonlymrk said:
Look ,I think genuinely you're stuck in your own perspective here, you have a 1050 so clearly game at 1080p , even console gamers are edging away from 1080@60.

I will likely get one somehow at some point jyst to try , but I insist on high res and Qi so it's not for me, i know this.
Still I'm not saying it's rubbish just that it'll be rubbish to me and many others.

And RTRT isn't new, you definitely are not the only one who knows what is what here ,you talk a lot, doesn't mean it's worth reading.
And Nvidia invented a cheaper(in computation) algorithm and the hardware, to run a less then equal version of Raytracing (it is not fully Raytraced) yet still heavily impact performance , im not paying over odds for that.
By the sound of it a 2080ti should play pretty much any present game at 4k@60 without Rtx or some games with Rtx on at 1080@60.

Those two resolutions work just fine on my monitor but do you really believe I or any other 4k owner Ever goes down to 1080.
Also because of this ,ie a 2080/2080ti owner is going to connect it to a pretty Decent monitor, they're not then going to be happy playing the Latest AAA games at 1080 and swapping back to 4k for everything else ,so i expect dev adoption to die off, even with big NV money about.
It's sad too because RT sounds like it could be very cool if it could be done without a massive performance hit. However I personally think Nvidia intentionally left RT performance horrible. This allows them to 4x their RT cores with 7nm Ampere, and claim ridiculous things like 4x the performance of Turing!
Posted on Reply
#49
notb
theoneandonlymrk said:
Look ,I think genuinely you're stuck in your own perspective here, you have a 1050 so clearly game at 1080p , even console gamers are edging away from 1080@60.
You should keep using the 1050 card strategy. It's really going well. :)
I don't game on the PC anymore. 1050 wasn't bought for gaming anyway.
And RTRT isn't new, you definitely are not the only one who knows what is what here ,you talk a lot, doesn't mean it's worth reading.
Well, we've already established that my knowledge is limited by what I read and your comes from owning. I do hope you'll get an RTX card at some point and your rendering knowledge will explode. It's worth it. :-)
And Nvidia invented a cheaper(in computation) algorithm and the hardware, to run a less then equal version of Raytracing (it is not fully Raytraced) yet still heavily impact performance
You have literally no idea what you're talking about. :-D
1) There's no such thing as "fully raytraced".
2) Nvidia's RT approaches are among the most advanced available today. Sure, RTRT is nowhere near professional renders or even movies (which Nvidia is famous for, BTW), but RT cores speed up serious rendering as well. It's a huge achievement. And it works in tandem with tensor cores. It's just a showcase of technological supremacy.
These first RTX cards being successful or not is another story - we'll see some sales statistics after the launch. IMO the preorder looks promising.
Nvidia can now build professional rendering machines around this new tech, so the R&D will pay off anyway.
Posted on Reply
#50
theoneandonlymrk
notb said:
You should keep using the 1050 card strategy. It's really going well. :)
I don't game on the PC anymore. 1050 wasn't bought for gaming anyway.

Well, we've already established that my knowledge is limited by what I read and your comes from owning. I do hope you'll get an RTX card at some point and your rendering knowledge will explode. It's worth it. :-)

You have literally no idea what you're talking about. :-D
1) There's no such thing as "fully raytraced".
2) Nvidia's RT approaches are among the most advanced available today. Sure, RTRT is nowhere near professional renders or even movies (which Nvidia is famous for, BTW), but RT cores speed up serious rendering as well. It's a huge achievement. And it works in tandem with tensor cores. It's just a showcase of technological supremacy.
These first RTX cards being successful or not is another story - we'll see some sales statistics after the launch. IMO the preorder looks promising.
Nvidia can now build professional rendering machines around this new tech, so the R&D will pay off anyway.
Susan I thought we were passed the emerdale moments so.

Is your money where your mouth is
Or
Is your mouth where your money's earned

Your like a written ever changing Nvidia Advert with ultimate bs apeal.
Posted on Reply
Add your own comment