Thursday, October 17th 2019

Intel and Wargaming Join Forces to Deliver Ray Tracing to World of Tanks

Intel has been very serious about its efforts in computer graphics lately, mainly because of its plans to launch a dedicated GPU lineup and bring new features to the graphics card market. Today, Intel and Wargaming, a maker of MMO titles like World of Tanks, World of Warships, and World of Warplanes, partnered to bring ray tracing feature to the Wargaming's "Core" graphics engine, used in perhaps one of the best-known MMO title - World of Tanks.

Joint forces of Intel and Wargaming developers have lead to the implementation of ray tracing, using only regular software techniques without a need for special hardware. Being hardware agnostic, this implementation works on any graphics card that can run DirectX 11, as the "Core" engine is written in DirectX 11 API. To achieve this, developers had to make a solution that uses CPU's resources for fast, multi-threaded bounding volume hierarchy which then feeds the GPU's compute shaders for ray tracing processing, thus making the ray tracing feature entirely GPU shader/core dependent. Many features are reworked with emphasis put on shadow quality. In the images below you can see exactly what difference the new ray-tracing implementation makes, and you can use almost any graphics card to get it. Wargaming notes that "some FPS" will be sacrificed if ray tracing is turned on, so your GPU shouldn't struggle too much.
Add your own comment

72 Comments on Intel and Wargaming Join Forces to Deliver Ray Tracing to World of Tanks

#26
Vya Domus
kings
The charts have DXR High and even DXR Ultra settings!
No, you must be joking by picking out only the numbers that you want in order to push you're bizarre idea that RTX is a must have if you want RT.

Have a look at this : https://www.computerbase.de/2019-10/world-of-tanks-encore-rt/

WoT : RTX 2070 RT Ultra vs RT off : you lose about 35% of the performance with RT on ultra

Metro Exodus : RTX 2070 RT Ultra vs RT off : you lose about 40% of the performance with RT on ultra

So, come again ? How much better is the dedicated hardware ?

These comparisons don't even matter at the end of the day because these are different games with different implementations but it shows the world can survive without dedicated RT cores that eat die space and drive the price up of silicon even if you want to have RT in games. And this isn't the first time we see this either, Crytek showed the same thing earlier this year.
Posted on Reply
#27
john_
Maths check.

Losing 100% performance sends you to 0 fps and losing 120% of performance to... negative fps. Right?

I think you are confusing performance gains with losses and probably that "100% loss" should be "50% loss"
Posted on Reply
#28
Vya Domus
john_
Maths check.

Losing 100% performance sends you to 0 fps and losing 120% of performance to... negative fps. Right?

I think you are confusing performance gains with losses and probably that "100% loss" should be "50% loss"
You're right I saw his "50% is very different from 100%" and followed up this stupid math, I wrote down the correct % above.
Posted on Reply
#29
ZoneDymo
kings
And these are just shadows.

Maybe now the conspiracy theorists can stop the nonsense that RTX cards are a scam and we don't need dedicated RT hardware for anything.

Without dedicated Ray Tracing hardware, the performance loss is even more depressing.
They are a scam due to the price tag, not so much the initial dipping toes in water with ray tracing.
And tbh, so far I have not seen a raytracing comparison that actually impresses me apart from maybe the Quake remake ;)

But on the topic at hand, yeah it seems to just make the shadows that are already there more sharp, how is that ray tracing?

john_
snip

So, what is the status of PhysX 10 years latter? Everyone seems to be using software PhysX and no one is using hardware PhysX. Also I was surprised to see last week, probably with a few years delay, that Nvidia has unlocked PhysX. Epic store was giving older Batman titles that support hardware PhysX for free, so I had the chance to revisit those games. And guess what. With primary card an HD 7870 and secondary card a GT 620, I have hardware PhysX unlocked and fully functioning. 10 years ago Nvidia was locking it. They would have been selling much more low end GPUs today and all those years, if they haven't been so arrogant. Or maybe not if software PhysX is as good as hardware PhysX, or at least good enough.

snip
Fully agree with your post here, just wanted to add, Is software PhysX as good as hardware though?
I mean I fully agree Nvidia completely killed it with its buying out Ageia and then making it propitiatory, and that still makes me angry.
But remember all those demos with PhysX? with rocks falling through cloth tearing part of it away, or those pillars all falling differently depending on where they were hit etc.
Well....we still dont have that in our games....
Posted on Reply
#30
kings
john_
Why do you believe that RTX is something more than hardware PhysX? Nvidia was always trying to create proprietary standards to lock it's customers in it's own products. They did it with PhysX they are also trying to do it with RTX.

So, what is the status of PhysX 10 years latter? Everyone seems to be using software PhysX and no one is using hardware PhysX. Also I was surprised to see last week, probably with a few years delay, that Nvidia has unlocked PhysX. Epic store was giving older Batman titles that support hardware PhysX for free, so I had the chance to revisit those games. And guess what. With primary card an HD 7870 and secondary card a GT 620, I have hardware PhysX unlocked and fully functioning. 10 years ago Nvidia was locking it. They would have been selling much more low end GPUs today and all those years, if they haven't been so arrogant. Or maybe not if software PhysX is as good as hardware PhysX, or at least good enough.

Nvidia is trying to find uses for it's tensor and RT cores, while locking it's customer base to it's products and ray tracing is a good excuse. That doesn't mean that Nvidia's hardware implementation is a necessity. Maybe ray tracing will end up being the best selling point for HEDT and hi end mainstream processors. Because why buy an ultra expensive 2080 Ti, if you can have the same performance at RT with a simple RTX 2070 and a 12-16 core CPU that uses Intel's solution?
I don't care how RT is done, that for the general consumer is irrelevant, what matters is the end result.

And just like the link Vya Domus pointed out, even with this Intel implementation, the RTX 2070 loses less performance than the RX 5700XT, so dedicated hardware for RT maybe is not as useless as some people would think.
Posted on Reply
#31
64K
More people will jump on the RT bandwagon when AMD releases their GPUs with part of it dedicated to improving the handling of RT. They already have that ready for the PS5 when that comes and note that this will not be a software solution. It will be a hardware solution.

The thing is to not fully judge this tech until it's implemented more in the future. Right now they are only scratching the surface of RT because if it were implemented too much right now it would turn a game into a slideshow. We are at least a couple more generations away from full implementation of RTRT on affordable GPUs.

For the time being if you don't like RT or see any difference or it's killing performance of your GPU then just turn it off.

Here's a demo of RTRT using the Unreal Engine:

Posted on Reply
#32
john_
kings
I don't care how RT is done, that for the general consumer is irrelevant, what matters is the end result.

And just like the link Vya Domus pointed out, even with this Intel implementation, the RTX 2070 loses less performance than the RX 5700XT, so dedicated hardware for RT maybe is not as useless as some people would think.
The general consumer who can't see the difference between an RT implementation on the CPU and an RTX implementation on hardware, without pausing and zooming on a frame, will not care if Nvidia's RTX is on hardware and Intel's is not. What will matter in the end is how much money payed for that. And in the case of Intel's and probably AMD's solution, will be zero.

Also you can't compare different games and different implementations without also comparing visual results, before coming to conclusions. You also can't take ONE implementation in DX11 and come to conclusions about every possible software implementation we will see in the future. Not to mention that WoT RT implementation will probably receive more optimizations in the near future.

As for the lower losses on the RTX card, that probably has nothing to do with the RTX hardware in the 2070. You think Intel payed them to use even one RT core in the Nvidia GPUs? They have more experience on optimizing with Nvidia hardware and the results are seen also in this case. WoT was never an AMD friendly title. Why be in this case?

PS. That example about PhysX that you happily chose to ignore, is there to explain to you that RTX could easily be nothing more than a new PhysX case.
Posted on Reply
#33
londiste
john_
Why do you believe that RTX is something more than hardware PhysX? Nvidia was always trying to create proprietary standards to lock it's customers in it's own products. They did it with PhysX they are also trying to do it with RTX.
DXR primarily.
john_
Nvidia is trying to find uses for it's tensor and RT cores, while locking it's customer base to it's products and ray tracing is a good excuse. That doesn't mean that Nvidia's hardware implementation is a necessity. Maybe ray tracing will end up being the best selling point for HEDT and hi end mainstream processors. Because why buy an ultra expensive 2080 Ti, if you can have the same performance at RT with a simple RTX 2070 and a 12-16 core CPU that uses Intel's solution?
There are stages in RT that benefit from CPU, in this thread's context - building BVH - but in tracing actual rays GPU has CPU beat. CPU is horribly underpowered for the operations that are implemented in hardware on GPU. In fact, the same operations are very fast running on shaders. And specialized hardware has shaders beat.
john_
As for the lower losses on the RTX card, that probably has nothing to do with the RTX hardware in the 2070. You think Intel payed them to use even one RT core in the Nvidia GPUs? They have more experience on optimizing with Nvidia hardware and the results are seen also in this case. WoT was never an AMD friendly title. Why be in this case?
It has absolutely nothing to do with RT Cores. As far as we know, these are not exposed in DX11, only DX12 and Vulkan (and possibly OpenGL extension). If there is a difference it is because of some other architectural difference.
Posted on Reply
#34
kings
john_
PS. That example about PhysX that you happily chose to ignore, is there to explain to you that RTX could easily be nothing more than a new PhysX case.
I didn't ignore it, I just don't think it makes sense to compare the two. Physx was a locked technology and DXR is a feature included in DX12, open to all.

PS: Just to clarify, if you ask me if the future is going to be the dedicated hardware to process Ray Tracing? I don't know, nor does anyone know.

But anyone can use it and implement Ray Tracing on their hardware/software and that's a big difference from what PhysX was.
Posted on Reply
#35
Chrispy_
kings
I didn't ignore it, I just don't think it makes sense to compare the two. Physx was a locked technology and DXR is a feature included in DX12, open to all.

PS: Just to clarify, if you ask me if the future is going to be the dedicated hardware to process Ray Tracing? I don't know, nor does anyone know.

But anyone can use it and implement Ray Tracing on their hardware/software and that's a big difference from what PhysX was.
The true test will be how well Nvidia RTX cards run DXR games using the open DXR API.

With neither the PS5 nor XBOXn+1 having Geforce DNA, and every major game engine being designed for those platforms, does Nvidia RTX even matter to anyone? It'll just be another G-Sync/PhysX/Hairworks defeat that either doesn't get used much, or is abandoned in favour of the open standard.
Posted on Reply
#36
sepheronx
Maybe I am blind but I cant tell the difference.
Posted on Reply
#37
ZoneDymo
sepheronx
Maybe I am blind but I cant tell the difference.


Blue so show sharper or added shadows
Red to show that its a more vague shadow vs Off

its all about the shadows in this, just like Tomb Raider's RTX implementation so far.
Posted on Reply
#38
sutyi
john_
Maths check.

Losing 100% performance sends you to 0 fps and losing 120% of performance to... negative fps. Right?

I think you are confusing performance gains with losses and probably that "100% loss" should be "50% loss"
It's a framerate blackhole!

On a more serious note tho, I've benchmarked it in the morning and with a Ryzen 5 1600 + GTX 1060 6GB in 1080p ULTRA

ULTRA + RT OFF: ~18372p
ULTRA + RT ON (Ultra): ~11595p



...so about a 40% performance cost. Framerate is capped at 141fps at my end, although probably doesn't really matter as it only hits that for a brief moment when the benchmark starts.

Mind you this game was never really Radeon friendly, treat it as an nVIDIA title in your minds. Since technically this implementation runs most on the CPU maybe WG can improve on the performance on AMD GPUs, but I wouldn't hold my breath...

PS: Got home, so updated with actually figures.
Posted on Reply
#39
Vayra86
Seriously if you need long looks at side-by-side comparisons while there is a noticeable FPS hit to a technology...

its not worth bothering with. So far, that's what we got.

Let's see if they get there someday. I'm not holding my breath... despite all the marketing around it.
Posted on Reply
#40
renz496
Vya Domus
What's the depressing is that you should be aware that this level of performance hit is generally the same on most other DXR enabled games with RTX cards. Which means, at least compared to this game, the dedicated hardware makes almost no difference. People have all the rights to question how much of a difference these things really make.
that's because in this game the RT shadows only applied to tanks and not the entire game. think developer already explain this. the performance hit are not as bad as other games that use DXR because they limit RT implementation to individual tanks only. and if i remember correctly only on non damaged tanks.

Chrispy_
The true test will be how well Nvidia RTX cards run DXR games using the open DXR API.

With neither the PS5 nor XBOXn+1 having Geforce DNA, and every major game engine being designed for those platforms, does Nvidia RTX even matter to anyone? It'll just be another G-Sync/PhysX/Hairworks defeat that either doesn't get used much, or is abandoned in favour of the open standard.
nvidia RTX already make use the open standards. it doesn't matter if nvidia RTX implementation are build differently than the one by PS or Xbox (which is be based on AMD implementation). game developer does not need to know the hardware details too much. the most important think for them is the implementation are complaint with MS DXR. it is no different than tessellation. both AMD and nvidia build theirs differently. but when 8th gen console being dominated by AMD tessellation still does not suddenly running faster on AMD because games did not suddenly favors the way AMD do tessellation more than the way nvidia did it. the only thing that matter is the effectiveness of the hardware to handle the computation.
Posted on Reply
#41
ZoneDymo
Vayra86
Seriously if you need long looks at side-by-side comparisons while there is a noticeable FPS hit to a technology...

its not worth bothering with. So far, that's what we got.

Let's see if they get there someday. I'm not holding my breath... despite all the marketing around it.
This is often the case with graphical settings though, you just get used to how the game looks and that is that.
This is however imo as if you are running teh game with shadows turned off vs on so I would think its worth it, that is if you play that zoomed in.
Posted on Reply
#42
Voluman
Anyone noticed it is a cpu-bound implementation not gpu like nvida wise?
So drop with many cores please and lets see how that works :)
Posted on Reply
#43
m4dn355
One thing comes to my mind out of this joint feature:
World of Tracing
Posted on Reply
#44
Fiendish
Vya Domus
No, you must be joking by picking out only the numbers that you want in order to push you're bizarre idea that RTX is a must have if you want RT.

Have a look at this : https://www.computerbase.de/2019-10/world-of-tanks-encore-rt/

WoT : RTX 2070 RT Ultra vs RT off : you lose about 35% of the performance with RT on ultra

Metro Exodus : RTX 2070 RT Ultra vs RT off : you lose about 40% of the performance with RT on ultra

So, come again ? How much better is the dedicated hardware ?

These comparisons don't even matter at the end of the day because these are different games with different implementations but it shows the world can survive without dedicated RT cores that eat die space and drive the price up of silicon even if you want to have RT in games. And this isn't the first time we see this either, Crytek showed the same thing earlier this year.
One of those games is using RT for full Global Illumination, the other is using RT for limited shadow effects, the fact that the performance hit is similar should give you a very good idea how much better dedicated hardware is. The Crytek demo ended up showing the same thing when they noted how much better performing RTX cards would be if the dedicated hardware was used.

This SIGGRAPH presentation showed some of the numbers regarding how much dedicated hardware is boosting RT performance, it's substantial.



The dedicated hardware itself, takes up a relatively tiny amount of die space, both the Tensor cores and the RT cores together only account for ~9% of the die space on current RTX cards and of that 9%, the RT cores represent only about a third of the amount. So assuming no other bottlenecks, you're getting 2x-3x times performance for a ~3% die space cost.

A lot of people think RT is too expensive even WITH the performance boost from dedicated hardware. Seems quite clear that going forward dedicated RT hardware is indeed a "must have" for real time RT, if it's ever going to be taken seriously at least.
Posted on Reply
#45
MuhammedAbdo
World of Tanks only applies RT shadows to intact tanks, max number is 30 tank, environments and other objects don't receive RT shadows, even damaged tanks don't cast RT shadows!

despite that, the RX 5700XT loses 75% of it's performance just enabling some very few Ultra RT shadows on select number of tanks in the game, while the 2070 loses 55% of it's performance. We end up with a simple game running close to 70fps @1440p using software RT!


https://www.computerbase.de/2019-10/world-of-tanks-encore-rt/

Meanwhile, we have Metro Exodus doing hardware RT, achieving massive RT GI and Shadows implementation doing 50fps @1440p with the same 2070. The 2070 dropped 57% going from no RTX to High RTX. That's the power of hardware RT, significantly more effects for less or about the same performance impact!

Posted on Reply
#46
Zubasa
MuhammedAbdo
A lot of idiots are comparing very limited Shadow RT effects in Worlds of Tanks to full scale RT shadows and Global Lighting in Metro Exodus! What a bunch of non nonsensical morons!

World of Tanks only applies RT shadows to intact tanks, max number is 30 tank, environments and other objects don't receive RT shadows, even damaged tanks don't cast RT shadows!

despite that, the RX 5700XT loses 75% of it's performance just enabling some very few Ultra RT shadows on select number of tanks in the game, while the 2070 loses 55% of it's performance. We end up with a simple game running close to 70fps @1440p using software RT!


https://www.computerbase.de/2019-10/world-of-tanks-encore-rt/

Meanwhile, we have Metro Exodus doing hardware RT, achieving massive RT GI and Shadows implementation doing 50fps @1440p with the same 2070. The 2070 dropped 57% going from no RTX to High RTX. That's the power of hardware RT, significantly more effects for less or about the same performance impact!


The power of RT is not able to play on 60 fps in 2019 with a $500+ card, nice moral victory you have there.
All the % is nice in theory, but WoT is a PVP game that can get rather competitive, so no one will sacrifice even 10% fps, let alone 55%.
As for limited effects, all implmentations of Ray Tracing is very minimal given the horrible performance of even Hardware based RT.
To the point where it isn't hard to see noise / artifacts in the lighting. Practically the RT cores are just wasted die space for this generation of GPUs.
So here is a reality pill for you.

Funny thing is you praise how there is 70 fps with RT on, but also states that the game doesn't use RT 95% of the time.
Yes RT games performs normally when you don't use the RT, who would have thought?
Posted on Reply
#47
Vayra86
m4dn355
One thing comes to my mind out of this joint feature:
World of Tracing
Tanking performance!
Posted on Reply
#48
64K
It's embarrassing seeing people who normally embrace tech advancement fighting against RTRT so hard.

Of course hardware that improves handling RTRT is too expensive. It's supposed to be. Remember when cell phones were new or SSDs?

RTRT kills the performance of my GPU. It's supposed to. It's for the future. It's not for past GPUs.

Nvidia is taking control of RTRT with RTX GPUs. They may try but they will fail. AMD already has a hardware solution for handling RTRT better with the upcoming PS5. Intel has a solution as well.

Developers are slow to embrace RTRT and some implementations are sloppy. Time takes time.

To me it looks like a good thing for gamers eventually but there are of course some rough spots right now.
Posted on Reply
#49
GreiverBlade
Recus
100 fps.

RTX performance hit can be justified by increased visual fidelity.


well that's fantastic ... all 4 panel look the same ...


ok, more seriously ... RT in WoT? that's unneeded (unwanted but that's general for RT in the end, given the usefulness for the moment) what does it give more than what we already have? visual fidelity? well WoT is a competitive tank moba (ok less fast paced than most ... if you have 60fps you are good, no need to 75+), FPS are more important than visual fidelity thus the justification is usually considered as... not being one, although given how it is easy to run maxed out at 1620p75hz, which look good enough and sustained around 75fps, would not be beneficial to activate RT for such low difference between RT off and higher values and loose some FPS

ZoneDymo


Blue so show sharper or added shadows
Red to show that its a more vague shadow vs Off

its all about the shadows in this, just like Tomb Raider's RTX implementation so far.
ah that's a little more visible on that example ... mmhhhh not worth the fps impact tho

64K
It's embarrassing seeing people who normally embrace tech advancement fighting against RTRT so hard.

Of course hardware that improves handling RTRT is too expensive. It's supposed to be. Remember when cell phones were new or SSDs?

RTRT kills the performance of my GPU. It's supposed to. It's for the future. It's not for past GPUs.

Nvidia is taking control of RTRT with RTX GPUs. They may try but they will fail. AMD already has a hardware solution for handling RTRT better with the upcoming PS5. Intel has a solution as well.

Developers are slow to embrace RTRT and some implementations are sloppy. Time takes time.

To me it looks like a good thing for gamers eventually but there are of course some rough spots right now.
well RT for now ...(might be imho, mind you.) is unimpressive, minor and a freaking huge chunk of a RTX card is used to run it ... while it could be used for something else, that's one hell of a drawback ...

comparing that to a cell phone or a SSD is a bit ... optimistic ... at best it's comparable to a cartridge fountain pen coming from a quill pen, in term of advancement and usefulness and unfortunately .... i has more the price to pay akin to the prior than the later ...
Posted on Reply
#50
64K
GreiverBlade
well RT for now ...(might be imho, mind you.) is unimpressive, minor and a freaking huge chunk of a RTX card is used to run it ... while it could be used for something else, that's one hell of a drawback ...

comparing that to a cell phone or a SSD is a bit ... optimistic ... at best it's comparable to a cartridge fountain pen coming from a quill pen, in term of advancement and usefulness and unfortunately .... i has more the price to pay akin to the prior than the later ...
My point is that it's expensive because the hardware is new just like with past new hardware. You've been around long enough to see that many times over.

We don't go straight to this overnight:

Posted on Reply
Add your own comment