• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Battlefield V with RTX Initial Tests: Performance Halved

The negativity surrounding RTX is unbelievable. Technologically, this is probably the biggest step up we’ve had in the history of the GeForce cards.

When the Voodoo was released was there widespread support for Glide? No. When the GeForce 256 was released was there widespread support for hardware T&L? No. When the GeForce 3 came out was there widespread support for programmable shaders? No. Why do you expect any different from this?

So far we do not have any games using an engine built from the ground up for ray tracing support. Ray tracing has been patched onto rasterization engines which were never designed for ray tracing.

Software needs to catch up to the hardware. Look at any launch title on console and compare the graphics to something a few years down the line. The hardware didn’t change, but the software caught up.

A friend of mine gave me a brilliant analogy last night. When a baby takes its first steps do the parents say “oh that’s crap, he’s so slow and unstable” or are the blown away that their little one is making such good progress?

The same will be true of ray tracing. For the first time we are looking at image generation in game differently and progress can only go one way. Wait until we have engines written with ray tracing in mind from the get go.

If you wanted all-wheel ABS in 1978, Mercedes offered it for around $ 32,000 BACK THEN. That was your only option. Now it’s found on almost every entry level car regardless of price. Give ray tracing a bit of time to mature. There will never be a “right time” for the initial release as without the initial release there will be no further progress.

Appreciate the technology for what it is and the revolutionary (as opposed to evolutionary) change it can bring.

So what if the RTX 2080 Ti gets 400 FPS instead of 180 FPS using traditional render methods? Both are beyond the level of perception of the human eye so it becomes an arbitrary figure. What we need is a way to drastically increase image quality without the performance hit we would have had prior to the RTX cards, which is now a reality. Don’t base the small increase in quality on a badly patched rasterization engine.

We now have the computational power that until not very long ago required a render farm packaged into a single GPU with a price tag that high end enthusiasts can afford. Show me one other graphics card that offers that?
Wow, you really like to write essays don’t you! Or is is a copy and paste from the nGreedia NPC software update?

Maybe you should work on trying to comprehend for yourself why people don’t like this situation before you write another marketing rant?
 
Could the RT cores be memory limited? They seem to be from 1080p teaser image..

Don't know about that yet. The teaser graph contains data from RTX 2080 Ti, which already has the highest bandwidth in the market. After all this is over, we'll see how memory OC affects this.
 
Forget the performance, I don't even like how it looks it many areas. Anyone with half a brain saw this coming because it was reported that it killed FPS so no surprise. RTX is still some time away but at least that $1300 bought some 30% performance increase without the RTX that will have to be disabled so it wasn't a complete waste of people's money.
 
"The more you buy, the more you save "......
 
I hate these "patch" that add technology to a game. If the game was not design with this built in, the performance is not going to be good. Just look at all the games that have DX12 as a "patch" vs. games that are developed with DX12 natively.
 
Fist of all to make this conclusion you have to compare the graphical quality of 1080p with RTX enabled and 4K without RTX enabled . As far as i know we don't have such a comparison so you can't make such a conclusion . Don't get me wrong , maybe in some early games the difference won't be worth the hassle , maybe on others it totaly will but as time goes real time ray tracing will for sure show superior results than rasterized rendering since the technology is simply superior .

You say that it is not ready for prime time , but how do you make it ready for prime time if nobody owns such a product ? Do you think game developpers will develope technology for a phantom product ?
The answer is obvious !
Well “regular” BF5 performance numbers are readily available so I’m not sure why you’re trying to split hairs 2080ti gets 83 FPS at 4K Ultra vs the initial numbers of 65FPS at 1080 Ultra the only graphical difference being the addition of RT How can I not form a conclusion from that? Do you not think that the DICE Dev team aren’t all using RTX products for dev work? This is as good as it can be in its current form with the hardware available.
Let’s be clear I’m not hating on RT I’m saying it’s not ready for the consumer space with current hardware. Like all new tech it takes a generation at least before it becomes readily accessible.
 
Maybe you should read my full post before replying, such as:









You too are judging an entire technology based on one game which was not designed for ray tracing.

Calling RTX anything less than revolutionary is an insult to every engineer who worked on it. Just because you don't understand it or we don't have the software available to take full advantage of it doesn't take anything away from what it is.
So we wait 3 years, or maybe 5 for a completely new game engine to be built from the ground up, then for the game it’s self to be created using this engine? Yeah, the nGreedia is strong with this one.

Let’s face it, nGreedia knew this was going to happen, and made sure that there was no RTX content for just long enough to get a few tens of thousands of very expensive cards out there to people with more money than sense. They must have been rolling around on their office floors pissing themselves with laughter while they were watching the sales figures counting up minute by minute.

And to help your comprehension, nobody has ever said that RTX technology is a bad idea. It’s the way nGreedia have gone about introducing it that has caused all this hatred. You see, people are not all totally stupid, some of us know when we are being manipulated and lied to.
 
RTX 2080 and 2070 data is a bloodbath. They post even bigger %losses than 2080 Ti. ETA on our review is ~1 hr.
 
So we wait 3 years, or maybe 5 for a completely new game engine to be built from the ground up, then for the game it’s self to be created using this engine? Yeah, the nGreedia is strong with this one.

Let’s face it, nGreedia knew this was going to happen, and made sure that there was no RTX content for just long enough to get a few tens of thousands of very expensive cards out there to people with more money than sense.

Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.

And to help your comprehension, nobody has ever said that RTX technology is a bad idea. It’s the way nGreedia have gone about introducing it that has caused all this hatred. You see, people are not all totally stupid, some of us know when we are being manipulated and lied to.

So how exactly did NVIDIA introduce this that was so wrong in your opinion?
 
Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.



So how exactly did NVIDIA introduce this that was so wrong in your opinion?

"It just works" a blatant marketing HUANG lvl of LIE
 
RTX 2080 and 2070 data is a bloodbath. They post even bigger %losses than 2080 Ti. ETA on our review is ~1 hr.
That seems to suggest theres a memory bottleneck....
 
I love to say it... I told you so!

nGreedia... The way you're meant to be played! Fools.

Hope you 2080Ti owners enjoy the 1080p 30 to 60FPS gaming on your shiny 4K 144Hz monitors! And what about the experts here saying what value for money the 2070 is? Yeah, real value, right there! An 2070 at 15-35FPS 1080p sounds great for a $600+ “value”.

Can't wait to hear how wrong I am! nGreedia will be sending the software update out to the green NPC army right about now, so that should make this fun.

Yap, we are still 1 to 2 Generations away from Raytracing implementation on the whole range of Cards doing 1080p@60 with RT enabled even for the lowest end Cards. AMD knows this, ist why they said theyre going to take care or Raytracing only when all their Cards will suport it.

And this is not everything, since you can increase or decrease the number of rays that are to be processed. So even then later on, with 5times the Performance we have now, put enough rays, Performance will still be 30 fps.

Questions is, how much rays are enough ? Perhaps and adaptive algorithm can be used that increases and decreases ray numbers to maintain a target Frame rate would be in order ? Thus lower end Card would use less rays, and higher end Cards more rays for the same Performance Levels. I would rather have something smart and done right like this, then this shitshow we have now. 1400 euro Cards doing 30 fps in 1080p. Pretty much outrageous. You can get a xbox, ps4, nintendo Switch, and and maybe have leftovers for a Laptop for that Money.
 
Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.



So how exactly did NVIDIA introduce this that was so wrong in your opinion?
I stopped spoon feeding my daughter when she was 2, tell me why I should be spoon feeding you?

Stop making arguments up. I never said anything of the sort. You really are not good with this, you really need to connect to the nGreedia server, and get your software updated... You said that we have to wait for game devs to write an engine that supports RTX from the ground up. I told you the timeframe that would happen in.

I guess you have AMD and Trump to bring up next, followed by a third course of white male privilege and homophobia baiting?
 
I stopped spoon feeding my daughter when she was 2, tell me why I should be spoon feeding you?

Stop making arguments up. I never said anything of the sort. You really are not good with this, you really need to connect to the nGreedia server, and get your software updated... You said that we have to wait for game devs to write an engine that supports RTX from the ground up. I told you the timeframe that would happen in.

I guess you have AMD and Trump to bring up next, followed by a third course of white male privilege and homophobia baiting?

and made sure that there was no RTX content for just long enough to get a few tens of thousands of very expensive cards out there to people with more money than sense.

So then what exactly are you saying here?
 
Yap, we are still 1 to 2 Generations away from Raytracing implementation on the whole range of Cards doing 1080p@60 with RT enabled even for the lowest end Cards. AMD knows this, ist why they said theyre going to take care or Raytracing only when all their Cards will suport it.

And this is not everything, since you can increase or decrease the number of rays that are to be processed. So even then later on, with 5times the Performance we have now, put enough rays, Performance will still be 30 fps.

Questions is, how much rays are enough ? Perhaps and adaptive algorithm can be used that increases and decreases ray numbers to maintain a target Frame rate would be in order ? Thus lower end Card would use less rays, and higher end Cards more rays for the same Performance Levels. I would rather have something smart and done right like this, then this shitshow we have now. 1400 euro Cards doing 30 fps in 1080p. Pretty much outrageous. You can get a xbox, ps4, nintendo Switch, and and maybe have leftovers for a Laptop for that Money.
For me 0 rays are enough, make separate cards for rtx only tensors.
 
RTX 2080 and 2070 data is a bloodbath. They post even bigger %losses than 2080 Ti. ETA on our review is ~1 hr.
I was thinking... if you're using Ray Tracing, do the "regular" shadows/lighting/reflections settings affect visual quality, and do they affect FPS? If they don't affect quality (considering RT takes care of that), but they do affect FPS - maybe lowering them/turning them off would boost FPS without quality loss?
 
Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.



So how exactly did NVIDIA introduce this that was so wrong in your opinion?
So then what exactly are you saying here?
Well you’re the one trying to justify it in it’s current state despite the facts that it’s not the the amazing miracle tech you continue to claim that it is. What part of its not ready do you need explained to you again? The numbers don’t lie and it’s only going to get worse with the 2080 and 2070. How are you going to try to spin those numbers favourably?
Bottom line the hardware DOESN’T exist.
 
looking all settings performance, looks like an GPU Engine bug, is´nt using all Cores or someting, Drivers and game will fix it, but i still expecting performance lost, maybe 20-30%


greetings
 
perfect-popcorn-vertical-b-1800.jpg




Actually I'm more inclined to say this is Hairworks 2.0. Barely visible improvements at major performance hit, but don't worry, you can make it super low quality to have decent performance. PhysX on GPU has almost always been pretty flawless and barely or didn't impact FPS at all.
Are you stupid, legally blind or just trolling? I get all the skepticism towards ray tracing, i thinks it's way to early for it too but comparing a totally game changing tech with hairworks saying it's barely visible is not a statement one can take serious. This is more like 2D->3D than hairworks but ok, guess hating is easier. (don't get me wrong, it's right to hate on nvidia giving their shitty business practices but rtx != nvidia.
 
How about performance @720p it should be OK i guess and next gen cards will do 1080p and after that 3rd gen card will do 1440p and then..
 
Why didn't you but two cards? Seriously, did you peons expect this to work with just one? If you can afford one 1300 card that means you can afford 2, and what's money anyway when it comes to having the best of the best so your epeen is huge....
 
I feel bad for anyone who got suckered by Nvidia's marketing and bought 20 series specifically for ray tracing.

Only one card in the 20 series - the 2080 Ti - has a real reason to be purchased, and that is specifically for maximum possible 4k rasterized gaming performance. All the 20 series cards are (and always were going to be) hot garbage for ray tracing.

If you want ray tracing, wait 1-3 years for 2nd and 3rd generation GPUs designed for it.
 
Back
Top