Friday, August 16th 2019

Assetto Corsa Competizione Dumps NVIDIA RTX

Assetto Corsa Competizione, the big AAA race simulator slated for a September release, will lack support for NVIDIA RTX real-time raytracing technology, not just at launch, but even the foreseeable future. The Italian game studio Kunos Simulazioni in response to a specific question on the Steam Community forums confirmed that the game will not receive NVIDIA RTX support.

"Our priority is to improve, optimize, and evolve all aspects of ACC. If after our long list of priorities the level of optimization of the title, and the maturity of the technology, permits a full blown implementation of RTX, we will gladly explore the possibility, but as of now there is no reason to steal development resources and time for a very low frame rate implementation," said the developer, in response to a question about NVIDIA RTX support at launch. This is significant, as Assetto Corsa Competizione was one of the posterboys of RTX, and featured in the very first list by NVIDIA, of RTX-ready games under development.
Source: Darth Hippious (Steam Community)
Add your own comment

91 Comments on Assetto Corsa Competizione Dumps NVIDIA RTX

#51
Dave65
Ray Tracing will become a thing, when it is a thing.
It's not a thing.... Yet!
Posted on Reply
#52
Vya Domus
The gist of all this is that developers are starting to realize that "hey this is a lot work and we have zero incentive to do it". That's all.
Posted on Reply
#53
AnarchoPrimitiv
londisteIf and when competitors (AMD primarily) come out with RT solution, this will benefit immensely from the work developers and Nvidia are doing right now.
I could be wrong, but I feel like that'd be the first time, in a very long time (if at all) Nvidia will have done something to the benefit of the entire community and to advance the industry as a whole. While loyal fanboy will seek to conflate the following statement as taking sides, I feel as though it's been AMD that has continually made such advances to everyone's benefit (and obviously to their own, the two are not exclusionary) e.g. HBM, Mantle/DX12, open source initiatives, etc.
Posted on Reply
#54
semantics
It's strange to see people actively argue against ray tracing in general.
chr0nos


EDIT:
The performance hit when enabling RTX is just to great to be worth it with current HW. Maybe when its down to a 5-10% hit, it may be appealing.
I wonder how much of that is the hardware vs how much of it is bad implementation. Like in the days of crysis 2 where you get jersey barriers with thousands of polygons in it's mesh and tessellated water that's placed over the entire level whether you see it or not.
Posted on Reply
#55
AnarchoPrimitiv
birdie@btarunr

Why the sensationalist title?

Why not use "drop" instead of "dump"? As a journalist you must remain neutral.
How a given individual takes a word's implicit suggestions is completely subjective, some like myself, didn't see any implied suggestion in the use of the word "dump". Not an accusation, but arguably, the very fact that you perceived such "bias" is a bigger indicator or your own maintained bias and preferred alliegence/loyalties than anything else, i.e. It implies you're acting as a self appointed defender and apologist for Nvidia (again, not an accusation I am making, but a probable opinion and analysis I could imagine many commentors making)
Posted on Reply
#56
danbert2000
All the multiplatform games will have some sort of DXR path for lighting and/or shadows IF the new consoles come out with support for raytracing. Otherwise, the situation with the Assetto Corsa devs will probably keep happening for any game that's not a blockbuster. The feature will be a selling point IF the consoles provide a wide install base and the usual suspects compete for best looking game by using the new capabilities.

I bought a 2070 Super knowing I appreciated the 1080 Ti performance at rasterization. I did not buy it for RTX. But I am hoping that in the next three years, which is probably how long I'll have this card, enough games will have options for it that I can at least try them out. If it never happens, I'll still have a good graphics card that just happens to do more than I need. This is exactly why the RTX series was so disappointing in the beginning, because the price per performance stayed flat and we got a feature no one cares about. The SUPER line does provide the needed bump in raster performance and we get something extra.
Posted on Reply
#57
Vayra86
AnarchoPrimitivI could be wrong, but I feel like that'd be the first time, in a very long time (if at all) Nvidia will have done something to the benefit of the entire community and to advance the industry as a whole. While loyal fanboy will seek to conflate the following statement as taking sides, I feel as though it's been AMD that has continually made such advances to everyone's benefit (and obviously to their own, the two are not exclusionary) e.g. HBM, Mantle/DX12, open source initiatives, etc.
I disagree on that one.

Many people have complained about GameWorks and they still do. But let's get one thing straight. Gameworks technologies have trickled down into the 'used everywhere space' and they still do. PhysX is the only notable one that really didn't - and yet, CPU Physx did, and now the SDK is open to all I believe. But think about AA tech for a minute. Nvidia has a major share in this. Additionally you can complain about Gsync but they were still first in offering it, and creating a market for it. I was never a fan of the model they used, and vastly prefer the FreeSync approach, but its still there. In some way your enemy is your friend in this business, because all progress is progress. Also, think about the development of GPU Boost. Or improved delta compression. Many of those are later copied by AMD, with their own sauce no less, but still (what's really different is how they implement it on their own arch). While AMD was suffering HBM shortages, Nvidia simply produced Maxwell GPUs with 256 bit buses doing just fine - and still doing that today - for high end GPUs. That cost advantage echoed in the price of, say a GTX 970 which took the market by storm with very good perf/dollar.

And what about all those open AMD initiatives? Only a few of them really stick, such as Vulkan in which AMD is merely one of the players. Many things AMD portrays as 'open initiatives' are industry wide movements they also have a flavor for. Mantle/DX12/Lowlevel access for example is much bigger than AMD and is not new either, even though some like to attribute all of that to the Mantle initiative (as if BF is of any major relevance in the market - see the lukewarm BFV RTX reception). I think if AMD contributed to the industry, their biggest achievements are in the CPU area instead.

In addition, the performance leader does push the industry forward. Its that simple - more performance, means developers can optimize for a higher performance mainstream. For PC gaming, I would definitely say Nvidia has injected more money into developers and overall graphics fidelity than AMD in the last few decades. There is a LOT of support being provided under the hood through dev channels. That 65-80% market share means there is a lot of margin to spend that money and even give it away. There is even a form of responsibility because if Nvidia doesn't take care of the content creators, they won't create nicer games that need higher performance GPUs - the two feed each other. I'm not really complaining about that to be honest, because dev money leads to ingame results and those favor everyone.

Another point could be made for AMD though. The console performance level dictates what most big budget games get optimized around. Better AMD console GPUs mean better PC games. Unfortunately, consoles are not the cutting edge, so while this is progress, it won't usually 'feel' as such - it actually feels like slowing us down from a PC gamer perspective.

On topic - this RTX push. There is an equally large group explaining it as just that: pushing the industry forward, or Nvidia serving mostly itself. Both are true, really.

A matter of perspective ;)
Posted on Reply
#58
Xzibit
Imagination tried it in 2014 didn't get them very far with it and they targeted devs. They didn't throw money at developers to include features though.

Nvidia pours money into these Dev houses as a partnership ($$$) to include RTX. Nvidia could just dump more money into KUNOS for them to find "the maturity of the technology" worth their wild again.
Posted on Reply
#59
Fluffmeister
This isn't too surprising in the grade scheme of things, we are all still waiting for DX12 and Vulkan to rule the world after all.
Posted on Reply
#60
notb
TPU serves almost no texts about RTRT development or new games that start implementing them, but - inevitably - we get a text about a game studio that delays RTRT implementation.

Normally I'd say @btarunr is pro-AMD (like many on this forum). However, AMD has already said RTRT hardware is coming soon, so what's going on? Are you guys just anti-RTRT? Why?

As for ACC itself: it is made by a tiny, highly specialized studio Kunos Simulazioni. It's under 50 people that basically make a single game. Switching to RTRT for such small companies will always be a huge cost.
For similar reasons ACC still uses UT4 (today used mostly in indie games). They can't afford an in-house engine and it seems they can't afford switching to something more modern. It's not that surprising since graphics always had lower priority in their games.

For large corporations shifting to RTRT is going to be fairly easy. Yes, games have to be written differently. Yes, you have to employ different people. But it's really universal (just like RT rendering engines).
With few mainstream games released yearly, the cost of transition will be absorbed easily.
Posted on Reply
#61
Assimilator
Vayra86Seriously who gives a damnwhat the reasons are? The fact remains it looks and plays like an afterthought, and last I checked we didnt get a discount for beta testing. We are talking about very costly GPUs here that sell with the perf dollar ratio worse than it was in 2016. Wake up already.
And yet AMD GPUs, which don't include ray-tracing hardware and are significantly smaller and less complex, sell for almost the same amount of money as NVIDIA's. Remind me who's ripping off customers again, hmmm?
Posted on Reply
#62
Vayra86
So that makes the price increase OK? :kookoo: AMDs price is completely unrelated here. And they are also selling costly GPUs with low margin, think about HBM equipped Fury, Vega, RVII.
Posted on Reply
#63
notb
Vayra86So that makes the price increase OK? :kookoo: AMDs price is completely unrelated here. And they are also selling costly GPUs with low margin, think about HBM equipped Fury, Vega, RVII.
So... we have 2 dGPU makers and you say both are too expensive.
How exactly does that work? :-D
Posted on Reply
#64
BiggieShady
sutyiAs for tweaking on lighting I don't think DXR really simplifies much... with RT you propaply need to adjust any and all surfaces for light bounce properties otherwise you'll get pretty derpy lighting artifacts.
That's why most games in the last year use physically based shader that has emission texture map, which makes this a non issue for digital assets created by hand ... it can look off with photogrammetry if it is not done right
AldainUm dude AtiTruForm is tessellation..
Exactly the point ... how many years since ATI Radeon 8000 until using tesselation in games? I'd say about 10 years ... even 15 if you look at widespread adoption.
Why that long? Fixed function tesselation is useless, it only makes sense if programmable via geometry shaders.
Similarly RTX as such in its first incarnation is evidently almost useless ... but not as useless as tesselation on DX8.1 hardware ... I'd say just as useful as tesselation on Radeon 5000 series DX11 gpus - it's there, but performance tanks if you use it.
Posted on Reply
#65
bug
Vayra86Irony of this story; if you put the ground rules and assets in place for a raster scene it behaves exactly the same way. Many engines run simulations just the same as RT is a simulation. And it does handle the code thats there much more efficiently, as it does not calculate all sorts of crap it wont use (no probing, culling).

Its potato potatoe material, and it all takes work while Nvidia has provided zero proof that workflows magically require fewer man hours for similar results. Just guesstimates induced by a healthy dose of marketing for the next best thing.

Nothing just works, all those things RT does 'on its own' are useless as we lack the horsepower to push it anyway. So you end up spending an equal amount of time fixing all of that.

The only thing you need less off with RT, is talented devs and designers. Raster takes more skill to get right. Not more time. RT is just a lazy package brute forcing it for you and passing the bill to end users.

Ive seen it too often. New ways of working, new algorithms... and yet, every half serious dev squad has a backlog to keep going for years...
And yet we haven't had rasterization in movies since the first Jurassic Park. I wonder why that is ;)
Posted on Reply
#66
Tartaros
londisteIt is a chicken and egg problem.
It's not. It's a problem of trying to parcel new tech for the sake of trying to squeeze the cow while the rest does their thing more slowly because you don't want to colaborate. We already had physx and havoc as a case, or even gsync and freesync. Eventually we will get there and everyone is working towards that, but nvidia ALWAYS apply the walled garden just to tear it down years later and slowing everyone else in the process. Walled gardens don't work in PC.

Look, I'm hyped over RT like no one and Cyberpunk 2077 got me really excited, but it would be better if they nvidia were more open about this.
Posted on Reply
#67
bug
TartarosIt's not. It's a problem of trying to parcel new tech for the sake of trying to squeeze the cow while the rest does their thing more slowly because you don't want to colaborate. We already had physx and havoc as a case, or even gsync and freesync. Eventually we will get there and everyone is working towards that, but nvidia ALWAYS apply the walled garden just to tear it down years later and slowing everyone else in the process. Walled gardens don't work in PC.

Look, I'm hyped over RT like no one and Cyberpunk 2077 got me really excited, but it would be better if they nvidia were more open about this.
DXR is not walled garden. And neither is its Vulkan counerpart (which I'm not sure is finished yet).
Posted on Reply
#68
ZoneDymo
birdie@btarunr

Why the sensationalist title?

Why not use "drop" instead of "dump"? As a journalist you must remain neutral.
its the same thing, you are splitting hairs just because either yourself are too emotionally involved in this or you see that 99% of the forum users are.
People are weird man.

I just dont get it, dont we all want real time ray tracing for like...two decades now?
And its finally getting worked on, with a good shot of becoming a thing....and people laugh when a company drops support?

I mean I get it, its RTX, its Nvidia, the company who pushes tech....when it exclusively benefits them (cough physics cough cough Gsync cough).
But we all want Real Time Global Illumination to become a thing.

Idk bout you guys but I get sad thinking about for example AMD's Mantle, how it was meant to introduce True Audio that would finally give audio quality a much needed kick in the rear and that never panned out.
Or how DX12 was suppose to natively support us hooking up whatever GPU's we wanted and they would work together.
Or that Lucid Hydra chip before it.
Posted on Reply
#69
Steevo
Another example of a technology with closed wall support in limited fashion that is hard to implement correctly. There are other forms of open source Ray tracing that give superior results with power computational overhead.

PhysX anyone?
Posted on Reply
#70
Vayra86
bugAnd yet we haven't had rasterization in movies since the first Jurassic Park. I wonder why that is ;)
Because movies are not an interactive audiovisual experience, but a passive one, that can be rendered ahead of time, and not in realtime?

(obtuse on purpose, or just a momentary lapse?)
notbSo... we have 2 dGPU makers and you say both are too expensive.
How exactly does that work? :-D
Common sense, and a look at the recent past, and how a perf/dollar is supposed to curve and not suddenly peak because both companies are deciding to slow down on perf gain per gen.

How this works: sales are lower than expected, and companies work harder to sell their next version at a better price or with better features, inventory won't sell, and old stock gets sold at discount. Or we buy into the marketing nonsense/empty promises, and confirm its a price level we agree with. Commerce does indeed work like that, yes. These 2 dGPU makers have a history and we have their GPUs already. We're upgrading. And we can choose not to, its as simple as that. The only share for RTX Nvidia is really capturing by storm is through the RTX2060, for all those who didn't feel like paying around 500 bucks for a GTX 1080 3 years ago. They got one instead for 350. The rest is just not interesting at all for anyone who already has a midrange or better GPU. Navi is more of the same really, by the way. 5-10% perf gaps are too silly to even talk about. The upper midrange is simply massively overcrowded right now and its more of the same ever since Pascal was released. So yes. Shit's too expensive, disregarding two or three exceptions and the odd lucky deal.

Seriously the above two responses of both of you are a clear sign of tunnel vision. You know these answers just as well as I do, but its a reality you don't like to hear, so you choose to contest it. You can rest assured, its futile. Whichever way it flies, the reality won't change, and that is you're paying way too much for a baby step forward. If that is what you want to support or defend, be my guest, but I'm just going to exercise some more patience until products arrive that dó offer what I want at a reasonable price point. And they will appear, no worries.

Same goes for RT enabled cards and games. When its truly ready like 'Crysis' was ready - and we still use the familiar quote today, even though it was unplayable at launch ;) -, these fruitless discussions are not even in anyone's mind - everyone will be amazed when a product offers something truly groundbreaking and the momentum it gets will silence any naysayers, you can go back in history for proof of that. The current state of affairs is not that - and that is the point I've been making since RTX was announced. Its good if you can make the distinction between hit and miss, and for any new tech or innovation, timing is everything. AMD can write a book about that, and so can Nvidia.

As time passes and RTRT 'momentum' is still non existant, perhaps the timing here was completely wrong, and you'd be a complete idiot to pay premium for that.
bugDXR is not walled garden. And neither is its Vulkan counerpart (which I'm not sure is finished yet).
You can build walls in many ways. One of them is having a high admission fee. Another is using custom hardware ahead of everyone else. It means you put most of the effort in your own flowers and step down on those the others are growing. Nvidia smelled an opportunity for cash and satisfying shareholders after mining dipped so it went ahead of the music. They even rushed it so badly that there was barely anything to show us. And there still really isn't much.
Posted on Reply
#71
eidairaman1
The Exiled Airman
Rtx is just like sli, physx, DEAD END
SteevoAnother example of a technology with closed wall support in limited fashion that is hard to implement correctly. There are other forms of open source Ray tracing that give superior results with power computational overhead.

PhysX anyone?
RadeonRays
Posted on Reply
#72
R-T-B
birdie@btarunr

Why the sensationalist title?

Why not use "drop" instead of "dump"? As a journalist you must remain neutral.
You know why. We all know why.
eidairaman1RadeonRays
Anything with a brand attatched to it in name is a dead end. Open, brand neutral standards are the way forward.
Posted on Reply
#73
bug
Vayra86Because movies are not an interactive audiovisual experience, but a passive one, that can be rendered ahead of time, and not in realtime?

(obtuse on purpose, or just a momentary lapse?)
No, I was just pointing out that, everything else being equal, ray tracing still yields better visuals. That's the primary reason RTRT is worth pursuing ;)
Posted on Reply
#74
Vya Domus
R-T-BOpen, brand neutral standards are the way forward.
DXR is brand neutral, at the end of the that's what makes RT work not RTX.
Posted on Reply
#75
Totally
notbTPU serves almost no texts about RTRT development or new games that start implementing them, but - inevitably - we get a text about a game studio that delays RTRT implementation.

Normally I'd say @btarunr is pro-AMD (like many on this forum). However, AMD has already said RTRT hardware is coming soon, so what's going on? Are you guys just anti-RTRT? Why?
I believe people are just keeping their feed grounded in reality. We don't preorder games on promises of what's to come. Why should that notion not apply to hardware also?

As for things being newsworthy or not, until devs start getting games with RTX out the door it doesn't matter who pledges to support it as there has been a large number for quite some time already but the number games that have shipped RTX can be counted on a single hand.
Posted on Reply
Add your own comment
Apr 19th, 2024 04:56 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts