• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

HalfLife2 RTX Demo Is out!

Are you interested in HalfLife 2 with Ray Tracing?

  • Yes! Bring it on!

    Votes: 44 42.3%
  • Yes, worth a try.

    Votes: 26 25.0%
  • No, I like the original look more.

    Votes: 20 19.2%
  • Something else, comment below.

    Votes: 14 13.5%

  • Total voters
    104
Did you not notice any of the remastered assets? This game looks like how I remember half life 2, but looking side by side today, it's evident that, lighting aside, this version looks considerably better. Some of the assets are stunning.
Right but those have nothing to do with the RT performance hit. If you just remastered half life 2 with conventional lighting you could have essentially the same game with 5-10x the FPS. 90% of the visual quality improvement is coming from raster/asset improvements, not from RT.

RT is a trojan horse. Nvidia has always wanted to make AI/GPGPU chips (ever since Brooks/ CUDA) - they're pimping AI boosted RT because it allows them to cover the cost in developing their AI chips by selling them to gamers, but in reality, if you look at it as an objective gamer, RT is a minimal improvement to a game, over say - remastered assets or higher quality raster.
 
Last edited:
Not to say that the look isn't nice, but I was never a big fan of Half-Life 2. It was a pretty good game, don't get me wrong. I, however, enjoyed Blue Shift and Opposing Force a lot more over the original Half-Life game and even Half-Life 2 game.

No amount of "Look at me! I'm so pretty!" will get me to come back and replay Half-Life 2. I have and will again, go back to replay Blue Shift and Opposing Force.
 
What nonsensical claim is this when literally every subsequent generation has got better at it?

Despite what you and other naysayers continue to claim, RT is not going anywhere, it's only going to get better, and it is going to become the standard graphics rendering technology.
Mhm, we'll see.
People said the same about VR.
And AR.

Multiple times, in history, too. Its still nowhere.

Right but those have nothing to do with the RT performance hit. If you just remastered half life 2 with conventional lighting you could have essentially the same game with 5-10x the FPS. 90% of the visual quality improvement is coming from raster/asset improvements, not from RT.

RT is a trojan horse. Nvidia has always wanted to make AI/GPGPU chips (ever since Brooks/ CUDA) - they're pimping AI boosted RT because it allows them to cover the cost in developing their AI chips by selling them to gamers, but in reality, if you look at it as an objective gamer, RT is a minimal improvement to a game, over say - remastered assets or higher quality raster.
I'm glad you see it too.

This is why I don't get all the kvetching about RT being slow; rasterisation used to be slow too and guess what, the hardware did eventually catch up. Everyone seems to have forgotten how truly wretched it was trying to get big-name games to run at playable framerates on graphics hardware between 1995 and 2010; since then we've been truly spoiled by how powerful that hardware has become. RT is that decade-and-a-half cycle repeating, and just like back then people were frustrated at how apparently slow things were moving, and just like back then it is getting and will get better.

Yes, there is the Moore's Law wall to contend with this time around, but we also have other solutions like frame generation to help us overcome the limitations of physics. You may not like them, but they are solutions, they do work well enough, and they too are getting and will get better. And eventually the hardware will become powerful enough to not need them, and we will discard them.
If we were still in the old paradigm of YoY GPU releases that kept offering substantial performance increases at the same price point on RT without fake frames or software trickery or handicapping the quality on the fly to keep some semblance of performance, then I would be completely on board with RT development.

But that paradigm is gone, and it ain't coming back, simply because we can't shrink chips further and the big efficiency wins have been implemented already.

RT is a brute force approach, as well, making it plain obvious inefficient which is an economical shitstorm that is happening as we speak. Lower FPS directly translate to higher prices, because people want some reasonable FPS and devs want that too on some reasonable price level: they're not managing that. GPU raw performance and price per shader has stalled completely, so the only way for price to go is up.

The price of RT is the problem. Not RT itself. And its primary marketing push is coming from a company that is single handedly damaging that same market it says it wants to develop RT for. I don't know about you, but my alarm bells are ringing here: if you really care about the tech's survival, you'd place far more and far better offerings in the market. But Nvidia doesn't give a shit - they have AI.
 
Last edited:
Goddamn slideshow of 25 FPS :D 12-years-old Cinematic mod owns this utter unoptimised pile of RayTraced crap.
 
Well, it's finally downloaded and installed -- only the demo; when does the full game get activated?

But it seems to be working okay so far. Don't have time to play at the moment, but I'll post some results later, both for the 4090 and the 3070Ti.
 
How is that every time some new RT tech demo/game comes along there are people coming in with the same boring arguments "RT bad, Raster FTW, let's go back to 2010".
I was hoping that since AMD also clearly stated where their focus is, it would dial down but no.

RT and even more so PT is demanding, yes it requires some trickery to be able to run realtime, but come on if somebody 5 years ago would tell me that we will have PT that runs in realtime I would laugh.
Go into Blender, download some demo scenes and run one render with Cycles, see how long one frame will take. And even that has speed up massively thanks to Optix and now HIP RT from AMD, so maybe make sure to either only use CPU or CUDA/HIP without RT acceleration.

But sure if not RT then please do tell what other techniques we could explore to get realtime lighting that is physically correct.
 
But sure if not RT then please do tell what other techniques we could explore to get realtime lighting that is physically correct.
Why is this so crucial to gaming?
 
Right but those have nothing to do with the RT performance hit. If you just remastered half life 2 with conventional lighting you could have essentially the same game with 5-10x the FPS. 90% of the visual quality improvement is coming from raster/asset improvements, not from RT.

RT is a trojan horse. Nvidia has always wanted to make AI/GPGPU chips (ever since Brooks/ CUDA) - they're pimping AI boosted RT because it allows them to cover the cost in developing their AI chips by selling them to gamers, but in reality, if you look at it as an objective gamer, RT is a minimal improvement to a game, over say - remastered assets or higher quality raster.
Glad to see someone else understands this. We’re simply squandering the limited transistor budget that each new node provides.

I love it when people act like they don't understand, comparing apples to oranges. From 1995 to 2010, lithography density increased by hundreds of times. But from the Turing era (2080 Ti) to today, we've only seen a 4-5x improvement. Fast forward half a decade, and it’s unlikely we’ll even double the performance of what 3nm tech delivers today. How can anyone realistically expect real-time ray tracing (RT) to become mainstream and popular under these conditions? And is it really reasonable for a $2,000–$3,000 product to consume 600 watts just to run at 30-40fps in a game with a low polygon count? The real problem is the tendency to label critical thinking with trolling. :p
 
Why is this so crucial to gaming?

Why are graphics even crucial to gaming? Like I said, it's "let's go back to 2010" argument.
I mean they might not matter to you and that's fine but since this is a tech enthusiast forum, in a thread regarding new PT mod of HL2 then it might not be the place to announce it.

Glad to see someone else understands this. We’re simply squandering the limited transistor budget that each new node provides.

I love it when people act like they don't understand, comparing apples to oranges. From 1995 to 2010, lithography density increased by hundreds of times. But from the Turing era (2080 Ti) to today, we've only seen a 4-5x improvement. Fast forward half a decade, and it’s unlikely we’ll even double the performance of what 3nm tech delivers today. How can anyone realistically expect real-time ray tracing (RT) to become mainstream and popular under these conditions? And is it really reasonable for a $2,000–$3,000 product to consume 600 watts just to run at 30-40fps in a game with a low polygon count? The real problem is the tendency to label critical thinking with trolling. :p
What would you rather spend it on? The transistors I mean.
 
The real problem is the tendency to label critical thinking with trolling. :p
You pulled together some well reasoned thoughts here, and presented them rationally, irrespective of whether I agree with them or not. This is not to be confused with your trolling earlier, despite this attempt to claim it wasn't. Notice how others aren't being accused of the same despite having negative opinions, weird. Perhaps look into the differences.

Perhaps we can try steer the topic back to this demo itself as we're veering off into the direction of graphics technology itself more than discussion about the actual game the thread is about. Interesting discussion too, kind of deserves it's own thread. (till they get shut down :rolleyes: )
 
Why are graphics even crucial to gaming? Like I said, it's "let's go back to 2010" argument.
I mean they might not matter to you and that's fine but since this is a tech enthusiast forum, in a thread regarding new PT mod of HL2 then it might not be the place to announce it.
No its an honest question especially for those tech minded and interested in gaming. Its not about going back to 2010. We all like prettier pictures, I think.

What many gamers do question- and what you also see in the wild with what games reach mass adoption (and keep being played) is how much value they need to attribute to prettier pictures - what is it worth to them? And how important it is relative to good gameplay. Ideally, games have it all. But that's not practice - developers need to make choices what to spend time on. And gamers need to make choices what to spend money on. A lot of games with a hyper focus on graphics tend to be limited in scope, mechanics, length, or just overall quality of gameplay.

So again, what makes you value 'perfect accuracy' so much higher than the 'almost perfect' that you already had in raster with 2-3x the performance? The fact that it's new, and promises a lot? Or has RT really transformed your gaming experiences?
 
No its an honest question especially for those tech minded and interesting in gaming. Its not about going back to 2010. We all like prettier pictures, I think.

What many gamers do question- and what you also see in the wild with what games reach mass adoption (and keep being played) is how much value they need to attribute to prettier pictures - what is it worth to them? And how important it is relative to good gameplay. Ideally, games have it all. But that's not practice - developers need to make choices what to spend time on. And gamers need to make choices what to spend money on. A lot of games with a hyper focus on graphics tend to be limited in scope, mechanics, length, or just overall quality of gameplay.

So again, what makes you value 'perfect accuracy' so much higher than the 'almost perfect' that you already had in raster with 2-3x the performance? The fact that it's new, and promises a lot?

If we are talking about time and spending it then RT can only help developers, not sacrifice other elements to add RT. Simply because you can place a light and that's it, you don't need to wait for it to bake, you don't need to carefully place fake lights on correct spaces so that it looks good because Raster cannot emulate this or that type of light.
Right now it's simply being held back because consoles have poor RT performance, that will hopefully change with next gen ones.

As for gameplay, well gameplay is gameplay, it doesn't work in a way - game doesn't have RT = it has good gameplay. Games can either be bad or good, RT has nothing to do with it.
You can have all the time in the world and still make a bad game (Skull and Bones anyone?).

And raster has it's limits that we have reached, if we want to go further there must be some implememnation of RT involved, whether it's PT, RTGI with ray traced probes, Lumen with it's SDFs or Cryengine with it's SVOGI that traces voxels. All are approximations, better or worse because even PT uses ReSTIR, how accurate you want it to be depends on the developer.
Me personally I like good lighting in games, I hate when shadows pop in and get worse in quality 3m from my character, I hate when a game looks flat, I like when I can interact with light and see it working correctly.
And If I can choose whether to play a game in pure raster in native 4K with 200 fps or a game with RT/PT at 4K with DLSS and framegen I choose the second option.

Latest example is AC Shadows, as much as I probably won't play that game, the latest comparison between RTGI on and off on consoles really shows how raster lighting is flat and incorrect and how much better RTGI looks.
 
Just tried it, at lowest setting it looks horrid, performance too. I'm sure with proper GPU it will look great but I rather play the original 2004 version on period correct hardware (plus CRT). I compared the modern Source Engine HL2 with the original I still prefer the latter.

Here how it looks like, static image is ok but when it move.....ugh

rtx.jpg
 
If we are talking about time and spending it then RT can only help developers, not sacrifice other elements to add RT. Simply because you can place a light and that's it, you don't need to wait for it to bake, you don't need to carefully place fake lights on correct spaces so that it looks good because Raster cannot emulate this or that type of light.
Right now it's simply being held back because consoles have poor RT performance, that will hopefully change with next gen ones.

As for gameplay, well gameplay is gameplay, it doesn't work in a way - game doesn't have RT = it has good gameplay. Games can either be bad or good, RT has nothing to do with it.
You can have all the time in the world and still make a bad game (Skull and Bones anyone?).

And raster has it's limits that we have reached, if we want to go further there must be some implememnation of RT involved, whether it's PT, RTGI with ray traced probes, Lumen with it's SDFs or Cryengine with it's SVOGI that traces voxels. All are approximations, better or worse because even PT uses ReSTIR, how accurate you want it to be depends on the developer.
Me personally I like good lighting in games, I hate when shadows pop in and get worse in quality 3m from my character, I hate when a game looks flat, I like when I can interact with light and see it working correctly.
And If I can choose whether to play a game in pure raster in native 4K with 200 fps or a game with RT/PT at 4K with DLSS and framegen I choose the second option.

Latest example is AC Shadows, as much as I probably won't play that game, the latest comparison between RTGI on and off on consoles really shows how raster lighting is flat and incorrect and how much better RTGI looks.
This, so much this. Raster is a hack to emulate realism that has had 3 decades to get good at that emulation; the practices, tooling, and mindset of the entire graphics industry is focused on those hacks. RT isn't just fighting a massive performance disadvantage, it's fighting inertia which is a far more formidable force.

This is why I get so annoyed when people complain that ray-traced lighting is bad. It's not bad, it's realistic - but years of exposure to raster has conditioned you to think that raster is right, solely because it's what you know. Stop using raster as the benchmark, use your EYES! Literally.
 
5 Minutes Playtime Review!

Smearing and ghosting pile of hot cow pat.
 
Low quality post by jarablue
So far, it runs like crap on the 3070Ti. If I enable DLSS, it's blurry and lower quality than the original. If I disable DLSS, it's an unplayable slide show. Guess it's going to be 4090 only.

Edit: fired up the original just to compare, and I ended up playing 15 minutes of Ravenholm and totally enjoying it. Uninstalled RTX on this (3070Ti) build; just doesn't seem to be worth it on an older 8Gb card.
 
Last edited:
IMO the dev tools here are actually cooler than the remastered visuals.

HL2 looks like it was released in 2014 instead of 2004 now, but runs like it was released in 2024.

If there was a method to bake in the lighting in the textures and really reduce the RT overhead this would be an amazing way to remaster a lot of older games where really the only thing wrong are low res textures and archaic lighting models.

Looks like there is some control over Ray bounces and such, so if the improved visuals don't drop off too much but the performance can be greatly improved there is definitely something worth celebrating here.
 
This 'solves' problems. You just don't see everything. And then you can white knight more about cards 'having enough VRAM with 8GB in 2025' like you seem to love to do lately.

Enjoy your illusions, ignorance is bliss, they say.
What's with the personal attack? Good grief man..
Mhm, we'll see.
Oh ok, I see, you're not just being an ahole to me, you're doing it to everyone. Frustrated are we?

NEWS FLASH, 8GB of VRAM is still enough for gaming in 2025! Also, raytracing is here to stay. You doing your nay-saying(and being a jerk about it) matters not.
Why is this so crucial to gaming?
If you haven't figured out why raytracing is here to stay after 7 YEARS, you have the problem. See to that.
 
Last edited:
When you want to do that come over to the Win11 discussion thread and I'll walk you through a customized/debloated version, perfectly legit of course. It's really easy too.
I will in 6 months, I might start with ltsc then debloat further, NO WUD to screw up the os either lol
 
This, so much this. Raster is a hack to emulate realism that has had 3 decades to get good at that emulation; the practices, tooling, and mindset of the entire graphics industry is focused on those hacks. RT isn't just fighting a massive performance disadvantage, it's fighting inertia which is a far more formidable force.

This is why I get so annoyed when people complain that ray-traced lighting is bad. It's not bad, it's realistic - but years of exposure to raster has conditioned you to think that raster is right, solely because it's what you know. Stop using raster as the benchmark, use your EYES! Literally.
See, I don't disagree on that fact, in isolation - but much like how engineers don't design in isolation, RT doesn't live in isolation. It deals with a world of problems and people and opinions.

But there is more to this than just the graphical change - its not really about being conditioned to anything, I think that's in your head and yours alone. Graphics have been changing all the time, quite radically, too. And I remember that every time, 'we' (gamers) judged what we got versus what performance hit it gave. Many things we simply turned off - and still turn off today, if we can.

Also I think you're easily glossing over the fact that an overwhelming majority of games is not made to be realistic graphically. They pursue various (artistical) styles.

What I've put forward as the counter argument to the use of RT is an economical one. People keep saying 'time will fix this'. So far what I'm seeing isn't encouraging. My anti-RT stance has been economical of nature all the time, ever since the first announcement. Brute forcing is expensive - its really that simple. I do see a lot of value in @remekra 's response and I'm not blind to the progress on the various approaches on RT implementation. For example the Cryengine SVOGI demo was impressive - it also had some small inaccuracies, but wasnt' that quite the sweet spot right there between performance and graphics wins? Why are we pursuing a PT that no card can feasibly run? The answer is economical. Not for your economy mind, but that of a three trillion dollar company.

I'll readily admit RT can indeed look better. But I've also seen raster based games deploy an equally good scene. Both are rare occurences and what sticks here is that its the scene you're looking at, that matters most. Not how it is lit. So far, I've not been convinced RT is a unified method to improve lighting in games. It can indeed improve them at a massive performance hit. But most RT implementations never even go there, and are a weaksauce alternative that shows inaccuracies just the same as raster does - the only, literally only objective thing non-PT RT has going for it is the offscreen reflection. That's truly something raster graphics can't do. But is it something that'll make or break graphics in a game? I beg to differ. This is all concluded using my own eyes, looking at many hours of RT content. I see the differences. I value them rather low.

Oh ok, I see, you're not just being an ahole to me, you're doing it to everyone. Frustrated are we?

NEWS FLASH, 8GB of VRAM is still enough for gaming in 2025! Also, raytracing is here to stay. You doing your nay-saying(and being a jerk about it) matters not.
Just pointing out there how your 8GB is 'managing' to remain enough. No need for caps. Surely you're not frustrated. I do hope you see the immense irony of it - missing quality on textures and assets is no biggie, but those rays shall be cast or its 'unrealistic' :roll:

Honestly, my stance can be captured in my signature. It just says it all: the utter ridiculousness of the money we spend for a few ray casted pixels, when in fact graphics are 99% done. The silly chase for realism in games built to escape from reality and specifically do things not really possible. Yes, it'll barely improve from now on... live with it. RT ain't gonna be changing that. The implementations that will survive are the ones that still use tricks to simulate reality - and still don't quite entirely get there.

So that's why I think... I'll see it when it actually is worth a damn. Not in a rush... let them figure out their consistency and performance issues first.

What's with the personal attack? Good grief man..
Sorry, you get that when immovable object meets impenetrable wall.

If you haven't figured out why raytracing is here to stay after 7 YEARS, you have the problem. See to that.
It was an objective question and I got an objective response to it. Tell me again who's frustrated here. I'm not having any kind of FOMO or problems because of RT... all I do is save money. I'm not gaming a second less for it, nor is the gaming less fun without it.

Better graphics. It's the whole point new gpus are released.
Except the new GPUs are the same thing you already had. And I think the sell for them wasn't better graphics, but more frames on the same graphics. Interesting nuance I think.
 
Last edited:
Except the new GPUs are the same thing you already had. And I think the sell for them wasn't better graphics, but more frames on the same graphics. Interesting nuance I think.
And that's why people (you included) are complaining about them. And that's why you are not buying one. But new gpus exist and are bought for people to see better graphics, period point blank. Bringing gameplay into the discussion is completely bonkers. It's not a gpus job to make a game better. Never was, never will be.

Remastered GTA V is a good example of why RT is a leap in graphics. Too long to explain the whys and what though and it's offtopic since this is about HL2 and not RT in general
 
And that's why people (you included) are complaining about them. And that's why you are not buying one. But new gpus exist and are bought for people to see better graphics, period point blank. Bringing gameplay into the discussion is completely bonkers. It's not a gpus job to make a game better. Never was, never will be.

Remastered GTA V is a good example of why RT is a leap in graphics. Too long to explain the whys and what though and it's offtopic since this is about HL2 and not RT in general
As always its all about the content and that includes gameplay. Its no surprise RT gets added to already built games - that way you can combine a well built game with a new graphical sauce, instead of having to juggle your resources and budget between both.

So gameplay is definitely related to RT as much as gaming is. And I can certainly point at games where the investment in graphics has taken precedence over the investment in a complete game - or there being simply less game and nice graphics altogether.

But you're not wrong, I think we've come full circle on this subject and changing it to HL2 Demo and that alone would be good.
 
And that's why people (you included) are complaining about them. And that's why you are not buying one. But new gpus exist and are bought for people to see better graphics, period point blank. Bringing gameplay into the discussion is completely bonkers. It's not a gpus job to make a game better. Never was, never will be.

Remastered GTA V is a good example of why RT is a leap in graphics. Too long to explain the whys and what though and it's offtopic since this is about HL2 and not RT in general

We should see a lot more movement if the next consoles are even better than RDNA4 at it on top of having a competent upscaler.

I do get the frustration of some though it still has a ton of issues 3/4 of a decade in and after an abysmal generation when it comes to actual meaningful improvements where it actually matters in the 4-800 range people are probably pretty tired of the RT this and RT that. On top of that even with cutting edge hardware we need a frame generation that increases latency and introduces artifacts and poor implementations of DLSS that increase ghosting and dissoclusion and while RR is getting a lot better even it still has a ton of issues.

Is RT the future, 100% but progress has been slow and even the best RT implementations don't wow in the same way environmental complexity once did or physically based materials or even Bump mapping over 20 years ago did on average anyways.

When someone asks me to name 5 modern games where RT clearly is transformative I have to think pretty hard and even most of them is its a it depends on the scene. You'd think after 7 years that would be easier.

Also some developers just implement it terribly so as easy as it is vs baked lighting it still takes a lot of work to look good.

Hopefully like I said above once consoles are competent at it things will move faster after all even with the RDNA2 based hardware some developers have done a good job striking a balance with it.
 
Back
Top