• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

HalfLife2 RTX Demo Is out!

Are you interested in HalfLife 2 with Ray Tracing?

  • Yes! Bring it on!

    Votes: 44 42.3%
  • Yes, worth a try.

    Votes: 26 25.0%
  • No, I like the original look more.

    Votes: 20 19.2%
  • Something else, comment below.

    Votes: 14 13.5%

  • Total voters
    104
It's a little bit like...
1. Let's improve our image quality while crushing performance (RT).
2. Let's improve our now horrible performance while crushing image quality (upscaling).
Genius! :rolleyes:


I see your point, but the only problem is that a midrange GPU is still too slow to play with RT even in the first RT games.
First rt games like bf1 run perfectly fine on midrange cards.

And claiming that upscaling crushes image quality is just... well whatever, weve been through that.

Then they come back a few years later and they try itagain, and still draw the same conclusions. It doesn't matter whether its technically reasonable - that is also what I mean to say when I say RT doesn't live in isolation. I'm not saying you're wrong here though. Obviously a heavier workload runs slower, that's the whole core of the issue even, the workload is too heavy for the GPUs its designed to run on.
To add to that, I think people havent realised how expensive rt is. It's not "I enable rt and I get 10 fps heavy", it's "I enable rt and I get 100 spf heavy". Spf stands for seconds per frame. It's a marvel we can run what we do run at the performance we are running it, even though the heaviest of rt games are just a drop in the ocean of what the tech can do.

CP77 with PT will be the holy grail for midrange gpus for the next 2 generations at least, so yeah we are moving very slowly cause the tech is just very heavy (and gpus especially at the midrange have stagnated).
 
The only arguably exclusive stuff in HL2 RTX, considering you can run it just fine on RDNA 4 hardware for example, are things like DLSS, Ray Reconstruction etc. These are proprietary because they run on special hardware that doesn't exist in the competition.
False. They do not require special hardware. The problem is in that nvidia has implemented them in a way that only works with nvidia gpu’s, using cuda instead of open standards.

these techniques are just some matrix operations.

The amd gpus can calculate all the same calculations as nvidia gpu’s can, this is not a hardware issue. It could be a performance issue, but we will never know.
 
What the hell are you peeps talking about? Nvidia hasnt increased RT performance? This is a purely RT workload, explain it please...It's only the 5070 and the 5080 that are kinda meh compared to last gen, 5070ti and 5090 are a decent jump.
View attachment 390684


EDIT. Posted a new one from the latest review to include 5xxx and 9070

untitled-43.png
Yeah but now add the power back in... 3090 and 4090 used the same amount of power - 5090 is 600W. So performance / watt increase is minimal.
 
Well I don't know, one could'a thunk that in the past seven years we made some actual progress, but people are still trying to get to grips with what is supposed to be a simpler workflow where you can just plonk your lighting in the scenes. Anything in motion is still a blurry mess despite four major version updates of DLSS and anywhere it is used without extra special Nvidia TLC.

And people clearly don't like that a whole lot. They're still awaiting the promises made to be fulfilled. It is quite similar to what I've been seeing and saying about it.
I mean yea, exactly, who would'a thunk? Nothing's changed. I'm not excusing it though.

What you see here is people identifying stagnation, versus other people saying its really happening 'but not quite there'.
The gist is: its not there yet, seven years in.
Forever in beta.
Which perfectly summarizes the RTX demos for me.. come back in a generation or twos time and play it to see what the standard was like. I would honestly prefer if NVIDIA went back to making actual demo's to demonstrate stuff. the RTX stuff has just been very.. stale.

Very true. That's why I also think remekra's post was valuable. Those are tangible arguments to identify progress. But they're all attempts and none of them are really at a point where everyone says 'This is it!' And we've been looking at this for quite a while now.
I haven't seen their post, but I still sit by the belief that Lumen is progress in the right direction done in the right way, I think raster / rt mix works best, even if the RT is very minimal such as in the resident evil remakes.
 
Maybe try verifying game files? or even deleting a couple you think could be config related then verifying again before a complete reinstall.
I was able to fix it with the steam verify integrity of game files function for the game. Afterward I jacked the rendering to Ultra and was able to play around 30fps with 78% vram usage on my 5950x + 4060LP (8G vram) + 64GB RAM. It did fairly well. I was expecting worse from all the negative RT comments here.
 
Now we need Doom 3 RTX and FarCry RTX to fully complete the “it’s 2004 again and my PC can’t run it” bingo card.

On the topic - zero interest to play this even when I get an RTX card soon-ish (it’s really time to put something newer into my personal rig), I have a huge backlog and that interests me more than playing HL2 for a thousandth time with a prettier layer of paint on it. HL2 is just not a very replayable game, I find, less so than its predecessor. It’s very much a rollercoaster ride and once you’ve experienced all its tricks there’s kind of nothing left and the shooting is quintessential early Source, so quite meh and cannot really support the game by itself. FEAR it is not. Hell, Dark Messiah it is not, if we want to stay in the Source swimming pool.
 
Now we need Doom 3 RTX and FarCry RTX to fully complete the “it’s 2004 again and my PC can’t run it” bingo card.

On the topic - zero interest to play this even when I get an RTX card soon-ish (it’s really time to put something newer into my personal rig), I have a huge backlog and that interests me more than playing HL2 for a thousandth time with a prettier layer of paint on it. HL2 is just not a very replayable game, I find, less so than its predecessor. It’s very much a rollercoaster ride and once you’ve experienced all its tricks there’s kind of nothing left and the shooting is quintessential early Source, so quite meh and cannot really support the game by itself. FEAR it is not. Hell, Dark Messiah it is not, if we want to stay in the Source swimming pool.
I find it repayable probably because the theme suits my own rebellious nature nicely. :D

But I get what you mean. It's a fun game thanks to the physics, but kind of repetitive after a while. The first one was a more well-rounded experience. And if you add Black Mesa on top... Holy crap! :rolleyes:

Or maybe we're being a bit too nostalgic?
 
Now we need Doom 3 RTX and FarCry RTX to fully complete the “it’s 2004 again and my PC can’t run it” bingo card.

On the topic - zero interest to play this even when I get an RTX card soon-ish (it’s really time to put something newer into my personal rig), I have a huge backlog and that interests me more than playing HL2 for a thousandth time with a prettier layer of paint on it. HL2 is just not a very replayable game, I find, less so than its predecessor. It’s very much a rollercoaster ride and once you’ve experienced all its tricks there’s kind of nothing left and the shooting is quintessential early Source, so quite meh and cannot really support the game by itself. FEAR it is not. Hell, Dark Messiah it is not, if we want to stay in the Source swimming pool.
I'll take both of those now! lol Also, about playing HL2 one more time.. I'm in the same boat this was cool to look at and is really great for newcomers but for me I want a new Half Life and right now that looks like it will be HLX.

As good as this HL2 RTX is it's really nothing compared to Half Life Alyx. I'd say for any Half Life fan that has not played Alyx yet it's 100% worth grabbing a Quest 3/3s. It's mind blowing at how good of a game it is and how it really shows you how good VR can be.
 
But I get what you mean. It's a fun game thanks to the physics, but kind of repetitive after a while.
It’s not really repetitive as such, every chapter DOES have its own gimmick and the pacing is excellent. It’s just that the minute to minute gameplay itself (the combat, mostly) that connects the gimmicks is nothing to really write home about.

Or maybe we're being a bit too nostalgic?
I’d say finding flaws with old classics is kind of the opposite of nostalgia. HL2 has this cult image nowadays of a “perfect” game, which, I think, is kind of… true and not? As I said, it’s an outstanding ride for a playthrough or two, but it’s not Deus Ex/Bloodlines/Dark Messiah level of “reinstall and explore every year” or the infinite depth and replay value of something like HoMM3. It’s just a solid game.

As good as this HL2 RTX is it's really nothing compared to Half Life Alyx. I'd say for any Half Life fan that has not played Alyx yet it's 100% worth grabbing a Quest 3/3s. It's mind blowing at how good of a game it is and how it really shows you how good VR can be.
Unfortunately, my bum eyes just don’t work with any sort of VR, so that avenue will remain unexplored by me.
 
Unfortunately, my bum eyes just don’t work with any sort of VR, so that avenue will remain unexplored by me.
You can try the No VR mod.
 
@Mindweaver
Sure, but, from what I gather, 90% of the experience that MAKES Alyx IS VR. So this would seem a bit pointless. Might as well watch the story on YT. Still, I probably will go down that route at some point, just to see the game, even in a diminished state, but I am in no hurry. I doubt HLX will come out before, like, 2030, knowing Valve.
 
Yeah but now add the power back in... 3090 and 4090 used the same amount of power - 5090 is 600W. So performance / watt increase is minimal.
The 3090ti was using 450 though and the 5090 still nails it in performance per watt.
 
Performance is better than other games with Path Tracing, demo look way way better than the shitty Sony remastered games and Crapcom's RE2/3/4 remake that they are selling as new game
 
@Mindweaver
Sure, but, from what I gather, 90% of the experience that MAKES Alyx IS VR. So this would seem a bit pointless. Might as well watch the story on YT. Still, I probably will go down that route at some point, just to see the game, even in a diminished state, but I am in no hurry. I doubt HLX will come out before, like, 2030, knowing Valve.
True and this was the hardest suggestion that I've ever made.. lol but if there is no other way. It was really hard for me to push Post reply.. haha
 
Performance is better than other games with Path Tracing, demo look way way better than the shitty Sony remastered games and Crapcom's RE2/3/4 remake that they are selling as new game
Did you try the other level yet? (sorry I forgot what it was called) I noticed if you look at glass with the embedded wire lattice then walk backwards it seems like the rendering goes haywire and from a distance the it renders with window with the very weird animation effect like rapidly moving water. Have you come across that? Perhaps you can capture that in a video it was a pretty interesting side-effect.
 
I find it repayable probably because the theme suits my own rebellious nature nicely. :D

But I get what you mean. It's a fun game thanks to the physics, but kind of repetitive after a while. The first one was a more well-rounded experience. And if you add Black Mesa on top... Holy crap! :rolleyes:

Or maybe we're being a bit too nostalgic?
No, the gameplay and particularly gunplay in the original Half-Life was definitely superior to its sequel. The original's constraints in terms of hardware resources meant the maps were small, which meant a lot of very confined spaces, which created some very memorable combat sequences - particularly with the Marine AI that was optimised to deal with those confines. Of course, then you had Xen which was... ugh.

The sequel has much larger maps that were often much more open and exploratory, but in contrast the Combine soldiers never felt particularly smart nor threatening. Arguably HL2 was best as a shooter when it imposed similar constraints to the original - Nova Prospekt especially remains one of my favourite levels - but it's also fair to say that HL2 is a lot more of a game, with actual characterisation and world-building and plot, than the original which was mostly just a well-presented shooter. Plus the Gunship and Strider battles are still amazing all these years on.

For anyone who hasn't checked out the Minerva: Metastasis mod for HL2 yet, I'd highly recommend they do so. It's basically no plot, all shoot, in some very tight spaces.
 
Performance is better than other games with Path Tracing, demo look way way better than the shitty Sony remastered games and Crapcom's RE2/3/4 remake that they are selling as new game
That light should be destructible.
 
There is no such thing as a purely RT workload.
Their point was to show the jump in RT performance exclusively, a point they succeeded in making. I was going to argue the very same point, but didn't want to jump down that rabbit hole.

Just that it's not an improvement of the RT cores per se.
But it is. It's not a huge or dramatic jump in IPC, but the improvements are not shabby either.

Ravenholm level, 3440 x 1440, low quality render, TAA-U Performance setting: 30-50 FPS depending on how heavy the scene is. It's not unplayable, and I'd even risk saying it's enjoyable, if not for the horrible TAA making it look like a bowl of mashed potato. :laugh:
Ok, turn off AA, turn down some of the RT FX and what do you get? If anything you can drop to 2560x1080p. That res will still look good on that screen and should give a reasonable boost to performance for this version of the beta.
 
Arguably HL2 was best as a shooter when it imposed similar constraints to the original - Nova Prospekt especially remains one of my favourite levels - but it's also fair to say that HL2 is a lot more of a game, with actual characterisation and world-building and plot, than the original which was mostly just a well-presented shooter.
It’s funny how when it released HL was a paradigm shift and was hailed as bringing story-driven approach to shooters (by people who didn’t play Marathon, I suppose), but in retrospect the story is just Doom. Like, almost one to one. It was the presentation and the unbroken first-person perspective that really stood out, but it IS just Doom. HL2, as you said, is much more of a cohesive, whole experience. It’s just honestly very different in most regards, to the point where, funnily enough, despite superficial similarities it’s one of the most dissimilar sequels when compares to the predecessor among really high-profile mainstream games. I actually mentioned FEAR originally for that very reason - it’s more of a Half-Life sequel in terms of structure and moment-to-moment gameplay than the actual HL2.
 
But wasn't that always the case, pre RT as well? You launched a game, tried ultra settings, if your gpu was too slow you just turned the settings down
Yes, but now people are envisioning a future where RT is integrated in the core of the game without being able to turn it off.

We've already seen the mess that was made of forced upscale/TAA. Now of course, games will gradually just keep trying stuff and one failed game isn't an issue. But then I look at the box of UE5 games that have so little to offer for their supposed new gen of gaming, and I wonder WTF is going on. There's a weird corporate push underneath this whole movement and it doesn't lead to better content or devs being left their freedoms to make it.

I think companies are very often too busy with the amount of cool stickers they can put on a box (supports XYZ RTX DLSS etc.) than whether the implementation of it is really worth anything. Other companies use their game as a playground for new RT technology, but forget to finish the game alongside it (Cyberpunk). Its like... do you guys WANT this to fail, or wtf are you doing here?!

Its so similar to VR. There isn't a single killer app that I MUST have it for. Honestly. My body is ready for that killer app. It ain't there.

You're missing my point.

5080 vs 4080 Super. Similar number of cores and frequency = similar performance. Even in RT. No advancement.

This is the first time stagnation shows, because this is the first time we're getting the same number of cores as last gen.
I think what you're looking for here is something akin to 'IPC'.

The performance per shader just isn't improving on the RT front, by much, anyway. The majority of it is coming from higher clocking, too, if you see it happen. Its becoming clear that you simply need X shaders for Y performance. Shrinks and higher clocking can only fix so much, but its the only fix for RT as far as we've seen. People have mentioned the balance within a shader (-unit); what's dedicated to what. You can obviously slowly tweak that to more RT focused... but then you still haven't won jack shit in terms of die space vs performance. You're just offering lower raster perf to give more RT perf (and most likely lose net. FPS).

That's why Nvidia is pushing the fake frame approach on an otherwise architectural standstill. And why we had a marketing line comparing x70 to x90 with MFG on.

This paradigm will change if AMD or Nvidia can bring an architecture to the market that actually does more work with fewer cores. Until then? RT is DOA and simply cannot survive the demise of Moore's Law. (This is exactly the prediction I made when Huang yelled 10 gigarays on stage 7+ years ago) As much as you can't deny RT being a heavy workload that 'just is'... that automagically defines that its going nowhere.

Unless of course, someone has a wild future vision about ultra cheap 1nm 750mm2 chips being loaded into every consumer PC. The only other escape route is very focused, very streamlined, or toned down RT, that approaches reality but isn't quite it. We're seeing that development too.
 
Last edited:
If we are talking about time and spending it then RT can only help developers, not sacrifice other elements to add RT. Simply because you can place a light and that's it, you don't need to wait for it to bake, you don't need to carefully place fake lights on correct spaces so that it looks good because Raster cannot emulate this or that type of light.
Right now it's simply being held back because consoles have poor RT performance, that will hopefully change with next gen ones.
It's not quite like that. RT can save time in some areas. With ray tracing, you plop a light down, and it naturally handles reflections, soft shadows, and global illumination without needing to fake it. No baking lightmaps for hours, no fiddling with dozens of fake lights to mimic bounce or ambient effects—like in old-school rasterization where you’d add fill lights or reflection probes to cheat what RT does natively. For devs focused on realism, that’s a win. Ok.
A single RT light can theoretically replace a bunch of hacks, and you don’t have to wait for a bake to see results. That’s time saved, especially in dynamic scenes where baked lighting falls apart.

But saying it “only helps” and doesn’t “sacrifice other elements” is off-base. RT isn’t a magic bullet—it’s a resource hog. Even if placing a light is simple, the performance cost scales fast. You have all the geometry and other game elements to implement and integrate with the super heavy lighting. If realism were the goal, all these factors would come into play, and you'd quickly realize that delivering a product focused on photorealism with decent performance is nearly impossible. In fact, it's irrational to pursue, considering that 70–80% of people are running hardware like the 4070, 4060, 6700XT, 7600, 6600, or lower. Don't hate me for saying this, but we won't see low-end GPUs with 4090/5090 performance this decade; that’s just the laws of physics at play.

In my view, traditional lighting’s simplicity is why it dominated games for decades—it’s fast, predictable, and doesn’t need a NASA rig to run. Therefore, we should focus on what’s feasible. All this raw power and the limited transistor budget should be directed toward strengthening rasterization across all GPUs, from low-end to high-end. Graphic quality will continue to advance regardless, but with more realistic goals.
 
It's not quite like that. RT can save time in some areas. With ray tracing, you plop a light down, and it naturally handles reflections, soft shadows, and global illumination without needing to fake it. No baking lightmaps for hours, no fiddling with dozens of fake lights to mimic bounce or ambient effects—like in old-school rasterization where you’d add fill lights or reflection probes to cheat what RT does natively. For devs focused on realism, that’s a win. Ok.
A single RT light can theoretically replace a bunch of hacks, and you don’t have to wait for a bake to see results. That’s time saved, especially in dynamic scenes where baked lighting falls apart.

But saying it “only helps” and doesn’t “sacrifice other elements” is off-base. RT isn’t a magic bullet—it’s a resource hog. Even if placing a light is simple, the performance cost scales fast. You have all the geometry and other game elements to implement and integrate with the super heavy lighting. If realism were the goal, all these factors would come into play, and you'd quickly realize that delivering a product focused on photorealism with decent performance is nearly impossible. In fact, it's irrational to pursue, considering that 70–80% of people are running hardware like the 4070, 4060, 6700XT, 7600, 6600, or lower. Don't hate me for saying this, but we won't see low-end GPUs with 4090/5090 performance this decade; that’s just the laws of physics at play.

In my view, traditional lighting’s simplicity is why it dominated games for decades—it’s fast, predictable, and doesn’t need a NASA rig to run. Therefore, we should focus on what’s feasible. All this raw power and the limited transistor budget should be directed toward strengthening rasterization across all GPUs, from low-end to high-end. Graphic quality will continue to advance regardless, but with more realistic goals.

Can't we get AI to just fake all that instead of brute force calculating every photon.

Ah - i think we found the next 'feature' - DLRT - 100x faster than regular RT!
 
Can't we get AI to just fake all that instead of brute force calculating every photon.

Ah - i think we found the next 'feature' - DLRT - 100x faster than regular RT!
It's already here funnily enough. In HL2 RTX. First bounces are calculated but next ones use AI to predict how it would look. Also for skin and subsurface scattering AI is being used (RTX skin).

It's not quite like that. RT can save time in some areas. With ray tracing, you plop a light down, and it naturally handles reflections, soft shadows, and global illumination without needing to fake it. No baking lightmaps for hours, no fiddling with dozens of fake lights to mimic bounce or ambient effects—like in old-school rasterization where you’d add fill lights or reflection probes to cheat what RT does natively. For devs focused on realism, that’s a win. Ok.
A single RT light can theoretically replace a bunch of hacks, and you don’t have to wait for a bake to see results. That’s time saved, especially in dynamic scenes where baked lighting falls apart.

But saying it “only helps” and doesn’t “sacrifice other elements” is off-base. RT isn’t a magic bullet—it’s a resource hog. Even if placing a light is simple, the performance cost scales fast. You have all the geometry and other game elements to implement and integrate with the super heavy lighting. If realism were the goal, all these factors would come into play, and you'd quickly realize that delivering a product focused on photorealism with decent performance is nearly impossible. In fact, it's irrational to pursue, considering that 70–80% of people are running hardware like the 4070, 4060, 6700XT, 7600, 6600, or lower. Don't hate me for saying this, but we won't see low-end GPUs with 4090/5090 performance this decade; that’s just the laws of physics at play.

In my view, traditional lighting’s simplicity is why it dominated games for decades—it’s fast, predictable, and doesn’t need a NASA rig to run. Therefore, we should focus on what’s feasible. All this raw power and the limited transistor budget should be directed toward strengthening rasterization across all GPUs, from low-end to high-end. Graphic quality will continue to advance regardless, but with more realistic goals.
See above. We will probably move away from trying to bruteforce RT. Future seems to be do minimal rays calculation and then let AI take over. Stuff like Neural Radiance Cache. And most importantly neural shaders. Its not Nvidia only gimmick as MS has integrated it into DX12 and AMD and Intel also claim support.
We will see.
 
Yes, but now people are envisioning a future where RT is integrated in the core of the game without being able to turn it off.

We've already seen the mess that was made of forced upscale/TAA. Now of course, games will gradually just keep trying stuff and one failed game isn't an issue. But then I look at the box of UE5 games that have so little to offer for their supposed new gen of gaming, and I wonder WTF is going on. There's a weird corporate push underneath this whole movement and it doesn't lead to better content or devs being left their freedoms to make it.

I think companies are very often too busy with the amount of cool stickers they can put on a box (supports XYZ RTX DLSS etc.) than whether the implementation of it is really worth anything. Other companies use their game as a playground for new RT technology, but forget to finish the game alongside it (Cyberpunk). Its like... do you guys WANT this to fail, or wtf are you doing here?!

Its so similar to VR. There isn't a single killer app that I MUST have it for. Honestly. My body is ready for that killer app. It ain't there.
There is a killer app for VR. Half-Life Alyx. The problem is, I can't see myself paying hundreds for a headset only to be able to play one killer app.

I think what you're looking for here is something akin to 'IPC'.
Yes, something like that.

The performance per shader just isn't improving on the RT front, by much, anyway. The majority of it is coming from higher clocking, too, if you see it happen. Its becoming clear that you simply need X shaders for Y performance. Shrinks and higher clocking can only fix so much, but its the only fix for RT as far as we've seen. People have mentioned the balance within a shader (-unit); what's dedicated to what. You can obviously slowly tweak that to more RT focused... but then you still haven't won jack shit in terms of die space vs performance. You're just offering lower raster perf to give more RT perf (and most likely lose net. FPS).

That's why Nvidia is pushing the fake frame approach on an otherwise architectural standstill. And why we had a marketing line comparing x70 to x90 with MFG on.

This paradigm will change if AMD or Nvidia can bring an architecture to the market that actually does more work with fewer cores. Until then? RT is DOA and simply cannot survive the demise of Moore's Law. (This is exactly the prediction I made when Huang yelled 10 gigarays on stage 7+ years ago) As much as you can't deny RT being a heavy workload that 'just is'... that automagically defines that its going nowhere.
AMD already made improvements with the 9070 XT which is way faster than the 7900 XTX in most RT games while being a bit slower in raster. Did they sacrifice raster performance for better RT? Perhaps, we'll never know. All we know is that it's a bit more balanced architecture than RDNA 3.
 
Back
Top