• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

HalfLife2 RTX Demo Is out!

Are you interested in HalfLife 2 with Ray Tracing?

  • Yes! Bring it on!

    Votes: 44 42.3%
  • Yes, worth a try.

    Votes: 26 25.0%
  • No, I like the original look more.

    Votes: 20 19.2%
  • Something else, comment below.

    Votes: 14 13.5%

  • Total voters
    104
just makes me want HL 3
 
You're thanking NVIDIA for AMD being too slow and with outdated priorities in hardware prior to RDNA4? Should progress stop and stoop to lowest common demoninator?

Game looks nice and framerates are fine too.


Back when Half Life 2 came out and GPUs were undergoing their fourth major API overhaul in as many years, the game had support tiers for each contemporary API step. This included DirectX 7, 8.1, and 9_0. Each had their own rendering mode and would allow the game to be more or less playable on many tiers of available hardware from 1999 onward.

It's interesting to see Half Life 2, of all games, used to demonstrate the opposite; a locked down render path with no flexible implementation, no broad support, and no thought for previous generations.

It's also mildly humorous to see that it's AMD this time taking the hit. Back in 2004 it was NVIDIA that took a 65% performance hit by enabling all the eye candy, where the GeForce FX series on its bespoke variant of DX9 had to expend multiple extra cycles to render the standard implementation the rest of the industry adopted.

Anyway the point is that progress doesn't have to stop to keep supporting what exists. Valve already proved over 20 years ago that the two are not mutually exclusive.
 
Last edited:
We're at the point where FG is required... no thank you.

I'll try this 10 years from now.
 
Are you kidding? I just posted a video above showing a 4060 getting those same frame rates at more or less the same settings. Where-ever those graphs came from, they are absolute twaddle. Get out here with that nonsense..:rolleyes:


Play with your settings, but you should have fun. If you don't have the full HalfLife 2 yet, might be a good time to buy it.


Just watched that. I'm impressed. This could solve a few problems for cards with limited VRAM.
Are you inadvertently trolling? This video shows the 4060 running 17fps in native 1080p, and 40fps with upscaling in performance mode stretching 540p to 1080p.

Here's Nvidia's own data showing basically the same as the Game GPU above.
 

Attachments

  • 1742325505812.png
    1742325505812.png
    143.9 KB · Views: 55
Not gonna play the demo, will just buy the full remaster when it releases. Considering every man and his dog and their dog's fleas owns a copy of HL2 at this point, shelling out a few bucks more for an RT-enabled experience seems quite reasonable.
 
Bunny hop died with CS GO :( I was sad when it no longer worked.
This is understandable. CS:GO was a competitive game so any heavy movement exploits should've been patched out for balancing.

Besides, you can still bunny hop in CS2. You use a variant of it for corner peeking and forward acceleration (instead of just running forward).
 
This bit has me worried though. Will make 1:1 benchmarking impossible if widely adopted:

View attachment 390402
If you didn't realize this was the goal all along you've been living under a rock. Make performance metrics worthless so the marketing can take over completely.
 
If you didn't realize this was the goal all along you've been living under a rock. Make performance metrics worthless so the marketing can take over completely.
Nvidia - "All your frames are belong to us!"
 
Back when Half Life 2 came out and GPUs were undergoing their fourth major API overhaul in as many years, the game had support tiers for each contemporary API step. This included DirectX 7, 8.1, and 9_0. Each had their own rendering mode and would allow the game to be more or less playable on many tiers of available hardware from 1999 onward.

It's interesting to see Half Life 2, of all games, used to demonstrate the opposite; a locked down render path with no flexible implementation, no broad support, and no thought for previous generations.

It's also mildly humorous to see that it's AMD this time taking the hit. Back in 2004 it was NVIDIA that took a 65% performance hit by enabling all the eye candy, where the GeForce FX series on its bespoke variant of DX9 had to expend multiple extra cycles to render the standard implementation the rest of the industry adopted.

Anyway the point is that progress doesn't have to stop to keep supporting what exists. Valve already proved over 20 years ago that the two are not mutually exclusive.
It's this or nothing.

The entire remaster isn't like a normal one, bankrolled by some AAA studio and resold, to be targetted at the widest possible reach including consoles, it's essentially a community project made possible through RTX Remix, a completely free toolset developed by NVIDIA, as I'm sure you're aware. The fact that ~90% of cards are NVIDIA doesn't hurt the support, which is in fact broad, considering the most popular GPU tier xx60 can run it with no problem, and the RTX featureset has been around for almost a decade.

Do you have any ideas how the RT/PT tech could support AMD cards that literally do not have the hardware to run it? It already struggles on older lower tier RTX cards.

Or perhaps you want HL2 RTX without the RTX features, so it's more compatible? Doesn't this essentially defeat the purpose of the project?

Literally free, so complaints are confusing to me.

1742333659827.png


The project wouldn't even be able to exist if the (again, free) toolset wasn't bankrolled and published by NVIDIA, for the benefit of anyone who wants to make old games look new.
 
Are you inadvertently trolling?
I started this thread, so no.

Only for nvidia? They can keep it.
Fairly certain this will play on RX7000 and RX9000 as well. At least players in the reviews are saying that the RX cards are working.
EDIT: Confirmed! Had someone I know test the demo with his RX7800XT and it runs fine with reasonable settings at 1440p.
 
Last edited:
I voted "worth a try". As an AMD user, yes.

If looks and runs great, that's awesome, but if it doesn't, I'll just play the original, no problem.
 
If I didn't know you were were joking, I would think you're daft in the head..
Don't feed the trolls, they can't help themselves and are compelled to be here.

1742345261044.png


Pretty neat for a free demo if you ask me, and the original remains there for anyone to play without the fun new features.
 
Last edited:
I voted "worth a try". As an AMD user, yes.

If looks and runs great, that's awesome, but if it doesn't, I'll just play the original, no problem.
I was going to try it on my 7900 XTX and post my performance here for laughs, but it's nearly 90GB! :( My Starlink will catch fire trying to download it.

I think you should try it AusWolf, give us some Red representation in this thread :)
 
Don't feed the trolls, they can't help themselves and are compelled to be here.

View attachment 390426
Pretty neat for a free demo if you ask me, and the original remains there for anyone to play without the fun new features.
Stop being a flamer. I presented rational arguments, like always.
You have none, so instead of engaging in discussion about the subject, you resort to attacking people. In other words, you're the real troll here.

You know what, welcome to my ignore list.
 
The fact that ~90% of cards are NVIDIA doesn't hurt the support, which is in fact broad, considering the most popular GPU tier xx60 can run it with no problem, and the RTX featureset has been around for almost a decade.
It already struggles on older lower tier RTX cards.

Emphasis mine. I think you see the problem here.

You're right though, the hardware and feature set are quite pervasive by now, which begs the question as to why it runs so terribly on that hardware and feature set. Filter through the negative reviews to get past the noise of people complaining just to complain and see the first hand reports from 20 and 30 series owners; they aren't impressed with the results. So on one hand the features are old and the support is broad by sheer market volume, but on the other hand those features are simultaneously so cutting edge that you can't expect them to run well on said broad market hardware that claimed to support them.

Do you have any ideas how the RT/PT tech could support AMD cards that literally do not have the hardware to run it? It already struggles on older lower tier RTX cards.

Sure, open software collaboration would allow exactly this. This has been said a billion times since the advent of TWIMTBP became a thing in the mid aughts; it's not that AMD doesn't support *NVIDIA specific code/feature* it's that *NVIDIA specific code/feature* doesn't support AMD. RDNA2/3 supports essentially the same RTRT capabilities that the 20 series does, the hardware functions are there to call on. What AMD lacked previously was properly dedicated hardware denoising/post processing. NVIDIA absolutely could extend the olive branch on support, which won't happen because they want to sell hardware.

Or perhaps you want HL2 RTX without the RTX features, so it's more compatible? Doesn't this essentially defeat the purpose of the project?

That isn't what I said and I won't be addressing strawman arguments. You know better.

On topic, I've attempted to run HL2 RTX on Intel's Arc A770 16GB and can confirm that it does at least work, unlike Portal RTX at the time it released. It took 11 minutes to load the main menu, and opening a door led to another minute and 12 seconds of loading.

Here is what that looks like:

1742338753404.jpeg 1742338771969.jpeg

Reminder that Alchemist "has technological-parity with NVIDIA RTX [30 series]."
 
I was going to try it on my 7900 XTX and post my performance here for laughs, but it's nearly 90GB! :( My Starlink will catch fire trying to download it.

I think you should try it AusWolf, give us some Red representation in this thread :)
90 GB? Holy f! :eek:

That'll take 3 years to download on my medieval English DSL connection. Anyway, I'll have a go and report back. I hope I can get it running on Linux. I'm still missing some key files for Portal RTX to work properly, so fingers crossed. :D
 
I presented rational arguments, like always.

gloriously choppy 30-40 FPS, complete with ghosting and nausea-inducing artifacts as a bonus. Glorious, Jensen. :cool:
So rational! To you I say, if you want to come across as only rational, drop the sarcasm and Jensen jabs, and just be rational. Worth a try maybe?

Amazes me people get so defensive when all others do is respond in kind. Projection?

------------------------------------

It's downloaded but I won't get to try for a few hours yet, looking forward to it!
 
Last edited:
I presented rational arguments, like always.
No, you presented nonsense that was refuted before you posted it.
You know what, welcome to my ignore list.
Feel free to add me as well then. I agree with Wolf.

Answering a few of the points made by other users(not quoting or naming names as I don't want to seem like it's an attack), remember folks, turning settings down and tweaking them is always an option. If you have a lower tier GPU, adjust your settings accordingly and to your satisfaction. Lowering your settings or even your resolution to get a playable framerate is a perfectly acceptable solution.
 
Last edited:
Emphasis mine. I think you see the problem here.

You're right though, the hardware and feature set are quite pervasive by now, which begs the question as to why it runs so terribly on that hardware and feature set. Filter through the negative reviews to get past the noise of people complaining just to complain and see the first hand reports from 20 and 30 series owners; they aren't impressed with the results. So on one hand the features are old and the support is broad by sheer market volume, but on the other hand those features are simultaneously so cutting edge that you can't expect them to run well on said broad market hardware that claimed to support them.
Graphics settings exist, if 20 and 30 series owners expect ultra settings path tracing, they're always going to be disappointed. The alternative is never improving graphics, so everyone can always play maxed settings with great framerates. The fact a $300 4060 owner can keep it above 60 FPS means the game itself isn't flawed, tech wise, and for 20/30 series owners they are free to use DLSS and turn some things down. Again. FREE GAME.
Sure, open software collaboration would allow exactly this. This has been said a billion times since the advent of TWIMTBP became a thing in the mid aughts; it's not that AMD doesn't support *NVIDIA specific code/feature* it's that *NVIDIA specific code/feature* doesn't support AMD. RDNA2/3 supports essentially the same RTRT capabilities that the 20 series does, the hardware functions are there to call on. What AMD lacked previously was properly dedicated hardware denoising/post processing. NVIDIA absolutely could extend the olive branch on support, which won't happen because they want to sell hardware.
"NVIDIA specific code" i.e. the DX12 Ultimate featureset. The only arguably exclusive stuff in HL2 RTX, considering you can run it just fine on RDNA 4 hardware for example, are things like DLSS, Ray Reconstruction etc. These are proprietary because they run on special hardware that doesn't exist in the competition. Again, the alternative is you get nothing instead, there is no "equivalent" to these tech, because they aren't hardware agnostic and cannot be. NVIDIA could have done what AMD did and made this supposedly wonderful open software collaboration that worked on everything (see FSR 1-3), which is universally worse than the DLSS alternative, but why would they do that? Now even AMD is doing hardware specific software with the advent of FSR 4. Though I assume Sony likely had some say in that, considering their new upscaler PSSR turns out to be partially based on FSR 4, so obviously AMD's biggest GPU customer wasn't exactly thrilled with the "works on everything, badly" approach.

I'm curious what the AMD crowd reaction/justification will be now that FSR requires specific hardware and that reality check hits, as AMD essentially does the same thing NV does, three years later, again.

Perhaps people can point me towards the opensource/collaboration equivalent software developed by those companies other than NVIDIA, that allow similar levels of remastering with the ease that RTX Remix does. I don't believe it exists but I could be wrong. I'm guessing it's most likely something like - CUDA "alternatives" exist you know.

I've underlined equivalent because the same thing but much worse is not "equivalent".
 
That'll take 3 years to download on my medieval English DSL connection.
Quite a bit behind when it should have happened, we finally got Fibre to the Premises in my suburb here in Perth Australia, and for the time being I am treating myself to 1000/50, it makes the biggest difference with game/update downloads for sure, I hope that sort of thing isn't too far away for you!

Answering a few of the points made by other users(not quoting or naming names as I don't want to seem like it's an attack), remember folks, turning settings down and tweaking them is always an option. If you have a lower tier GPU, adjust your settings accordingly and to your satisfaction. Lowering your settings or even your resolution to get a playable framerate is a perfectly acceptable solution.
Just like almost every single other game out there really, think optimised settings rather than dialling it to 11 and being disappointed, it's almost always not worth the large performance hits for the smaller visual returns.
 
Just like almost every single other game out there really, think optimised settings rather than dialling it to 11 and being disappointed, it's almost always not worth the large performance hits for the smaller visual returns.
Bingo. Should CP2077 not exist because at the time you couldn't max out the graphics settings on AMD/Intel hardware and get playable framerates?

"Orbifold Studios is a collective of developers who love Half-Life, formed by the teams behind Half-Life 2: VR, Half-Life 2: Remade Assets, Project 17, and Raising the Bar: Redux. We're led by our passion for creation and desire to always raise the bar of quality.
Our goal with Half-Life 2 RTX is to deliver a new way to experience Valve's classic, using cutting-edge rendering tech to bring the game to life in a way never before possible. At the same time, we'll be remastering all of the game's assets, to be made available to the community for free. With Half-Life 2 RTX and the release of its assets, we hope to ignite new passion within the community to explore the potential of RTX in remastering classic mods, or making new mods with RTX in mind!"

Gamers/devs passionate about a game release an improved version of it for free.

Forumites "this is terrible because I can't max out settings with my five year old AMD card and get playable frame rates".
 
I'm curious what the AMD crowd reaction/justification will be now that FSR requires specific hardware and that reality check hits, as AMD essentially does the same thing NV does, three years later, again.
It's not the same thing, though.

DLSS has always needed AI cores, that's why it's always been superior in image quality. FSR never did until now. Complaining that your 6700 XT doesn't run FSR 4 (ignoring the fact that it runs FSR 3) is like complaining that your 1070 Ti doesn't run DLSS.

If you want to present AMD as shady, then I'd say look at RDNA 3, and why the alleged "AI cores" can't run FSR 4. As far as I currently know (more reading is needed on this), it's missing some instructions that only RDNA 4 has and are needed for FSR 4. If it's true, then it begs the question whether RDNA 3's AI cores are true AI cores or not.
 
It's not the same thing, though.

DLSS has always needed AI cores, that's why it's always been superior in image quality. FSR never did until now. Complaining that your 6700 XT doesn't run FSR 4 (ignoring the fact that it runs FSR 3) is like complaining that your 1070 Ti doesn't run DLSS.

If you want to present AMD as shady, then I'd say look at RDNA 3, and why the alleged "AI cores" can't run FSR 4. As far as I currently know (more reading is needed on this), it's missing some instructions that only RDNA 4 has and are needed for FSR 4. If it's true, then it begs the question whether RDNA 3's AI cores are true AI cores or not.
You're missing the point, I'm not presenting AMD as anything, I'm saying there is a reason why the game runs badly on non NVIDIA hardware and pre RDNA 4 hardware. It requires specialised hardware to run. This isn't good or bad, it's just a fact. For those that don't have the hardware, or have weak hardware, go play the original HL2, or turn some settings down.

Whether RDNA 3 AI cores are good or not is debatable, yes. But despite the constant whinging about NVIDIA proprietary software/hardware since 2018, AMD has literally gone and done the exact same thing, and released ML trained AI upscaling that requires specific hardware, instantly making all their previous cards obsolete.
 
Back
Top