• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Portal with RTX

Nice, RTX make Portal into a futuristic looking game like it should be.
Hopefully they will do Portal 2 RTX too
 
The RTX 3070 and 3080 probably didn't have enough memory at 4k and only got 1 FPS. Even if it was, it couldn't exceed 3-4 FPS for 3070 8-9 FPS for 3080 Thanks Nvidia so I will get 1 FPS.

Also, even the current Rt technology is not enough for the future, this change was made with an old game. In other words, if the RT system at this level is used in current games, even RTX 4090 will get a maximum of 10 FPS. It's a very exaggerated and inflated technology.
 
Last edited:
Nvidia loves the brute force approach so that they can save the day with dlss . 1440p at 60<fps for a $1600 minimum card. I bet the witcher remastered will be another slide shyte show!:shadedshu:
 
Folks keep saying its a gimmick, I call BS. Does it make the games look outstanding when implemented properly? Hell yeah it does. Does it require a sh1t ton of graphical power to make it happen? Yes it does. In time just like AA, RT will be processed with ease. Now I don't like Nvidia's business practices at all, but I do appreciate them pushing new technologies.
 
Folks keep saying its a gimmick, I call BS. Does it make the games look outstanding when implemented properly? Hell yeah it does. Does it require a sh1t ton of graphical power to make it happen? Yes it does. In time just like AA, RT will be processed with ease. Now I don't like Nvidia's business practices at all, but I do appreciate them pushing new technologies.

I bet Portal with RTX look way better than just running the old game at 16K 60FPS.

After playing through plenty of shallow current-day RPGs, I kinda want to to play remastered old games like Elder Scrolls Oblivion with RTX
 
That'd just ugly and made so all cards below RTX 4080 and 4090 look bad.

Apparently it's also hard to play, since textures for walls which can be targeted with portals and the ones that can't aren't differentiated enough.
 
Last edited:
The performance makes no sense. In every other raytraced game, the 6900XT equals 3070, and here it's 8x slower? Seems like nvidia did everything in their power to make this a clown show just to demonstrate their superiority.
 
So; The GPUs that can run these games, need DLSS 3.0 to push into the high refresh rate range but at the same time, cause input latency, which defeats the purpose of the high refresh rates, and then there is that $2000 price tag.

:roll:

No, no no no no, nooooooooooooooooo.

@W1zzard

Thank you for the images slides and comparisons. However, nVidia has complicated the whole story and well, thus the work you might have to consider doing in the future. Within the comparison slides, we see DLSS, but, not the version of the said DLSS. Yes, I understand that you are using an RTX4090, but perhaps, we can get the top-of-the-line DLSS 2 GPU in there too? You know, for peasants that can't upgrade every 2 seconds @ $2000.
 
Physically correct lighting according to AMD fans is a gimmick and shouldn't exist because it destroys the performance of their favourite company cards. Fake lighting is also "better".

I've read sad stupid things on the internet but this trumps them all by a large margin.

This is exactly how 3D games should be lit. I'm also a huge fan of Quake II RTX which, with its real RTRT, makes the game look real and truly fantastic. I cannot wait to play Portal RTX on RTX 6060, the first fast enough card I'm able to afford.

The performance makes no sense. In every other raytraced game, the 6900XT equals 3070, and here it's 8x slower? Seems like nvidia did everything in their power to make this a clown show just to demonstrate their superiority.

In games with actual raytracing AMD cards are quite slow:

_id1608218885_1406098.jpg
 
Last edited:
Oh boy, yet another RT sensation that a $2k 4090 card can run barely. That's great. Actually, it can't. Well that's even better right? That means NV needs to buckle up and get a card 5090 out that will be twice as fast with Raster and RT. Wait there's more. It will cost $3k guys only. Now that is what I call a good value right? Hurray for NV and RT this is the future.

Bleh I'm really tired of idiocy and that bullshit surrounding it.
 
Oh boy, yet another RT sensation that a $2k 4090 card can run barely. That's great. Actually, it can't. Well that's even better right? That means NV needs to buckle up and get a card 5090 out that will be twice as fast with Raster and RT. Wait there's more. It will cost $3k guys only. Now that is what I call a good value right? Hurray for NV and RT this is the future.

Bleh I'm really tired of idiocy and that bullshit surrounding it.

You know, when the time comes, when they hide in their bunkers with their ill-gotten gains, it puts a smile on my face, knowing that, they are going to be juggled around within that said bunker, not unlike a theme park ride that pulls you here and there, mayhap a blender? Or a trash crusher, as the plates move sideways... anyways... there isn't gonna be stopping it, there will be no "on" or "off" button they control. Hide in space? No no, asteroids will be their next problem, no place to hide.
 
You know, when the time comes, when they hide in their bunkers with their ill-gotten gains, it puts a smile on my face, knowing that, they are going to be juggled around within that said bunker, not unlike a theme park ride that pulls you here and there, mayhap a blender? Or a trash crusher, as the plates move sideways... anyways... there isn't gonna be stopping it, there will be no "on" or "off" button they control. Hide in space? No no, asteroids will be their next problem, no place to hide.
Honestly if there are people cheering for this course of action we see now and justify what the companies are doing, these people work for the company. Because I can't believe literally, it is just impossible to be such a madman. Yet we have these lunatics everywhere. Even here on this forum and they think they are right. That's just amazing and scary at the same time. People would believe anything.
 
Last edited:
Physically correct lighting according to AMD fans is a gimmick and shouldn't exist because it destroys the performance of their favourite company cards. Fake lighting is also "better".

I've read sad stupid things on the internet but this trumps them all by a large margin.

This is exactly how 3D games should be lit. I'm also a huge fan of Quake II RTX which, with its real RTRT, makes the game look rea and truly fantastic. I cannot wait to play Portal RTX on RTX 6060, the first fast enough card I'm able to afford.



In games with actual raytracing AMD cards are slow:

_id1608218885_1406098.jpg
The true RT games (read old games fully path traced that had crap lighting before) look absolutely amazing with RT, and are totally worth it. But the modern engine ones that fake/half ass it (tombraider, cp2077, metro, F1, farcry 6, crysis remastered) -- that also have great fake lighting - in those RT makes virtually no difference.

I've spent hours comparing RT on and off looking at every scenario I can think of in CP and Metro -- At one point in cyberpunk had turned RT off, and forgot, then saw some lighting that I thought was badass, and was like "Oh ok here, ill turn off RT and I bet it wont look this good" -- was already off. It's literally almost impossible to tell.

These types of older remakes are what make RT exciting, for the modern games with already great fake lighting - I'm definitely in the "not worth turning on" camp after playing them with a 4090 and cranking the settings.

IMO modern fake lighting is so good, that if RT was the de facto lighting standard in all games and fake lighting didn't exist, nvidia would invent fake lighting, and then call it DLLL or something stupid, and then promise 50% fps boosts and you can hardly tell the lights are fake.

I cant wait till they remake the fallout series with some RT. Morrowind looks like it's getting a remake... Can just imagine how awesome Vice City / Fallout New Vegas RTX would be.
 
Last edited:
Man there are some salty takes on RT in this thread.

I think this is super cool and can't wait to play around with it myself, a glimpse into the future perhaps. Play it, don't play it, upgrade, don't, make your choice but despite some peoples best effort, you cannot convince me that this isn't bloody amazing.

That VRAM usage at 4K is insane. The 10-12GB cards from last generation are not aging well.
Shame that the 16GB cards from last generation can't muster playable framerates though, I stand by the choice I made 2 years ago for the kind of gaming I'd like to do, from my perspective the 16GB cards from last generation aren't ageing well, and the stronger feature set at the expense of a few extra gigs for textures was the better choice.
 
Last edited:
And there you have it: 3060 is 300% faster than 6900XT in the courtesy of proper AI and dedicated hardware.

:p
 
And there you have it: 3060 is 300% faster than 6900XT in the courtesy of proper AI and dedicated hardware.
Best slideshow ever.
 
That VRAM usage at 4K is insane. The 10-12GB cards from last generation are not aging well.
It's just ram stuffing, no preformance penalty for having a bit (10-12) less.

Some GPUs that are normally considered fine for 4K are brought to their knees with this RTX fully path traced game. It just goes to show that we aren't close to being ready for this level of RTX for mainstream gamers. Maybe in a couple generations or possibly 3.
Yep, full RTX implementation is the new Crysis.

So, can it run fRTX?

And there you have it: 3060 is 300% faster than 6900XT in the courtesy of proper AI and dedicated hardware.

Best slideshow ever.
Yep, 3 fps is sooo much better than 1.
:kookoo:

That VRAM usage at 4K is insane. The 10-12GB cards from last generation are not aging well.
16GB cards are doing even "worse" so no memory problem here dude.

Oh boy, yet another RT sensation that a $2k 4090 card can run barely. That's great. Actually, it can't. Well that's even better right? That means NV needs to buckle up and get a card 5090 out that will be twice as fast with Raster and RT. Wait there's more. It will cost $3k guys only. Now that is what I call a good value right? Hurray for NV and RT this is the future.

Bleh I'm really tired of idiocy and that bullshit surrounding it.
Just dlss3 it.
There, all is good now.
 
So ugly graphics for today's standarts and just 60fps with 4090 with DLSS balanced? That is why I hate this RT bulshit, zero effort to actually improve the graphics and we actually get worse graphics than 5 years old games with worse performance
 
The performance makes no sense. In every other raytraced game, the 6900XT equals 3070, and here it's 8x slower? Seems like nvidia did everything in their power to make this a clown show just to demonstrate their superiority.
NV is not to blame, it just reviles the true advantage of NV RTX vs. AMD RX when fully unleashed.
Dedicated and specialized HW (RT cores in this case) will always do better. See also ASIC for mining.
If and when AMD will decide to spend some silicone space for that feture thay will be back on the (RT) game.

Smells like a good, real case (not just tech demo), new RT benchmark title.
 
Man there are some salty takes on RT in this thread.

I think this is super cool and can't wait to play around with it myself, a glimpse into the future perhaps. Play it, don't play it, upgrade, don't, make your choice but despite some peoples best effort, you cannot convince me that this isn't bloody amazing.


Shame that the 16GB cards from last generation can't muster playable framerates though, I stand by the choice I made 2 years ago for the kind of gaming I'd like to do, from my perspective the 16GB cards from last generation aren't ageing well, and the stronger feature set at the expense of a few extra gigs for textures was the better choice.

The salt comes from the extreme price tags associated with this supposed "progress". You aren't moving forward if only 10% can afford these new features. I like to see how you bring up the VRAM issue, well guess what, we too warned about that, we are getting too little for the price we paid. But oh, look, we are supposed to be happy. Suffice it to say, there is a big cash grab and middle finger from Nvidia's side to their customers, and we are not happy.

No one says raytracing sucks, it looks amazing, just not at the price points and middle fingers from Nvidia.

NV is not to blame, it just reviles the true advantage of NV RTX vs. AMD RX when fully unleashed.
Dedicated and specialized HW (RT cores in this case) will always do better. See also ASIC for mining.
If and when AMD will decide to spend some silicone space for that feture thay will be back on the (RT) game.

Smells like a good, real case (not just tech demo), new RT benchmark title.

You are correct, I can't wait to see what the new RX7000 series does on AMD's side.
 
Yet Doom 3 had proper mirrors 14 years before ray tracing appeared in consumer graphics.
I think they used portals? So not "proper" ?

Edit: portal in this game-dev context means they render the whole (visible) game world a second time in the mirror, mirrored, separated by an invisible wall
 
I
The performance makes no sense. In every other raytraced game, the 6900XT equals 3070, and here it's 8x slower? Seems like nvidia did everything in their power to make this a clown show just to demonstrate their superiority.
I'm sure that with basic driver update (game ready style) AMD will gain big.
It is still only NV optimized.
 
Thank you for the images slides and comparisons. However, nVidia has complicated the whole story and well, thus the work you might have to consider doing in the future. Within the comparison slides, we see DLSS, but, not the version of the said DLSS. Yes, I understand that you are using an RTX4090, but perhaps, we can get the top-of-the-line DLSS 2 GPU in there too? You know, for peasants that can't upgrade every 2 seconds @ $2000.
The performance charts show various cards with DLSS Balanced, does that help?
 
Last edited:
Physically correct lighting according to AMD fans is a gimmick and shouldn't exist because it destroys the performance of their favourite company cards. Fake lighting is also "better".

I've read sad stupid things on the internet but this trumps them all by a large margin.

This is exactly how 3D games should be lit. I'm also a huge fan of Quake II RTX which, with its real RTRT, makes the game look rea and truly fantastic. I cannot wait to play Portal RTX on RTX 6060, the first fast enough card I'm able to afford.
Erm... no. It's more like...

"Physically correct lighting according to people who can't or just don't want to afford the latest shiniest GeForce card is a gimmick because it adds little immersion compared to the performance hit it inflicts on everything except for said latest shiniest GeForce cards."

This is exactly how 3D games should be lit if the performance penalty with any remotely affordable GPU wasn't so great, and if Nvidia didn't push their anti-competition agenda that we see an example of in this review.

I think they used portals? So not "proper" ?

Edit: portal in this game-dev context means they render the whole (visible) game world a second time in the mirror, mirrored, separated by an invisible wall
You're right, not "proper". But it looked real because of what you said. :)
 
Back
Top