• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

[rant] RAGE is making me rage!

Is that framerate cap issue a dealbreaker for you?


  • Total voters
    19

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.77/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
I bought RAGE when it came out back in 2011, but like so many other games I've got, I tend to take a long time to getting round to playing them, what with so many other things competing for my time. This time it's RAGE's turn.

Most games with teething troubles at release pretty much have them ironed out after a few months, so waiting like this means I normally get to play a nicely working game, with few glitches. Not so with RAGE and I'm regretting buying it.

It looks like a good story in a great sci-fi setting with really great visual effects, just the sort of thing I'd like to play and have built a high end rig for. However, it's badly let down by idiotic, simple technical issues with large consequences that I can't believe plague it to this day.

Yeah, John Carmack might know how to write a video game, just like many other hotshot developers, but I think he's overrated if he puts out sloppy work like this, with so many rough edges. It's not a fucking beta and I paid good money for it that I want back.

So, what peeves me off about it? Several things which either just don't work properly or are implemented badly, that's what.

The first one is the complete dealbreaker which inspired me to write this rant: it's the idiotic capped framerate.

On clicking the Play button in Steam, I get the option of running the 32-bit version (default option) or the 64-bit, unsupported version. Heck, from what I can see, they're both unsupported anyway, so I can't see what the difference is.

So, the 32-bit version is capped at 60fps regardless. I've looked online, but nothing I did would lift it. The monitor refresh is 120Hz, so it's judder all the way and therefore completely unacceptable. I can set the monitor refresh to 60Hz to match it, but then the backlight strobing doesn't work and I get motion blur, plus there is much more lag, so I can't win.

I've spent thousands of pounds on a system which has enough power to get rid of the annoying judder in most games and LCD monitor motion blur. Smooth animation is the whole point, isn't it? However, my investment has been compromised by some dumb developer's policy.

I can unlock the framerate in the 64-bit version, but at the cost of the game doubling in playing speed at 120fps vsync locked - it's even faster with vsync unlocked. Again, completely unacceptable. It's possible to adjust the running and walking speeds by using commands in the config file, but all the effects and other animations would still happen at double speed, which isn't good enough. Of course, there doesn't appear to be a global speed setting command which would have mitigated this problem.

Carmack, you've written/ported this game for the PC. Of course, gamers are gonna want to run it at their chosen framerate and not at some console-limited rate. Every other other decent PC game can do this, so it's idiotic how yours can't. What, don't you know how to write a game engine that can handle higher framerates properly? You should have seen some of the angry forum comments I came across about this issue...

- SLI is officially not supported! This one I find hard to believe, except that the driver defaults to single card for this game. Every other major game supports it, so why can't Carmack manage it? It makes such a big performance difference, that a developer cannot afford to leave out support for it.

- GPU transcode (hardware compressed texture expansion using the GPU) cannot be enabled on my system. Seems like the game hasn't been updated to recognize my graphics cards perhaps, which were released two years later. Still, poor show that it can't have wider compatibility as later GPUs tend to support earlier instruction sets and features, or just allow the option with a compatibility warning and let the user see how it goes.

- To make advanced changes to graphics settings and enable the really large, hi res textures, one has to mess around with config files and obscure, poorly documented commands rather than conveniently selecting the options from the menu. Seriously, how much developement work does it take to make a proper user interface? No more than a day at the outside. This is so lazy.

I remember when Crysis 2 had only simple console style graphics settings at release and PC gamers rightly complained, loudly. EA listened and fixed this with a later update, making the game a real graphics powerhouse that is often used in graphics card reviews.

- vsync setting in the game doesn't work. Not a huge deal, as I can override it in the driver but still a stupid glitch which shouldn't be there after several patches.

- In an early version of the game, the command console could be enabled and changes made to the game on the fly. This ability was then disabled for some mysterious, unexplained reason in a later patch. While some people have been able to re-enable it, I haven't, in either version, likely due to further patches preventing it. Also, enabling it punishes you by disabling Steam achievements!

I spent several hours trying to get around some of these issues, but with the failure to get proper 120Hz animation out of it, I just said fuck it and gave up on this game. I might come back to it one day, but there are over 100 games in my Steam library, plus several in my Origin library that deserve my attention much more than this piece of crap.

If someone here knows of a fix for this framerate issue, then I'll look at the game again soon, otherwise I won't bother.

Rage over.
 
Last edited:
Wow, qubit, I'm sorry you're having such a hard time. No, I don't have an answer to SLI or the 120GHz monitor issues.

At first when i saw you said the issues had not been ironed out, I scratched my head, because i immediately thought of all the release problems. I waited 8 months after release to play ot, and the multitude of other problems were indeed fixed. You should have seen it at release! For me, it turned out to be quite a fun game, except for a lackluster ending. No gameplay issues or bugs to speak of. I'm actually glad now I didn't make the move to a 120 GHz monitor yet.
 
Last edited:
Answer to all your problems = ID TECH 5
 
64-bit version had problems with tearing for me so I played the 32-bit version. I would play 32-bit at 60 FPS and 60 Hz. Beat it twice doing so.
 
I remember there was a fix with the fps cap but once that was done and as long as i did not minimize the game it would be fine, how ever been so long forgot what had to be done although i have this feeling it was just a config edit or a like.
 
I've had the same issue. And also came to post

"ID Tech 5"

I preordered Wolfenstein and was super upset about the cap. I was really looking forward to The Evil Within and read that it was using ID Tech 5...and I have yet to play it. I have not played Rage either.

It seems strange to me that such a PC focused developer would do this. Doom 3 was also locked to 60, until the BFG Edition came out quite recently.

I save games like these for when I travel. Just play them on my laptop where I am limited to 60 hz anyway.




"Remind me not to buy a 120 Hz monitor."

There's nothing quite like playing at a constant 120fps on a 120hz monitor. You won't want to go back to 60hz. Oh so buttery smooth. Just like the old days of CRT's. I recommend trying it at some point.
 
Last edited:
Answer to all your problems = ID TECH 5
It's not idtech5, it's retarded management (probably gone by now). Doom3 BFG edition uses the very same idtech5 and does 120FPS from options just fine. I heard stories about when Brian Harris (at Nvidia now) added motion blur to Doom3-BFG for the PC version, there were yelling and big drama in the office because QA wasn't notified... so you can imagine what kind of "PC gamers" ran the show there.

The engine runs fine on a xbox360, so I have no idea while would you need SLI tbh. It runs fluid on a g92 GPU, even two of those would be overkill. AMD GPUs are a different story sadly. Rage is a fun little shooter if you don't except anything "deep" like Borderlands. I have no idea why they never released a high resolution version for the PC, I personally wouldn't care if it would be 200GB if I could see that world's megatexture at 4 times the resolution on the PC. The weapons feel really good and the movement is still done like nobody else can do......

.....but yes, if you would look at things from a PC gaming enthusiast's perspective, then OP's rant is quite valid I'm afraid, and their last hope is not to f**k up the new Doom or they are done.
 
Last edited:
why would 60fps judder on 120hz? it should be perfect, it's aligned to the refresh

no need to make carmack a scapegoat, rage came out when bethesda/zenimax owned id, carmack was on his way out & eventually went to oculus, it's not like he's the only programmer manually making the engine+cuda detector+vsync+etc etc...

same for sli with today's cards since it's not some ultra demanding title (yes it's a curiosity, no it's not the only engine or game that cant have sli/cf)

funny thing about framerates & game speed or physics... remember doom3? same thing, i wonder why... also ETQW oh man, 30fps menus without a hardware mouse cursor, ouch
 
I played Rage back in the day, and for what it's worth, if you're still wasting time on this game, end it now.

This game is really just Borderlands the way it shouldn't be done, but without a nice backstory and containing about two fun weapons, the rest is just same old shooter crap against a really REALLY bad AI and in an extremely limited game world. The game tries to pull wool over your eyes to hide a terrible lack of content. Not to mention the narrative, which doesn't try to take itself seriously but also completely fails at presenting, well, pretty much anything at all.

And in all honesty Motorstorm (2006) has a múch better presentation of badlands and desert than Rage does. So much for ID Tech.

Sorry mate, Rage is just a steaming pile. Avoid at any cost
 
Last edited:
I liked it. Graphics are pretty beautiful. I only have 60hz so its not a big deal for me. Probably won't replay it though.
 
I liked it. Graphics are pretty beautiful. I only have 60hz so its not a big deal for me. Probably won't replay it though.

Yeah, while fun, replayability is not high. I did two playthru's though, just to make sure I didn't miss anything. In its patched form, it's a decent game, only let down by a fizzle of an ending and the ILLUSION of open world. They billed it as open world, but it's really on rails, although you have a pretty big deviation you can make in reaching most of your goals...except underground.
 
I played Rage back in the day, and for what it's worth, if you're still wasting time on this game, end it now.

This game is really just Borderlands the way it shouldn't be done, but without a nice backstory and containing about two fun weapons, the rest is just same old shooter crap against a really REALLY bad AI and in an extremely limited game world. The game tries to pull wool over your eyes to hide a terrible lack of content. Not to mention the narrative, which doesn't try to take itself seriously but also completely fails at presenting, well, pretty much anything at all.

And in all honesty Motorstorm (2006) has a múch better presentation of badlands and desert than Rage does. So much for ID Tech.

Sorry mate, Rage is just a steaming pile. Avoid at any cost
Bait or not, I'm gonna bite on your post because it's just so wrong.


As I already wrote in my previous post, it's not Borderlands at all, it's not an open-world adventure game, nor it's a RPG, just a good old "id shooter". This is a certain game-type you obviously can't appreciate, which is ofc a choice you are free to make. Doom or Wolfenstein had the very same kind of "simple" story and stupid enemies, and people loved those games as well. This is a gametype, where you supposed to crunch yourself through levels after levels while having great fun with weapons and bullets (preferably on nightmare if you are not some slow brained console gamer, or if you don't want to get bored).

Rage has and had many problems, (especially with the drivers, when AMD simply released the wrong one on the launch date which broke the engine). It was developed for too long and realtime 3D engines advanced a lot during the delay, but still the world they created is pretty impressive (imho) and it runs very well even on low end systems. While the story/design sucks indeed (but that's true for any id game since Romeo left tbh), the control is still fast and snappy, the movement is fluid and it's fun to shoot around imo. I bought it for $4, and I finished it on nightmare in one sitting. I bought the DLC for $1.25 later as well and also finished that in one sitting. Was it one of the best games? No it wasn't at all, and I might never play it again, but it was fun and a $6.25 very well spent, no regrets there. Some could argue if it's 5/10, 6/10 or 7/10 or more, but it's nothing like 2/10 or craps like that so it's certainly pretty far from a "steaming pile".
 
Last edited:
Rage gave new definition to rage quit.

To me it was a mixed bag. The AI was pretty good the way they would craftily weave and dodge, and it had some decent missions and levels, but there was a lot of disappointment.

The races I found to be pretty lackluster and repetitive. The megatextures limited a lot of features, static sky, no weather effects, fewer interactive doors, etc.

I also felt the graphics clearly didn't benefit from megatextures. There were some decent distant vista views, but for the most part megatextures = tradeoffs to the consumer, and only faster production time for the devs.

There clearly were visual before and after comparisons that did not live up to the pre launch hype. Plus the game's world felt very smallish for a post apocalyptic setting. Worse yet you actually had to get an expanded underground via DLC. That should have been included in the game.

Perhaps the biggest letdown was the ending, which was horrible. For a game that took them so long to make, I def expected a better experience overall. John Carmack's lengthy keynote speech following Rage was full of apologies. It's not surprising that shortly thereafter he left game development to go to Oculus.

I'm kinda hoping Mad Max will fill the huge void that Rage left. For the most part it was a very disappointing and forgettable game.
 
Last edited by a moderator:
I bought RAGE when it came out back in 2011, but like so many other games I've got, I tend to take a long time to getting round to playing them, what with so many other things competing for my time. This time it's RAGE's turn.

It probably still runs best on NV. ATI/AMD couldn't even run the game properly for about 4 months after release. It was stunning mismanagement at AMD. I don't think AMD bothered to continue optimization because Wolfenstein The New Order performance tests I've seen still show NV generally ahead.

-GPU transcode is NV only because it uses CUDA. It may not be a tangible benefit with modern CPUs anyway. Wolfenstein doesn't use it IIRC.
-VSync in idTech5 is "adaptive", meaning it disables if the engine thinks the framerate won't be 60 fps. Disable the game vsync and use your video card control panel vsync instead if you don't like this. The consoles use adaptive vsync and even adaptive resolution to keep framerate at 60, IIRC.
-60fps limit has been around since idTech4. For some reason that's what id chose to sync the various parts of their engine to.
-SLI is not necessary for this game. It isn't demanding enough to bother any recent video card. I used to run 2720x1536 on a 560 Ti and get 60fps most of the time.


I wouldn't bother with any mystical config tweaks to get more detailed visuals because they probably don't do anything. The megatexture is simply rather low resolution, in order to keep data size reasonable. Notice the game is still huge for the day. Megatexture is neat but it has a data storage aspect that more traditional games seem to avoid with their repeating textures. There was a patch that added a texture "noise" feature to give the perception of higher detail but I thought it looked awful.

Another "interesting" quirk with megatexture is it limited maximum anisotropic filtering to 4x.
 
Last edited:
A tip once you do get it working, the combat shotgun is the best gun IMO. Especially with the rocket launcher ammunition.

I loved the graphics. I also played the DLC which was like $2 when I bought it and IIRC it added about 3 hours of content.
 
I played through this game back at release and had a blast.
 
Wow, qubit, I'm sorry you're having such a hard time. No, I don't have an answer to SLI or the 120GHz monitor issues. I'm actually glad now I didn't make the move to a 120 GHz monitor yet.

Thanks man.

Yes, it's stupid that it doesn't support SLI, regardless of what graphics load it places on the system. It sounds like it wouldn't be a problem in practice though, if one decent card can keep the framerate up, which seems it can do easily.

Don't let my experience with this one game put you off getting a 120Hz monitor. The difference is nothing short of breathtaking and most games work fine at this rate. Make sure it's a model with a strobing backlight though, which eliminates motion blur inherent to all LCD displays. I believe Benq do the best ones for strobing displays as they're not tied to NVIDIA. UT2004 which I still play (and talk about!) a lot is amazing at 120Hz, with the low input lag really making the game fast, twitchy and a lot more fun as you can aim and shoot much more accurately.

Maybe: http://www.dsogaming.com/news/here-...-framerate-in-rage-wolfenstein-the-new-order/

toggle com_synctotime

If speeds up:

r_syncatendframe


Remind me not to buy a 120 Hz monitor.

Thanks, I'd already seen that website and tried that, but it made no difference - they'd also missed a couple of important details on how to actually implement the commands that I figured out from other websites. Perhaps I missed something though and will try again. I actually have several tabs still open with help and hints and tips pages for this game when I was trying to sort it out.

It's not idtech5, it's retarded management (probably gone by now). Doom3 BFG edition uses the very same idtech5 and does 120FPS from options just fine. I heard stories about when Brian Harris (at Nvidia now) added motion blur to Doom3-BFG for the PC version, there were yelling and big drama in the office because QA wasn't notified... so you can imagine what kind of "PC gamers" ran the show there.

The engine runs fine on a xbox360, so I have no idea while would you need SLI tbh. It runs fluid on a g92 GPU, even two of those would be overkill. AMD GPUs are a different story sadly. Rage is a fun little shooter if you don't except anything "deep" like Borderlands. I have no idea why they never released a high resolution version for the PC, I personally wouldn't care if it would be 200GB if I could see that world's megatexture at 4 times the resolution on the PC. The weapons feel really good and the movement is still done like nobody else can do......

.....but yes, if you would look at things from a PC gaming enthusiast's perspective, then OP's rant is quite valid I'm afraid, and their last hope is not to f**k up the new Doom or they are done.
So, office politics screwed up the game and the framerate in particular. What's new? :rolleyes: Still, you'd think such a basic problem would be fixed after all this time with a little patch, wouldn't you?

For SLI, see my reply to rtwjunkie, above. Let's hope the latest Doom works properly.

why would 60fps judder on 120hz? it should be perfect, it's aligned to the refresh

no need to make carmack a scapegoat, rage came out when bethesda/zenimax owned id, carmack was on his way out & eventually went to oculus, it's not like he's the only programmer manually making the engine+cuda detector+vsync+etc etc...

same for sli with today's cards since it's not some ultra demanding title (yes it's a curiosity, no it's not the only engine or game that cant have sli/cf)

funny thing about framerates & game speed or physics... remember doom3? same thing, i wonder why... also ETQW oh man, 30fps menus without a hardware mouse cursor, ouch

Yeah, you do see the judder and it's not game specific, either. Whenever the same frame is shown twice or more, judder can be seen. This is why 24fps movies shown at the cinema and on TV judder, since each frame is displayed twice. You're literally seeing the animation move, stop, move, stop... etc. The only way to have smooth motion is for every frame to be shown once per monitor refresh.

This is achieved with gaming by locking to vsync and having no dropped frames. The latest adaptive sync technology is an improvement on that, but it does basically the same thing to eliminate judder: show the frame only once.

Same answer as above, for SLI.

It probably still runs best on NV. ATI/AMD couldn't even run the game properly for about 4 months after release. It was stunning mismanagement at AMD. I don't think AMD bothered to continue optimization because Wolfenstein The New Order performance tests I've seen still show NV generally ahead.

-GPU transcode is NV only because it uses CUDA. It may not be a tangible benefit with modern CPUs anyway. Wolfenstein doesn't use it IIRC.
-VSync in idTech5 is "adaptive", meaning it disables if the engine thinks the framerate won't be 60 fps. Disable the game vsync and use your video card control panel vsync instead if you don't like this. The consoles use adaptive vsync and even adaptive resolution to keep framerate at 60, IIRC.
-60fps limit has been around since idTech4. For some reason that's what id chose to sync the various parts of their engine to.
-SLI is not necessary for this game. It isn't demanding enough to bother any recent video card. I used to run 2720x1536 on a 560 Ti and get 60fps most of the time.


I wouldn't bother with any mystical config tweaks to get more detailed visuals because they probably don't do anything. The megatexture is simply rather low resolution, in order to keep data size reasonable. Notice the game is still huge for the day. Megatexture is neat but it has a data storage aspect that more traditional games seem to avoid with their repeating textures. There was a patch that added a texture "noise" feature to give the perception of higher detail but I thought it looked awful.

Another "interesting" quirk with megatexture is it limited maximum anisotropic filtering to 4x.
That sounds like a royal fuckup at AMD. I think they're somewhat better nowadays though.

If the GPU transcode is an NVIDIA tech, then it shouldn't be disabled for my cards. It looks to me that since the game doesn't recognize the cards, it just disables it. Instead, it would have been easy to identify them as later versions of NVIDIA cards, even if the game had no knowledge of their specifics and just enabled the feature, at least with a warning about potential problems.

According to Ikaruga above, this engine can also do 120fps, so it's just been intentionally disabled for an unknown reason. I saw on one of the sites a suggestion that physics didn't work properly at higher frame rates, which might be the reason. It's still not a good reason, though. Just fix the engine rather than forcing such an awful workaround.

Again, re SLI, same answer as above.
 
So, office politics screwed up the game and the framerate in particular. What's new? Still, you'd think such a basic problem would be fixed after all this time with a little patch, wouldn't you?
But it should work to be honest, I have several mates who have the same machine as you and they had no problem (I actually asked for you, they had 2600K but that shouldn't make a difference). It might be a driver version thingy.


For SLI, see my reply to rtwjunkie, above. Let's hope the latest Doom works properly.
I understand, but you really don't need SLI or this game. If it would be a 8400GS SLI that would make sense perhaps, but I wonder how many people have such a setup. Just think about it, this engine runs 1080p 60fps on the xbox360 with the same assets, this is not your average optimization you used to see with other PC engines, and SLI makes no difference.


Yeah, you do see the judder and it's not game specific, either. Whenever the same frame is shown twice or more, judder can be seen. This is why 24fps movies shown at the cinema and on TV judder, since each frame is displayed twice. You're literally seeing the animation move, stop, move, stop... etc. The only way to have smooth motion is for every frame to be shown once per monitor refresh.
Another indication that there is something "wrong" with your setup there. 60fps on a 120Hz monitor should be smooth without any kind of judder! The only difference you should notice that you see a frame for twice as long time, so for 16,667ms instead of 8,333ms. I watch movies at (3x24) 72Hz custom resolution on my monitor and there is zero judder with any kind of judder tests.


According to Ikaruga above, this engine can also do 120fps, so it's just been intentionally disabled for an unknown reason. I saw on one of the sites a suggestion that physics didn't work properly at higher frame rates, which might be the reason. It's still not a good reason, though. Just fix the engine rather than forcing such an awful workaround.
You are right it's annoying that you need unofficial tweaks and whatnot to get out more fps, but iirc [id] did field tests with gamers and they decided that 99% of the players are satisfied with 60fps gameplay on the PC, and since you can't go above on the consoles, they set it as a "goal". You can imagine that as a Doom and Quake fan I also find that very wrong, but since Doom3-BFG can run at more fps, it's more like a stupid design choice than a limitation.
If I would try to be the devil's advocate here (don't really want to since fps is more important, but let's try), I would remind you that idtech5 runs the control polling and everything else at 60fps too, and that's really awesome, and it's quite refreshing when other AAA titles run the same parts at 20-25Hz at best, and even worse, some run those at 10-15Hz, which is handheld territory really.


I hope that your problems will go away with a new driver in the future, and you will be able to play the game one day.
 
Last edited:
Bait or not, I'm gonna bite on your post because it's just so wrong.

" I finished it on nightmare in one sitting"

Hate to break it, but Rage pretended to be much more than an ID shooter like Doom right? It has vehicles, it has an 'open world' (it was even marketed as such!) and then you start playing the actual game and discover the whole thing is a glorified on-rails shooterfest with very little meat on its bones. I mean, it doesn't even have the puzzle elements that Doom and Quake had. You just enter a map after driving around for a minute or two, you clear that map, and repeat that shit a few times and... 'oh. the game is finished'. And all this, in 2011, with a horrible AI to fight against. It could have been literally ported from legacy Doom.

In my book, the actual price of a game doesn't really have anything to do with the quality of the title. Price changes, quality (or lack thereof) remains. To me, Rage is a game with so many strange design choices on so many levels that you start questioning the sanity of its developer.
 
I love this game. It makes you angry. I like that.
 
Hate to break it, but Rage pretended to be much more than an ID shooter like Doom right? It has vehicles, it has an 'open world' (it was even marketed as such!) and then you start playing the actual game and discover the whole thing is a glorified on-rails shooterfest with very little meat on its bones. I mean, it doesn't even have the puzzle elements that Doom and Quake had. You just enter a map after driving around for a minute or two, you clear that map, and repeat that shit a few times and... 'oh. the game is finished'. And all this, in 2011, with a horrible AI to fight against. It could have been literally ported from legacy Doom.

In my book, the actual price of a game doesn't really have anything to do with the quality of the title. Price changes, quality (or lack thereof) remains. To me, Rage is a game with so many strange design choices on so many levels that you start questioning the sanity of its developer.
I did not debate or question your personal preference, as I said, you are ofc free arrive to any kind of conclusion you want to. All I said that it's not a "steaming pile" and you are definitely wrong about that. I agree that the design sucks on many levels, but it's a shooter (even the official trailer says "the game is easy, kill or be killed"), and the devs told the press several times that they are perplexed about how people comparing it to Bordelands or Fallout 3, because they did not make such game (Carmack said several times before launch that idtech5 can't handle open-world games, and that's the main reason why Skyrim won't use the engine).
I personally think it looks great on 2011 level, the artists did a nice job imo, but again, you are free to think otherwise, just be objective and fair. The wast majority of the professional reviewers (not including jokes like IGN ofc) agree on this, so it's definitely not a 2/10 or 3/10 "steaming pile", that's all I'm saying.
 
Yeah, you do see the judder and it's not game specific, either. Whenever the same frame is shown twice or more, judder can be seen. This is why 24fps movies shown at the cinema and on TV judder, since each frame is displayed twice. You're literally seeing the animation move, stop, move, stop... etc. The only way to have smooth motion is for every frame to be shown once per monitor refresh.
...
That sounds like a royal fuckup at AMD. I think they're somewhat better nowadays though.
that is not why movies judder... 24fps on 30 or 60 judders because it's not aligned, that's what judder means, misaligned movement that skips every X frames, it does not mean low framerate

so 60fps on 120hz cannot judder, double buffered vsync does not judder, the framerate is steady as long as it's a multiple of the refresh (30 on 60hz, 40 on 120hz, etc)

that said, the game engine or drivers could suck... in rochard for example, i am seeing some kind of microstutter even at 60 vsync on my 570m, not sure why

as for AMD, i have heard that bethesda or id had bad feelings towards them due to some point in the ATI days where a pre-release game build leaked out via the driver team
 
If the GPU transcode is an NVIDIA tech, then it shouldn't be disabled for my cards. It looks to me that since the game doesn't recognize the cards, it just disables it. Instead, it would have been easy to identify them as later versions of NVIDIA cards, even if the game had no knowledge of their specifics and just enabled the feature, at least with a warning about potential problems.

According to Ikaruga above, this engine can also do 120fps, so it's just been intentionally disabled for an unknown reason. I saw on one of the sites a suggestion that physics didn't work properly at higher frame rates, which might be the reason. It's still not a good reason, though. Just fix the engine rather than forcing such an awful workaround.
Doom3/idtech4 can be run at >60fps too but it does mess up parts of the engine. These engines are just synched to 60Hz by design. It's not uncommon to see this sort of thing, but it is admittedly more rigid than usual. Usually you'll have stuff like Bioshock or Halo with animations or some effects remaining at a lower speed like 30fps.
 
Back
Top