Tuesday, December 27th 2022

Alienware 500 Hz Gaming Monitor Leaks Ahead of CES Reveal

Based on a leak on Twitter, Alienware is planning on announcing a 500 Hz capable gaming monitor at CES. According to the leaker, the new monitor will carry the AW2524AH model name, suggesting it's a 25-inch monitor. The general tech specs aren't likely to impress most people, as the display panel is as one would expect, only a 1080p panel, although it's at least a fast IPS panel rather than a TN panel.

The AW2524H delivers 480 Hz natively over DP, but has an OC setting which makes it reach 500 Hz. Based on the leaked picture of the rear of the display, it'll have some RGB elements, as well as a height adjustable stand, which most likely also allows the display to be rotated. The leaker didn't provide any kind of pricing, but expect this one to be a very expensive 1080p display.
Source: @g01d3nm4ng0 on Twitter
Add your own comment

98 Comments on Alienware 500 Hz Gaming Monitor Leaks Ahead of CES Reveal

#51
InfernalAI
evernessinceAny 1440p or 4K LCD panel is gong to beat a 1080p plasma in regards to image quality. I'm not sure what IPS or TN display you are comparing it to but I don't need to see a plasma display to know that more pixels is going to result in higher image quality. On top of that Plasma TVs are required to have a glass front which means glare is going to be hella bad and the max brightness means they couldn't be used in every environment.

Another part of image clarity, response times and refresh rate, Plasmas are at a disadvantage for as well. The way Plasma TVs generate different shades is by quickly turning on and off subpixels. By modulating the rate at which the subpixel is on or off you change the perceived shade. To display a darker shade of blue the plasma would just flicker the blue subpixel at a slower rate. This is where the "600 Hz" figure comes from. It's measuring the number of times a pixel and it's subpixels can be flickered within a given refresh rate. In effect each flicker is a portion of a pixel / subpixel transition. Most Plasmas operate at a refresh rate of 60 Hz and would flicker 10 times each frame, thus you get your marketing "600 Hz". Mind you plasmas displays have a minimum pixel transition time of 5ms as this is how long the phosphor takes to decay. On top of that, the pixel response time profile of plasma TVs is sub-optimal. Many LCD panels tend to push the pixels to the desired values early in the transition, sometimes at the cost of overshoot. This drastically reduces the time pixel transitions take and many displays have settings that let users tune pixel response time (often called overdrive). This faster response time means that high motion games and movies will be more clear.

IPS panels are the go to for color accuracy and OLEDs are a close 2nd (OLED do have the best contrast by far though). You are likely referring to contrast which is where Plasma's are pretty good. That said VA LCD panels typically achieve around the same contrast or better without the drawbacks of Plasma and all the advantages of modern LCDs. Samsungs new Odyssey monitors are a good example of that.

As I pointed out earlier, even reviews from that time were not blow away compared to existing products on the market. Your experience is noted but that doesn't automatically dismiss the conclusions from existing reviews.



OLEDs have similar brightness issues to Plasma although less burn-in issues. That said OLED burn-in is often permanent but modern panels have done a good job of mitigating the chance of that.

OLEDs have vastly superior contrast as they can completely turn off pixels where as plasma pixels still retain some light. They can display a wider range of color, have a higher peak brightness, and have better color accuracy. OLEDs are not as efficient as LCDs but they are more efficient than Plasma TVs.

Not really of importance for end consumers but maybe a reason they shouldn't catch on, Plasma TVs emit a lot of radio frequency interference. Talking up to 1/2 mile away.
What I am referring to as image quality, are the colors and the depth it gives to the image. None of the lcd monitors I have are able to get close to my plasma in that regard. This may be the contrast advantage that the plasma has, and you noted va's, which I don't use because I can not stand image ghosting, and VA's are known to not have as good pixel response times as tn or ips. But the new ips panel I got also doesn't stack up to my plasma in terms of color, the ips glow, backlight bleed is there and noticeable, and creates and uneven image in terms of color and contrast. I'm sorry to say but plasma has a lot of advantages that neither tn, ips, nor va are able to replicate. Oled is the only one that seems it might be able to match it, but as I said I don't have one so I can't say myself at the moment.

As I said the 1080p resolution is the only downside, but that's nothing unique to plasma, because the point there is it will look better than any lcd 1080p. And yes, in many instances like looking out at a beautiful sunset in skyrim, the plasma actually makes that scene more beautiful than on my 1440p tn. The ips gets close in terms of color, but the unevenness, and the lack of deep blacks, just simply makes the image lack in certain ways still. But this is why I have made a point to myself that I won't get any new monitor unless it's an Oled, or maybe mini-led and such like that, though I don't know how well those will do in stacking up to the image I'm looking for. The plasma tv I have isn't perfect but there is something to the picture that I just have not seen in any of the lcd's I have ever had or used.
Posted on Reply
#52
WhoDecidedThat
64KWhy the hostility?
Just poking fun at all the people arguing the human eye can't see beyond xyz fps :p

Obviously the human eye can see beyond 2 fps... it's the exaggeration that makes it funny
Posted on Reply
#53
AsRock
TPU addict
NRANMPretty much this.

I've had my 165Hz monitor for a few months now and I fail to see the attraction of the "high refresh rate". It is actually most noticeable when moving the cursor and dragging windows. In game, I've been able to see a difference only in Doom, and even then it wasn't huge. What I mean is, high refresh rate (120 and up) is pleasant and nice to have, but by no means essential to the gaming experience. If anything, it's a bit of a liability: it increases requirements towards the hardware, causes more coil whine, and potentially exposes various driver issues, especially when mismatching different monitors with different refresh rates (I'm looking at you, AMD!); all this for an increase in framerate that I'm barely noticing.
But to the manufacture it's keeping the cycle alive, higher resolutions need faster video card which then higher refresh rates, same thing they do with phones and basically there are too many people who don't realize they are being taking for a ride.

Then at some point will bitch about pollution which lets be honest is to just to make a buck too, and make the price go up. Same kinda shit with the eco crap too.
Posted on Reply
#54
kapone32
sephiroth117That's maybe something for the 0.01% of competitive gamers that actually can use this..(and of course not those who pretend they are competitive but suck at 144 or 500hz alike).

I'd trade 250-300Hz for a resolution bump to 1440p and OLED/mini-led anytime...after all OLED gaming monitors are coming en masse to CES and they have insane response time.
I'm done using outdated panel with mediocre lightbleed, my long awaited upgrade Q1 2023 is toward oled or mini-led, period.
Don't be surprised to see these at the big for money Video Game tournaments. I can even see in that community the same argument that happened when 120Hz became common enough to challenge and replace 60Hz. I was one of those 60Hz (thanks to my console days) hangons. until I played the Division with my new 120HZ monitor and could not believe that I could use the Machine Gun with a scope to take out enemies. At 60Hz it was a shakefest just using it. That convinced me, but when I see legacy Games running at these frames (PUBG, Sleeping Dogs) it remains to be seen if there is a noticeable difference.

I here you on the panel technology that is available today. It is to the point where I watch my TV service through my Smart TV app. I just can't do 720P after watching Disney Plus.
AsRockBut to the manufacture it's keeping the cycle alive, higher resolutions need faster video card which then higher refresh rates, same thing they do with phones and basically there are too many people who don't realize they are being taking for a ride.

Then at some point will bitch about pollution which lets be honest is to just to make a buck too, and make the price go up. Same kinda shit with the eco crap too.
EA Play allows you to play Madden on PC. It is locked at 30 FPS. You literally can make a pass on every play and the Game feels like molasses.
Posted on Reply
#55
ZoneDymo
Bomby569the question isn't so much what you deem, that's subjective af, but does it really matter at all or is it just pointless, like the mouse DPI mentioned.
again, up to the individual to decide instead of just pre-hating or pre-loving it.
qubitI wonder what kind of graphics card it takes to drive it at 500Hz just on the desktop.
depends on the game/settings.
DJ_CasSame story as with Mouse DPIs
I mean mouse DPI matters as we go up in resolution, old mice can barely be used because they are so slow and artificially upping the speed is not that great, but again if individuals like higher dpi....well thats fine, enjoy the progress in that area, I wont get upset over it.
Posted on Reply
#56
Bomby569
ZoneDymoagain, up to the individual to decide instead of just pre-hating or pre-loving it.
the individual can decide to jump of a bridge, that's not what matters here, we should discusse if jumping of a bridge is a good idea or not. We are talking about the tech not what any individual wants to do
Posted on Reply
#57
qubit
Overclocked quantum bit
ZoneDymodepends on the game/settings.
No it doesn't, because I'm talking about just generating a 500Hz picture for the desktop. Lower end, previous gen cards may not be able to do it.
Posted on Reply
#58
WhoDecidedThat
NRANMIn game, I've been able to see a difference only in Doom, and even then it wasn't huge.
I agree with Doom (Eternal) being the only game in which I've noticed a difference but the difference, to me, was huge.

Going from 60 Hz to 144 Hz gave me a significant difference in fluidity and responsiveness. Switching back felt like a very noticeable downgrade.

For every other game in my library, going above 60 Hz has been like - the animation is a bit smoother... that's it?

Even with Doom (Eternal), I don't see it benefitting significantly from more than 120 Hz.
Posted on Reply
#59
Veseleil
64KWhy the hostility?
Probably because all of the ignorance, hate and malice manifested. It's a niche market oriented product. You won't see me comment on the $10000 headphones article in the similar fashion, and surely not with an automatic assumption they are stupid and unnecessary. Why? Because I know there are people that will find the money and proper use for those.
I'm not a competitive FPS gamer, but let's just pretend I am. Apex Legends has a fps cap of 300, and I have a GPU power to run the game at that fps in 1080p. I want a monitor that can translate each of those frames without any loss (skipped frames). I want things that may give an edge to my already top of the line performance. Is that makes me a fool, a gimmick sucker? SMH.
Posted on Reply
#60
evernessince
InfernalAIWhat I am referring to as image quality, are the colors and the depth it gives to the image. None of the lcd monitors I have are able to get close to my plasma in that regard. This may be the contrast advantage that the plasma has, and you noted va's, which I don't use because I can not stand image ghosting, and VA's are known to not have as good pixel response times as tn or ips. But the new ips panel I got also doesn't stack up to my plasma in terms of color, the ips glow, backlight bleed is there and noticeable, and creates and uneven image in terms of color and contrast. I'm sorry to say but plasma has a lot of advantages that neither tn, ips, nor va are able to replicate. Oled is the only one that seems it might be able to match it, but as I said I don't have one so I can't say myself at the moment.
Samsung's high end VA panels do not have black smearing (the ghosting you are referring to) in case you are interested in buying those. Pixel response for VAs was only poor during black transitions but it seems that disadvantage for them is going away given Samsung has cracked it.

You should RMA your unit if uniformity / backlight bleed is bad. That is not normal nor acceptable.

You could also choose to get an IPS panel with a mini-LED backlight. This will vastly exceed the contrast ratio of a Plasma TV. Alternatively you can get a good OLED TV or monitor at a reasonable price nowadays.

You say plasma has many advantages but any LCD panel with a mini-led backlight is going to beat it in all respects.

Heck even comparing my 120 Hz OLED phone to my 240 Hz 1440p acer predator, the latest IPS panels have definitely stepped it up when it comes to contrast and color. The only time the OLED wins is in dark scenes, otherwise the color presentation is better on my IPS monitor.
InfernalAIAs I said the 1080p resolution is the only downside, but that's nothing unique to plasma, because the point there is it will look better than any lcd 1080p. And yes, in many instances like looking out at a beautiful sunset in skyrim, the plasma actually makes that scene more beautiful than on my 1440p tn. The ips gets close in terms of color, but the unevenness, and the lack of deep blacks, just simply makes the image lack in certain ways still. But this is why I have made a point to myself that I won't get any new monitor unless it's an Oled, or maybe mini-led and such like that, though I don't know how well those will do in stacking up to the image I'm looking for. The plasma tv I have isn't perfect but there is something to the picture that I just have not seen in any of the lcd's I have ever had or used.
Mini-LED is still LCD based, it just allows local dimming. It'll get the picture you are after, which is higher contrast.
Posted on Reply
#61
lightning70
To tell the truth, it's a monitor that wouldn't make sense without the RTX 4090. I think 240Hz is fine.
Posted on Reply
#62
Chrispy_
Ugh, such snake oil.

If they'd said "OLED" I'd have been fine with this - a stupid-expensive 500Hz panel for eSports gamers with deep pockets.

But 500Hz is far beyond the capabilities of IPS technology which is already really struggling to get 100% refresh compliance above 144Hz without severe overshoot making the image quality useless.

Yes, 240Hz IPS monitors do exist, but no reviewers that do detailed pixel response testing are recommending them over the good 165-180Hz IPS monitors because the extra refresh rate buys you only additional blur trails and/or insane overshoot ruining the image. Not even the best 180Hz IPS panels are overshoot free, but 180Hz is close enough to the true capability of IPS that it doesn't turn the image into a complete mess - overshoot is within 25% or so, and refresh compliance is above 75%.

If I had to guess, a 500Hz IPS panel will have overshoot exceeding 200% for many transitions and refresh compliance will be under 10%. If you don't understand what that means, it means the 500Hz claim is a total, easily-provable lie; It accepts frames from a GPU at 500Hz but the output results of each frame aren't even remotely close to what the GPU sent it.
Posted on Reply
#63
Steevo
I think we used to have a reaction time thread somewhere around here where some special people were too fast for their lagging (insert hardware here) and only had the same average reaction time due to the hardware
Posted on Reply
#64
RoutedScripter
I'm back again ... why not please just slightly blur the front page preview of such articles with a spoiler warning enough for me to glance away and others to see their snotty leaks. TPU could be the first and I think other media could follow suit or at least commend. I do not believe it would take that much traffic away. I already did a whole wall of text on this months ago, if not last year or when so I'll skip that on this occasion ofcourse.
Even an automated overlay that de-spoilers after 3-5 seconds with a big enouhg countdown indication and some fancy border effects would be fine, without leak hungry user needing to click anything, those that are aware would look away. I'd still keep visiting any talking about other stuff on TPU even if all the news is spoilers in a day.

Or perhaps just the title, or hybrid the title and preview text but except first 3 characters in the title or similar, or minus the first word in the title.

Alienware
500 kHz Rec.2100 HDR-10K Gaming Monitor Leaks Ahead of CEESS Reveal, to be boundled with Call of Duty 2 Remastered, releasing on Windows XP2
Posted on Reply
#65
kapone32
lexluthermiesterYeah, it doesn't work that way. Most games sync over networks at a rate slower than display refresh rates anyway. But even if some of them are faster, the difference between 1/240th of a second and 1/500th of a second to the human eye is imperceptible. It is mathematically significant, but insignificant to the human eye.


That's a configuration error on your part. Locking vsync without first setting the default refreshrate will result in the problem you described. You must first set your default refresh rate and THEN set vsync. Whether you do it in the Windows display settings or the driver control panel is up to you.
My default rate was already set at 165hz.
Posted on Reply
#66
qubit
Overclocked quantum bit
Reading these comments, it surprises me just how controversial something as simple as a very high refresh rate monitor can be.

I say wait for the reviews before passing judgement on it and I hope TPU review it. In particular, those pointing out the artefacts on current high refresh rate monitors and saying it's gonna look like crap on this one due to overshoot, you don't actually know that, since it's quite possible that they've successfully addressed these issues in such an extreme refresh rate monitor. Again, let's wait for reviews before making assumptions about its performance.
Posted on Reply
#67
Chrispy_
qubitReading these comments, it surprises me just how controversial something as simple as a very high refresh rate monitor can be.
Because very few monitors can achieve good results at the optimistic refresh rate they claim.

VA and IPS technology has reached a stage where 75Hz "normal" monitors typically finish most or all of their pixel transitions before the next frame arrives in 13.3ms. It is a 75Hz display not just because the signal is fed to it 75 times a second, but because it can draw the whole frame within 1/75 of a second. At even 'just' 144Hz, the next frame arrives after 6.9ms and the problem with both VA and IPS is that many transitions take more than 10ms, meaning that you never actually see all of the extra frame that your expensive GPU just spent your money making. They claim 144Hz but in reality, the pixels themselves are only capable of fully drawing a frame in 1/90th of a second. With aggressive overdrive, maybe some of these 'bad' 144Hz displays can draw 75% of the frame in 1/144th of a second, but the remaining 25% of the frame with either still be changing from the previous frame, or completely overshot the mark and is displaying something utterly unwanted and distracting.

There is a very tangible benefit from moving beyond 60-75Hz non-gaming monitors. I know sensitivity varies from person to person but for me, the point at which individual frames turn into motion is about 50Hz, depending on how fast the content is moving, so I consider 60Hz smooth (and do a lot of gaming at 4K60) but true fluidity - the point at which higher refresh rates stop mattering* to me is at about 105Hz.

[INDENT]*[/INDENT]
[INDENT]All of that last paragraph was based on CRT testing and more recently OLED+black-frame-insertion, which is the most CRT-like experience you can get these days, so for me (and I'm reasonably typical, I think) a good, strobing, 100% refresh compliant display at 105Hz is the point of diminishing returns. Yes, I can see some very minor gains from 120Hz to 144Hz, to 165Hz, but on a 240Hz monitor, I can barely perceive any difference at all between 240Hz and 144Hz unless trying to read fast-scrolling labels/text.[/INDENT]

The higher refresh rates serve one primary function beyond diminishing returns - and that's a reduction of sample-and-hold blur. My brain can barely process the extra information its given at 240Hz, but if my eye is tracking an object moving across the screen, the reduction of the sample-and-hold "wiggle" is very noticeable at higher refresh rates. The caveat to this is that the monitor HAS to have completed its pixel transitions before the next frame arrives, otherwise all of that extra Hz and FPS is wasted.

Personally, I wish manufacturers would work on implementing better backlight strobing and black frame insertion. I would pick a 100Hz monitor with excellent strobing over the 240Hz monitor I currently have, any day of the week, because 100fps gaming is great, it means I don't need an RTX 4090 to reach 240fps, and (provided the pulses are typical 25% duration) I can track moving objects with the clarity I'd have using a hypothetical 100% refresh-compliant 400Hz display. The BFI on my G75T is mediocre, but I still prefer it at fixed 120Hz to 240Hz VRR without it.

So, 120 or 144Hz gaming is a big upgrade over 60Hz, but beyond that people who want higher refresh rates probably think they want higher refresh rates, but in reality are trying to get smoother motion tracking which would be better achieved with a good strobing/BFI implementation.
Posted on Reply
#68
TumbleGeorge
WhoDecidedThatJust poking fun at all the people arguing the human eye can't see beyond xyz fps :p

Obviously the human eye can see beyond 2 fps... it's the exaggeration that makes it funny
Must invented way to use eyes to gaming without brain usage :)
Posted on Reply
#69
Vayra86
Veseleil). I want things that may give an edge to my already top of the line performance. Is that makes me a fool, a gimmick sucker? SMH.
Well, you are kind of saying it yourself right here - already top of the line performance. We are solid in the realm of diminishing returns here; at 240hz already. 360-500 is absolutely in the category of idiocy and placebo benefit.

Everything has its limits. Simple, even if the human psyche disagrees. Its wisdom to be able to see the difference. And we are led by marketing if we fail to do so.
Posted on Reply
#70
qubit
Overclocked quantum bit
@Chrispy_ Very selective quoting there. As I suggested, why don't you wait for the reviews instead of making a judgement based on all these assumptions about this new monitor based on much older and slower designs? That's the objective way to look at it.

As far as where the laws of diminishing returns kick in, it'll be somewhat different for everyone. In my case, I can tell the difference between 120Hz and 144Hz, but it's subtle. I can't speak for higher refresh rates because I haven't seen them, but I suspect that I'll be able to see that too.

Finally, there's a killer reason for really high refresh rates: much reduced motion blur. It's inherent in all sample and hold displays, including OLED, partly because of the way human vision works with averaging out the motion between frames.

There's two ways to reduce this: 1) strobing, 2) increasing the framerate. Doing so at 500Hz makes the difference between them very small and hence much reduced blur. There's an article by Microsoft published a few years ago that explains this all in great detail. However, you can see the effect just by switching between 60/100/120/144Hz and noticing how the motion blur drops each time.

@Vayra86 you posted while I was creating this post. Please see my last paragraph for the benefit. It's absolutely not a placebo or idiocy.
Posted on Reply
#71
TumbleGeorge
qubitblur
What is additional reasons to have blur in games?
Posted on Reply
#72
Chrispy_
qubit@Chrispy_ Very selective quoting there. As I suggested, why don't you wait for the reviews instead of making an judgement based on all these assumptions about this new monitor based on much older and slower designs? That's the objective way to look at it.
If you believe Alienware has made their own IPS display that is magically somehow 300% better than the panels AUO, LG.Display, CMO et al have been fighting for tiny incremental improvements over for the last two decades, then you truly are a more optimistic man than me.

This is only a guess, but I would put money on it - This will be more of the same shit we've seen from 240, 300, 360Hz IPS. It doesn't work, multiple independent reviews have proved it. IPS itself is the limiting factor, so it's OLED or nothing beyond about 200Hz if you want anything approaching refresh-compliance.

Go and read some RTINGS articles or watch HUB videos on the fastest IPS displays of the last two years. We're at the point where the combination of best panel and best overdrive algorithms can get 80% compliance in about 4.6ms and 100% compliance in around 7ms average. That means that IPS technology is perfect for (1000/7) 144Hz and anything beyond that is a compromise of smear and overshoot.
Posted on Reply
#73
Vayra86
qubit@Chrispy_ Very selective quoting there. As I suggested, why don't you wait for the reviews instead of making an judgement based on all these assumptions about this new monitor based on much older and slower designs? That's the objective way to look at it.

As far as where the laws of diminishing returns kick in, it'll be somewhat different for everyone. In my case, I can tell the difference between 120Hz and 144Hz, but it's subtle. I can't speak for higher refresh rates because I haven't seen them, but I suspect that I'll be able to see that too.

Finally, there's a killer reason for really high refresh rates: much reduced motion blur. It's inherent in all sample and hold displays, including OLED, partly because of the way human vision works with averaging out the motion between frames.

There's two ways to reduce this: 1) strobing, 2) increasing the framerate. Doing so at 500Hz makes the difference between them very small and hence much reduced blur. There's an article by Microsoft published a few years ago that explains this all in great detail. However, you can see the effect just by switching between 60/100/120/144Hz and noticing how the motion blur drops each time.

@Vayra86 you posted while I was creating this post. Please see my last paragraph for the benefit. It's absolutely not a placebo or idiocy.
Motion blur is generated from two sources. Your brain/processing and sample and hold. There is a benefit, Im not denying that; but diminishing returns are at play. 120-165, sure, we are almost unanimous in seeing that benefit. maybe 240 would still do as well but only if you can maintain >200 FPS all the time, and even then its sacrificing frame time variance (smoothness) for... what exactly? A minute difference in blur? Its not even quantifiable at this point; why is the upper limit more important than stability?

Point is, the sacrifices no longer outweigh the benefits at a certain point. And no, thats not a moving goal either. 120-144 and maybe 165 are more than sufficient to exceed that point. It was in 1999, and it will be in 2030.

This article though isnt even about 240hz. Its about 500. Similar principles apply to pixel density; 4K at very small diagonals are similarly pointless and just a result of commerce, not sense. Their purpose is as Veseleil describes it accurately: something to buy when you already have top of the line performance.

Its money looking for a purpose, and there is a market for it, like there is for everything, no matter how silly.
Posted on Reply
#74
Chrispy_
qubitFinally, there's a killer reason for really high refresh rates: much reduced motion blur. It's inherent in all sample and hold displays, including OLED, partly because of the way human vision works with averaging out the motion between frames.

There's two ways to reduce this: 1) strobing, 2) increasing the framerate. Doing so at 500Hz makes the difference between them very small and hence much reduced blur. There's an article by Microsoft published a few years ago that explains this all in great detail. However, you can see the effect just by switching between 60/100/120/144Hz and noticing how the motion blur drops each time.
Did you not read the second half of my post? You're literally telling me what I wrote 20 minutes ago, though I guess you may not have F5'ed and it was an edit to an older post....
Vayra86This article though isnt even about 240hz. Its about 500.
I think the large amount of criticisms in this thread are about 500Hz being a joke when not even 240Hz displays can really do what they claim. There is only one LCD display made to date (the Odyssey G7) that can actually draw a whole frame in 1/240th of a second, and that's with caveats. The fastest IPS panel is actually a 180Hz panel, because although the Viewsonic X2431 refreshes at 240Hz, it's pixel transitions are slower.
Posted on Reply
#75
Vayra86
Chrispy_I think the large amount of criticisms in this thread are about 500Hz being a joke when not even 240Hz displays can really do what they claim. There is only one LCD display made to date (the Odyssey G7) that can actually draw a whole frame in 1/240th of a second, and that's with caveats.
Yeah the issues are everywhere; G2G, input frame info, server tickrates, frame time variance... and thats omitting the actual performance of your own PC to begin with. This is pure marketing BS, if not an outright lie.
Posted on Reply
Add your own comment
May 15th, 2024 21:52 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts