• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Old PC Vs new

  • Thread starter Deleted member 24505
  • Start date
Joined
May 8, 2021
Messages
1,978 (1.87/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
I do, but like I said I do not play competitive games and so far I'm yet to run into any issues in singleplayer games.
They are nicely locked to 74 with no frame time mess either. 'unles the game itself is a mess'

Both Min and Max value set to 74 under chill and all kind of sync disabled ingame if possible.

Most of the time I'm also using windowed borderless mode in games since I tend to tab out a lot.
Oh well, it seems that Chill doesn't work well for me.
 
D

Deleted member 24505

Guest
Just tried cyberpunk runs very well 1080, impressed, first time i have tried it since it released.
 
Joined
Jan 14, 2019
Messages
9,727 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
Hardly, those Athlons are like the last ones to be so unfortunate to not have L3 cache and then "not real" cores. I doubt that there is any modern CPU, which could stutter as much, while delivering a playable framerate. The lowest end CPU, which is able to play something is refreshed Pentium:

As you can see it's not able to provide 60 fps in all titles, but it pretty much doesn't stutter.
You're still missing the point. I wanted to explain what I mean by "unstable but high framerate", and not talk about old and dual core CPUs. (the COD and Cyberpunk parts are quite stuttery, btw)

It's personal to me, but CPU should never limit graphics card's potential.
That I agree with. A GPU bottleneck gives you lower, but stable framerates, which is more desirable than a stuttery CPU bottleneck.

MP? what kind of FPS are we talking about btw?
Even if I had the official version of the game, I couldn't be asked to waste a single second of my life in multiplayer. Sorry. :ohwell:

FPS was enough for a smooth gameplay on both machines. Like I said, I didn't put a counter on, as it would have influenced my judgement. I was looking for stuttering and overall experience, not performance numbers. Besides, my main computer has a 2070 and the small one has a 1050 Ti in it, so it wouldn't have been a fair comparison anyway.

Edit: Before someone says that my test is totally subjective - yes it is! Life is made up of subjective experiences, and not objective facts. ;)
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I wasn't arguing against that.
Well, either you were, or you were making a pointless platitude about "at some level, performance is unacceptably low", which ... yes. I think it's safe to assume we all know that. And in which case you misunderstood the point of the analogy you were responding to.
Oh well, it seems that Chill doesn't work well for me.
Chill is a bit of a case of a great solution without a matching problem. The idea is superficially great, but flawed in that it assumes that low user input = low framerates are acceptable. To work well, such a system at least needs to account for on-screen movement. After all, if you're camping in a corner with a sniper rifle, it's hardly ideal that the GPU slows you down to 30fps just because you aren't moving. I used Chill for a bit when playing Divinity: Original Sin, which on the surface seemed like an ideal use case for a system like that, but in the end resulted in it being far too aggressive, slowing down framerates annoyingly in relatively action-heavy parts of the game. In the end, in order to select parts of a game where it's okay to show things at a lower frame rate, you need to know what is going on in the game, not just whether or not the user is using the input devices. And even on-screen movement is difficult to judge - detailed facial animations look far better at 60fps than 30, after all, yet a cutscene close-up of someone talking is hardly a screen with much movement.
 
Joined
May 8, 2021
Messages
1,978 (1.87/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
You're still missing the point. I wanted to explain what I mean by "unstable but high framerate", and not talk about old and dual core CPUs. (the COD and Cyberpunk parts are quite stuttery, btw)
I think that I have found what you mean:

Basically you need to have a fast chip, but the one that is limited by cache or by core count. Overclocked i3 7350K is a good example of otherwise fast CPU, but the one that doesn't have enough cores. And while average performance was quite okay, in games like Far Cry Primal it had inconsistent frame times, meanwhile i5 was doing a lot better.

Or you can just pair a decent CPU with poopy RAM to simulate poor cache on it, unfortunately frametimes were still decently consistent:

In 2021 it's really hard to find a chip that performs well and stutters or has unstable framerate. So I found this:

To me that's adequately inconsistent framerate, but even the nit still performs somewhat predictably. And you really can't top those Athlons without L3 in terms of random stuttering and otherwise good, but unstable performance:

That I agree with. A GPU bottleneck gives you lower, but stable framerates, which is more desirable than a stuttery CPU bottleneck.
Depends on how old you wanna go with such statement. AGP cards with less pipelines (like 4) often had unstable framerates. Like this FX 5200:

In Doom it could get 50 fps in one are, in others it was 15-25 fps. BTW it's 64 bit model, not faster 128 bit model, although they all had rather unpredictable performance in many games. Since it was like GT 710-GT730 of the time, I would expect new potatoes to have similar problems. Particularly DDR3 models.

Well, either you were, or you were making a pointless platitude about "at some level, performance is unacceptably low", which ... yes. I think it's safe to assume we all know that. And in which case you misunderstood the point of the analogy you were responding to.
My point is that some low end hardware is quite okay, if you play at potato settings and could be enjoyable, but there is such rubish hardware that is under any circumstances isn't good, it just sucks. I had one of such unfortunate experiences with nVidia GeForce 6150 Go. It's was insufferable garbage. Nothing was playable at 640*480, except for Unreal Tournament 2004, which had a very unstable framerate of 30-50 fps average and resolution was so low that it was legitimately hard to see opponents. And infuriating thing is that same laptop had Turion X2 TL-60 (2GHz K8 cores) and 2x2GB DDR2. So it was otherwise very capable machine, but ruined with integrated graphics. Originally it had single core Sempron at 1.8GHz and 512MB RAM, so it's not like it came with Turion. I just decided to upgrade old laptop.

Chill is a bit of a case of a great solution without a matching problem. The idea is superficially great, but flawed in that it assumes that low user input = low framerates are acceptable. To work well, such a system at least needs to account for on-screen movement. After all, if you're camping in a corner with a sniper rifle, it's hardly ideal that the GPU slows you down to 30fps just because you aren't moving. I used Chill for a bit when playing Divinity: Original Sin, which on the surface seemed like an ideal use case for a system like that, but in the end resulted in it being far too aggressive, slowing down framerates annoyingly in relatively action-heavy parts of the game. In the end, in order to select parts of a game where it's okay to show things at a lower frame rate, you need to know what is going on in the game, not just whether or not the user is using the input devices. And even on-screen movement is difficult to judge - detailed facial animations look far better at 60fps than 30, after all, yet a cutscene close-up of someone talking is hardly a screen with much movement.
Well, you can set minimum and maximum Chill framerate for that. Unfortunately, I set Chill to 60-60 and tried to play CS:GO and I got very unstable and even stuttery fps. It was mostly in 50s, but sometimes went to 40s. It clearly didn't work as advertised.
 
Joined
Jun 27, 2019
Messages
1,817 (1.05/day)
Location
Hungary
System Name I don't name my systems.
Processor i3-12100F 'power limit removed'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75 Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
Chill is a bit of a case of a great solution without a matching problem. The idea is superficially great, but flawed in that it assumes that low user input = low framerates are acceptable. To work well, such a system at least needs to account for on-screen movement. After all, if you're camping in a corner with a sniper rifle, it's hardly ideal that the GPU slows you down to 30fps just because you aren't moving. I used Chill for a bit when playing Divinity: Original Sin, which on the surface seemed like an ideal use case for a system like that, but in the end resulted in it being far too aggressive, slowing down framerates annoyingly in relatively action-heavy parts of the game. In the end, in order to select parts of a game where it's okay to show things at a lower frame rate, you need to know what is going on in the game, not just whether or not the user is using the input devices. And even on-screen movement is difficult to judge - detailed facial animations look far better at 60fps than 30, after all, yet a cutscene close-up of someone talking is hardly a screen with much movement.

I pretty much only use Chill as a fps limiter and it doesn't drop frames based on whats on the screen since both min and max value is set to the same.
At least I never experienced that happening cause of chill.
Its just to avoid my monitor to reach out of its Freesync range, this way I can't see any tearing and don't have to use vsync in games.

Chill.jpg

So when I game this is what it should look like:

MELEchill.jpg

In that scene and in that game in general I could easily push 100+ frames but I don't want that, this way it feels a lot smoother and consistent to me.
 
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I pretty much only use Chill as a fps limiter and it doesn't drop frames based on whats on the screen since both min and max value is set to the same.
At least I never experienced that happening cause of chill.
Its just to avoid my monitor to reach out of its Freesync range, this way I can't see any tearing and don't have to use vsync in games.

View attachment 207163

So when I game this is what it should look like:

View attachment 207164

In that scene and in that game in general I could easily push 100+ frames but I don't want that, this way it feels a lot smoother and consistent to me.
Well, sure, but then you're not really using Chill, you're just using it as a way of gaining a secondary functionality. The "adjusts fps based on activity" feature is what makes Chill what it is.
Well, you can set minimum and maximum Chill framerate for that. Unfortunately, I set Chill to 60-60 and tried to play CS:GO and I got very unstable and even stuttery fps. It was mostly in 50s, but sometimes went to 40s. It clearly didn't work as advertised.
Well, yes, but the point of Chill is to save power and heat output by lowering rendered fps in situations where it isn't needed. Setting minimum and maximum fps very close renders it meaningless - you might as well just use an FPS limit or VSYNC. Your experience sounds pretty buggy though. Mine stayed within the set range, it just didn't work well for that use case, and I struggle to imagine one where it would.
My point is that some low end hardware is quite okay, if you play at potato settings and could be enjoyable, but there is such rubish hardware that is under any circumstances isn't good, it just sucks. I had one of such unfortunate experiences with nVidia GeForce 6150 Go. It's was insufferable garbage. Nothing was playable at 640*480, except for Unreal Tournament 2004, which had a very unstable framerate of 30-50 fps average and resolution was so low that it was legitimately hard to see opponents. And infuriating thing is that same laptop had Turion X2 TL-60 (2GHz K8 cores) and 2x2GB DDR2. So it was otherwise very capable machine, but ruined with integrated graphics. Originally it had single core Sempron at 1.8GHz and 512MB RAM, so it's not like it came with Turion. I just decided to upgrade old laptop.
That is exactly the platitude I was talking about. That's not what this thread is about, nor what anyone here is disucssing. It's blindingly obvious that all hardware has its limitations, and some hardware simply isn't suited to some tasks. That's a given. The discussion here is on upgrade mania and chasing high benchmark numbers vs. what is actually a meaningful perceptible increase in enjoyment and game quality. When the discussion is on experiential differences within perfectly playable quality/fps ranges and how a lot of older hardware can still provide good gameplay experiences, saying "well, some hardware is just crap" hardly brings anything interesting or insightful to the table, or furthers the debate in any meaningful manner.
 
Joined
May 8, 2021
Messages
1,978 (1.87/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Well, sure, but then you're not really using Chill, you're just using it as a way of gaining a secondary functionality. The "adjusts fps based on activity" feature is what makes Chill what it is.
That's not true. It's just a feature to lock your fps and save power by not rendering more frames than specified. Anyway, it doesn't work as intended.

Well, yes, but the point of Chill is to save power and heat output by lowering rendered fps in situations where it isn't needed. Setting minimum and maximum fps very close renders it meaningless - you might as well just use an FPS limit or VSYNC. Your experience sounds pretty buggy though. Mine stayed within the set range, it just didn't work well for that use case, and I struggle to imagine one where it would.
Well, if you have clearly overpowered card for certain games, Chill could save a lot of power. Vsync doesn't work like Chill. As I understand, with Vsync you may still double or triple buffer, so your card should be loaded just as much as with Vsync off. fps limiter may works like Chill, but I'm not sure if it will work out exactly like that.


That is exactly the platitude I was talking about. That's not what this thread is about, nor what anyone here is discussing. It's blindingly obvious that all hardware has its limitations, and some hardware simply isn't suited to some tasks. That's a given. The discussion here is on upgrade mania and chasing high benchmark numbers vs. what is actually a meaningful perceptible increase in enjoyment and game quality. When the discussion is on experiential differences within perfectly playable quality/fps ranges and how a lot of older hardware can still provide good gameplay experiences, saying "well, some hardware is just crap" hardly brings anything interesting or insightful to the table, or furthers the debate in any meaningful manner.
But it's true. Lots of hardware is a bit underestimated by reviewers by focusing on latest and greatest, meanwhile some hardware is so bad that it's in no way imaginable good or enjoyable. I think that such information is still of value in this thread. The main idea of this thread is that what is seen as good hardware is lower end or older than expected and that's what reviewers potentially miss. And despite this finding, some hardware is simply too slow to fit such description.
 
Joined
Jan 14, 2019
Messages
9,727 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
I think that I have found what you mean:

Basically you need to have a fast chip, but the one that is limited by cache or by core count. Overclocked i3 7350K is a good example of otherwise fast CPU, but the one that doesn't have enough cores. And while average performance was quite okay, in games like Far Cry Primal it had inconsistent frame times, meanwhile i5 was doing a lot better.

Or you can just pair a decent CPU with poopy RAM to simulate poor cache on it, unfortunately frametimes were still decently consistent:

In 2021 it's really hard to find a chip that performs well and stutters or has unstable framerate. So I found this:

To me that's adequately inconsistent framerate, but even the nit still performs somewhat predictably. And you really can't top those Athlons without L3 in terms of random stuttering and otherwise good, but unstable performance:
Yep, that's kind of what I mean. :) When overall CPU performance is good enough to feed the GPU with data, but lags behind when loading assets, or performing other temporary operations within the game, or running background tasks (for example, when Windows update kicks in).

Depends on how old you wanna go with such statement. AGP cards with less pipelines (like 4) often had unstable framerates. Like this FX 5200:

In Doom it could get 50 fps in one are, in others it was 15-25 fps. BTW it's 64 bit model, not faster 128 bit model, although they all had rather unpredictable performance in many games. Since it was like GT 710-GT730 of the time, I would expect new potatoes to have similar problems. Particularly DDR3 models.


My point is that some low end hardware is quite okay, if you play at potato settings and could be enjoyable, but there is such rubish hardware that is under any circumstances isn't good, it just sucks. I had one of such unfortunate experiences with nVidia GeForce 6150 Go. It's was insufferable garbage. Nothing was playable at 640*480, except for Unreal Tournament 2004, which had a very unstable framerate of 30-50 fps average and resolution was so low that it was legitimately hard to see opponents. And infuriating thing is that same laptop had Turion X2 TL-60 (2GHz K8 cores) and 2x2GB DDR2. So it was otherwise very capable machine, but ruined with integrated graphics. Originally it had single core Sempron at 1.8GHz and 512MB RAM, so it's not like it came with Turion. I just decided to upgrade old laptop.
The FX 5200 and 6150 Go are kind of extreme examples, as they were never meant to be more than display adapters in the first place (not to mention that the GeForce FX series was a failure altogether). I think what @Valantar meant was more like x50 or x60 series cards that offer decent gaming performance with low power consumption, and sometimes even come in small form factor. They usually serve you through several generations and can be used in a variety of systems, unlike bigger, more power-hungry "enthusiast" cards that require a large power supply, a heavy cooler (or water) and a big chassis with plenty of airflow.
 
Joined
May 8, 2021
Messages
1,978 (1.87/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
The FX 5200 and 6150 Go are kind of extreme examples, as they were never meant to be more than display adapters in the first place (not to mention that the GeForce FX series was a failure altogether).
Display adapters or not, but FX 5200 was able to run some AAA games, which is certainly not bad. It was also totally respectable in UT 2004 and was able to run it at 1024x768 medium settings at 50 fps. And it could run Quake 3 very well too. The latest game that I managed to run on it was Minecraft, it ran just fine at 1280x1024 with Optifine and I got like 45-60 fps. Just imagine today GTX 3010 coming out and it run Cyberpunk at lowest settings, in 800x600 and it is able to max out texture setting and gets 30 fps. That certainly wouldn't be bad and just because of that, I don't honestly think that FX 5200 was really as bad as people were saying. It's just that nVidia never before made intentionally that low end product. Today it would be equivalent of GT line never existing and one day GT 3010 comes out.

In terms of other e-waste I used to game on, I tried out ATi 3000 integrated chipset graphics and it was able to run ETS 2 at 640x480, CS:GO at 800x600, CoD MW2 at 1024x768. So it was able to run latest AAA games few years after it was made. I also used A4 6300 with integrated graphics and it was able to run Duke Nukem Forever at 1080p medium, DiRT 3 medium-high 1600x900, WRC 4 at 640x480, Far Cry 1 at 1080p Ultra. Again, it ran AAA games a few years after it was released, so that isn't bad for what was the cheapest APU that you could buy. The only other insufferable garbage was Pentium P6200 with unknown Intel graphics. All it was able to run was NFS MW original at 640x480, that's it. It was beyond disappointing.

I certainly don't recommend using things like these to play games on, but it's just that they can be used for that and sometimes aren't actually completely awful. To be honest, sometimes messing with weak hardware like this is a lot of fun. You never know if games will run or not, so it's always an adventure. It's a lot like buying a beater car and trashing it offroad. It's cheap and you expect nothing, but often it works out just fine. I sometimes think, that it would be fun to buy GT 710 GDDR5 model only to volt mod it, overclock it to moon and see where I end up. It seems that all low end cards have quite good overclocking potential. I managed to overclock FX 5200 as far as Afterburner sliders went and that was a solid 25% stable overclock. GT 710s also have at least 15% OC potential on any model.

I think what @Valantar meant was more like x50 or x60 series cards that offer decent gaming performance with low power consumption, and sometimes even come in small form factor. They usually serve you through several generations and can be used in a variety of systems, unlike bigger, more power-hungry "enthusiast" cards that require a large power supply, a heavy cooler (or water) and a big chassis with plenty of airflow.
That's true, they are always a good value. In past Radeons were hotter budget cards due to better bang for the buck, omega drivers, overall better tweakability, cheap crossfire (I know that you shouldn't, but to the ape part of brains, it still sounds cool). Polaris was a very good generation too, along with both Vegas, which later became cheap and they are just as tweakable as Polaris, due to miners. They are ridiculously easy to bios mod, also decent undervolters and etc. Sounds great, but I wouldn't buy one without intention to tweak it due to awful heat output and monstrous power consumption.
 
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
That's not true. It's just a feature to lock your fps and save power by not rendering more frames than specified. Anyway, it doesn't work as intended.
No, that's what a framerate limiter does. Radeon Chill is specifically a(n attempt at a) "smart" system that dynamically adjusts in-game FPS on-demand, with an upper ("high activity") and lower ("idle") level. The explicit goal of this is to both save power by rendering as few frames as possible when not needed, while also preserving responsiveness and performance when needed. So quite unlike a pure FPS limiter. It can still be used as one, but that isn't using it for what it's made for. AFAIK the most recent versions of Radeon Software brought back a pure FPS limiter alongside Chill, highlighting that these are separate things.

Well, if you have clearly overpowered card for certain games, Chill could save a lot of power. Vsync doesn't work like Chill. As I understand, with Vsync you may still double or triple buffer, so your card should be loaded just as much as with Vsync off. fps limiter may works like Chill, but I'm not sure if it will work out exactly like that.
Whether VSYNC is single, double or triple buffered is at times adjustable, but often not, and in those cases games typically don't tell you which it is. Obviously triple buffering doesn't save you any power if you're for example playing on a 60Hz display and your GPU can render anything below 180fps, or double buffering if you're rendering 120fps or less, but if it's single buffered and you're at all above 60fps, then it lets your GPU stay idle for a while after each frame. Which does save power. I never said Vsync worked like Chill, I said it can (in certain situations) do something similar by capping framerates and thus limiting GPU power draw.

As for "if you have a clearly overpowered card for certain games, Chill could save a lot of power" - no. That is what a frame rate limiter does. The purpose of Chill is "smart" power saving by throttling framerate when it's not needed, yet maintaining it when it is:

Radeon CHILL


The concept of CHILL is fairly simple. During periods of low user interaction or little action on screen, the CPU and GPU power limit is reduced, causing the hardware to slow down. This reduces the frame rate during periods where a high frame rate is not required, allowing the CPU/GPU to cool down (where the CHILL bit comes in). This also saves power on power-limited systems such as laptops. When CHILL detects more action or more strenuous movement by the user, the power limit is moved back up.

But it's true. Lots of hardware is a bit underestimated by reviewers by focusing on latest and greatest, meanwhile some hardware is so bad that it's in no way imaginable good or enjoyable. I think that such information is still of value in this thread. The main idea of this thread is that what is seen as good hardware is lower end or older than expected and that's what reviewers potentially miss. And despite this finding, some hardware is simply too slow to fit such description.
I disagree. The topic of this thread is how old but decent hardware can provide a gameplay experience so good it doesn't feel meaningfully different from brand-new expensive hardware, and the value of staying on the upgrade train. Saying "but some hardware is just unusably slow" is ... yes. Yes, it is true. And blindingly obvious. Nobody here is arguing otherwise. Thus it brings nothing of value to the discussion. I mean, what is that contributing? It's not what we're talking about. It doesn't negate that other older hardware still provides great gameplay experiences, nor change anything else. It's obvious that the OP didn't mean we might all as well just stick with 486es. Nobody here is arguing that a GT 710 or other "GPU in name only"/display adapter with hardly any 3D acceleration GPUs provide anything like a good gameplay experience (though I guess they might if all you do is play retro emulators?). So ... it's a rather pointless addition to the discussion. If people are discussing how a lot of food is still good after its "best before" date, saying "but if your food is covered with mold or smells rotten you shouldn't eat it" is ... kind of missing the point of the discussion.
 
Joined
May 8, 2021
Messages
1,978 (1.87/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
No, that's what a framerate limiter does. Radeon Chill is specifically a(n attempt at a) "smart" system that dynamically adjusts in-game FPS on-demand, with an upper ("high activity") and lower ("idle") level. The explicit goal of this is to both save power by rendering as few frames as possible when not needed, while also preserving responsiveness and performance when needed. So quite unlike a pure FPS limiter. It can still be used as one, but that isn't using it for what it's made for. AFAIK the most recent versions of Radeon Software brought back a pure FPS limiter alongside Chill, highlighting that these are separate things.
And you can turn Chill into dumb fps limiter, by setting min max values the same.

(and I run enterprise drivers, so I don't care about gaming drivers)


As for "if you have a clearly overpowered card for certain games, Chill could save a lot of power" - no. That is what a frame rate limiter does. The purpose of Chill is "smart" power saving by throttling framerate when it's not needed, yet maintaining it when it is:
If you set min max to same value, it should be in theory be like fps limiter and yet it's not.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.61/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Yep, that's kind of what I mean. :) When overall CPU performance is good enough to feed the GPU with data, but lags behind when loading assets, or performing other temporary operations within the game, or running background tasks (for example, when Windows update kicks in).


The FX 5200 and 6150 Go are kind of extreme examples, as they were never meant to be more than display adapters in the first place (not to mention that the GeForce FX series was a failure altogether). I think what @Valantar meant was more like x50 or x60 series cards that offer decent gaming performance with low power consumption, and sometimes even come in small form factor. They usually serve you through several generations and can be used in a variety of systems, unlike bigger, more power-hungry "enthusiast" cards that require a large power supply, a heavy cooler (or water) and a big chassis with plenty of airflow.

The FX 5200 was the exact same as a Xbox, the reason I know this is because on a 9700 Pro you could crank the graphics in NFSU 1&2 to max where the 5200 ran at minimum, the exact same detail level was on the xbox.
 
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
And you can turn Chill into dumb fps limiter, by setting min max values the same.

(and I run enterprise drivers, so I don't care about gaming drivers)

If you set min max to same value, it should be in theory be like fps limiter and yet it's not.
Again: You can do that, but at that point you're not really using Chill. You can tweak its parameters to make it a dumb framerate limiter, but at that point you've removed literally everything that makes Chill what it is. The entire point of Chill is to be more and smarter than a dumb fps limiter, allowing for a "best of both worlds" scenario of power savings and performance. Setting both numbers to the same removes that entirely. You still have the feature enabled, but at that point in name only. If you go to a burger shop and order a cheeseburger without cheese, you'll likely get some weird looks and questions from your server, but if you insist you'll likely get it. But when you then get it, are you then actually eating a cheeseburger? The receipt will say so, but I'd certainly argue for that not being the actual case.
 
Joined
May 8, 2021
Messages
1,978 (1.87/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
The FX 5200 was the exact same as a Xbox, the reason I know this is because on a 9700 Pro you could crank the graphics in NFSU 1&2 to max where the 5200 ran at minimum, the exact same detail level was on the xbox.
Sorry but no, architecture is completely different and TMU count is also different. Not to mention that FX 5200 was released a lot later. And FX 5200 had newer DirectX, OpenGL, pixel shader and vertex shader support. OG Xbox GPU is closest to GeForce 4 Ti 4200 128 bit card. But even then, they are different. There was no Xbox GPU available to desktop users, it was a custom chip only for Xbox. Considering release date, it was likely GeForce 3 derived variant.

Again: You can do that, but at that point you're not really using Chill. You can tweak its parameters to make it a dumb framerate limiter, but at that point you've removed literally everything that makes Chill what it is.
Not my problem that AMD removed FRTC.

The entire point of Chill is to be more and smarter than a dumb fps limiter, allowing for a "best of both worlds" scenario of power savings and performance.
I really don't think that it's best of both, more like a feature, that nobody asked for.


You still have the feature enabled, but at that point in name only.
Um, no. Chill was made to save power by limiting frames and it still does that even if dumber.

Anyway, I found out something. With Chill set to 60 fps, for some reason card isn't able to maintain 60 fps, it does go down to 59 or even 57. And I'm pretty sure that it messes with game latency, because game feels jittery. Raising Chill to 90 fps almost fixes that issue, but at that point my power savings are small, only 40-50 watts, meanwhile with 60 fps cap, they were in 90-100 watt range. I tested that in CoD 4. A forced Vsync sort of helps, but it makes latency really bad. If Chill is on, I can't use Antilag, which may be helpful if I want to use Vsync. CoD4's built-in Vsync is horrendous and it adds a lot of input lag. Probably because of triple buffering. I also found out that Chill wasn't meant to be supported in all titles and titles must support Chill to work correctly. It was in past, so I don't know if that still applies, but doesn't look good.
 
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Not my problem that AMD removed FRTC.
No, but that doesn't make Chill into FRTC either.
I really don't think that it's best of both, more like a feature, that nobody asked for.
I've been pretty clear on it not actually working as advertised in my experience, haven't I? You're mixing up the purpose/intention of a feature and whether it's actually capable of reaching that goal.
Um, no. Chill was made to save power by limiting frames and it still does that even if dumber.
Nope. It was made to save power by limiting frames when not needed while maintaining them when needed. A spork is not a fork, even if it kind of resembles one, can be partially used as one, and no doubt was (partially) derived from one.
Anyway, I found out something. With Chill set to 60 fps, for some reason card isn't able to maintain 60 fps, it does go down to 59 or even 57. And I'm pretty sure that it messes with game latency, because game feels jittery. Raising Chill to 90 fps almost fixes that issue, but at that point my power savings are small, only 40-50 watts, meanwhile with 60 fps cap, they were in 90-100 watt range. I tested that in CoD 4. A forced Vsync sort of helps, but it makes latency really bad. If Chill is on, I can't use Antilag, which may be helpful if I want to use Vsync. CoD4's built-in Vsync is horrendous and it adds a lot of input lag. Probably because of triple buffering. I also found out that Chill wasn't meant to be supported in all titles and titles must support Chill to work correctly. It was in past, so I don't know if that still applies, but doesn't look good.
Yeah, the implementation is sadly quite lacking. I don't know what kind of game support it needs (I would guess some sort of tuning in how it interprets inputs and adjusts framerates, but really have no idea). In principle a "simple" implementation relying only on inputs could work regardless of the application, but that would be quite problematic as I've said before, as it would be too aggressive for a lot of games, or just plain unsuited to their style of gameplay. What I use for games requiring fast response times is a frame cap at either the screen refresh rate or an integer multiple of it (i.e. 120/180 for a 60Hz display), alongside anti-lag and enhanced sync. Chill isn't really suited for those types of games anyhow, as any time it would reduce framerates it would also increase response times and latencies anyhow.
 
Joined
May 8, 2021
Messages
1,978 (1.87/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
No, but that doesn't make Chill into FRTC either.
Effectively FRTC is now Chill. If you claim otherwise, provides reasons why you think so.

Nope. It was made to save power by limiting frames when not needed while maintaining them when needed.
Big deal, I can make it work as FRTC.

Yeah, the implementation is sadly quite lacking.
Sorry for being negative, but many AMD's technologies are like that. AMD loves to hype up some feature and then doesn't care if it works and how well it works. AMD's RIS is exactly the same. Enhanced Sync is almost the same too. So basically half of cool features that differentiates Radeon from GeForce plainly suck, don't work well or aren't properly supported and no proper compatibility list is anywhere provided. And what sucks the most is that those ideas aren't bad and could be quite wonderful if they are polished and updated. Oh and things like virtual super resolution and custom resolution, they also barely work as advertised and have tons of issues, meanwhile on nVidia's side exactly the same things work just fine. Oh and wattman is still a sad joke, just like it was since 2016 or so. It's still fucked up due to "wattman settings have been reseted, because wattman hurt itself in confusion". Sorry for off-topic.


I don't know what kind of game support it needs (I would guess some sort of tuning in how it interprets inputs and adjusts framerates, but really have no idea). In principle a "simple" implementation relying only on inputs could work regardless of the application, but that would be quite problematic as I've said before, as it would be too aggressive for a lot of games, or just plain unsuited to their style of gameplay. What I use for games requiring fast response times is a frame cap at either the screen refresh rate or an integer multiple of it (i.e. 120/180 for a 60Hz display), alongside anti-lag and enhanced sync. Chill isn't really suited for those types of games anyhow, as any time it would reduce framerates it would also increase response times and latencies anyhow.
lol I'm probably the only person on Earth trying to use Chill in CS:GO and CoD 4. I still don't get it, how AMD manages to fuck up Chill and for that matter FRTC so bad, that it's worse than fps capping. It's been done for ages in video games and at this point by 3rd party software and AMD still cannot figure it out how to make something so simple actually functional. And yet they make graphics cards and should be rather knowledgeable about lots of things, especially about what happens in graphics card rendering pipeline. It's a shame that AMD is so dumb.
 
Top