• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Old PC Vs new

  • Thread starter Deleted member 24505
  • Start date
Joined
Jul 24, 2009
Messages
1,002 (0.19/day)
Got 6+6 Xeon @ 4GHz and its pretty good for gaming. Actually can compare it to Zen 2 (also 6+6) .. and honestly its sometimes pretty obvious that certain games were made for Intel not AMD. Despite AMD having that extra 300MHz something advantage, there isnt much or any in games. That said, Zen 2 is waaay more efficient with computing power vs power consumption. Old Xeon with OC is more like 5 liter V8, while Zen 2 is 1.8 Vtec.
 
D

Deleted member 24505

Guest
maybe i should do some tests with this setup i have
4790k @ 4.4
msi z87i gaming AC
2x8gb hyperX savage 2400 @ 2400
msi gtx980ti gaming 6gb

I have a fair few games to try it on, anything you fancy me testing on? it will be at 1080p max though.


Untitled.jpg
 
Joined
Mar 31, 2014
Messages
1,533 (0.42/day)
Location
Grunn
System Name Indis the Fair (cursed edition)
Processor 11900k 5.1/4.9 undervolted.
Motherboard MSI Z590 Unify-X
Cooling Heatkiller VI Pro, VPP755 V.3, XSPC TX360 slim radiator, 3xA12x25, 4x Arctic P14 case fans
Memory G.Skill Ripjaws V 2x16GB 4000 16-19-19 (b-die@3600 14-14-14 1.45v)
Video Card(s) EVGA 2080 Super Hybrid (T30-120 fan)
Storage 970EVO 1TB, 660p 1TB, WD Blue 3D 1TB, Sandisk Ultra 3D 2TB
Display(s) BenQ XL2546K, Dell P2417H
Case FD Define 7
Audio Device(s) DT770 Pro, Topping A50, Focusrite Scarlett 2i2, Røde VXLR+, Modmic 5
Power Supply Seasonic 860w Platinum
Mouse Razer Viper Mini, Odin Infinity mousepad
Keyboard GMMK Fullsize v2 (Boba U4Ts)
Software Win10 x64/Win7 x64/Ubuntu
I have a fair few games to try it on, anything you fancy me testing on? it will be at 1080p max though.
Anything newer than 2016 (x
6+6 Xeon @ 4GHz and its pretty good for gaming.
This is really subjective and relative, like yeah go stick a RX 570 or something in one of those systems and play old games at 1440p and i guess you are fine? I honestly think X58 is probably the most overrated hardware in the last 3 or so years, the IPC is miles behind even sandy bridge... On a good day you might be able to get around a 3700x in multicore performance?

Then you also have to get a big case, big motherboard (motherboard will also be very old), two coolers, loads of DDR3 sticks, and you need to deal with real NUMA as well.

The only reason you would not see an advantage with a 3700x in games is because you entirely GPU bind the system at something well below the 3700x is capable of.

The only attraction of those old xeon systems is that you get a boatload of PCIe (2.0) lanes, but with 16+4 lanes of 4.0 on AM4 you'll get the equivalent of 64+16 lanes of 2.0 just off the CPU.

Price breakdown:
2x X5690 2x $60 or so ($120)
2x CPU coolers ($80)
Dual 1366 motherboard ($80-100)
6x4GB DDR3 ($100+)

3700X ($220 used)
1x CPU cooler if you are unlucky and can't get the stock cooler of the 3700x ($30)
B450 Max motherboard ($80 new)
2x8 DDR4 ($70-80 new)

If you want something cheap go look for used zen/zen+ ryzens or intel LGA1200 i3s... Maybe if you are lucky you can snipe okay deals on Z370/390 sets too... Those 1366 6 cores are at best equivalent to sky/zen2 quad cores in multicore but have worse memory performance, caches, and 1t so they consistently CPU bottleneck games.
 
Joined
May 8, 2021
Messages
1,978 (1.84/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Anyway... :rolleyes: I only saw two cringeworthy moments in the FX 6300 vs i5 6400 vs i5 2400 comparison video. One of them is the stutterfest on the 2400 in BF5 at 0:46, which also presented itself on the 6400 at 0:55. The other one is when CP77 dipped below 30 fps on the 2400 at the end of the video. Neither of these are deal breakers, though, as the general gameplay experience is largely stutter-free. Also, I wonder what point the video's maker wanted to make by including the middle game (was it GTA5?). It looks buttery smooth on all 3 systems. Maybe that's the point? :wtf:
Because GTA 5 used to be heavy on CPUs. My own FX 6300 was only able to run it at 40-60 fps. As far as I know no FX chip stock can handle GTA 5 properly, perhaps except for 9370 and 9590. At mountains it runs GTA fine, but not so well at city, when you have lots of NPCs and lots of objects. One odd thing is that grass tends to reduce fps a lot, as it seemingly loads CPU a lot more. I remember that in GTA 4 if I didn't turn off shadows (which to be honest looked a bit garbage anyway), CPU was stuttering and dipping a lot, often to 30s. Without shadows, CPU ran GTA 4 at nearly 60 fps all time. Another fun fact, after GTA 5 loads up, if you kill Rockstar's Social Club task, you can gain like 5-10 fps. I didn't test that on FX 6300, but I tested that on Athlon 870K, which is 4 FX cores, with less L2 cache and no L3 cache. It does break game a little, as far as I know stocks stop working. Anyway, if you really want GTA 5 to be actually smooth on FX stuff, you have to overclock those chips. I now run GTA 5 on i5 10400F and I use VSync, it makes 60 fps feel like 120 fps on high refresh rate monitor and there's seemingly no input lag (I also use Radeon Anti-Lag, so maybe that's why).


For a trouble-free gaming experience, I agree. For any gaming experience, just 4 cores are still enough. It's still rather impressive, considering that the first 4-core 8-thread consumer-grade CPUs were released 12 years ago (Core i7-860 and 870), and also considering that you can get a modern 8-thread CPU for about £80 with delivery brand new. That's £10 per thread, or £20 per core. :laugh: I remember the '90s and early 2000s when you had to upgrade almost every year just to be able to run the newest games at any fps - and I was still rocking a Celeron MMX 300 MHz with an S3 ViRGE "3D decelerator" for 6 long years, dreaming about owning a 3DFX or GeForce 3 one day. This might have something to do with why I don't give a damn about super high framerate gaming. :D
Maybe, for me it's just a massive expense. I still remember, when I had my retro rig. Initially I used nVidia GeForce FX 5200 128MB 128 bit. It was one of the better models of legendary potato, but it was able to run CoD 1 in DirectX 7 mode at 1024x768 resolution at decent framerate. It also was able to run Far Cry at lowest settings, 800x600, 16x aniso, maximum textures at quite stable 30 fps. It was also able to run GTA San Andreas okay at whatever settings I used. I later got ATi X800 Pro and it was all around a better card, much faster, had more up to date DX compatibility. It ran games a lot better and at far more respectable settings. I quite liked that and it more or less transformed gaming experience. However, I later got ATi X800 XT PE and while it was even faster, it made me feel nothing. Sure I got somewhat more fps, maybe a bit IQ bump, but it felt like money as wasted as it simply failed to do anything more than X800 Pro. It taught me a good lesson for life, is that chasing for best isn't always an answer, but getting most for your money is surely good. And while that is nice, sometimes just the minimum spec can be very enjoyable (FX 5200). For these reasons + some other reasons, I probably never again will buy the best card. nVidia's xx60 series are plenty as well as AMD's equivalents, sometimes even xx50 Ti is good enough. The thing is that once you get game running at some adequate framerate and resolution high enough that you can see shit, any further improvements will do very little for your gaming experience.

And nowadays, getting a high end card is as much fun as getting high end car. You see, you can buy an expensive sports car, that is very fast and all, but what many people still don't understand is that those cars are high maintenance affairs and they are awful for for pretty much any daily driving. Same with RTX 3090. It sure is fast, but it certainly runs hot, it certainly is not quiet and it consumes a lot of electricity. And on top of that it dumps a lot of heat into room. Realistically, you need beefy case, decent cooling set up, may AC at home and a set of headphones. It requires a lot of reengineering of your whole setup and in return you only get more frames and maybe a resolution bump (depends on your monitor). Meanwhile something like RTX 3060 barely requires anything else done and it is very practical for most people. Cards like RTX 3060 are like VW Golf GTi. Everybody knows that it's a basic card and it doesn't have many downsides of high end cards and yet it delivers a good performance. A Golf GTi didn't become a good seller, because it was really fast, but because it remained cheap to buy and relatively inexpensive to own and surprisingly quick. Same with graphics cards, RTX 3060 is the Golf GTi of graphics cards now (or at least it should have been if prices remained as intended). And that's why every GTX xx60 sold so well, it's because it's the most suitable card for masses. Some people may not consider that, but high end cards are quite bad for us in unexpected ways:

And if you are completely cynical today, then they do harm us in a very predictable and annoying way:

I don't know, maybe. I'll try my best to test it out, though, as I'm genuinely interested. :)

Here's another video to illustrate what I mean by stuttering and inconsistency:


The Cyberpunk part is an excellent demonstration, but I really mean the GTA5 and AC:V parts where the average fps is quite alright in my opinion, but the stutters (shown as frametime spikes) just make it uncomfortable for the eye.
Well, that's easy Athlon X4 950 doesn't have L3 cache at all and its L2 cache is tiny. Once you need cache, you stutter. On top of that, those Athlons are dual module FX chips, and FX cores are bit complicated. Each core isn't independent. They works more like 2C4T parts and more precisely something in between 2C4T and 4C4T parts. So you have typical FX being a FX and then you chop the cache and shit happens. Athlon 950 was one of the latest FX derivates and had improved IPC and supported more latest instructions so the cores themselves are better than what Vishera's had, but due to less cache and less levels, you get mostly better performance, but sometimes poopy performance. What most people also don't know is that late FX derivatives, basically anything beyond Vishera and Richland, tend to run much hotter and generally don't have nearly as much overclocking headroom too (maybe because they have much more volts from factory). So, Athlon X4 950 ends up being a poor CPU, which also cannot be improved much. The only good thing about late FX derivatives is that undervolting potential is still quite high. I own Athlon X4 845, which is Carrizo and stock it was like 1.45 volts and I managed to bring that down to 1.275 volts and a bit below that. This doesn't mean much, because despite being FX derivatives, they still have too much of FX soul in them and they still consume a lot of power for what they are. Athlon 200GE consumes pretty much half of what Athlon 950 does and delivers slightly better performance. Those FX derivates were quite useless, meanwhile Richland APU's at least were good overclockers. You can clock them to 5GHz on modest cooling, but you still have cut down cache and 90% good performance and 10% bad performance. It wasn't the first time that AMD cut down the cache. Basically all Phenom era Athlons didn't have L3 cache at all. K8 era Athlons depending on your luck and PR Rating you either got more cache or more clock speed and clock speed was far more valuable, than bigger cache models. I don't know about Intel much, particularly in their malaise era of Pentium Ds and Celeron Ds, but Intel generally messed less with caches. And to make matters worse, AMD on FM2+ platform often remodeled caches, so 760K has more L2 cache, than 870K, but it was worse cache on 760K. They also messed with L1 size. And I don't remember that too well, but AMD may or may not have cut down PCIe lanes from 16 to 8. I know for sure that AM1 stuff only had 4 PCIe lanes and that was part of why they sucked so badly.
 
Joined
Mar 31, 2014
Messages
1,533 (0.42/day)
Location
Grunn
System Name Indis the Fair (cursed edition)
Processor 11900k 5.1/4.9 undervolted.
Motherboard MSI Z590 Unify-X
Cooling Heatkiller VI Pro, VPP755 V.3, XSPC TX360 slim radiator, 3xA12x25, 4x Arctic P14 case fans
Memory G.Skill Ripjaws V 2x16GB 4000 16-19-19 (b-die@3600 14-14-14 1.45v)
Video Card(s) EVGA 2080 Super Hybrid (T30-120 fan)
Storage 970EVO 1TB, 660p 1TB, WD Blue 3D 1TB, Sandisk Ultra 3D 2TB
Display(s) BenQ XL2546K, Dell P2417H
Case FD Define 7
Audio Device(s) DT770 Pro, Topping A50, Focusrite Scarlett 2i2, Røde VXLR+, Modmic 5
Power Supply Seasonic 860w Platinum
Mouse Razer Viper Mini, Odin Infinity mousepad
Keyboard GMMK Fullsize v2 (Boba U4Ts)
Software Win10 x64/Win7 x64/Ubuntu
AMD on FM2+ platform often remodeled caches, so 760K has more L2 cache, than 870K, but it was worse cache on 760K. They also messed with L1 size
L1 was one of the biggest weaknesses in the bulldozer/piledriver cores, I don't remember off the top of my head whether they fixed it in excavator (I know they fixed the way too narrow decode, just duplicating it...)
 
Joined
Aug 12, 2019
Messages
1,717 (1.00/day)
Location
LV-426
System Name Custom
Processor i9 9900k
Motherboard Gigabyte Z390 arous master
Cooling corsair h150i
Memory 4x8 3200mhz corsair
Video Card(s) Galax RTX 3090 EX Gamer White OC
Storage 500gb Samsung 970 Evo PLus
Display(s) MSi MAG341CQ
Case Lian Li Pc-011 Dynamic
Audio Device(s) Arctis Pro Wireless
Power Supply 850w Seasonic Focus Platinum
Mouse Logitech G403
Keyboard Logitech G110
maybe i should do some tests with this setup i have
4790k @ 4.4
msi z87i gaming AC
2x8gb hyperX savage 2400 @ 2400
msi gtx980ti gaming 6gb

I have a fair few games to try it on, anything you fancy me testing on? it will be at 1080p max though.


View attachment 206936
lol just a few :p
 
Joined
Jan 14, 2019
Messages
9,822 (5.11/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
R6S and AC odyssey/origins are the ubisoft ones I know off the top of my head that struggle with those parts, and BF1/5 as I mentioned are also suspect.
BF5 tested. 1080p max settings on both PCs, raytracing enabled on the main one. I played through the tutorial mission on each system with no fps counter to influence my judgement, just pure subjective feeling. The result is buttery smooth experience on both machines. The 3770T produced a millisecond's worth of a hitch when loading assets in the beginning of the tiger tank driving part, but that's the only thing that reminded me that I was playing on a 10 year-old 45 Watt CPU.

Well, that's easy Athlon X4 950 doesn't have L3 cache at all and its L2 cache is tiny. Once you need cache, you stutter. On top of that, those Athlons are dual module FX chips, and FX cores are bit complicated. Each core isn't independent. They works more like 2C4T parts and more precisely something in between 2C4T and 4C4T parts. So you have typical FX being a FX and then you chop the cache and shit happens. Athlon 950 was one of the latest FX derivates and had improved IPC and supported more latest instructions so the cores themselves are better than what Vishera's had, but due to less cache and less levels, you get mostly better performance, but sometimes poopy performance. What most people also don't know is that late FX derivatives, basically anything beyond Vishera and Richland, tend to run much hotter and generally don't have nearly as much overclocking headroom too (maybe because they have much more volts from factory). So, Athlon X4 950 ends up being a poor CPU, which also cannot be improved much. The only good thing about late FX derivatives is that undervolting potential is still quite high. I own Athlon X4 845, which is Carrizo and stock it was like 1.45 volts and I managed to bring that down to 1.275 volts and a bit below that. This doesn't mean much, because despite being FX derivatives, they still have too much of FX soul in them and they still consume a lot of power for what they are. Athlon 200GE consumes pretty much half of what Athlon 950 does and delivers slightly better performance. Those FX derivates were quite useless, meanwhile Richland APU's at least were good overclockers. You can clock them to 5GHz on modest cooling, but you still have cut down cache and 90% good performance and 10% bad performance. It wasn't the first time that AMD cut down the cache. Basically all Phenom era Athlons didn't have L3 cache at all. K8 era Athlons depending on your luck and PR Rating you either got more cache or more clock speed and clock speed was far more valuable, than bigger cache models. I don't know about Intel much, particularly in their malaise era of Pentium Ds and Celeron Ds, but Intel generally messed less with caches. And to make matters worse, AMD on FM2+ platform often remodeled caches, so 760K has more L2 cache, than 870K, but it was worse cache on 760K. They also messed with L1 size. And I don't remember that too well, but AMD may or may not have cut down PCIe lanes from 16 to 8. I know for sure that AM1 stuff only had 4 PCIe lanes and that was part of why they sucked so badly.
You missed my point. :ohwell: I just wanted to demonstrate what I meant by stuttering at playable framerates. The fact that it happens to be a present-day review of the Athlon X4 950 is pure coincidence. It could have been any other CPU.

Because GTA 5 used to be heavy on CPUs. My own FX 6300 was only able to run it at 40-60 fps. As far as I know no FX chip stock can handle GTA 5 properly, perhaps except for 9370 and 9590. At mountains it runs GTA fine, but not so well at city, when you have lots of NPCs and lots of objects. One odd thing is that grass tends to reduce fps a lot, as it seemingly loads CPU a lot more. I remember that in GTA 4 if I didn't turn off shadows (which to be honest looked a bit garbage anyway), CPU was stuttering and dipping a lot, often to 30s. Without shadows, CPU ran GTA 4 at nearly 60 fps all time. Another fun fact, after GTA 5 loads up, if you kill Rockstar's Social Club task, you can gain like 5-10 fps. I didn't test that on FX 6300, but I tested that on Athlon 870K, which is 4 FX cores, with less L2 cache and no L3 cache. It does break game a little, as far as I know stocks stop working. Anyway, if you really want GTA 5 to be actually smooth on FX stuff, you have to overclock those chips. I now run GTA 5 on i5 10400F and I use VSync, it makes 60 fps feel like 120 fps on high refresh rate monitor and there's seemingly no input lag (I also use Radeon Anti-Lag, so maybe that's why).
How does 40-60 fps mean "not handling the game properly"? It's a totally playable value, imo.

Because GTA 5 used to be heavy on CPUs. My own FX 6300 was only able to run it at 40-60 fps. As far as I know no FX chip stock can handle GTA 5 properly, perhaps except for 9370 and 9590. At mountains it runs GTA fine, but not so well at city, when you have lots of NPCs and lots of objects. One odd thing is that grass tends to reduce fps a lot, as it seemingly loads CPU a lot more. I remember that in GTA 4 if I didn't turn off shadows (which to be honest looked a bit garbage anyway), CPU was stuttering and dipping a lot, often to 30s. Without shadows, CPU ran GTA 4 at nearly 60 fps all time. Another fun fact, after GTA 5 loads up, if you kill Rockstar's Social Club task, you can gain like 5-10 fps. I didn't test that on FX 6300, but I tested that on Athlon 870K, which is 4 FX cores, with less L2 cache and no L3 cache. It does break game a little, as far as I know stocks stop working. Anyway, if you really want GTA 5 to be actually smooth on FX stuff, you have to overclock those chips. I now run GTA 5 on i5 10400F and I use VSync, it makes 60 fps feel like 120 fps on high refresh rate monitor and there's seemingly no input lag (I also use Radeon Anti-Lag, so maybe that's why).



Maybe, for me it's just a massive expense. I still remember, when I had my retro rig. Initially I used nVidia GeForce FX 5200 128MB 128 bit. It was one of the better models of legendary potato, but it was able to run CoD 1 in DirectX 7 mode at 1024x768 resolution at decent framerate. It also was able to run Far Cry at lowest settings, 800x600, 16x aniso, maximum textures at quite stable 30 fps. It was also able to run GTA San Andreas okay at whatever settings I used. I later got ATi X800 Pro and it was all around a better card, much faster, had more up to date DX compatibility. It ran games a lot better and at far more respectable settings. I quite liked that and it more or less transformed gaming experience. However, I later got ATi X800 XT PE and while it was even faster, it made me feel nothing. Sure I got somewhat more fps, maybe a bit IQ bump, but it felt like money as wasted as it simply failed to do anything more than X800 Pro. It taught me a good lesson for life, is that chasing for best isn't always an answer, but getting most for your money is surely good. And while that is nice, sometimes just the minimum spec can be very enjoyable (FX 5200). For these reasons + some other reasons, I probably never again will buy the best card. nVidia's xx60 series are plenty as well as AMD's equivalents, sometimes even xx50 Ti is good enough. The thing is that once you get game running at some adequate framerate and resolution high enough that you can see shit, any further improvements will do very little for your gaming experience.
Same here. Interestingly, I also learned the same lesson (chasing the 1% doesn't lead you anywhere) with a Radeon X800. I had a HIS X800 XT IceQ - the first card with a dual-slot cooler that exhausted hot air outside your case with barely any noise, and I absolutely loved it. Then DirextX 9.0c came along, and I decided to upgrade to a GeForce 7800 GS. Sure, it handled DX 9.0c while the X800 only did 9.0b, but 1. it wasn't faster, 2. had a loud cooler, 3. overheated, 4. the cooler had unique mounting points, making changing it impossible, 5. introduced me to microstutters, which I tried to demonstrate with the above video (no, it wasn't my CPU - the Athlon 64 3000+ was plenty for gaming back then, and the X800 XT worked fine) and 6. only a handful of games needed DX 9.0c support.

And nowadays, getting a high end card is as much fun as getting high end car. You see, you can buy an expensive sports car, that is very fast and all, but what many people still don't understand is that those cars are high maintenance affairs and they are awful for for pretty much any daily driving. Same with RTX 3090. It sure is fast, but it certainly runs hot, it certainly is not quiet and it consumes a lot of electricity. And on top of that it dumps a lot of heat into room. Realistically, you need beefy case, decent cooling set up, may AC at home and a set of headphones. It requires a lot of reengineering of your whole setup and in return you only get more frames and maybe a resolution bump (depends on your monitor). Meanwhile something like RTX 3060 barely requires anything else done and it is very practical for most people. Cards like RTX 3060 are like VW Golf GTi. Everybody knows that it's a basic card and it doesn't have many downsides of high end cards and yet it delivers a good performance. A Golf GTi didn't become a good seller, because it was really fast, but because it remained cheap to buy and relatively inexpensive to own and surprisingly quick. Same with graphics cards, RTX 3060 is the Golf GTi of graphics cards now (or at least it should have been if prices remained as intended). And that's why every GTX xx60 sold so well, it's because it's the most suitable card for masses.
This I agree with. As a car review I watched said it: "slow car fast is often more fun than fast car fast".
 
Joined
Jun 29, 2009
Messages
1,875 (0.35/day)
Location
Heart of Eutopia!
System Name ibuytheusedstuff
Processor 5960x
Motherboard x99 sabertooth
Cooling old socket775 cooler
Memory 32 Viper
Video Card(s) 1080ti on morpheus 1
Storage raptors+ssd
Display(s) acer 120hz
Case open bench
Audio Device(s) onb
Power Supply antec 1200 moar power
Mouse mx 518
Keyboard roccat arvo
ahhh nice game collection Mr. Gruffalo with good old duke nukem 3d -love it
i only have the box-sad !
DSC01458.JPG
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Am I the only one who has kind of tired of Duke 3D and feel more nostalgia for the older 2D Duke games? Those, plus the Commander Keen series, Lemmings, Lost Vikings and various other stuff were my jam as a kid. 3D games came a bit later, and were also cool, but I have far fonder memories of those older ones. Heck, I have a Commander Keen tattoo, so I guess that says enough about that.
 
Joined
Jun 29, 2009
Messages
1,875 (0.35/day)
Location
Heart of Eutopia!
System Name ibuytheusedstuff
Processor 5960x
Motherboard x99 sabertooth
Cooling old socket775 cooler
Memory 32 Viper
Video Card(s) 1080ti on morpheus 1
Storage raptors+ssd
Display(s) acer 120hz
Case open bench
Audio Device(s) onb
Power Supply antec 1200 moar power
Mouse mx 518
Keyboard roccat arvo
i have a similiar system like you- maybe a little showdown ?
4790k \ giga z97soc\2933 2x4gb\ evga 980ti sc
 
D

Deleted member 24505

Guest
i have a similiar system like you- maybe a little showdown ?
4790k \ giga z97soc\2933 2x4gb\ evga 980ti sc

Maybe so. I should see if my ddr3 2400 will go faster maybe. Cpu is at 4.4
 
Joined
Jun 29, 2009
Messages
1,875 (0.35/day)
Location
Heart of Eutopia!
System Name ibuytheusedstuff
Processor 5960x
Motherboard x99 sabertooth
Cooling old socket775 cooler
Memory 32 Viper
Video Card(s) 1080ti on morpheus 1
Storage raptors+ssd
Display(s) acer 120hz
Case open bench
Audio Device(s) onb
Power Supply antec 1200 moar power
Mouse mx 518
Keyboard roccat arvo
i have some 2400 too but only 2x4gb- maybe i go 4x4 to match your settings.
and the 2933 is hynix mfr which is good for high clocks but not very fast against dual sided 8gb sticks so we might be even
 
Joined
May 8, 2021
Messages
1,978 (1.84/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
You missed my point. :ohwell: I just wanted to demonstrate what I meant by stuttering at playable framerates. The fact that it happens to be a present-day review of the Athlon X4 950 is pure coincidence. It could have been any other CPU.
Hardly, those Athlons are like the last ones to be so unfortunate to not have L3 cache and then "not real" cores. I doubt that there is any modern CPU, which could stutter as much, while delivering a playable framerate. The lowest end CPU, which is able to play something is refreshed Pentium:

As you can see it's not able to provide 60 fps in all titles, but it pretty much doesn't stutter.

How does 40-60 fps mean "not handling the game properly"? It's a totally playable value, imo.
It's personal to me, but CPU should never limit graphics card's potential. CPU should always be able to deliver more fps and graphics card should always be able to deliver around 60 fps. Sure 40-60 fps is playable, but I don't really like that.

Same here. Interestingly, I also learned the same lesson (chasing the 1% doesn't lead you anywhere) with a Radeon X800. I had a HIS X800 XT IceQ - the first card with a dual-slot cooler that exhausted hot air outside your case with barely any noise, and I absolutely loved it. Then DirextX 9.0c came along, and I decided to upgrade to a GeForce 7800 GS. Sure, it handled DX 9.0c while the X800 only did 9.0b, but 1. it wasn't faster, 2. had a loud cooler, 3. overheated, 4. the cooler had unique mounting points, making changing it impossible, 5. introduced me to microstutters, which I tried to demonstrate with the above video (no, it wasn't my CPU - the Athlon 64 3000+ was plenty for gaming back then, and the X800 XT worked fine) and 6. only a handful of games needed DX 9.0c support.
Oh well :D. Anyway, I found out that those old ATi cards were super tweakable. Depending on model, you could unlock 4 pipelines. You could buy ATi Silencer 4 and overclock it further. Since that was quite overkill cooling solution, you could also volt mod them rather easily. It was possible to tweak it a lot and get some really big performance gains. Just unlocking pipes alone yields 30% improvement. And with voltmod and overclock, you could likely gain another 20-30% improvement. That's like total of 60% and that's a lot. The main problem with ATi X800 cards was that reference cooler was only cooling core well, memory chips were not cooled and some partially covered by heatsink, so they had no ventilation and were additionally heated up. In one review temperatures of memory chips were measured and they were in 80s-90s and at one hotspot were above 100C.

This was the reference cooler:



This I agree with. As a car review I watched said it: "slow car fast is often more fun than fast car fast".
Unfortunately that doesn't apply to computer hardware. A slow hardware is just slow hardware and there's nothing nice about that.
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Unfortunately that doesn't apply to computer hardware. A slow hardware is just slow hardware and there's nothing nice about that.
You're missing the point here though: good performance is relative and dependent on many factors, inlcuding but not limited to CPU+GPU performance, game settings, render resolution, display resolution, display refresh rate, display response times, input device response times, viewing distance, various sensory and physiological factors, etc.

The point being: it can be a lot of fun pushing older/cheaper hardware to deliver a good experience than just buying something overkill. And gaming at lower settings or lower resolutions can be just as enoybable as higher ones depending on the context.

The metaphor doesn't transfer perfectly - there's no such thing as a GPU version of a Golf GTI, as GPUs are far simpler in their functionality than a car. But the metaphor still kind of works in representing how getting a good experience of out something affordable and accessible can be just as fun - if not more - than chasing the ever elusive "best possible performance" PC, which inevitably takes focus away from the experience and onto measuring and tweaking variables.
 
D

Deleted member 24505

Guest
i have some 2400 too but only 2x4gb- maybe i go 4x4 to match your settings.
and the 2933 is hynix mfr which is good for high clocks but not very fast against dual sided 8gb sticks so we might be even

These are pretty good sticks it seems, got a good deal on them. worth a fair bit more on ebay
 
Joined
May 8, 2021
Messages
1,978 (1.84/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
You're missing the point here though: good performance is relative and dependent on many factors, inlcuding but not limited to CPU+GPU performance, game settings, render resolution, display resolution, display refresh rate, display response times, input device response times, viewing distance, various sensory and physiological factors, etc.

The point being: it can be a lot of fun pushing older/cheaper hardware to deliver a good experience than just buying something overkill. And gaming at lower settings or lower resolutions can be just as enoybable as higher ones depending on the context.
No shit, Sherlock.
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Joined
May 8, 2021
Messages
1,978 (1.84/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Joined
Mar 31, 2014
Messages
1,533 (0.42/day)
Location
Grunn
System Name Indis the Fair (cursed edition)
Processor 11900k 5.1/4.9 undervolted.
Motherboard MSI Z590 Unify-X
Cooling Heatkiller VI Pro, VPP755 V.3, XSPC TX360 slim radiator, 3xA12x25, 4x Arctic P14 case fans
Memory G.Skill Ripjaws V 2x16GB 4000 16-19-19 (b-die@3600 14-14-14 1.45v)
Video Card(s) EVGA 2080 Super Hybrid (T30-120 fan)
Storage 970EVO 1TB, 660p 1TB, WD Blue 3D 1TB, Sandisk Ultra 3D 2TB
Display(s) BenQ XL2546K, Dell P2417H
Case FD Define 7
Audio Device(s) DT770 Pro, Topping A50, Focusrite Scarlett 2i2, Røde VXLR+, Modmic 5
Power Supply Seasonic 860w Platinum
Mouse Razer Viper Mini, Odin Infinity mousepad
Keyboard GMMK Fullsize v2 (Boba U4Ts)
Software Win10 x64/Win7 x64/Ubuntu
tested. 1080p max settings on both PCs, raytracing enabled on the main one. I played through the tutorial mission on each system with no fps counter to influence my judgement, just pure subjective feeling. The result is buttery smooth experience on both machines. The 3770T produced a millisecond's worth of a hitch when loading assets in the beginning of the tiger tank driving part, but that's the only thing that reminded me that I was playing on a 10 year-old 45 Watt CPU.
MP? what kind of FPS are we talking about btw?

Sure 40-60 fps is playable, but I don't really like that.
This, I find it extremely jarring on any game with anything remotely fast paced to drop down into the 60s, personally in less competitive games and SP stuff I feel relatively comfortable around 90 or so.

Even when I played CoD4 on my core 2 duo and 8500GT rig I really struggled to play well with 40 and noticeably improved my gameplay when I played promod which could run more than twice the fps due to removing the light effects.

Flight simming on the other hand I feel fine under 30 fps because the movements are so smooth anyway, only if I'm doing aerobatics or something does it really get jarring.

Like I said when I first posted on this thread: it depends on the game and it is very subjective.
 
Joined
May 8, 2021
Messages
1,978 (1.84/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
This, I find it extremely jarring on any game with anything remotely fast paced to drop down into the 60s, personally in less competitive games and SP stuff I feel relatively comfortable around 90 or so.

Even when I played CoD4 on my core 2 duo and 8500GT rig I really struggled to play well with 40 and noticeably improved my gameplay when I played promod which could run more than twice the fps due to removing the light effects.

Flight simming on the other hand I feel fine under 30 fps because the movements are so smooth anyway, only if I'm doing aerobatics or something does it really get jarring.

Like I said when I first posted on this thread: it depends on the game and it is very subjective.
For me it pretty much has to be 60 fps. I don't differentiate between game types, all games benefit from 60 fps. I only settle for less, when game is poorly made and no one can properly reach 60 fps or when I try to run very demanding game that my hardware is too weak to run. But ever since I upgraded to RX 580, I never really had any problems reaching 60 fps, unless games are broken (like GTA 4, Victoria 2).

I mostly don't look at fps counter, but high fps always feels good. 50 fps is still okay, meanwhile 40 just feels not so good. 30 fps for me is unnaceptable, it doesn't feel right, it feels choppy and it messes too much with my perception. For that reason I can't stand consoles. On X360 there was Horizon 2 release and clearly X360 wasn't able to keep up, so MS locked fps to 30, made it look very low spec, removed some game features like tune setup. And while it was sort of playable, it felt slow, it felt like I was reacting to delayed information and I basically have to plan ahead my moves so that I execute them well. If something unexpected happens, I can't really react in time. It was a poor experience. GTA 4 on X360 was even worse, it was locked to 30 fps and it looked poor. Unfortunately, X360 often dropped to 20s and sometimes even 15s, so it was really more suffering than truly playable experience. I also tried to play TDU on X360 and it was 30 fps only and it had issues with loading maps at high speeds. I remember, I was driving some Ferrari on high way at 280 kmh and I jumped, game got stuck for a while, and when it unstuck I fishtailed and crashed. Also races were a chore, due to how non responsive 30 fps felt. Ever since playing with X360, 30 fps to me is unplayable. On PC 30 fps not locked just feels bad due to inconsistent frame times and with mouse it makes hard to aim. 40 fps is the minimum spec for me. 50 fps is fine and 60 fps is all I want. Anything more and I don't care, because my monitor is 60hz. Also depending on game 40 fps is no go too. In UT 2004, anything less than 50 fps average is bad, as it messes too much with aiming and fast reactions.

Also in older days, when we had single core chips and pipelined graphics, average fps was almost meaningless spec. In games like UT 2004, one level may run at 80 fps, meanwhile other will run at 30 fps and certain parts of level may also be high fps or low fps affairs. There's much less variation today, due to multithreading, but in games like that, you can't really say that certain fps good.

People seem to not use it, but sometimes VSync is nice to use. In GTA 5 it makes frame times a lot more consistent and makes same fps feel a lot smoother. It's pretty impractical to use it, as it requires hardware to be able to deliver 60 fps or more at all times, or you will dip into 30 fps. So you need powerful hardware or lower settings to make it work properly, but when you turn it on, it feels like my 60 Hz peasant monitor turns into 90 Hz monitor. That's really nice and there's no noticeable input lag. But then again, I must disclose that I use Radeon Anti-Lag, which probably affects my experience a bit.

While things like Radeon Anti-Lag exist, I have found out that it's usually barely helpful and is barely noticeable. Nonetheless, I found out that certain titles benefit a lot from it. Few of those are UT 2004 and Colin McRae Rally 2005, in which there is night and day differences with Anti-Lag on and off.

There's some person net, that investigated Anti-Lag and whatever nVidia's version is called, in depth and general consensus is that you can reduce latency, if GPU isn't running at 100% :

In the end, a good experience depends on many factors and you have to test things yourself to see how it all works out for you. I just naturally tend to end up at 50-60 fps and as I have discovered recently, latency matters too.
 
Joined
Mar 31, 2014
Messages
1,533 (0.42/day)
Location
Grunn
System Name Indis the Fair (cursed edition)
Processor 11900k 5.1/4.9 undervolted.
Motherboard MSI Z590 Unify-X
Cooling Heatkiller VI Pro, VPP755 V.3, XSPC TX360 slim radiator, 3xA12x25, 4x Arctic P14 case fans
Memory G.Skill Ripjaws V 2x16GB 4000 16-19-19 (b-die@3600 14-14-14 1.45v)
Video Card(s) EVGA 2080 Super Hybrid (T30-120 fan)
Storage 970EVO 1TB, 660p 1TB, WD Blue 3D 1TB, Sandisk Ultra 3D 2TB
Display(s) BenQ XL2546K, Dell P2417H
Case FD Define 7
Audio Device(s) DT770 Pro, Topping A50, Focusrite Scarlett 2i2, Røde VXLR+, Modmic 5
Power Supply Seasonic 860w Platinum
Mouse Razer Viper Mini, Odin Infinity mousepad
Keyboard GMMK Fullsize v2 (Boba U4Ts)
Software Win10 x64/Win7 x64/Ubuntu
I play a lot of differently paced games, flight Sims I think are an outlier but I think stuff like city Simulators for example are also pretty playable at low fps since most of the scene doesn't actually move... CS:GO on the other hand I find any stuttering extremely jarring, going from a 7700k to a 3600 in that case was a really noticeable upgrade even though the only practical difference in terms of running it with a 144hz screen was that the minimums are better... I don't run that game with a fps counter but stutters in it are very noticeable, same case for when I played R6 siege and warzone... But these are competitive, reaction time based shooters and in siege and csgo I've been reasonably high rank when I played them a lot.
In UT 2004, anything less than 50 fps average is bad, as it messes too much with aiming and fast reactions.
For me in these games stutter below 144 I start to experience this.
 
Joined
Jun 27, 2019
Messages
1,848 (1.05/day)
Location
Hungary
System Name I don't name my systems.
Processor i3-12100F 'power limit removed/-130mV undervolt'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75 Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
Like I said when I first posted on this thread: it depends on the game and it is very subjective.

Yup, thats why I try to avoid such topics if possible cause theres like never an agreement between ppl when it comes to fps and whats playable/enjoyable and whats not.

I'm also on the other end of this scale, to be honest lower fps never bothered me and for a long time I wasn't even checking my frames and just played games with whatever it was running at.

Currently I'm using a 75Hz Freesync Ultrawide monitor and have my max fps locked at 74 in AMD driver just so I can always stay in my Freesync range which gives me the best experience which is 40-75 Hz/fps.
That and also I feel like its a waste to have more fps and generate more heat in my system/draw more power when it makes no difference to me.

That being said I DO NOT play a single competitive game anymore nor have the interest in doing so and for singleplayer games I'm all good as long as I have a stable 40+,genuinely does not bother me and can play games and have fun that way on a daily basis.
Playing Borderlands 3 for example has some drops down to the low 40s or so cause end game builds and stuff can get crazy with effects even with using the tweaked settings from HW unboxed. 'mix of medium/high and ultra'
Yet that does not stop me from soloing the hardest content in the game currently and having fun with the game in general. '~800 hours played so far'
 
Last edited:
Joined
May 8, 2021
Messages
1,978 (1.84/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Currently I'm using a 75Hz Freesync Ultrawide monitor and have my max fps locked at 74 in AMD driver just so I can always stay in my Freesync range which gives me the best experience which is 40-75 Hz/fps.
That and also I feel like its a waste to have more fps and generate more heat in my system/draw more power when it makes no difference to me.
Do you use Radeon Chill for that? I have tried to use it, but it makes fps very unstable in CS:GO.
 
Joined
Jun 27, 2019
Messages
1,848 (1.05/day)
Location
Hungary
System Name I don't name my systems.
Processor i3-12100F 'power limit removed/-130mV undervolt'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75 Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
Do you use Radeon Chill for that? I have tried to use it, but it makes fps very unstable in CS:GO.

I do, but like I said I do not play competitive games and so far I'm yet to run into any issues in singleplayer games.
They are nicely locked to 74 with no frame time mess either. 'unles the game itself is a mess'

Both Min and Max value set to 74 under chill and all kind of sync disabled ingame if possible.

Most of the time I'm also using windowed borderless mode in games since I tend to tab out a lot.
 
Top