• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

TPU's Nostalgic Hardware Club

Can subscribe - have two FX, a 4300 and a 8300 - both get crushed by a 2600k and a Phenom II X6 1055T. And that's with R9 280 and 8GB worth of DDR3.
 
first time I have seen this guy with a beard o_O
i should start a Youtube retro hardware/gaming channel ... i have an awesome beard ...

not lying ...
IMG20230305223328.jpg

Swiss Viking retro hardware themed Youtube channel anyone?

@Greenslade I'm not sure you wanna go down that path. ;)

View attachment 288038
i regreted not keeping my ASRock 970 Extreme3 and the FX 6300 i had ...
 
Is the FX series as bad as people say?
Yeah.
I had an FX 8150 (free) and later an 8300, both were pretty bleh. Neither stood a chance against my i7 920. Only used them because my X58 setup was just too fiddly and would bsod almost every day.
 
Yeah.
I had an FX 8150 (free) and later an 8300, both were pretty bleh. Neither stood a chance against my i7 920. Only used them because my X58 setup was just too fiddly and would bsod almost every day.
i sidegraded my FX 6300 for a i5 4690K ... and i say sidegraded because in games performances were quite alike ...

MAN, now i miss my i7 920 + dual GTX 580 Matrix now ... AAAAAAAAAAAAAAAHHHHH why did i sell all these ... (oh, i know why ... dire time and dire need of money ... THE ROOT OF EVIL!!!)
 
I still have my FX 8320 rig running stock under the wraith air cooler these days. Its only purpose is running Folding @ Home on a GTX 970.
Back when it was my primary gaming rig it was running @ 4.75GHz under water cooling. I put a lot of work into modding the case & rigging extra fans to get enough cooling to the CPU, VRM, & the socket. My mobo is the Sabertooth FX 990 v2.0.
 
Last edited:

Attachments

  • DSC01317.JPG
    DSC01317.JPG
    6.2 MB · Views: 75
  • DSC01315.JPG
    DSC01315.JPG
    6.8 MB · Views: 86
  • DSC01318.JPG
    DSC01318.JPG
    6.4 MB · Views: 74
  • DSC01319.JPG
    DSC01319.JPG
    6.9 MB · Views: 90
Last edited:
Nice! Congrats on that board. It should be fun to setup & play with.
 
Yeah.
I had an FX 8150 (free) and later an 8300, both were pretty bleh. Neither stood a chance against my i7 920. Only used them because my X58 setup was just too fiddly and would bsod almost every day.

There was so much misinformation about the FX; it was a chip that went wide (8 core) rather then Small and speedy like Intel's did. They released a chip in an era where Single core was still king. Thats why it performed so bad in comparison to other models. But if you look at it today, it can still cope up using a RX580 and capable of doing 60FPS gaming, solid.

Ive had one for 2 years approx; before that a 1055T 6 core. I'd say the Vishera was overall faster due to it's clockspeed advantage and DDR3, but it needed to be tweaked. With tweaking we're not talking about the avg OC guide where you raise the multiplier and call it a day, nope. You dig into that platform, and start working on latency's, FSB and everything around it.

You also make sure your cooling is up for it and so does your board. Many boards did'nt exactly specifcy which VRM they had and if they where capable of doing 200W. Thats why you see most OC's of FX chips "end" at 4.5Ghz on avg. Its not because of the limit of the chip but because of the limit of what the board could do in regards of power.

5Ghz was'nt unusual, but it ate power for breakfast at that speed. Ive managed to run 300Mhz FSB with 4.8GHz clock and DDR3 2400Mhz which was technically not supported (only by OC). If you measured that thing against chips today it would fare well against a R1700 or so (769+ points in CB15). The culprit is it's shared resources (cores share the same things) and it's quite long pipelines (overcome by faster clocks).

Thats why the FX or Opteron kind of failed, it just was'nt up for it's expectations compared to the previous generation. But! If you know how and what to tweak there's easily 40% performance on the table. And no it does'nt consume 220W with just playing games. 220W is a scenario where all cores are taxed to the max. These are great OC'ers.

When i replaced that with a 2700x it was day and night. Most remarkable was the minimum FPS in games. With a 2700x that was 100% better then a FX. The FX could be tweaked for games too, it was raising the CPU/NB speed which indirect was responsible for the L3 cache (speed) and would greatly benefit games in that.

Ive spend countless of hours; 5GHz or even 5.2Ghz was possible on just water but it ate power so much that the cooling was'nt up for it. 4.8Ghz is exactly where you wanted a chip like that. Back then it was a perfect cheap 8 core chip that could do better against a i7 in multithreaded things. Many people attest to that.

It was just good value!
 
I’ve been slowly upgrading a 2010 Mac Pro. Started with the base quad core, which I replaced with the x5690. Then found a cheap dual CPU tray with E5620s in it. I have a second x5690 coming. Runs Windows 11 really well, despite the lack of TPM and being a supported CPU!
 
The x2 finally came in, and having some strange issues where installing the GPU driver makes the sound disappear if installed. What's more strange is that it's happening over both the internal AC97 jack and external SB Audigy SE card.
 
Nice! Congrats on that board. It should be fun to setup & play with.
Thanks :)It is a shame i can,t get one of the other three boards going but there is nothing more i can do to get any of them to work :(That Core2Duo one is a pain with the eight pin so near other stuff to get the connecters out. :(

There was so much misinformation about the FX; it was a chip that went wide (8 core) rather then Small and speedy like Intel's did. They released a chip in an era where Single core was still king. Thats why it performed so bad in comparison to other models. But if you look at it today, it can still cope up using a RX580 and capable of doing 60FPS gaming, solid.

Ive had one for 2 years approx; before that a 1055T 6 core. I'd say the Vishera was overall faster due to it's clockspeed advantage and DDR3, but it needed to be tweaked. With tweaking we're not talking about the avg OC guide where you raise the multiplier and call it a day, nope. You dig into that platform, and start working on latency's, FSB and everything around it.

You also make sure your cooling is up for it and so does your board. Many boards did'nt exactly specifcy which VRM they had and if they where capable of doing 200W. Thats why you see most OC's of FX chips "end" at 4.5Ghz on avg. Its not because of the limit of the chip but because of the limit of what the board could do in regards of power.

5Ghz was'nt unusual, but it ate power for breakfast at that speed. Ive managed to run 300Mhz FSB with 4.8GHz clock and DDR3 2400Mhz which was technically not supported (only by OC). If you measured that thing against chips today it would fare well against a R1700 or so (769+ points in CB15). The culprit is it's shared resources (cores share the same things) and it's quite long pipelines (overcome by faster clocks).

Thats why the FX or Opteron kind of failed, it just was'nt up for it's expectations compared to the previous generation. But! If you know how and what to tweak there's easily 40% performance on the table. And no it does'nt consume 220W with just playing games. 220W is a scenario where all cores are taxed to the max. These are great OC'ers.

When i replaced that with a 2700x it was day and night. Most remarkable was the minimum FPS in games. With a 2700x that was 100% better then a FX. The FX could be tweaked for games too, it was raising the CPU/NB speed which indirect was responsible for the L3 cache (speed) and would greatly benefit games in that.

Ive spend countless of hours; 5GHz or even 5.2Ghz was possible on just water but it ate power so much that the cooling was'nt up for it. 4.8Ghz is exactly where you wanted a chip like that. Back then it was a perfect cheap 8 core chip that could do better against a i7 in multithreaded things. Many people attest to that.

It was just good value!
Thanks for all that info on it.:) Defiantly give it a miss.

Jism

:)
 
Last edited:
I have one gig 78LMT here, think it is now w. FX4300, a bit tuned, going on W10, but it took a few moments to iron it out, worked good.
Not been on online with it for a few months though, other things came along.
dang. your post made me feel I should try if ithe fx4300 still kicks.
 
Last edited:
I have a 78LMT-S2P w/ 4300 as well. Does run snappy in Windows but I shudder to think of how bad the VRM would probably throttle the CPU down - 3+1 phase really has no place powering a 95W TDP chip.
 
Here goes. The SATA 6 vs. 3 Gb/s thing is a slight waste but runs good anyway.

gig_78lmt.jpg
 
Last edited:
Put a waterblock on 290X and worst case scenario temps dropped from 94C to ~70C. I think that the cooler is somewhat defective, it's a Matrix Platinum card so the cooler should be more than enough.
 
Infinity NF4 SLi.png

Finally got the Manchester x2 4600+ running. There's still a few audio things to fix (any audio output cuts out the second I enter Vista, XP x64 works fine) but so far it doesn't look too bad and feels surprisingly snappy.

Specs:

- DFI Infinity nF4 SLi
- 4GB DDR400 (maximum that will officially work and be attainable - I'm fairly sure there is unregistered DDR400 but no idea whether it even would POST on a nF4 chipset)
- Palit Geforce 8600GT Sonic+ 256MB DDR3 128bit
- 160GB WD Caviar SE
- Athlon 64 x2 4600+ Socket 939
- LC-Power LC-420H-12 PSU
 
^nice. The onboard Realtek ALC655 6-Channel codec should be possible on Vista, IIRC.
 
^nice. The onboard Realtek ALC655 6-Channel codec should be possible on Vista just nice.
It shows as working, but no audio plays on my Samsung LE32B350's speakers. I have an AUX 3.5mm patch cord and I have sound on XP but not Vista.

I did an experiment and it turns out any audio input from the 3.5mm gets killed the moment it switches from the boot screen to the logon process. No idea why that happens - I tested a HD4850 just to be sure, and it does not kill onboard audio input when switching - it only happens on any Geforce card except my PX8800GTX which does seem to output audio on its own over the DVI-D to HDMI adapter I'm using.

Of course - standalone PC speakers work regardless of the GPU - it's only the TV's PC audio input that gets killed during Vista's logon - I know this because I specifically changed the startup and logon sound to be the OG PSone startup sound, so it plays for longer.

What OSes I haven't tested yet are 7 SP1 and 10 Enterprise. While 7 might be a viable alternative to Vista SP2 (and it's a compromise I'm willing to make), no way in all hell 10 is likely to run acceptable on it - it's probably more of a proof of concept if anything - even if I did run 10 Enterprise LTSC on Brisbane AM2-based units beforehand - with slightly mixed results.
 
Last edited:
I cant' remember, but was there not some "audio" cable needed btwn the nvidia GPU and motherboard in some older systems.
 
I cant' remeber, but was there not some "audio" cable needed btwn the nvidia GPU and motherboard in some older systems.
I think that would be SPDIF. There's a separate 2 pin header next to the SLi goldfinger, though I have 0 clues about to what purpose does it serve.

I do also have a Dell OEM'd breakout cable (which resembles a 7-pin S-Video, which seems to be present on the GPU) though it seems to only have one digital output, while my mobo features two SPDIF channels onboard.
 
Unfortunately it might be a more wider issue than that - same thing happens over a 8500GT and a Quadro FX3700 (both which lack the SPDIF connector, as far as I remember) so there's clearly something that Vista SP2 does wrong at the very least.

I'm going to give 7 SP1 a go, just to rule out the OS and GPU drivers being the culprit.
 
I have had a couple times problems with getting the 3,5 audio to play, fortunately it was possible to make them work via the providers audio control panel, but yes,
had to take a couple of sessions on youtube to see how others solved it.
 
Last edited:
View attachment 288261
Finally got the Manchester x2 4600+ running. There's still a few audio things to fix (any audio output cuts out the second I enter Vista, XP x64 works fine) but so far it doesn't look too bad and feels surprisingly snappy.

Specs:

- DFI Infinity nF4 SLi
- 4GB DDR400 (maximum that will officially work and be attainable - I'm fairly sure there is unregistered DDR400 but no idea whether it even would POST on a nF4 chipset)
- Palit Geforce 8600GT Sonic+ 256MB DDR3 128bit
- 160GB WD Caviar SE
- Athlon 64 x2 4600+ Socket 939
- LC-Power LC-420H-12 PSU
Almost a perfect 939 rig. I'd just get a 4400+, 4800+ or FX-60; a CPU with 2x 1MB L2 and a 8800 GT or a HD 3870. But looks great already! :toast:

edit: though I'd swap that PSU ASAP. LC-crap is... crap.
 
I’ve been slowly upgrading a 2010 Mac Pro. Started with the base quad core, which I replaced with the x5690. Then found a cheap dual CPU tray with E5620s in it. I have a second x5690 coming. Runs Windows 11 really well, despite the lack of TPM and being a supported CPU!

I have one gig 78LMT here, think it is now w. FX4300, a bit tuned, going on W10, but it took a few moments to iron it out, worked good.
Not been on online with it for a few months though, other things came along.
dang. your post made me feel I should try if ithe fx4300 still kicks.
Go for it :)
 
Back
Top