• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

TPU's F@H Team

Yes, but can I squeeze even more points if I couple strongest GPU's to stronger CPU's?
On my testbench I am currently running a 1080 and 1080 Ti side by side on a Phenom II 955. I think that comes with a penalty in PPD.
I was thinking about swapping the 1080 Ti with another plain 1080, but then i have to be sure the 1080 Ti will fit the other PC case.

And look out for @mstenholm, the next couple of days he is going to make Top200 rank !
I've decided that single-thread execution speed is the key factor for Folding GPU/CPU and of course, sufficient cores available to feed & control the GPU.
I recently upgraded a dual Xeon folder CPUs and saw a definite increase in Folding efficiency.
I did a blog post about the transition:
 
I have a PC, basically an hp i5 bussiness machine with a GTX 1080, getting more than double PPD than my ex-gaming PC with a 1080 Ti (the one in my sig), doing the same type of work units? :kookoo:
The 1080 Ti core and memory clock are running at max frequency, but the card itself isn't hitting temperature or power limits.

Upon further inspection, I see the memory controller load on the GTX 1080 is relatively higher in GPU-Z (1080 memory controller load 60% vs 1080 Ti memory controller load 20%).
Overall the 1080 Ti should be better in absolute terms than a plain 1080 ? It looks as if it just sits there, wasting cycles at high clock speeds.

The 1080 is taking 3h to finish a WU , the 1080 Ti is going to take 5h.
After finishing the WU, I decided to reboot the machine, because the next WU was also going to take to much time.
After reboot, the memory controller load on the 1080 Ti went up to 50% and TPF dropped considerably.

It's worth to check on TPF once in a while and reboot if necessary when the GPU gets locked in an unfavourable state and is not performing as expected.
 
Last edited:
Check it out, I'm about to join the 1% :D

1618255221345.png
 
With buying new or used GPU's now out of the picture, i am looking at replacing some systems that are still running on 2011's athlon/phenom processors.
I already have some lower end Skylake i5 systems running, with good results, so I am thinking about getting more of these.
 
That's a good approach. With CPU attached dual PCIe x16 slots running at x8 you can get good performance.
This should give your single-thread execution speed a generous boost over that vintage gear.
The other approach would be eBay server-class systems and mining frames, which is what I did. These systems can support several GPU per frame with PCIe x16 or x8 lanes.
I have surplus slot capacity now for Folding.
Edit:
I should add that I saw no advantage for PCIe4 over PCIe3 with the RTX3070 if anyone was wondering about an architecture update.
The recently announced Nvidia Grace architecture made me think that it was an effort to eliminate the bottlenecks in current architectures that exhibit results as I had seen.
Perhaps Nvlink and Grace-like GPU connections will be the next quantum leap in compute and gaming, certainly, PCIe4 wasn't it for Folding.
 
Last edited:
My folding experiences of late have left me... a bit puzzled, let's say. So, the suspects:

Sapphire RX 470 8G Mining Edition
XFX R9 270 DD
EVGA GT 1030 Passive
MSI R7 250
Powercolor HD 7750
Visiontek HD 7750

The 470's a friggin' champ (IMO). Sits there folding at 100% in a too-hot case, and the temp won't crack 60C. OK, fine. Let's go OC-ing. 1400@+18mV and still under 60. :confused: It doesn't seem to want to go higher without more volts, which I haven't tried yet. But this is either a killer chip, or the cooling solution is over-spec'ed, or F@H doesn't load it right. Or I'm an idiot. Can't rule that out.

Not much to say about the 270. It's underclocked and -volted to keep the temps around 70C with the fans at 40% (they don't like going faster). It's mostly a comparison point to the other three Radeons, which do NOT produce like I expected. Figured with each rocking half the shaders and ROPs of the 270, together they'd crank out ~150% of its output. Nope. Half. Combined. WTF? I yanked 'em, cuz that's just pointless.

The 1030, uh, just sits there and chugs away, knocking out just over 100K/d. Super boring, which I suppose is about what one would expect from a 1030. Semi-related, part of the reason I'd picked up the little Radeons was because the TPU performance chart had them only a few percent behind the 1030, lending support to my faulty hypothesis that I could maybe see 70-80K from each. *sigh*
 
Having another go at folding. I gave it a go when it turned cold late last year, but couldn't afford the electric on top of the heating bills. Now my costs have dropped with long days, solar panels and warmer weather, decided to fire up the GTX 1060. Looks like CUDA support has improved - last couple of wu have clocked in at around 600K ppd (it only earned 200K ppd before), so I hope I can afford the electric for a few months :)
 
Last edited:
Having another go at folding. I gave it a go when it turned cold late last year, but couldn't afford the electric on top of the heating bills. Now my costs have dropped with long days, solar panels and warner weather, decided to fire up the GTX 1060. Looks like CUDA support has improved - last couple of wu have clocked in at around 600 ppd (it only earned 200 ppd before), so I hope I can afford the electric for a few months :)

600K PPD?
 
Collection server 140.163.4.210 seems not to be working. Watching my points drop significantly on wu that can't upload when finished :(
 
I really hate that :( Did it manage to get uploaded ok eventually @debs3759 ??
 
It did eventually. Now I have another which isn't uploading to a different server.
I hate it when that happens :( Costs you time and points !! :(
 
Oh well, two that aren't uploading at the moment. Over 200K points lost on just those two so far. At least the science benefits if they ever do upload :)
 
Everything seems to be working right now, and I decided to fold on my CPU as well, so my ppd by the weekend should be in the region of 650K to 700K. Hope to upgrade to a PCI-E 4.0 system by the end of the year (probably AMD, as they give the most threads for the money) and then an RTX 4000 series GPU when they are available next year.
 
Everything seems to be working right now, and I decided to fold on my CPU as well, so my ppd by the weekend should be in the region of 650K to 700K. Hope to upgrade to a PCI-E 4.0 system by the end of the year (probably AMD, as they give the most threads for the money) and then an RTX 4000 series GPU when they are available next year.
We recommend that you crunch on the CPU, that is, run World Community Grid on the CPU and F@H on the GPU.
 
Dammit. wu worth 220K was at 99.98% and I opened Photoshop. It crashed the core and caused the wu to fail :( No bloody idea why that happened!

We recommend that you crunch on the CPU, that is, run World Community Grid on the CPU and F@H on the GPU.
Why?
 
HOW DARE YOU QUESTION THE BIG BLUE FUZZY!? Okay, just kidding. The biggest reason is just how nicely, in comparison to F@H, BOINC gets along with what else you're doing. I think that F@H is a prima donna, while BOINC/WCG is quite the gentleman. I also don't think F@H is that efficient running on the CPU in comparison to WCG.
 
We recommend that you crunch on the CPU, that is, run World Community Grid on the CPU and F@H on the GPU.

All of F@H's work is MUCH more efficiently run on GPUs, while most of the WUs on WCG will only run on a CPU. Put more simply, your CPU time is more valuable to WCG than to F@H.
 
HOW DARE YOU QUESTION THE BIG BLUE FUZZY!? Okay, just kidding. The biggest reason is just how nicely, in comparison to F@H, BOINC gets along with what else you're doing. I think that F@H is a prima donna, while BOINC/WCG is quite the gentleman. I also don't think F@H is that efficient running on the CPU in comparison to WCG.

All of F@H's work is MUCH more efficiently run on GPUs, while most of the WUs on WCG will only run on a CPU. Put more simply, your CPU time is more valuable to WCG than to F@H.

Thanks. I'll think about it. Another good thing I see is that I can opt to just work on covid research.
 
I"m going to shutting down for a couple of days due to a pending heatwave here in the SF Bay Area. While our temps will only get into the '80's (F). Compared to the rest of the US and indeed the world, it doesn't seem like much. But in a place where summer temps rarely break out of the '60's (F) it's downright blazing. Only the most modern of dwellings have A/C. I live in what passes for a downright ancient apartment here in California. Therefore, no A/C. I don't like sweating while I'm just watching TV. Yeah, I'm spoiled, can't help it. I was born and raised here. :)
 
I"m going to shutting down for a couple of days due to a pending heatwave here in the SF Bay Area. While our temps will only get into the '80's (F). Compared to the rest of the US and indeed the world, it doesn't seem like much. But in a place where summer temps rarely break out of the '60's (F) it's downright blazing. Only the most modern of dwellings have A/C. I live in what passes for a downright ancient apartment here in California. Therefore, no A/C. I don't like sweating while I'm just watching TV. Yeah, I'm spoiled, can't help it. I was born and raised here. :)

Understandable. Heading for 115F/46C here (few hours south of you) on Friday. Hoping the grid can keep up.
 
I"m going to shutting down for a couple of days due to a pending heatwave here in the SF Bay Area. While our temps will only get into the '80's (F). Compared to the rest of the US and indeed the world, it doesn't seem like much. But in a place where summer temps rarely break out of the '60's (F) it's downright blazing. Only the most modern of dwellings have A/C. I live in what passes for a downright ancient apartment here in California. Therefore, no A/C. I don't like sweating while I'm just watching TV. Yeah, I'm spoiled, can't help it. I was born and raised here. :)
Ahh, Oakland in mid June. I was there and Alameda last June 19th. I recommend finding a friend with a boat...the temps in the bay are quite cooler.
 
Looks like I'm going to have to dig out my watercooling kit and fire up the chiller. Had 3 failed wu on my GTX 1060 in the last 48 hours, with the GPU only at 76C and not overclocked. Just fired up Afterburner to set the fan at 100%, which has dropped the temp to 69. Hopefully I can find a GPU waterblock for the 1060 before the next heatwave, keep temps below 40.

Damn, another one failed. Looks like my GPU might be failing :(
 
Back
Top