Discussion in 'World Community Grid (WCG)' started by Chicken Patty, Feb 20, 2009.
CPUs only, although the GPUs also ran a hair longer - around 16 minutes.
QX9770 - 4 WUs in parallel [4 cores] in 9 hours each: 4 / 9 = 0.44 of a WU completed per hour
FX-8350 - 8 WUs in parallel [8 cores] in 13 hours each: 8 / 13 = 0.62 of a WU completed per hour
8350 wins by doing more work overall
I have problems with FA@H with my FX-8320, too. They sometimes hang and "jerk in place" for hours. Even had to abort a couple of its WUs.
Mine aren't hanging.
They keep on going - 1/1000 of a percentage point at a time.
What do you mean by "jerk around" though?
It sounds normal to me since different WCG projects take different runtime to complete.
If you run only HCC (CPU or GPU) the WU are all roughly the same size and so you get very good estimates for completion times. Other projects can have both long and short runtimes (perhaps what you are seeing with FA@H?).. and when you mix them all up it sort of ends up with a big soup of estimates. Check in a day or so, the estimates will probably get better with time.
The discrepancy of 10 to 13 hours for CEP WU's makes me think that there's nothing wrong. Try compare your average PPD over a few days with someone else in the team running a similar setup, that should be the best indicator of it running fine.
CEP2 WUs generally take between ten and twelve hours to complete--they're about twelve hours on my Atom and Opty 4P but not that much shorter even on an OCed i7. I'm not sure why they behave like this.
With FAAH it's very different--thirty hours on the Opty 4P or maybe five/six on the 3930k.
I am going to be selling one of my VisionTek HD 7970's this weekend and wanted to give crunchers heads-up before I put it on a FS thread. PM me if interested. No discussion in this thread though please.
Correct your app_config.xml file.
Currently you have:
It should be:
max_concurrent indicates maximum amount of WUs (GPU AND CPU). Boinc will always give GPU WU a higher priority.
EDIT: NVM, I see you already corrected the situation.
I just brought an E8500 online--it's running on a Gigabyte EP45-UD3R under the Thermalcrap Pisswater that I was posting about earlier. With cable ties holding down the waterblock, temps aren't too bad---mid 40s C at stock speeds. I know that this board is legendary for its ability to OC, so I'm going to try for 4GHz this weekend
That's epic [Ion] awesomeness right there,
Cable ties? I thought you were using 3 HDDs to hold it down?
When crunching, my E8400 gets in the low 40s, on air with a 120mm fan @ <900 rpm (the heatsink is a Asus Triton 75). Although honesty demands that I state that it is pretty cool here in the room, 16-19 Celsius (don't feel like putting on the heating because I like the room temperature well enough).
But I seem to forget that your room temps are way higher with all those systems running, so the temps you mention are pretty good.
This is pretty much the original WC setup--like from S939 days. I feel like it will be more efficient if I can prop it up on something--as is, there isn't much space for the hot air to get out of the radiator.
It's certainly not 16-19C in here--I'd guess just shy of 80F or so.
And t_ski and manofthem, I did have HDDs holding the block in place, but that didn't seem very stable to me--now I'm using four huge fable ties, which holds it in place pretty well. It can still rotate a bit, but not too bad IMO
lol, 4GiB of RAM was way more than enough for my second cruncher.
With 8 WU running the total memory usage of the whole system doesn't go past 800MiB
Can't wait to see how much points that second cruncher will net me over a full Free-DC day
picked up another high rpm PWM fan (BitFenix Spectre 120) for a push/pull config on the Hyper 212 Evo. Also grabbed a Phillips 21.5" IPS LED http://www.philips.ca/c/pc-monitor/e-line-21.5-inch-54.6-cm-227e3qphsu_27/prd/en/ for the 24/7 cruncher (i5 2400/HD 7770)
** EDIT- Made it!
would it be worth strapping a 2nd 80mm on my hyper 101i ? im kind of anal about things and i love the way the 212 EVO cools with 2 fans. Just wondering if its worth it on a 101i.
Worth a shot if you already have the fan or can get one cheap.... 1 fan/2 fan setup choice is often trial and error.
It appears that Free-DC is evolving with new graphs and layout tweaks. RIP the old style pie charts.
Ahh, so it would seem.
Do you guys have any idea how much longer the GPU WUs are expected to last? I figure only a few days if that...
What's the most efficient crunching card? Cost, performance, power draw ratio wise?
Most likely would be the 7770 ($110/85w/35k ppd) or 7850 ($175/130w/50k ppd) cost/power/ppd is nearly linear so you can find a card within your budget.
However, the current gpu project is nearly finished AFAIK
*** EDIT- based on the WCG post announcing appr. 25 days left on 4/2/2013, this may be the last week of the project ***
I've now crunched over a million work units
Not bad I'm aiming for 1M for HCC1 alone but I'm 55691 short with around 6500/day so it will be close.
BTW I'm shutting rigs down due to the fact that they have now run for 60 days without adding water to the loop. They will be back online when I return to them in x days from now.
Separate names with a comma.