Saturday, March 03, 2007

The cost of distributed computing projects like SETI and Folding at Home

Wonder no more how much it cost to run SETI, Folding at Home or the other distributed computing projects. With the help of a simple yet powerful tool we can unlock the secret mystery out of how much F@H cost you.

Let's say you run BOINC at 100% CPU usage. By looking up you PC model number, looking in the manual or looking on the back of your power supply you should be able to find the maximum number of watts the power supply is capable of using.

watts per hour * hours per day * days / 1000 * Cost Per kW = $ Total Cost

kW = kilowatt, 1 kW is equal to 1,000 watts

This computers power supply is says it uses 240 watts per hour max and right now a kW cost about 15 cents.

So the cost of F@H for 22 hours a day, 350 days out of the year :

240 * 22 * 350 / 1000 * .15 = $277.20 per year (not including taxes and fees)

Or to run BOINC at 100 percent CPU usage 24 hours a day for this month (March) :

240 * 24 * 31 / 1000 * .15 = $26.78

Personally I have Folding at Home set to only use 11% of the CPU and BOINC set to use 10%. I don't want the PC working hard all the time nor do I want to hear the fan running all the time trying to cool the CPU down.

Running distributed computing programs at low settings lets me keep the energy cost down, keep the noise levels down and keep the amount of heat from the PC to a minimum.

- Please let me know if my math is wrong :)


Lithia Chevrolet said...

Interesting thought, I didn't know those distributed computing things were still going. Haven't heard about them for years.

On the power though. If you power supply says its a 250 watt. That is the max sustained power it can put out.

That means you can keep adding componants untill you reach that level (although maing out your supply is not recomended). That does not mean that's what the usage will be with your current componants. When everything is running at full speed you are probably 1/2 to 3/4 of that.

Me said...

You're right.

However as you are probably aware most of these projects default to use 100% of the CPU power.

Given this I think it's OK to assume the power usage is somewhere around the max watt load the power supply can handle.

But really this is for a general idea of the top end cost (worst case scenario) of running these programs.

Anonymous said...

If you're going to have the computer on anyway, it's not fair to count the whole power usage of the computer. I've actually measured this, and on my system, there's only about a 60 watt difference between max and idle CPU.