On Mon, Oct 10, 2016 at 01:07:54PM -0700, Joe Zeff wrote:
On 10/10/2016 12:49 PM, Patrick O'Callaghan wrote:
Well, apparently the savings might not offset the price of the ammeter, but I suppose the advancement of knowledge always has a cost :-)
I remember back in the late '80s seeing a study that found that the cost per hour of running a desktop (386 with a CRT monitor) was about $.04/hr. Adjust that for inflation and you'll get a fairly good first approximation of what the cost is now because while modern monitors use less power, today's more powerful computers use more, balancing things out to some extent.
I've missed the beginning of this thread, so I beg forgiveness should this response be off-base.
To measure the power my computer uses, I use a Kill-a-Watt meter, at last notice they were available from Amazon. The one I have was something like 20 dollars, a few years ago. It continuously tells me the wattage used. alternatively it can report several other measurements as well (it isn't where I am, so I won't go out on a limb with a list that may be wrong.)
You plug it into your outlet, then plug the computer into it. Press the right button on the front to get the measurement you want.
It is reporting that my computer uses around 117 watts (plus or minus a couple) when running FoldingAtHome on both a six-core AMD and the Nvidia 750Ti GPU. without the FAH client it would probably be around half that (I haven't done that measurement).