Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It motivated me to put everything on a timer that only turns power own in the evenings, since that's the only time they get used.

I was surprised to learn that a timer itself also uses power. I borrowed a Kill-a-watt from the library and found that an 2 decades old timer uses 2.3W while a newer one uses 0.6W. That tells me that I should just keep the old timer for the rare occasions.



I suppose you should consider the cost of the new timer vs the cost of the electricity at that point.

If a new timer is $20, and you're paying the US National Average for electricity of $0.165 / kwh. The new one is 0.6 and the old one is 2.3, for a 1.7 watt difference.

Doing the math, $20/$0.165=121.212kwh, or 121212 watt hours / 1.7 watts per hour = 71301.2 hours / 8760 hours in a year (not counting leap hours) = ~8.1 years for the device to pay for itself in savings.

If you're worried about the environment, then it's wasteful to take a moderately efficient system and upgrade it. If you're worried about cost, then you're not saving that much money. If you're worried about overall value for your effort, there are better things to focus on.

Even if it were perfectly efficient (0 watt standby) then it would still take 6 years to pay for itself if it were $20.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: