efficiency and power consumption of computers
July 7, 2005 10:30 AM Subscribe
I have two computers, each with something like 350W power supplies. By keeping them on 24/7, am I consuming the equivalent of seven 100-watt light bulbs around the clock? What can I do (either software or hardware) to make my machines more power-efficient?
Also - while I'm on the subject - does an incandescent bulb consume less when it's on a dimmer and that dimmer is turned down?
I'm an electricity moron, obviously.
I'm an electricity moron, obviously.
Your computers will not consume 350W each. They only consume what the components inside require. The 350W rating just tells you what the upper limit is for that particular power supply. The best way to reduce power usage is by setting your hard drives to turn off after set amount of minutes. If you have CRT monitor, that is probably the most important thing you can turn off when you are not using your computer.
start -> settings -> control panel -> power options.
posted by Sonic_Molson at 10:42 AM on July 7, 2005
start -> settings -> control panel -> power options.
posted by Sonic_Molson at 10:42 AM on July 7, 2005
i think that's the max output rating - the amount of power your computer can use. so it's not the actual amount the power supply uses (it's not 100% efficient, so the power supply actually uses some extra) when on maximum load. on the other hand, i doubt that the computer really draws 350 watts all the time (i would guess it's highest at startup).
so the answer is that "it depends", but probably they're using less. how much less? no idea. a bit of googling turns up this rant which seems to suggest 200 watts for a computer and 150 for a crt monitor (although if it's a modern monitor that will only be during use - and when a computer is unused it'll drop too).
so maybe 3 or 4 light bulbs on average?
as for dimmers - a modern dimmer works by switching the light on and off at 100Hz (i believe it delays turning back on after each zero corssing of the ac signal). so in theory yes, it's using less power. but it's doing so in a way that is probably not that friendly to the supply system - someone smarter than me would need to say whether the "saved" energy could be actually used by something else, or whether it's dissipated as heat somewhere.
posted by andrew cooke at 10:43 AM on July 7, 2005
so the answer is that "it depends", but probably they're using less. how much less? no idea. a bit of googling turns up this rant which seems to suggest 200 watts for a computer and 150 for a crt monitor (although if it's a modern monitor that will only be during use - and when a computer is unused it'll drop too).
so maybe 3 or 4 light bulbs on average?
as for dimmers - a modern dimmer works by switching the light on and off at 100Hz (i believe it delays turning back on after each zero corssing of the ac signal). so in theory yes, it's using less power. but it's doing so in a way that is probably not that friendly to the supply system - someone smarter than me would need to say whether the "saved" energy could be actually used by something else, or whether it's dissipated as heat somewhere.
posted by andrew cooke at 10:43 AM on July 7, 2005
Modern switching PSUs should be 70% - 80% efficient. Here's a table of the power average PC components use. Those, you should know, are MAXIMUM numbers. Just like your stove could draw a maximum of over 6000 watts (probably -- I'm not an electrician) it doesn't normally since you don't leave the broiler, oven, all 4 burners on, and a kettle plugged into each outlet built into it at the same time. But it has to support that lest someone do something silly like that.
I would think that 100 watts would be a reasonable average for a "normal" PC to use.
so in theory yes, it's using less power. but it's doing so in a way that is probably not that friendly to the supply system - someone smarter than me would need to say whether the "saved" energy could be actually used by something else, or whether it's dissipated as heat somewhere.
What would happen, if you hooked a dimmer to a switching PSU, apart from probably damaging it, would be that the PSU would draw electricity from its capacitors during the "outages" and would replenish the power used when the power was available again. So, during the time the power is available, the load is MUCH higher, since it's powering up the capacitors *AND* the computer. It would average out about the same (likely more for the dimmer since it is invariably more inefficient).
posted by shepd at 11:09 AM on July 7, 2005
I would think that 100 watts would be a reasonable average for a "normal" PC to use.
so in theory yes, it's using less power. but it's doing so in a way that is probably not that friendly to the supply system - someone smarter than me would need to say whether the "saved" energy could be actually used by something else, or whether it's dissipated as heat somewhere.
What would happen, if you hooked a dimmer to a switching PSU, apart from probably damaging it, would be that the PSU would draw electricity from its capacitors during the "outages" and would replenish the power used when the power was available again. So, during the time the power is available, the load is MUCH higher, since it's powering up the capacitors *AND* the computer. It would average out about the same (likely more for the dimmer since it is invariably more inefficient).
posted by shepd at 11:09 AM on July 7, 2005
That 350 watt figure is not only the maximum rating, but also very likely to be a lie in any case, as buyers want to see a big number regardless of their actual power needs. A modern PC rarely draws even 200 watts unless running at full tilt. An older PC will draw considerably less. A system with working power management features will draw quite a bit less over time.
The largest consumer of electricity in a typical PC setup is a CRT monitor. Run that as little as possible, or consider replacing it with an LCD.
posted by majick at 11:18 AM on July 7, 2005
The largest consumer of electricity in a typical PC setup is a CRT monitor. Run that as little as possible, or consider replacing it with an LCD.
posted by majick at 11:18 AM on July 7, 2005
Computer power supplies are somewhere between 60% and 80% efficient depending on manufacturer and the relative load (percentage of full load). Here is a tomshardware page on power supply efficiency.
There are a lot of factors effecting the dimmer circuit, I think someone else might be able to do better than me at it though. Anyway, the details don't really matter. In my experience dimmer switches only ever get 'slightly warm', so really all you have to do is compare how much the temperature of the light changes when dimmed.
posted by Chuckles at 11:18 AM on July 7, 2005
There are a lot of factors effecting the dimmer circuit, I think someone else might be able to do better than me at it though. Anyway, the details don't really matter. In my experience dimmer switches only ever get 'slightly warm', so really all you have to do is compare how much the temperature of the light changes when dimmed.
posted by Chuckles at 11:18 AM on July 7, 2005
1 - i wasn't thinking you'd use the dimmer with the computer - i was just answering the second part of the question.
2 - i wasn't expecting the heat to be dissipated in the dimmer (the switching process i described is used rather than a rheostat precisely because it doesn't waste heat directly). rather it might be dissipated in all connected devices (and some radiated as rf noise, i guess).
posted by andrew cooke at 11:21 AM on July 7, 2005
2 - i wasn't expecting the heat to be dissipated in the dimmer (the switching process i described is used rather than a rheostat precisely because it doesn't waste heat directly). rather it might be dissipated in all connected devices (and some radiated as rf noise, i guess).
posted by andrew cooke at 11:21 AM on July 7, 2005
I read that the average PC consumes the equivalent of one 100-watt lightbulb. I did the math for myself and worked out about $5/month per PC based on my current electricity rates. This is assuming monitors are off.
posted by knave at 11:32 AM on July 7, 2005
posted by knave at 11:32 AM on July 7, 2005
shepd, that dimmer on a computer power supply thing is pretty funny!
As for what would happen... The rectifier and bulk capacitors which make up the first stage in the power supply wouldn't even notice the 'outage' unless it lasted more than 90 deg of the voltage waveform. This is because the voltage on the bulk capacitors doesn't really change much over a cycle, so the conduction angle of the rectifier is very short, even under full load it should be less than 10 degrees I guess.
Err, to put this a different way, as long as the dimmer doesn't cut the peak of the voltage waveform the power supply won't care. Once the peak voltage is cut significantly it will probably quit working altogether (like a brown out condition).
posted by Chuckles at 11:32 AM on July 7, 2005
As for what would happen... The rectifier and bulk capacitors which make up the first stage in the power supply wouldn't even notice the 'outage' unless it lasted more than 90 deg of the voltage waveform. This is because the voltage on the bulk capacitors doesn't really change much over a cycle, so the conduction angle of the rectifier is very short, even under full load it should be less than 10 degrees I guess.
Err, to put this a different way, as long as the dimmer doesn't cut the peak of the voltage waveform the power supply won't care. Once the peak voltage is cut significantly it will probably quit working altogether (like a brown out condition).
posted by Chuckles at 11:32 AM on July 7, 2005
Since we're talking about dimmers anyway, anyone know why you wouldn't/couldn't simply use an transformer (autotransformer?) for that? Seems a much simpler and more elegant solution, to my electronics-ignorant self.
posted by kickingtheground at 11:33 AM on July 7, 2005
posted by kickingtheground at 11:33 AM on July 7, 2005
Money. A 100W transformer without adjustment ability is worth a few dollars in manufacturing costs. Add the adjustment feature and you will at least double the cost, so maybe $5. It is also heavy, making packaging and shipping expensive too.
The two triacs cost less than a dollar, possibly less than $0.50...
posted by Chuckles at 11:43 AM on July 7, 2005
The two triacs cost less than a dollar, possibly less than $0.50...
posted by Chuckles at 11:43 AM on July 7, 2005
If you are really concerned with power usage and you have AMD chips you might be able to easily modify them so your OS thinks you have laptop chips. You could then enable something like speed-step through your power management settings. Basically underclocking the cpu for less power usage when not alot of processing is needed.
kickingtheground: the typical dimmer uses a triac, a potentiometer, capacitor(s), and maybe resistors. These are ALOT cheaper than an autotransformer. Ballpark might be 5% of the cost of an autotransformer.
posted by 6550 at 11:45 AM on July 7, 2005
kickingtheground: the typical dimmer uses a triac, a potentiometer, capacitor(s), and maybe resistors. These are ALOT cheaper than an autotransformer. Ballpark might be 5% of the cost of an autotransformer.
posted by 6550 at 11:45 AM on July 7, 2005
If you are really concerned with power usage and you have AMD chips you might be able to easily modify them so your OS thinks you have laptop chips. You could then enable something like speed-step through your power management settings. Basically underclocking the cpu for less power usage when not alot of processing is needed.
on my AMD motherboard, it's simply an option in the BIOS--no modification necessary. it's a good idea though.
posted by Ziggy Zaga at 3:36 PM on July 7, 2005
on my AMD motherboard, it's simply an option in the BIOS--no modification necessary. it's a good idea though.
posted by Ziggy Zaga at 3:36 PM on July 7, 2005
MSI boards typically come with CoreCell (I think), which is a program for overclocking, and can be used to underclock, as well.
posted by angry modem at 3:46 PM on July 7, 2005
posted by angry modem at 3:46 PM on July 7, 2005
I should reiterate that you can do so while running Windows.
posted by angry modem at 3:47 PM on July 7, 2005
posted by angry modem at 3:47 PM on July 7, 2005
A friend of mine has an Athlon64 'Laptop' (Boat Anchor). Normally the battery would last 55 minutes. Underclocking it to just %90 of max increases the battery life to 2.5 hours. Underclocking to %50 gets him 5 hours or more.
Still works great for playing DVDs when dramatically underclocked, and sounds much less like a hyena screaming through a jet turbine.
posted by blasdelf at 12:58 AM on July 8, 2005
Still works great for playing DVDs when dramatically underclocked, and sounds much less like a hyena screaming through a jet turbine.
posted by blasdelf at 12:58 AM on July 8, 2005
A friend of mine has an Athlon64 'Laptop' (Boat Anchor). Normally the battery would last 55 minutes. Underclocking it to just %90 of max increases the battery life to 2.5 hours. Underclocking to %50 gets him 5 hours or more.
Does this not make sense to anyone else? Assuming the same processor and duty cycle, wouldn't reducing the processor voltage be the only way to reduce power consumption? And while you can do this to an underclocked processor, there is a limited range of only a few percent. Even if by "underclocking" you mean "reducing the duty cycle", and the processor was the only thing using power on the laptop, a 50% reduction from a 55 minute battery would still only be 110 minutes. There must be something else at work here. Maybe that hyena fan really sucks the watts.
posted by trevyn at 2:22 AM on July 8, 2005
Does this not make sense to anyone else? Assuming the same processor and duty cycle, wouldn't reducing the processor voltage be the only way to reduce power consumption? And while you can do this to an underclocked processor, there is a limited range of only a few percent. Even if by "underclocking" you mean "reducing the duty cycle", and the processor was the only thing using power on the laptop, a 50% reduction from a 55 minute battery would still only be 110 minutes. There must be something else at work here. Maybe that hyena fan really sucks the watts.
posted by trevyn at 2:22 AM on July 8, 2005
This thread is closed to new comments.
posted by Jairus at 10:40 AM on July 7, 2005