Re: how much electricity does it take?


Pinky <pinky14@...>
 

Hi Jim,

Here is an article that will answer your question.

Tested: Should You Unplug Chargers When You’re Not Using Them?

by

Chris Hoffman

on October 21st, 2015  

banner end

Energy Savings With Turning Off Electrical Appliances

How much energy do your smartphone, laptop, and tablet chargers really use? Should you unplug them when you aren’t using them to save power and money?

We measured exactly how much power a variety of common chargers use, and how much keeping them plugged in will cost your each year.

You’ve probably heard of “vampire power” — the amount of power a device uses in standby mode when you aren’t using it. But just how much vampire power

does a charger use, and is it worth the hassle of unplugging them when you aren’t using them?

How We Measured It — and How You Can Measure, Too

The How-To Geek Guide to Measuring Your Energy Use

We used a Kill A Watt electricity usage meter to measure the power usage of a variety of popular chargers. You can measure the electricity usage of your

own devices and appliances if you buy such a device, too. They’re currently

under $24 on Amazon.

Plug the meter into an electrical socket and plug another device into the meter. The meter will sit between the two and tell you how much energy the device

is using. This is very useful if you want to

measure your energy use,

 allowing you to identify power-hungry appliances and devices that should be replaced or adjusted. Look up the rate your electricity company charges you

and you’ll be able to figure out exactly how much that electricity will cost you, too.

So, with a meter in hand and a variety of chargers lying around, we got to work and tested them so you wouldn’t have to.

 

How Much Vampire Power Does a Charger Use?

Plugging in a variety of chargers — everything from iPhone, iPad, and MacBook chargers to Android phone and tablet chargers to Windows laptop chargers

to Chromebook chargers to Nintendo’s 3DS charger — it was immediately obvious there was a problem with the very idea of our test. Having heard about the

evils of vampire power and the need to unplug devices when we’re not using them, we were surprised to see that not a single charger used a detectable amount

of vampire power when it was plugged into an outlet.

In other words, the meter’s display read a big 0.0 watts, no matter what charger we plugged into it.

charger uses no detectable vampire power

But Surely They’re Drawing Some Power!

It’s not entirely accurate to say that each charger was using 0 watts, of course. Each charger is using some fraction of a watt. And it should certainly

be detectable at some point!

With that in mind, we had a new idea — plug a power strip into the meter and plug a variety of chargers into the power strip. We’ll see just how many chargers

it takes for the meter to be able to measure some noticeable electrical draw.

The power strip itself — despite its red LED light — registered 0.0 watts when we plugged it in. We started plugging in chargers and watched the meter

continue reading 0.0, even after several chargers were plugged in.

Eventually — with six separate chargers plugged in, filling up the power strip’s electrical outlets — we had a solid, measurable reading.

The combined total vampire power draw of this power bar, an iPhone 6 charger, an iPad Air charger, a MacBook Air (2013) charger, a Surface Pro 2 charger,

a Samsung Chromebook charger, and a Nexus 7 charger read 0.3 watts.

chargers on power strip energy usage measured

Aha! How Much Money is That?

Finally, we have a measurement to work with: 0.3 watts.

We’ll assume these are all plugged in 24 hours a day, 7 days a week over an entire year. There are 8760 hours in a year. That equates to 2.628 kilowatt

hours (kWh).

According to the

EIA,

the average cost of electricity in the US is 12.98 cents per kWh. This means that those 2.628 kWh of electricity will cost about 34.1 cents over an entire

year. Even using the most expensive electricity rates in the US — 30.04 cents per kWh in Hawaii — that’s only about 79 cents per year.

The real cost is actually lower, as you’ll be charging your devices with these chargers sometimes, so they won’t always be drawing vampire power. You’ll probably

unplug them to take them with you sometimes, too.

But let’s use the highest number — 79 cents per year. Divide that by the six different chargers here — being charitable and ignoring the power strip —

and you get 13 cents per year for each charger in Hawaii. That’s about five and a half cents on the average US electrical bill.

Silver nickel isolated on a silver background

This Isn’t Meant to Be Precise, But It Answers the Question

This isn’t meant to be a completely scientific or precise test, of course. Some of the chargers likely use more power than others, so the real cost to

leave your smartphone charger plugged in for an entire year is probably below 13 cents.

Either way, this shows us that the amount of vampire power consumed by your chargers is extremely small. It isn’t worth worrying about. Just leave your

chargers plugged in for convenience; don’t unplug them.

Yes, it’s true that you could save a tiny amount of electricity by unplugging your chargers, but you could save a much larger amount of electricity by

looking to heating, cooling, lighting, laundry,

your computer

and other more significant power drains. Don’t sweat the chargers.

These are all relatively modern chargers, of course — the oldest one here is from 2012 or so. Much older chargers might actually use a noticeable amount

of vampire power. For example, if you still have a cell phone or other portable electronics device from the 90’s, its charger might continually use a noticeable

amount of power if you leave it plugged in — but even that amount of vampire power probably won’t make a noticeable dent in your electricity bill.

article end

 

Join main@TechTalk.groups.io to automatically receive all group messages.