Of all the electricity-powered household appliances, your refrigerator uses the second largest amount of energy, after the air conditioner. According to the U.S. Department of Energy, the average refrigerator uses approximately 725 watts of electricity and 15 to 20 amps, which can equal 10 percent or more of your home’s total energy usage.
Wattage, measured in watts, measures the rate of energy converted to electricity, usually per hour. Your own refrigerator should have come equipped with an energy consumption label, stating, at the very least, its wattage. This number can be used to determine your unit's amperage.
Once you know your refrigerator's wattage, you can determine its specific amperage. Amperage is equal to the wattage of the unit divided by the voltage, or:
Amperage = Watts/Voltage.
Voltage, or the tension between the electrical potential, is standardized in the United States. All American homes have a voltage of 120.
Power companies typically charge you for your electricity usage based on a charge per daily kilowatt-hour, or kWh. Multiply your refrigerator’s wattage by 24, or the number of hours per day it is used. Then, divide that by 1000 to determine your refrigerator’s kWh energy consumption. You can then use this number to determine how much you pay, on average, to run your refrigerator, based on your power company’s charges per daily kilowatt-hour.
Because you cannot turn off a refrigerator when it is not in use, you can reduce the amount of energy you use by choosing the smallest refrigerator that would work for your home. Choose one without an ice maker – or one with an ice maker that can be switched off when not in use – and replace older models with newer models that meet the “Energy Saver” standards established and maintained by the U.S. Congress.
- Photo Credit Ryan McVay/Photodisc/Getty Images