Digital Energy Consumption

Home/Digital Energy Consumption

Digital Energy Consumption

The IT ecosystem is one of the largest consumers of electricity worldwide. It consumes about 1,500 TWh per year of electricity – enough to equal the power generated by Germany and Japan combined – or almost 10% of the electricity generated worldwide. Within the sector, cloud computing alone accounts for 416 TWh2,3, roughly equivalent to the carbon footprint of the entire aviation industry, and it is growing fast: cloud computing doubles its energy consumption every four years. By 2020, it will grow to 1,400 TWh annually and could surpass China and the US, the world’s biggest electricity consumers, by 2030. Within the next decade, electricity might become a scarce resource, putting upward pressure on prices, if not globally then in certain places at certain times. The source of this bottleneck is the grid rather than power generation. The fastest-growing application in cloud computing is cryptocurrency mining. The amount of energy consumed by Bitcoin and Ethereum exploded within seven years from virtually zero in 2010 to 19.2 TWh in 2017 – matching the energy produced by Iceland or Puerto Rico. The energy efficiency of ASICs and GPUs has risen quickly, but it has been outpaced by the increase in transactions and market cap. While this exponential growth provides excellent opportunities for miners to earn rewards, the power consumed by the information technology ecosystem also increases competition for energy. Only those with safe access to affordable electricity can put their chips to work.