To grasp the realities of the use of and demand for electricity, and so how and where and with what to generate it, it is counterproductive to use any metric that does not include the growth of computing. Modern computing has altered the traditional "flat" electricity demand curve. Those days are over.
Windmills and solar panels struggle to keep-up with a portion, about 21 percent, of today’s demand, at the cost of hundreds of billions of dollars and millions of acres of wilderness destruction. Adding billions of dollars of 11th-century technology (windmills) or 19th-Century technology (solar cells) will not provide what we need, let alone what we want, in the 21st century and beyond. It is fantasy or sheer ignorance to believe otherwise.
Energy companies have begun to raise concerns about the consumption of electricity by data centers. What few people discuss, or are aware of, is that the growth of cloud computing, virtual reality and AI, are changing even the basic function of counting how much electricity is needed, let alone supplying it. We read about KiloWatts – thousands of watts – along with MegaWatts, GigaWatts, TeraWatts and more recently PetaWatts, each of which is 1,000 of the preceding. We’re just getting started.
The “giga” prefix for 1 billion (1,000 million) and “tera” (a trillion, or 1,000 billion) were both adopted in 1960. In 1975, we saw the official creation of the prefixes “peta” (1,000 giga) and “exa” (1,000 peta), and then the “zetta” (1,000 exa) in 1991. Today’s cloud traffic is estimated to be roughly 50 zettabytes a year…
Until just over a year ago, there was only one remaining official prefix name for a number bigger than a zetta: the 1,000 times bigger “yotta.” Given the AI-accelerated pace of data expansion, we’ll soon be in the yottabyte era. So now the bureaucrats in the Paris-based International Bureau of Weights and Measurements have officially given names to even bigger numbers, because before long, data traffic will blow past the yottabyte scale. One thousand yottabytes? That’s a ronnabyte. Your children will be using such numbers.
Such astonishing volumes of data being processed and moved will overwhelm the gains in energy efficiency that engineers will inevitably achieve. Already today, more capital is spent globally on expanding the energy-consuming cloud each year than all the world’s electric utilities combined spend to produce more electricity.
Read that last sentence again and see if it conjures up a mental picture of windmills.
Because we are producing tech at an ever-accelerating rate and will continue to do so – among the many uses for AI is to write more computer programs more quickly, using more electricity to write, test, deploy and for us to use – we will need electricity to power and cool that tech at a rate accelerating at least as quickly. Windmills and the sun aren’t going to cut it, no matter how many birds we kill, forests and cattle we destroy, bugs we eat, or into how many 15-minute cities our overlords cram us into.
The only way we can generate anything like the kind of electricity necessary to meet our computing and living needs, we will need to double down on actually reliable sources, like the hydrocarbon energy – oil and natural gas especially – which the environmentalists are so desperate to abolish, and ultimately nuclear, the safe, environmentally friendly energy source of the future. Anyone telling you anything different is either lying or has no idea what he's talking about.
Article tags: AI, electricity, electricity generation, Energy, environmentalism, the cloud, wind and solar