Thumbnail for Can clean energy handle the AI boom? by Vox

Can clean energy handle the AI boom?

Vox

9m 21s1,566 words~8 min read
Auto-Generated

[0:00]I spent some time recently reading through a big spreadsheet of questions submitted by Vox's audience members. And one of them caught my eye. It was from Kathy, a retired school teacher in New York City. What is the question that you wanted us to answer? So the question is, can green energy even begin to handle the increased demands that AI and crypto and cloud storage are going to put on our energy system? It's a good question. I've done some reporting on AI, but I've never thought much of the climate impact of all the AI products we're increasingly using. And all of our digital belongings like photos and documents and emails, getting stockpiled in servers around the world. They need a lot of electricity, and the electricity has to come from somewhere. This is all happening while the climate crisis demands we use less energy, not invent new ways to use more of it. I think our climate goals already feel pretty impossible to me, but now it's almost like we haven't changed the goal post, we've changed the entire game. So, let's get to the bottom of this.

[1:08]Within Cathy's big question is a more basic one, about how much electricity our digital lives require? At first, I was thrown off that Cathy mentioned things like cloud storage and AI and cryptocurrency in one category. But then I realized that their electricity demands happen at the same place. Data centers. Ultimately, you're talking about machines loaded up in large facilities who generate computations, and they need power, a lot of it, they need water, they need space. I spoke to Alex De Vries. He runs a research site called Digiconomist, where he's dug into this exact topic. Data centers are massive, often windowless warehouses that house thousands of servers that run virtually nonstop. Some of the bigger data centers are as big as four football fields and use as much electricity at any given time as 80,000 households. There are more than 8,000 data centers around the world, and the US has more than any other country. In 2022, data centers, artificial intelligence and cryptocurrencies made up about 2% of total global electricity demand. But by 2026, that number is expected to double, which is like adding the amount of electricity used by the entire country of Sweden. I'll explain why in a minute. Alright. Support for this video comes from Clavio. Clavio works with businesses to turn their data into meaningful connections with their customers, through AI powered email, text messages and more. According to Clavio, over 150,000 brands trust their data and marketing platform to build smarter digital relationships with their customers. During the holiday season and beyond. Clavio has no editorial influence over our work, but they make videos like this possible. Learn more at the link below. That big jump from 2022 to 2026 is thanks to rising cloud storage and cryptocurrency electricity demands. But it's also because of the AI boom. We know AI requires a ton of computational power, but it turns out that the amount of electricity it uses is a really difficult question to answer. AI is a huge umbrella term that includes everything from basic statistical models that detect patterns in data to generative AI that creates text and images and videos. That's the most computationally intensive kind. The thing is, the handful of private tech companies that dominate the AI field don't really disclose how much of their energy use is dedicated to AI specifically. If you look at Google's latest environmental report, it clearly states they absolutely don't want to make a distinction between regular workloads and AI specific workloads. And these company's AI models are mostly closed source, meaning no one knows exactly how they are built. This has left some researchers to try to piece it together on their own. Researchers looked at an open source large language model called Bloom, that has roughly the same amount of parameters as GPT-3. And found that training something like GBT3 required almost 1,300 MWh of electricity. About as much power as consumed by 130 homes in the US for one year. Today, large language models like GPT4 have hundreds of billions of parameters, if not a trillion. And researchers say that the computational power required to train these models is expected to double every nine months. So far, it has mostly been large language models driving the AI energy boom. Of course, that could change going forward. Now we see AI on the rise for image generation and also specifically video generation. So far, we talked about training a large language model. Researchers also looked at energy use from people actually using it. It's been estimated by myself and others that a single jetGPT interaction would take like 3 Wh. Which is comparable to running a low lumen LED bulb for one hour. So on itself, it doesn't sound like a whole lot, but of course, hey, it's the volume that matters. This is 10 times more than a standard Google search, and of course, if you're talking about millions or billions of interactions, the number start to stack up quickly. Alex took another research approach by looking at the hardware used for AI training and use. Over 95% of the AI industry uses servers made by the company Nvidia. They could sell 1.5 million of their servers by 2027. He multiplied that by the information Nvidia publicizes about each of their server's energy demand. He found that data centers devoted to AI alone could consume around 100 terawatt hours of electricity per year, or about the same as his home country of the Netherlands. There's a big part of Cathy's question I haven't gotten to yet. Can renewable energy meet the surging demand from the world's data centers? The good news is that using green energy is the stated goal of a lot of these companies. Both Google and Microsoft have made pledges to be net-zero by 2030, but there are signs that AI is disrupting those plans. That's because solar and wind energy can't produce electricity all of the time, and these data centers need to be running all of the time. In most cases, they will just have a backup connection to the power grid which will have fossil fuels on it. It's not just that. Data centers are being built at a rate that renewable energy infrastructure can't keep up with. It can take a year to build a data center, but many more years to get a solar or wind farm on an electrical grid. Google's 2024 sustainability report showed that the company's emissions rose by 48% from 2019 to 2023. In large part due to its data center energy consumption, suggesting that integrating AI into their products could make reducing their emissions challenging. There's already evidence in the US that coal plants that were meant to close are staying open because of data centers electricity demands. And that state utilities are building new natural gas plants for the same reason. But even if these tech companies can look good on their sustainability reports and get to net zero, there's still a problem. The thing is that our renewable energy supply globally is limited, so if we're attributing an increasing part of that to the data center industry, the consequence is that there's less renewables available for everything else. That probably will mean that on the whole, we will end up using more fossil fuels anyway. With all this context, the answer to Cathy's question is that for right now, we aren't prepared for renewable energy to meet the increasing demand of the world's data centers. So what do we do about this? As users, it would be extremely difficult to opt out of backing up our data on the cloud. Or even refrain from using AI. I think AI is embedded in so many things that I'm not sure I will have the option to say, I'm not using it. You know, I'm out. Researchers like Alex say the best place to start is to force more transparency from these tech companies. In the EU, the AI Act doesn't really force tech companies to disclose anything with regard to the environment. And that's the EU, not even talking about the US yet, which is lagging behind a bit on this matter. Some environmental organizations and local communities are calling for moratoriums on data centers. And some researchers have proposed the idea of an energy efficiency rating. So companies and consumers can choose data centers that are the most sustainable. We could also hope that the servers and data centers will keep getting more energy efficient. But more than anything, this issue emphasizes how desperately we need to be scaling up renewable energy and fast. Not only to meet the ever increasing data center demands, but so there's plenty of renewable energy to go around. If you like this video, you'll love Vox's new podcast explained it to me. I'm the host JQ, and every week I call up a listener with a question, get them some answers, and we have some fun along the way. You can find a link to the podcast in the description. While you're there, if you want to support Vox, you can check out the details on our membership program.

Need another transcript?

Paste any YouTube URL to get a clean transcript in seconds.

Get a Transcript