A long time ago — like 15 years ago – I was working at Twitter and tried to compute the energy needed for a tweet. The answer was pretty low: 100 joules of energy, or, at the time, about 0.02 grams of CO2 emissions. As Fast Company wrote about it, “That means Twitter’s carbon footprint is relatively low–the 50 million tweets sent out daily emit one metric ton of CO2. In 2006, a single American family emitted an average of 24 metric tons of CO2 from home energy use and transportation combined.”
At the time, the energy needed for that one piece of information on the web was less than a Google search. But that was then. Now, presenting information is going to start costing a whole lot more energy. You guessed it: AI.
Look, all these numbers are approximate and inferred from the outside, so take it as a sketch: That Google search requires .3 watt-hours. The ChatGPT query? 2.9 watt-hours. All this AI requires a lot of energy. Running these models, these LLMs, takes millions of matrix multiplications. But that’s just running them; training them takes even more energy. Numenta reported last year that “while energy usage has not been disclosed, it’s estimated that GPT-4 consumed between 51,773 MWh and 62,319 MWh, over 40 times higher than what its predecessor, GPT-3, consumed. This is equivalent to the energy consumption over 5 to 6 years of 1,000 average US households.”
The alarm has been sounding about the exponential increase in energy use looming over our AI future. Data centers, which tend to be based in areas with low energy costs as well as a low risk of natural disaster, such as Phoenix, are also renowned water-guzzlers. They need that water to keep the servers and buildings cool. Does Phoenix, an increasingly hot place with very little groundwater, really need to dedicate 16.5% of its energy use (and who knows how much of that water) to data centers, as is predicted by 2030?
It’s not yet decided which electricity source these future centers will use, but the good news is that owners are already developing and using water recycling techniques and waterless cooling alternatives, and are looking for alternative energy sources for their backup generators. (You know, for when the grid goes down.) Then again, not all alternatives are good: Meta uses a cooling system that draws from outside air, but that air must be below 85 degrees in order for it to work. In Phoenix, summer days can easily reach over 100.
“Last season on the podcast, we spoke with Keolu Fox, who is working on indigenous technology solutions to some of these problems, like using the ocean for cooling, or finding completely out-there ways to do storage — say, DNA.”
Okay, so this, coupled with climate change and an already-stressed electricity grid, seems bad. But there is some good news: State governments are starting to step in to limit the building of data centers. (Though Meta and Google will likely have their facilities built in time to avoid the Arizona governor’s limitations.) And on Wednesday, there was an Energy, Climate, and Grid Security Subcommittee hearing on how to keep up with AI’s energy needs on an already strained system. They are advocating for baseload reliable energy, a term you’ll be hearing more of. (Let’s not forget that AI is being used by energy companies for over 50 different applications, such as supporting the smart grids that generate massive amounts of data.)
I do believe that we can innovate our way around this problem. For example, the way power is used with AI is very different: It’s "spiky," meaning we need a lot of power for both training AI and its inference phase, when you’re feeding it tons of prompts, and less power for using it. Could that training be timed to off hours, when the grid is less burdened, or could we use excess energy from renewable supplies to power search? Last season on the podcast, we also spoke with Keolu Fox, who is working on indigenous technology solutions to some of these problems, like using the ocean for cooling, or finding completely out-there ways to do storage — say, DNA — that requires way less energy for long-term storage.
The fact that we have a couple of years before the AI energy requirements go through the roof means we can still sprint to find solutions. And regulators can demand transparency from AI developers as to the energy sources they’re using. “And if that transparency doesn’t come naturally, which it hasn’t so far,” data scientist Alex de Vries told Scientific American, “then we should think about giving it a little bit of a push.” Who wants to push with me?
Also, please listen to this week’s episode of the podcast, focusing on kids and data. I’d love to know what you think! Email me at us@technicallyoptimistic.com.
Worth the Read
Instagram and Facebook users who fell victim to scam ads sued Meta for failing to remove them. On Tuesday, the Ninth Circuit Court found that the tech industry’s immunity shield doesn’t completely, well, shield the company.
Timeline, the beloved Google Maps feature, is about to become more private. The catch? You won’t be able to use it on your desktop after Dec. 1.
The island of Anguilla is getting a boost from registering .ai domain names, generating 32m in 2023 alone. Pretty funny side effect, and a funny way for Anguilla to make money off the AI boom.
One of my former professors, Lawrence Lessig, just wrote a fantastic opinion piece about empowering tech workers to sound the alarm on AI risks that aren’t being addressed.
This is something I was really missing in the entire podcast, so far you talked about sustainable datacenters and the options there, but energy consumption was not really featured in the podcast (i have to note that i'm only 1 season in, and have yet to listen to the second season).
I would very much like to go in depth and learn how we can educate ourselves, eachother, ... on when to properly use AI and when we shouldn't
Its easy to state that chatGPT is there for everyone, and people shouldn't ask it for facts because those are things they can google. yet john doe in the street doesn't care and uses it as he seems fit. So it's kind of the far west here
I understand that people want to experiment with it as well, but if i hear how much 1 prompt costs energy wise i really feel like i should not do that just yet. Do i stay away from AI waiting for the sector to find better ways to reduce energy consumption? And in doing so do i not just risk missing the boat.
Or am i just inflating the problem in my head while this is just something that will get dealth with over time.