AI Thoughts 1

Published on 08 December 2025

An oil painting of a sunny drenched
forest.

“It is difficult to get a man to understand something, when his salary depends on his not understanding it.” - Upton Sinclair

I want to go through some of the hot button questions surrounding artificial intelligence and do some thinking on the page for each one. Why? Primarily because I’m in the middle of my first true job transition. I’m going from writing software for planning the power grid to writing software to do power management for an AI lab’s datacenters. This has two consequences

  1. I’ll be working on AI – so I’d like to be clear with myself and others about what future I think I might be helping bring about.
  2. I’ll be working at a prominent frontier lab – which means this is kind of my last chance to talk broadly, even philosophically, about AI without AI hype being linked with my financial wellbeingI mean, any more than is already the case for anyone with general exposure to the American economy. See Kayla Scanlon’s . And of course, Google does AI, and I received equity in that company. So in a very real sense, my financial exposure to a generalized bet on AI probably hasn’t changed a ton. But I think you guys know what I mean: I’ll probably be a lot more emotionally invested in OpenAI’s success and the generalized bet on AI, and others will likely perceive me as such. or dancing around any confidential information I might know. Basically, I want to preregister some stances and thoughts before people can levy the above quote against me.

Regardless, I’m publishing this in my downtime between jobs and seriously, this reflects nothing but public data and discourse. It will likely range from pretty confident about things I have some experience with (power) to midwit speculation (sentience).

This is not pro-AI or anti-AI. I think you should be skeptical of anyone who makes such a profession of belief. And I get it, like you might hear me say I’m anti-car. What I’m really saying is “when I think of cars I think primarily about negative interactions with them: The ways my urban environment is tailored to them, the danger they present to me as a cyclist and a pedestrian, and the noise and pollution they produce”. But cars are super useful to me, I occasionally enjoy a drive, and I don’t want to rid society of cars. I understand most people’s professions for or against AI as expressions of something similar. But I think it’s a shortcut and I’d like to be more concrete.

I’ll order these topics roughly by how much I have run into them in conversations with my peers. If you got this blog post texted to you as a result of a conversation – hello! I hope you enjoyed your social interaction with Steven and are not too put off by him having a sort of cheatsheet prepared.

Is AI Bad for the Environment?

Confidence level: medium-high

The first thing I want to put forward is some common sense ground rules we can all agree on before we get to anything overly concrete.

One: everything consumes natural resources and energyThis is true in a second law of thermodynamics way, but also in the way that, even if we limit our concern to the eras when the Earth is awash in seemingly unlimited solar energy, there is still the opportunity cost of our choices. Even a Type 1 Kardashev civilization, with mastery of all solar and planetary energy, suffers the cost of choosing one use over another.. Some things use a lot. Some use relatively little. But it’s all nonzero. Which leads us to

Two: Because of One, everything becomes a balancing act between the resources something consumes and the value it provides.

Some examples for Two: Maybe my hobby is venting as much helium as I can get my hands on out of my window and directly into space where it’s lost forever. While I may enjoy this and probably can’t do it on too big of a scale, I think we can agree that on a societal level this is a raw deal for everyone else.

On the other hand, something like the production of ammonia via Haber-Bosch is quite resource intensive and “consumes 1 to 2% of total global energy and is responsible for approximately 3% of global carbon dioxide emissions”From which I got from hehe obviously. Buuuuut without this ammonia we’d be unable to feed 8 billion people. Vaclav Smil writes “the crop harvest in the year 2000 would have required nearly four times more land and the cultivated area would have claimed nearly half of all ice-free continents, rather than under 15% of the total land area that is required today”. So, that’s a good deal! Good trade. Of course, one can always try to get a better deal by increasing the value or decreasing the resource usage, but I think it’s fair to say this is the basic calculation we’re all doing.

A corollary of One and Two is that if you think LLMs are an abomination, that no good can or will come of them, then the resource consumption will always be unjustified in your mind because the value is zero.

So there are two sort of potential audiences here

  1. I don’t think LLMs are useful or I think they are net harmful, and to make matters worse I think they’re bad for the environment.
  2. I think LLMs might have significant value, but I’m worried the environmental tradeoff isn’t worth it.

If you’re 1, I still would hope you find the following interesting! But let’s be real, the arguments about resource usage are playing second fiddle to the value question. We can talk about that in other sections. If you’re 2, you’re really the target audience here.Now if you’re some other combination, like I hate AI but I also hate the environment or you’re already convinced that the environmental concerns are overblown, congrats.

Three: the more dire and complicated a problem like the climate crisis, the worse it is to misidentify drivers of that crisis. I think it’s easy to react to fact-checking on environmental issues with “yeah but it is still consuming XYZ, and I don’t like that”, especially if you place a very low value on LLM use. But we’re all engaged in a constant effort to identify climate problems we can solve that will make a differenceProject Drawdown being a noble effort in this vein , and it’s actually bad to elevate things that aren’t significant problems because it takes time, effort, and attention away from the real problems.

If we can agree on all these, and I don’t think that’s asking too much, then the following should mostly just be a matter of getting our facts straight.

The question of the environmental impact of AI, as I’ve encountered it, primarily deals with water use and electricity use. So let’s talk about each of those in turn.

Should I Be Worried About the Amount of Water AI Uses?

TL;DR each prompt probably uses about ~2ml of water, only ~0.3ml of that is used by the datacenter, the rest is just the water required to make electricity in the US. Buying one less pair of jeans probably mitigates a lifetime’s worth of personal AI water usage.

I am tempted to just say go and read Andy Masley’s post on AI water usage .

I draw extremely heavily from this post. In fact, segments of this will just be the most convincing quotes from that post. It’s well-researched and even-keeled. But I get that I would not be adding much to just send you to a link, so I’ll try to also walk through some highlights and also call out where some of the key numbers come from, to help convince you that these are decent estimates.

It’s helpful to understand that the numbers that get thrown around for water usage per prompt can be wildly different just because they’re measuring completely different things. When we say “AI uses water”, we mean that the datacenters that train and host the models use water. And when we say datacenters, it’s worth noting that a lot of datacenters just run stuff on the internet. However, a lot of the growth in datacenters seems driven by AI, and I think it’s reasonable to assume an increasing fraction of datacenter racks are being dedicated to AI.

You can break down data center water usage into

Water usage can further be understood to be

As a rough guide, Masley states that 80% of datacenter water usage is offsite usage to generate electricity. That’s scope-2 water, in the parlance of emissions (which is actually used for water as well).

Here’s some sanity checking on that:

puts direct annual direct data center consumption at 66 billion liters (17 billion gallons) and indirect consumption at 800 billion liters (211 billion gallons). So that’s roughly a 92:8 ratio of indirect:direct water consumption. Note, this not just AI and AI still makes up a minority of data center compute usage.

in the ACM by researchers at UC Riverside focuses on estimating the water usage for training GPT-3. “By taking the GPT-3 model with 175 billion parameters as an example, we show that training GPT-3 in Microsoft’s U.S. datacenters can consume a total of 5.4 million liters of water, including 700,000 liters of scope-1 onsite water consumption.” That’s an 87:13 indirect:direct ratio.

So I think 80% is a good, conservative estimate of how much of data center water usage is from indirect consumption.

Both of those indirect numbers almost certainly include, as this points out, the evaporation of the reservoirs that feed hydroelectric power. Basically, since hydroelectric dams create giant lakes behind them with lots of surface area to evaporate, hydroelectric power is by far the most water intensive way to generate electricity, about 40 times more water consumptive than thermoelectric plants ().

It’s important to remember this point because including the evaporation of reservoirs is controversial. As far as I know, the evaporation numbers aren’t compared to a counterfactual where the reservoir didn’t exist. There would be some amount of evaporation even if there wasn’t that hydroelectric plant there. Additionally, the water sits there and evaporates regardless of who is using the power or whether the plant is even generating electricity. So it’s hard to attribute this evaporation to anyone’s power usage, much less datacenters. It’s just a cost of building this particularly clean, cheap, extremely reliable source of electricityHydro is really great. I think there are some concerns about the ecological impacts of damming rivers, but from a clean energy standpoint hydro is sweet if you’ve got it. Extremely reliable and it provides inertia to the grid. When you see stuff like , it’s often a combination of a good solid energy transition and a lot of hydropower that they were lucky enough to be able to take advantage of..

Okay so we’ve established a relative sense of where the bulk of data center water consumption comes from. It’s primarily electricity generation, and it’s partially inflated by hydro. But in absolute terms we haven’t talked about how impactful the consumption is.

Is That A Lot?

It’s hard for me to look at something like 17 billion gallons or 211 billion gallons and to know whether that’s bad or good!

Here, I’m just gonna play the hits. And really, you should . Or at least the .

“So AI consumes approximately 0.04% of America’s [annual] freshwater [usage] if you include onsite and offsite use, and only 0.008% if you include just the water in data centers.”

“All AI in all American data centers is collectively using 8 times as much water as the local water utility in my town provides to consumers. You should be exactly as worried about AI’s current national water usage as you would be if you found out that 8 additional towns of 16,000 people each were going to be built around the country.”

What about datacenter growth?

We can apply some of the alarming projections about datacenter growth:

“So the water all American data centers will consume onsite in 2030 is equivalent to:

8% of the water currently consumed by the U.S. golf industry.

8% of the water consumed by U.S. steel production.

The water usage of 260 square miles of irrigated corn farms, equivalent to 1% of America’s total irrigated corn.”

And what about your personal usage?

“Here’s a list of common objects you might own, and how many chatbot prompt’s worth of water they used to make (, and using the onsite + offsite water value):

  • Leather Shoes - 4,000,000 prompts’ worth of water
  • Smartphone - 6,400,000 prompts
  • Jeans - 5,400,000 prompts
  • T-shirt - 1,300,000 prompts
  • A single piece of paper - 2550 prompts
  • A 400 page book - 1,000,000 prompts

If you want to send 2500 ChatGPT prompts and feel bad about it, you can simply not buy a single additional piece of paper. If you want to save a lifetime supply’s worth of chatbot prompts, just don’t buy a single additional pair of jeans.”

What about the water bottle stat?

From the

“A 100-word email generated by an AI chatbot using GPT-4
Once requires 519 milliliters of water, a little more than 1 bottle”

Which became, at least in my Instagram feed “a water bottle a prompt”.

Masley links another post which attempts to reverse engineer how the Post got to that statistic, which seems to be everywhere. That breakdown is . In short, it’s a series of extraordinarily aggressive assumptions stacked on top of each other.

The author of that breakdown is less than generous to the Post on their motivations for publishing something that could rightly be construed as misleading. I am a bit more tempered on this. There has been a lot of bad reporting on this issue and the bar for journalism should be highMasley for his part has I think done a reasonable job reaching out and trying to correct or otherwise give context. See and .. But I also think it’s just really hard to understand the context of any one number of water usage–combine that with headlines optimized for clicks and it becomes easy to vastly misstate the magnitude of a problem. I think, in general, one’s bar for assuming ill intent or an agenda over incompetence should be really, really high. I genuinely feel that the former is much much rarer than the latter. It’s also a philosophy that encourages forgiveness and allows you to live your life less convinced that people around you are evil.

The above paints a picture, for me, that data centers indeed use water, but that there is little cause for alarm (again, on the water front). I have a general sense that increasing surface temperatures are making drier regions drier and wetter regions wetter, and that volatility may reduce our accessible freshwater. That absolutely strikes me as concerning. Data centers are just neither a big part of the consumption story nor an effective lever for affecting that issue.

What do people think who aren’t AI corporate stooges?

For some generally more critical coverage, check out this awesome which also starts out from the same place which is “most of the numbers you read in the news are the product of a bunch of assumptions and complexities that, without you the reader being aware of, make those numbers meaningless to you”. He plays all the hits! Evaporative cooling, the water use of electricity generation, corn!

And I think the conclusion that Hank comes to is one I completely agree with. Water is highly local–there is undoubtedly the potential for local stories of concern where a data center is being built in a water-constrained locale and there’s probably, again, a local story about the economic value that town wants out of the data center being in tension with the wisdom of doing anything in that town that exacerbates water use. But in absolute terms, it is hard to make a case for general worry.

He does note that he is much more concerned about the electricity usage of AI.

Should I Be Worried About the Amount of Power AI Uses?

Confidence: medium

TL;DR Data centers right now are a small, relatively efficient part of national electricity usage and a miniscule fraction of your personal energy and emissions. The most alarmist narratives about aggregate data center demand growth are based on forecasts for which there’s growing consensus that they are overblown. It is a notable story in electricity demand growth, but dwarfed by much larger electrification narratives like heating, cooling, and EVs.

This should be the sort of thing I’m probably more confident on, since I have more experience with power than with water. Unfortunately, a lot of the power issue is less about current power usage, but about where power usage might go. As , demand forecasts are notoriously upward biased. If you simply add up all the datacenter interest that different regional operators have gotten, you’ll overestimate due to duplication and a bunch of speculative interconnection requests. However, things like natural gas turbine purchase orders and Big Tech’s own statements do seem to indicate we’re in a bit of a regime change.

In short, the question is “will datacenter growth, possibly driven by AI, drive worrisome increases in our electricity consumption?”

I say possibly simply because the vast majority of things we do on the internet are not AI related. We scroll, shop, searchThose of you who are less fun at parties will at this point say “but your feed is determined by AI and every Google search has gemini”. That last point is true, but the first isn’t in the way that most people refer to AI. AFAIK recommender systems are still using machine learning systems that, while of course sharing machinery, do not resemble the LLMs people think of as AI.; we write, date, download vast hours of entertainment, and livestream our coworker’s bedrooms and offices; we spend somewhere on the order of in front of screens, consuming and producing and funneling bits through servers hosted in a datacenter somewhere. See this chart from the LBNL report:

Source:

The non-AI processors (CPUs) still make up the majority of the power draw of datacenters. Obviously, this chart also guesses that by 2028 this will no longer be true. But it is for now, and it’s helpful to remember that.

So again, “will datacenter growth, possibly driven by AI, drive worrisome increases in our electricity consumption?”

Quickly let’s start with the baseline. Per the LBNL report, in 2018, data centers consumed “76 TWh, representing 1.9% of total annual U.S. electricity consumption. U.S. data center energy use has continued to grow at an increasing rate, reaching 176 TWh by 2023, representing 4.4% of total U.S. electricity consumption.”Page 5

To further ground us, the following is a Sankey diagram of US energy use. It’s in quads, which are ~293 TWh.

I think data centers count as commercial, so we’re looking at 176 TWh out of 1,374 TWh or 12% of just commercial electricity usage.

I think it’s worth taking a moment to appreciate that this is amazingly efficient for an economy increasingly built on digital services. Your digital life is such a small amount of your energy (and by extension your emissions) footprint. Here’s a great chart from Masley’s post “

So the baseline really isn’t alarming to anyone (other than our screen time is absurd, that’s alarming. But that’s for a different post.) The projections are what worry us. LBNL projects data center power usage growing to “6.7% to 12.0% of total U.S. electricity consumption forecasted for 2028.”Page 6 Equivalently, 325 to 580 TWh or, using a 50% average utilization rate, 74GW to 132GW of drawDraw meaning instantaneous power draw. Took me a while for the power vs energy to click but a terawatt is a measure of power which is for an instant in time, terawatt-hour is a measure of energy, power over time. Also, you can follow along with this conversion from terawatt-hours to gigawatts by taking the 325TWh number, dividing it by the number of hours in a year (365 * 24) and then multiplying by 2 to account for 50% utilization during that year.. The first thing to say is it’s a big range! 6.7% would basically be steady growth since 2018, before which it was basically flat going back until the dot com era 2000-2005Both big increases in efficiency and consolidation around big cloud providers meant increases in compute at basically no increased power draw..


Source, LBNL report page 5

12% would be a lot more! And along with that might come

  1. Increased prices for other users of electricity (including you and me reee)
  2. Periods of reduced reliability as the grid adapts to so much load growth.

Let us examine the cases for the aggressive estimate and the conservative estimate.

Source: LBNL report, page 12

The Aggressive Power Case

If you aggregate utility 5-year load forecasts, you get very (drawing on work by Grid Strategies, which kicked off a lot of this discussion). Now this is for 2030, which doesn’t exactly line up with 2028, but these are very rough numbers anyway. That already gets us past the low estimate from the LBNL report. And this is not evenly shared across utilities. If these forecasts bear out, particular regions may see large peak load growth that strains their existing infrastructure and requires new build out that would get passed on as costs to consumers.

You can also just do the hard work of trying to track notable data center projects and add up the expected capacity. That’s what . The vast majority of this is still in the development phase, but I assume that means they more or less have their power secured. This is much more reliable since this is less of a forecast and more so just counting up projects that will probably finish in 3-6 years. As more projects secure either colocated generation or make it through the large load interconnection process, this number might go up and then the datacenters might finish in time for the 2028-2030 period we’re looking at.

Idk.

I’m trying really hard here, but if you are finding this an unimpressive steelman I don’t blame you. The only way to see scary aggregate numbers is if you unquestioningly add up utility load forecasts or if you take certain proprietary analyses like those done by BCG and McKinsey at face value.

The Conservative Case.

The problem is, with the exception of news outlets simply echoing utility load forecasts, no one serious reports utility load forecasts without big caveats. Take this warning from the Grid Strategies report which calls out the 90GW number:

“Given lower estimates from other benchmarks, aggregate utility forecasts likely overstate 2030 national demand by roughly 25 GW.

- Data center load forecast for 2030 aggregates to about 90 GW, nearly 10% of forecast peak load, based on Grid Strategies’ analysis of utility and regional load forecast publications.
- Cleanview is tracking ~60 GW of data centers scheduled to begin operation before 2029 but may not track all projects.
- In mid-2024, based on anticipated shipments of processing chips for data centers, TD Cowen projected 65 GW of new power demand by 2030.
- Other data center market analysts also use utility-published data and information from data center developers to forecast data center and large load growth. Despite significant variations on scope and methods, all point to growth well below 90 GW.”

Aka if you try other methods like counting projects that we think will actually finish or aggregating all the chips that are being shipped for use in those data centers, you don’t get anywhere near 90GW!

Again, from the Grid Strategies report

“However, data center market analysts indicate that data center growth is unlikely to require much more than 65 GW through 2030. Similar growth is shown in one proprietary database of data center projects. This suggests that either the timing or the magnitude of FERC-submitted load forecasts collectively overstate data center-driven load growth by about 40%”

They’re taking the under on the low estimate from LBNL!I am uncertain if there’s a large variance on what utilization factor everyone uses, that could have a large impact!

on why data centers are actually moving fairly slowly with regards to acquiring power.

He makes the point that, if markets expected data centers to drastically increase demand for electricity, you’d expect to see markets that track electricity futures rise but we’re not.

Another way to look at this is to ask whether the existing data center buildout has impacted the prices that consumers like you and I pay. From :

“What about elsewhere? The Economist has adapted a model of state-level retail electricity prices from the Lawrence Berkeley National Laboratory to include data centres (see chart 2). We find no association between the increase in bills from 2019 to 2024 and data-centre additions. The state with the most new data centres, Virginia, saw bills rise by less than the model projected. The same went for Georgia. In fact, the model found that higher growth in electricity demand came alongside lower bills, reflecting the fact that a larger load lets a grid spread its fixed costs across more bill-payers. Still, problems may be coming. The clearest warning sign comes from PJM Interconnection, the largest grid operator in the country. Prices at auctions for future generation capacity there have soared, as data-centre growth has yanked up projected demand. That will hit households; PJM reckons the latest auction will lift bills by up to 5%.”

In other words, the data center growth we’ve already seen doesn’t seem to have hit ratepayers. On the other hand, ratepayers in PJM are about to be impacted by a particularly expensive future capacity market, where PJM buys guarantees from generators that they will be available to produce a certain amount of power. This is the primary market signal that PJM gives to producers to indicate what it thinks it will need to meet peak load in the future. How much power it needs to guarantee from the capacity markets is based on its load forecast. So while real data center construction doesn’t seem to have significantly affected ratepayers, the expectation of data center demand has. The question of whether this is justified once again falls on a regional operator’s load forecast (typically an amalgamation of its constituent utility’s demand forecasts).

Benoit Blanc voice: at the center of this hyeah case lies a big, fat question mark. Every concern, every dire warning. every consternation about datacenter demand seems to trace back, like Genesis, to an original sin: whether utility load forecasts are to be taken at their word.

I think once you believe, as I do, that utility data center load forecasts are heavily inflated, the data center demand growth question becomes much less alarming–still significant, still notable–but a minor player amongst much larger electrification narratives.

Let’s talk about those other narratives. It should give us a way to place these data center demand forecasts amongst all the other big drivers of electricity demand.

Hannah Ritchie, who writes Sustainability by the numbers has an excellent post on . She does a great job of summarizing a 2024 International Energy Agency report on sources of global electricity growth (note: switching contexts to global here).

Put into words: The energy transition demands electrification. We are already shifting exajoules of energy use away from inefficiently burning hydrocarbons to electric power sources. Electric vehicles will almost certainly force the grid to grow in new and unexpected ways, so will heat pumps and industrial electrification. The above numbers reflect not only net-new demand for warm homes and clean transport, but the transition to better, cleaner ways of providing the same quality of life. Mixed in this momentous change, building on the same electrical infrastructure, is a new industry. Who knows if it will bear out, the economic narrative is not yet fully written. But the energy story, while notable, continues to be more sober and less relevant than we thoughtRitchie notes the IEA continually revising their data center numbers downward..

Okay but what about my personal energy/emissions story?

I would point you to the .

TL;DR Much like water, you will not meaningfully impact your energy or emissions footprint regardless of what you do with an LLM. There are many useful comparisons to other things you do to emphasize this point. This is not a reasonable thing to be worried about regarding your personal AI usage.

Caveats

Power is local, less so than water, but it is still constrained by our ability to move it around via big transmission lines (which we need more of). There are absolutely thorny reliability problems if you have a bunch of new loads clustered on the network. There’s tricky problems about cost sharing, about who bears the cost of an upgraded transformer. Ireland in particular seems to be having a bad time of it.

Furthermore, tech companies racing to power new chips for their AI products are increasingly turning to natural gas generators colocated to reduce their effect peak demand. To me this is a grid failure–in the same way gigawatts of renewable energy sits fallow in interconnection queues, large loads face long delays and so are exploring ways around that. Regardless, this has an emissions impact. Sufficient enough that it seems even companies like Google, which at one point had eliminated its historical carbon footprintMeaning that Google believes it offset the entire amount of carbon emitted in its history. and literally invented stricter standards about being 24/7 carbon free (as opposed to simply offsetting the periods of the day where you couldn’t run exclusively on renewables), .


[^1]: I mean, any more than is already the case for anyone with general exposure to the American economy. See Kayla Scanlon’s . And of course, Google does AI, and I received equity in that company. So in a very real sense, my financial exposure to a generalized bet on AI probably hasn’t changed a ton. But I think you guys know what I mean: I’ll probably be a lot more emotionally invested in OpenAI’s success and the generalized bet on AI, and others will likely perceive me as such.

[^2]: This is true in a second law of thermodynamics way, but also in the way that, even if we limit our concern to the eras when the Earth is awash in seemingly unlimited solar energy, there is still the opportunity cost of our choices. Even a Type 1 Kardashev civilization, with mastery of all solar and planetary energy, suffers the cost of choosing one use over another.

[^3]: From which I got from hehe obviously

[^5]: Now if you’re some other combination, like I hate AI but I also hate the environment or you’re already convinced that the environmental concerns are overblown, congrats.

[^6]: Project Drawdown being a noble effort in this vein

[^7]: Hydro is really great. I think there are some concerns about the ecological impacts of damming rivers, but from a clean energy standpoint hydro is sweet if you’ve got it. Extremely reliable and it provides inertia to the grid. When you see stuff like , it’s often a combination of a good solid energy transition and a lot of hydropower that they were lucky enough to be able to take advantage of.

[^8]: Masley for his part has I think done a reasonable job reaching out and trying to correct or otherwise give context. See and .

[^9]: Those of you who are less fun at parties will at this point say “but your feed is determined by AI and every Google search has gemini”. That last point is true, but the first isn’t in the way that most people refer to AI. AFAIK recommender systems are still using machine learning systems that, while of course sharing machinery, do not resemble the LLMs people think of as AI.

[^10]: Page 5

[^11]: Page 6

[^12]: Draw meaning instantaneous power draw. Took me a while for the power vs energy to click but a terawatt is a measure of power which is for an instant in time, terawatt-hour is a measure of energy, power over time. Also, you can follow along with this conversion from terawatt-hours to gigawatts by taking the 325TWh number, dividing it by the number of hours in a year (365 * 24) and then multiplying by 2 to account for 50% utilization during that year.

[^13]: Both big increases in efficiency and consolidation around big cloud providers meant increases in compute at basically no increased power draw.

[^14]: I am uncertain if there’s a large variance on what utilization factor everyone uses, that could have a large impact!

[^15]: Ritchie notes the IEA continually revising their data center numbers downward.

[^16]: Meaning that Google believes it offset the entire amount of carbon emitted in its history.

[^17]: I sort of imagine the average person as being like “huh ChatGPT sure is helpful sometimes but also I’m worried about some stuff”. I mean, that’s how I feel a lot of the time. I’m just a .