How environmentally harmful is working with AI? Everything we know so far

When it comes to figuring out the environmental impact of AI as a creative collaborator, it’s smoke and mirrors. We try to cut through the plume.

Share

Light and Shade is a new series exploring the challenges at the heart of the AI-creative conversation. As AI becomes increasingly present across the creative industries, the series examines the opportunities and dilemmas our community grapples with. It is grounded in interviews with technologists, researchers, artists, designers, creative founders, writers, lecturers and environmental and computational experts, offering a fuller view of the many sides of the story of AI’s creative influence.

Most of us have heard by now that AI guzzles water like it’s just crossed a marathon finish line. Over the past year, headlines have warned of AI servers requiring vast quantities of water to keep cool, and simple AI queries generating many times more carbon than a standard Google search. In the creative industry, these stories only add to existing tremors of guilt and trepidation. Designers and artists who have begun to experiment with, or even rely on, generative AI now face a fresh catch-22: I know it’s bad for the environment, but soon I might have to use it, for the sake of my career.

“If you work as a designer, you need to have an iPhone and social media,” says Linda Dounia Rebeiz, an artist, designer and writer interested in the philosophical and environmental implications of technocapitalism. “So you’re fundamentally participating in a very unethical system, right? It’s the same with AI. This is a defining feature of capitalism. It makes you used to closing your eyes [in order] to enjoy little conveniences. But not just that, it locks you into conveniences, because it turns them into tools for production that you need to work.”

How deeply we are locked into the convenience of AI is already startling. In just a few short years, generative AI has gone from parlour trick to embedded tool. In personal work and projects for major agencies alike, it’s used iteratively, sometimes for hours at a time – just last year, Squarespace’s CCO David Lee described how his team uses AI to rapidly prototype or “validate” creative ideas prior to production. Most of us are aware of the link between AI and the environment. The trouble is, despite us using these tools increasingly in daily practice, nobody is very sure what that cost is.

Above

Panvel, India, is home to the data centre Yotta nm1, which occupies 76,180 sq. meters. It’s reported to have cost $1.1 billion to build.

The answer to this, and other questions about AI and the environment, isn’t easy to come by. Unlike tools like Google Flights, which now displays the total carbon cost of each flight option to help guide user decisions, MidJourney or Dall-E don’t tell you how much carbon you emit through your usage, or how to tweak your usage accordingly. And it’s not just users who are left in the dark – it’s researchers too.

One currently working in this space is Alex de Vries, a PhD candidate at VU Amsterdam’s Institute for Environmental Studies: “It’s a struggle,” he says, “because the tech companies providing these applications aren’t giving us the actual numbers behind the scenes. [In this field], you’ll find very frustrated researchers who can’t give very concrete answers.”

Given how much is shrouded in secrecy, is it possible to get a clear picture of the environmental impact of AI? Well, sort of. Using all the available data, most of it sourced not from tech companies but from researchers, we’ve attempted to answer some of the most common questions creatives might have on this subject. From immediate inquiries, like: “How much carbon does it cost to generate an image?”, to bigger-picture concerns about where responsibility lies in the supply chain.

We’d like to note that there are limitations in this data. Alex, one of the three experts we leaned on during this process – someone who has spent years tracking these estimates – warns that in this field, “there’s only blind spots”. Our own answers do similar aggregate work, combining findings from published studies with first-hand insights from researchers. In this spirit of expert collaboration, we’d like to leave any AI platform developers reading this with a parting invitation from Alex, who said during our interview:

“The tech companies are welcome to give us better numbers if we’re doing it wrong.”

How much carbon does a standard AI query cost?

In 2024, a peer-reviewed study by Alex de Vries found that a single ChatGPT query could use around ten times more energy than a traditional Google search.

The study took into account multiple sources, including a 2023 statement from Google parent company Alphabet’s chairman who said that interacting with a large language model (LLM) could ‘‘likely cost 10 times more than a standard keyword search”. This aligns with a 2023 estimate from SemiAnalysis that ChatGPT’s electricity consumption per request could be as high as 2.9 Wh (watt-hour), compared to Google’s 0.3 Wh.

It’s worth noting that Alex had no direct access to ChatGPT’s or Google’s actual emissions data while conducting this study; he worked with modelling and available data.

Short answer: A ChatGPT search in 2023 could use roughly ten times more energy than a standard (non‑AI) Google search.

Above

Image generation could use over 60x more energy than text.

Based on a 2024 study from Hugging Face (not peer-reviewed)

How much carbon does it cost to code with AI?

In a 2025 study evaluating 14 LLMs across a range of subjects, researchers found that models with greater reasoning capacity could produce up to 50 times more CO2 emissions. Questions that required lengthy reasoning processes, like abstract algebra or philosophy, led to emissions up to six times higher than more straightforward subjects, like high-school history.

If your coding project requires lengthy AI responses, or requires the model to show its “workings out”, it could come with a higher carbon cost. “If you’re asking an LLM to write code, the typical output might be longer than a regular question.” says Alex. “Longer responses are more energy intensive than short responses. How much more, exactly? It depends on the baseline.” And if users run into mistakes that need debugging with the help of AI, this all adds to a lengthening interaction with the model.

Alex adds that coding is just one example of how users might work with AI: “You can also use LLMs to help (re)write essays or complete novels. Inputs may be longer in these cases, which affects energy costs as well.”

Short answer: Unknown, but longer more complex reasoning outputs could emit up to six times more CO2 emissions than straightforward subjects.

Above

Covilhã Data Centre in Portugal sits at approximately 75,000 sq. metres and is estimated to have cost over €90 million.

How much carbon does it cost to generate an image?

Studies in this area are sparse, but there are some broad estimates. A 2024 study from open-source AI community Hugging Face found that image generation can use, on average, over 60 times more energy than text generation. The least efficient model they tested consumed as much energy as 52 smartphone charges per 100 images – roughly a half charge per image.

These figures haven’t been peer-reviewed and come from non-commercial, open-source models rather than major industry tools. Still, the baseline carbon intensity difference between image and text production has big implications for the creative industry, where batch image generation is common (Dall-E permits four images per prompt through its free tier) and AI is often used to “rapidly prototype” (produce lots of test images, fast).

Then, there’s film. Anyone who follows platforms like Runway on Instagram has likely seen huge leaps in AI applications for motion design and experimental short filmmaking. “It gets a lot more complicated when you get to video generation,” says Alex, “because that is also a lot more energy intensive. But it also depends on what you’re generating. Are you generating a couple seconds, or are you generating very [long scenes]?” Either way, if these are the outputs of generating images alone, “just think about what that means for generating video,” he says.

Short answer: Image generation could use over 60 times more energy than text generation.

Above

“It’s a struggle, because the tech companies providing these applications aren’t giving us the actual numbers behind the scenes.”

Alex de Vries

How much carbon does it cost to train your own generative model?

Some creatives engage with AI by creating their own bespoke models – that means setting up their own personal AI systems, which are then trained on their own artwork. Interestingly, Alex says this is an area where creatives could actually gain some control over their carbon footprint – if they take the step of building their own AI models and operating these models on their own computing hardware. Only this measure would allow users to track their energy use directly and independently.

Of course, this too comes with a warning. “If everyone is going to develop their own applications, it’s also [going to be] extremely energy inefficient,” Alex says.

Short answer: Unknown. In theory, training a small-scale model could be more environmentally friendly than a large-scale model, but massively inefficient if everyone does it.

Above

China Telecom built its 994,062 sq. metres data centre in Hohhot, Inner Mongolia – that’s around 139 football pitches.

Does AI produce more carbon than traditional creative pipelines?

This one stopped us in our tracks. A February 2024 study gathering data from a range of published sources found that AI emits between 130 and 1500 times less CO2e (carbon dioxide equivalent) than human writers (per page of text), and 310 and 2900 times less than human illustrators (per image). These figures compare AI carbon estimates to the amount of carbon humans typically produce per hour, based on standard computational costs.

The carbon “wins” here largely come from the speed at which AI can produce content in comparison to humans. However it’s this very efficiency that encourages creatives to use AI in a supplementary manner – as well as, rather than instead of.

This study does not account for working iteratively or in tandem with AI. It also notes that: “The freed human time [from using AI] may also incur new unexpected environmental costs.” We found no studies comparing AI to heavier production pipelines, like film.

Short answer: If you directly compare one image made by a human with one made by AI, and the AI-generated image involves no revisions, iterative editing, batch generation or human creative time to produce, and if the time saved by using AI isn’t spent on other carbon-intensive activities, then yeah, sure: an AI image produces less carbon.

Above

“Users often don’t even know the carbon footprint of their actions.”

Noman Bashir

What’s the environmental impact of AI beyond carbon?

So far, we’ve spoken mainly about carbon. But energy is only part of the cost. For communities living next to data centres, the impact can be felt far closer to home.

The pressure on data centre capabilities has increased since the acceleration of AI, which means more pressure on the areas and communities where they are built. In Europe, they are normally present in locations with cheap and abundant power, or in areas which offer tax incentives. The highest concentration of data centres are in the US. So high is the concentration in Loudoun County, Virginia that it’s referred to as “Data Centre Alley”: home to 30 million square feet of data centres, or about 1.08 square miles.

The impacts on communities like Loudoun are significant. There is noise pollution from data centres affecting nearby neighbourhoods, as well as increased health risks caused by pollutants produced by generators, including particulate matter, nitrogen oxide, sulfur dioxide and carbon dioxide.

Data centres are a key research area for Noman Bashir, computing and climate impact fellow at MIT. “Once they are installed, data centres have very high fluctuations in their power demand, because AI workloads are not stable. This impacts the power grid, potentially causing more blackouts in the neighbouring communities,” Noman says. Then, “whenever the power goes down, the on-site gas and diesel generators that are used by data centres activate and release sulfur dioxide and nitrogen dioxide gasses into the environment.”

In Loudoun, Noman says that the lack of community infrastructure in the areas directly surrounding data centres mean some local workers are also facing a lack of local amenities, with twenty-minute travel times to access groceries.

As Noman puts it, “just because this is called ‘cloud computing’ doesn’t mean the hardware lives in the cloud.”

Short Answer: Noise pollution, power cuts and increased health risks for local communities situated near data centres.

Above

Costing an estimated $1.3 billion, Switch’s 668,901 sq. metres centre in Nevada, USA, is known as The Citadel.

Who is responsible for AI’s environmental impact?

Both Noman and Alex believe that the onus for responsible AI use is too often placed at the user’s door. “Users often don’t even know the carbon footprint of their actions,” Noman says. Alex agrees, stating that without publicly available data, “It’s just not realistic to ask an end user to be more responsible.”

Shaolei Ren, an associate professor who researches AI’s thirstiness, puts it another way. “So for AI users, the direct carbon emissions from usage are essentially zero, since we are not the ones operating the machines or consuming the energy; those activities are carried out by upstream entities such as the technology companies running the models.” However, Shaolei adds, “When you ask who is responsible, it may be helpful to distinguish between direct and indirect responsibility. In a broader sense, all parties are involved, whether upstream or downstream in the entire value chain, which is why collective efforts from both the supply side and the user demand side are necessary.”

Noman adds another view: those that are responsible, and those that have the power for any “real change”, are those that drive the pace of AI development. “It’s impossible to build sustainable data centres at this pace,” he says. “If you were to give me 10 years or 15 years, as opposed to the next two years, I would be able to build those sustainably.” This high focus on performance and pace is driven not by individual AI developers, but at market and stakeholder level. Here is where agency lies, and therefore responsibility.

Short answer: AI companies and their stakeholders are directly responsible.

Above

“It’s impossible to build sustainable data centres at this pace.”

Noman Bashir

What can I do to minimise my AI carbon footprint?

It can be easy to feel like the only option for sustainable AI is going cold turkey. Surprisingly, Shaolei doesn’t think that AI abstinence is the way to go. “There are mostly two extremes [on AI],” says Shaolei. “One is there are no environmental problems, it’s perfect and the other is AI is using so much energy, it’s harming the planet, let’s stop using it. But it’s more complex than that.”

When it comes to what users can do to help, the options are limited. But users are not powerless. For example, currently there is little incentive for tech companies to use low-carbon energy – “even a carbon tax wouldn’t impact tech companies significantly as their profit per ton of carbon is much higher than traditional industries,” says Noman – but if incentive exists, Shaolei says it could create critical mass.

“If, let’s say, most users want to have some clean, greener version of AI models, then model developers and even upstream vendors would have stronger incentives to shift toward those greener options,” says Shaolei. Google Flights, in this instance, is a good model we can look to. Just as consumer pressure following the pandemic encouraged travel companies to provide sustainable flight path options, creatives can create their own microcosm of consumer demands, keeping their prompts to AI providers loud and constant.

In the meantime, adding (-ai) to your Google searches will at least remove the automatic AI summary feature. Small wins.

Short Answer: Apply consumer pressure.

Above

Mesa in Arizona houses Apple’s data centre, coming in at a tidy 120,773 sq. metres. For comparison, that’s 52 Covent Garden-sized Apple Stores.

Uncover the full Light and Shade series

Explore the challenges at the heart of the AI-creative conversation with our series of insights-driven articles below.

Brought to you by

Insights

Insights is a visual research department within It’s Nice That helping creative teams with sticking points. We deliver research on cultural landscapes, audience tastes, communities and talent to unlock your creative approach.

Share Article

About the Author

Liz Gorny

Liz (she/they) is associate editor at Insights, a research-driven department within It's Nice That. They previously ran the news section of the website. Get in contact with them for potential Insights collaborations or to discuss Insights’ fortnightly column, POV.

It's Nice That Newsletters

Fancy a bit of It's Nice That in your inbox? Sign up to our newsletters and we'll keep you in the loop with everything good going on in the creative world.