UTFacultiesEEMCSNewsAsking ChatGPT what's for dinner? Here's how much energy that costs

Asking ChatGPT what's for dinner? Here's how much energy that costs

Most of us do it without thinking: you open an AI tool, type a quick question, and wait for the answer to roll in. Easy. Fast. Apparently harmless. But what if those everyday questions come with a much higher energy bill than you expect?

That’s the starting point of the research of Lola Solovyeva, PhD candidate at UT. She studies something most people rarely consider: the energy use of large language models. The technology behind tools like ChatGPT, CoPilot, and other new AI-powered search engines. “AI doesn’t run on magic. It runs on electricity.”

A single question can already be surprisingly expensive

When Lola talks about AI, she talks about it almost the way someone talks about a car engine: efficient here, wasteful there, smooth on some parts, clunky on others. Except this “engine” runs inside data centres. Every time we ask a question, that engine starts up. “People use AI for the smallest things now,” Lola says. “Each request activates large systems in a data centre somewhere. That costs energy, far more than most people think.” Often, the biggest surprise for new audiences is this: the energy cost isn’t only in training the model, but it’s also in its usage. Every answer you receive is the result of millions, and in larger models billions, of calculations. Sometimes we make it worse without realising.

The way you talk to AI

Something Lola sees in her research is how human behaviour affects energy use, much more than we assume. For example:

  • Asking vague questions → AI produces long answers → more energy.
  • Asking follow-up questions because the first one wasn’t clear → more energy
  • Being polite (“thank you!”, “good morning!”) → extra responses → more energy

Her personal favourite example? “There was a post online saying you should thank the model after it responds,” she laughs. “People don’t realise that even a quick ‘thanks’ triggers a whole new response and therefore more energy use.” Even the AI systems themselves add to the problem. Many tools reply with cheerful intros or overly long explanations simply because that’s how they were designed. “They were trained to be friendly and helpful,” Lola says. “But the energy depends on the number of words AI produces. As a result, more elaborate or highly enthusiastic responses generally require additional energy. And because we’re talking about millions of users worldwide, that adds up.”

Can we make AI less wasteful?

Lola studies two things: how to get AI to use less energy per answer and how energy-efficient AI-generated code really is. Right now, she’s exploring why many models produce such long responses and how to reduce that without harming the quality of the answer. Shorter output = fewer calculations = less energy. Developers increasingly rely on tools like GitHub Copilot to write software. But LLM-written code isn’t always as clean or efficient as human-written code.

For popular languages like Python, AI does produce more often than not a correct code, which also sometimes can be as efficient as human-written code. But for Java or C++, AI produces correct code less frequently and its efficiency is not yet close to human-written code. Which leads to extra fixes and extra energy. “If you need to ask the model again to repair the code, that’s another burst of energy,” she explains. “It all builds up.”

Should we be worried?

Lola is not pessimistic, but she is realistic. AI use is exploding. AI answers are appearing inside search engines. More people rely on them for work, study, and even entertainment. “The impact is definitely growing,” she says. “Not because AI is bad, but because we’re using it for everything, sometimes without thinking.” Still, she doesn’t argue for using less AI. She argues for using it smarter.

So what can regular users do? More than you’d expect. Her three tips:

  • Use AI when it genuinely helps you. Not for things you could easily do yourself.
  • Be specific in your question. Clear questions produce shorter, more accurate answers.
  • Say what you want. Add: “Answer briefly.” The model will follow.

“It sounds small,” Lola says, “but if lots of people do this, the energy savings really matter. Do not be scared to ask AI a longer question if it leads to a more specific and shorter answer, because from the energy point of view, the length of the question matters less, in comparison to the length of the response.” Behind every AI-generated sentence sits hardware, electricity, and (whether renewable or not) a footprint. That’s something Lola hopes more people will realise. Her work is part of a bigger movement at UT to connect digital innovation with climate responsibility. As our world becomes increasingly digital, these invisible impacts become increasingly important.