Why Businesses Are Asking How Much Energy AI Systems Are Using
If you're using AI tools at work, you're probably not thinking about how much electricity they consume. But that question is getting louder, and for good reason.
AI isn't a future technology anymore. Many UK businesses already rely on it daily, for customer service, content, data analysis, and admin. As it embeds itself further into everyday operations, more business owners are asking what that means for energy demand, costs, and their sustainability commitments.
This guide explains how much energy AI systems actually use, why it matters for UK businesses, and what you can do to stay on top of your energy costs in a market that's changing quickly.

How Much Energy Does AI Actually Use?
On its own, a single AI query doesn't use a lot of electricity.
According to OpenAI CEO Sam Altman, a single ChatGPT query uses approximately 0.34 watt-hours of electricity, roughly the same as running an energy-efficient light bulb for a couple of minutes. Research from Epoch AI, published in February 2025, puts the figure for a typical GPT-4o query at a similar level, around 0.3 watt-hours, which is ten times less than older estimates that circulated widely a few years ago, thanks to more efficient hardware and model improvements.
It's worth knowing that these figures vary quite a bit depending on the tool and the complexity of the task. Estimates across studies range from 0.3 watt-hours at the lower end up to around 3 watt-hours, with differences reflecting variations in model size, hardware efficiency, and measurement methods. More computationally heavy tasks, like generating images or running reasoning models, sit toward the top of that range.
One thing is consistent though: a single query uses a small amount of electricity. That's not what businesses should be focused on. The more useful question is what happens when AI is used across a whole organisation, every day.
AI Adoption Is Growing and So Is the Energy That Comes With It
One ChatGPT query here and there is barely a rounding error on your electricity bill. But AI doesn't work like that anymore.
Most businesses now use it across multiple departments. Customer service teams run AI-powered chat tools around the clock. Marketing teams use generative AI for content. Finance, operations and HR teams rely on it for analysis, forecasting and automation. Each interaction uses a small amount of electricity. Multiply that across a team, then an organisation, then every organisation doing the same thing, and the numbers start to look very different.
All generative AI queries could hit 329 billion per day by 2030, according to Schneider Electric projections. All of it runs through data centres that operate continuously, requiring energy not just for computing, but for cooling and backup systems too.
According to Electric Insights, data centres that power AI models contributed to raising Britain's electricity demand by 1.7% in 2025. That's a meaningful shift, and the trend is pointing firmly upwards.
Are AI Tools Contributing to Rising Business Energy Bills?
AI tools won't appear as a line item on your electricity bill. You won't see "ChatGPT usage" alongside your unit rate. But that doesn't mean AI has no bearing on what you pay.
According to NESO's Future Energy Scenarios 2025, data centres consumed an estimated 7.6 terawatt-hours in 2024, equivalent to around 2% of Great Britain's electricity demand. NESO's own modelling projects that figure rising to between 20 and 41 terawatt-hours by 2035.
More demand on the grid means more pressure on wholesale electricity prices. The electricity data centres consume is bought on the same wholesale market that your energy supplier uses to serve businesses like yours. When wholesale prices go up, variable tariffs feel it immediately, and fixed tariffs feel it at renewal.
As analysis submitted to Parliament has noted, AI data centres aren't just an electricity consumption issue. They're also a grid infrastructure issue, and the effect may show up not only in the wholesale unit rate but also through higher network and non-commodity charges, especially on contracts where third-party costs are passed through.
AI isn't the only driver of energy prices, and it'd be an overstatement to pin your bill on it. But it's now part of the picture, and it's worth understanding.
If you want to get a clearer view of how your tariff is structured and when it might be worth reviewing your deal, our business energy hub covers the different options available to UK businesses.
What Is the Carbon Footprint of AI?
The carbon footprint of AI is directly tied to how the electricity powering data centres is generated.
In August 2025, Google published a detailed breakdown showing the median Gemini text prompt consumes about 0.24 watt-hours and produces 0.03 grams of CO2 equivalent. On a per-query basis, that's a very small number. At scale, it adds up.
A December 2025 study in the journal Patterns estimated that AI systems running in data centres could produce between 32.6 and 79.7 million tons of CO2 in 2025, comparable at the lower end to Norway's annual emissions.
The direction of travel for the major tech companies isn't encouraging either. Google's own environmental report showed total greenhouse gas emissions up 51% since 2019, with AI a key driver, while its data centres consumed 30.8 million megawatt-hours of electricity in 2024, more than double the amount in 2020. Microsoft's emissions rose 23.4% since 2020, despite pledges to be carbon negative by 2030, mainly due to the energy demands of AI and cloud computing.
For businesses with ESG commitments or sustainability targets, this matters. Digital operations are increasingly part of the overall carbon picture, and the energy suppliers you work with vary in their approach to renewables and fuel mix. If that's relevant to your business, it's worth understanding how different energy suppliers generate the electricity they sell.
How Can Businesses Stay on Top of Their Energy Costs?
AI is part of how businesses run now. The goal isn't to avoid it. It's to make sure your energy costs aren't working against you while you're using it.
That starts with knowing what you're on. A lot of businesses are still sitting on tariffs that made sense when they signed them but don't reflect the market anymore. Comparing your options takes minutes and could make a real difference, particularly if your contract is coming up for renewal.
With Love Energy Savings, you can compare tariffs across our panel of carefully selected suppliers without spending hours doing the research yourself. We've been helping UK businesses find better energy deals since 2007, with over 500,000 businesses helped and more than £150,000,000 saved. We do the hard work so you don't have to.
See what you could save. It only takes a couple of minutes.
AI Energy Usage FAQs
-
How much energy does AI use per prompt?
A typical text-based AI prompt uses around 0.3 to 0.34 watt-hours of electricity, depending on the tool and the complexity of the task, according to figures from Epoch AI and OpenAI. More demanding tasks, like image generation or reasoning models, can use considerably more.
-
How much energy does a ChatGPT query use?
According to OpenAI CEO Sam Altman, a standard ChatGPT query uses around 0.34 watt-hours. That's roughly the same as running an energy-efficient light bulb for a couple of minutes. Estimates do vary across studies, and more complex prompts will use more.
-
How much more energy does AI use than a Google search?
Research from Epoch AI and OpenAI's own figures suggest a typical ChatGPT query uses around 10 times more energy than a standard Google search, which the Electric Power Research Institute (EPRI) puts at approximately 0.3 watt-hours.
-
Why does AI use so much energy?
AI relies on large data centres that run continuously. While each individual request uses a small amount of electricity, it's the scale of usage across millions of queries per day, combined with the energy needed for cooling and backup systems, that makes the cumulative demand significant. Schneider Electric projects all generative AI queries could reach 329 billion per day by 2030.
-
What is the carbon footprint of AI?
It depends largely on how the electricity powering data centres is generated. Google has reported that a median Gemini text prompt produces around 0.03 grams of CO2 equivalent. At a global scale, a December 2025 study in Patterns estimated AI systems could produce between 32.6 and 79.7 million tons of CO2 in 2025. If sustainability is a priority for your business, it's worth looking at renewable energy tariffs when you next compare.
-
Does AI have a large carbon footprint?
Per query, the carbon footprint is very small. At scale, it's becoming increasingly significant. The bigger factor is how the electricity powering AI infrastructure is generated. Data centres running on renewable energy have a much lower impact than those relying on fossil fuels. Google's detailed August 2025 methodology is one of the most transparent disclosures from any major provider to date.
-
Is AI affecting business energy bills?
Not directly. AI tools don't appear on your electricity bill. But growing data centre demand puts pressure on wholesale electricity prices, which can feed through to the rates businesses pay, particularly on variable tariffs or at the point of renewal. NESO's Future Energy Scenarios project UK data centre demand could more than double by 2035.
-
How can businesses keep on top of their energy costs?
Regularly reviewing your tariff is the most straightforward step. Comparing business energy deals takes minutes and ensures your rates still reflect market conditions. Love Energy Savings makes it easy to compare tariffs across our panel of suppliers in one place.
-
How do I compare business electricity tariffs?
Head to our business energy comparison page to see what's available. We compare tariffs across our panel of suppliers so you can find the right deal for your usage and contract type.