This blog was authored by: Ben Jarvis, Data and AI CTO | 25 March 2025

 

Artificial Intelligence (AI) is set to transform industries by providing new capabilities and efficiencies. However, the market is noisy, with discussions about AI evolving on a weekly basis.

 

We have created a new thought leadership series, ‘Beyond The Hype’ to help organisations cut through this noise. Myself, and other members of the Telefónica Tech team will be sharing the topics we advise our customers on when developing their Data & AI strategy.

 

In this article, we explore the environmental implications of AI, focusing on the energy consumption of large AI models and strategies to mitigate their impact.

The Energy Consumption of AI Models

 

Running large AI models, such as GPT, requires significant computational power and energy as demonstrated by the statistics above. These models are trained on vast datasets, which involves spinning up numerous servers and consuming substantial amounts of electricity. This not only has a negative impact on the environment but also incurs high costs for businesses if not controlled and managed effectively.

 

Ensuring this is discussed and considered during the model selection and deployment process helps to prevent unexpected surprises.

 

Choosing the Right Model for the Use Case

 

When it comes to implementing AI, one of the most critical decisions is selecting the right model for the specific use case. This is something we stress with all organisations implementing an AI strategy. This choice can significantly impact both the effectiveness, return on investment and the environmental footprint of the AI solution.

 

Here, we delve deeper into this topic, providing examples of different technologies from our key Data & AI partners, Microsoft and Databricks that can be leveraged for various use cases.

 

General-Purpose vs. Targeted Models

 

General-purpose models like GPT (Generative Pre-trained Transformer) are designed to handle a wide range of tasks by being trained on extensive datasets from diverse sources. These models are incredibly powerful but also resource-intensive, both in terms of computational power and energy consumption. For instance, GPT models are trained on the entire internet, making them suitable for tasks requiring broad general intelligence.

 

However, not all use cases require such extensive capabilities. For example, summarising customer complaints does not need the general intelligence of GPT. Instead, smaller, more targeted models can be more efficient and environmentally friendly.

 

By carefully selecting the right model for each use case, businesses can achieve their AI goals more efficiently and sustainably. Leveraging the specialised tools and platforms from partners such as Microsoft and Databricks can help ensure that AI initiatives are both cost-effective and environmentally responsible.

By carefully selecting the right model for each use case, businesses can achieve their AI goals more efficiently and sustainably. Leveraging the specialised tools and platforms from partners such as Microsoft and Databricks can help ensure that AI initiatives are both cost-effective and environmentally responsible.

 

Agentic AI: A Modular Approach

 

As mentioned, Agentic AI represents a shift from building monolithic AI systems to creating modular AI components. Instead of using a single large model to handle all tasks, Agentic AI involves developing specialised agents for specific functions. This modular approach allows for more efficient use of computational resources, reducing both costs and environmental impact. We will be sharing more on Agentic AI specifically in a follow up article.

 

Integrating Other Technologies to Manage Cloud Spend

 

There are various approaches which help organisations get maximum value out of cloud and avoid wasted spend. These identify underutilised resources driving consumption costs, create cost transparency so teams own their cloud usage, and measure the ROI. These practices help avoid spiralling cloud costs and inefficient energy usage.

 

FinOps and GreenOps are key approaches to optimising cloud use. FinOps tracks, optimises and attributes cloud spending to business areas, helping organisations assess the return on investment for AI and other workloads, continuously optimising the run costs. GreenOps on the other hand promotes sustainability by encouraging practices whichreduce the environmental footprint of cloud services. By iadopting FinOps and GreenOps practices, businesses can optimise their cloud strategies for both cost efficiency and environmental responsibility.

Conclusion

 

The environmental impact of AI is a critical issue that requires thoughtful strategies and frameworks. By choosing the right models, adopting modular AI approaches, and integrating other cloud management technologies, businesses can mitigate the environmental impact of their AI initiatives. As we continue to explore the potential of AI, it is essential to balance innovation with sustainability, ensuring that the benefits of AI are realised without compromising our planet’s health.

 

Stay tuned for more insights in the ‘Beyond The Hype’ series, where we will continue to explore the important topics to consider within your Data & AI Strategy.

 

With contributions from Daniel Rickards, Head of Strategic Cloud Consulting at Telefónica Tech UK&I