Main content

Frugal AI - how one UK university is laying the foundations to tackle AI’s environmental problems

Cath Everett Profile picture for user catheverett January 16, 2026
Summary:
Cambridge Frugal AI Hub is designing AI systems to be more resource-efficient, sustainable, accessible and inclusive than those based on traditional approaches.

climate change

It is well known that AI is creating significant environmental challenges. For instance, the equipment used in data centers consumes large amounts of raw materials. This, in turn, creates sizeable volumes of electronic waste, which often contains hazardous substances, such as mercury and lead.

But there are other problems too. These include high levels of water and energy usage, not least in relation to the training and use of large language models. As a result, data centers may only consume 1.5% of the world’s electricity today. But by the end of the decade, the International Energy Agency (IEA) expects this figure to more than double to 945 TWh as AI adoption rises.

Even more worryingly, a report by investment bank Goldman Sachs Research predicts that demand for energy to power datacenters will increase by 165% by 2030 compared with 2023 levels.

Understanding Frugal AI

To try and address these thorny issues, work is currently taking place in the form of ‘Frugal AI”, a set of principles that are intended to provide a possible solution. The notion stems from the concept of ‘Frugal Engineering’, a term coined and approach adopted by Carlos Ghosn, Chairman and CEO of the Renault-Nissan Alliance, in 2006. He had been impressed by the ability of his Indian engineers to innovate more quickly and cost-effectively than those elsewhere, despite resource constraints.

They were able to do so due to employing an approach that in Hindi is known as ‘jugaad’, a word that translates as a ‘quick fix’ or ‘hack’. It implies the idea of looking for solutions in difficult circumstances by being resourceful, bending the rules and thinking laterally to make something work.

In other words, the approach is based on the notion of ‘necessity being the mother of invention” and doing more with less. This includes removing non-essential features to create products that offer value for money and are tailored to meet local needs and conditions.

The concept was subsequently taken up and reapplied by social entrepreneur and professor, Radha Basu. She co-founded the first Frugal Innovation Lab (now ‘Hub’) at Santa Clara University (SCU) with Godfrey Mungal, then Dean of SCU’s School of Engineering, in 2011. The aim here was to engage students in developing simple, affordable and sustainable solutions to tackle social challenges in underserved communities around the world.

Once Navi Radjou, Jaideep Prabhu and Simone Ahuga published their book ‘Jugaad Innovation’ the following year, the concept then took off more widely. But it was the pioneering work of Prabhu in particular, as a leading voice on ‘frugal innovation’, that laid the foundations for the principles to be applied to AI.

The case for Frugal AI

Currently a Professor at the Cambridge Judge Business School, he supported the creation of the University’s Frugal AI Hub in April last year by Visiting Fellows Elizabeth Osta and Serish Venkata Gandikota. Formally launched last November, the Hub’s aim is to design AI systems that are more resource-efficient, sustainable, accessible and inclusive, particularly for emerging markets and low-income countries, than traditional approaches. As the Hub’s Chief Technology Officer Arjuna Sathiaseelan, points out:

There are a number of problems with current AI. For example, if we look at where OpenAI’s current energy consumption is and how it expects this to grow over the next eight years, it’ll add up to more than India’s total electricity consumption. A second data point is that, according to a recent MIT study, 95% of AI projects fail as AI is so costly to deploy, it’s not energy-efficient and it’s not sustainable.

Tackling this situation is becoming ever more pressing globally. Osta explains:

Frugal AI is about the realization we have constraints in terms of the grid, energy availability and capital, especially in Europe, which is limiting the region’s ability to scale. Energy grid constraints and less available capital for start-ups to pay for compute power is resulting in higher levels of failure. However, the situation is very different in the global south. A lack of data centers and high costs mean it is difficult to access AI there at all. While there’s a seven billion user opportunity worldwide, there are currently only about one billion people using it. But if we could expand into the global south and support local start-ups, it would make a huge difference in terms of ensuring economies there aren’t left behind.

Key benefits of Frugal AI, Sathiaseelan believes, are that:

It turns AI from being an experimental cost center into a scalable capability by cutting waste in terms of compute power, energy and water across the entire stack. When you deliberately design for efficiency, you can deliver better outcomes with smaller models and cheaper infrastructure, which should accelerate AI adoption in future.

The most important advantage though, he says, is predictability in understanding development and deployment costs:

Once you measure and optimize, you can move from hoping a model is worth it to having clear unit economics, like cost per customer served or cost per uplift. It’s what’s required for executives to see what the trade-off is between latency, cost and investment. Currently most people have no idea where or what their costs are, but Frugal AI makes predictability visible.

The Frugal AI Hub

To make the vision a reality, the Frugal AI Hub is focusing on four main areas of work:

  1. Collaborating with partners on a Frugal AI technology stack, which includes infrastructure, data, and open APIs. The results will be published in a white paper
  2. Building standardized measurement frameworks in collaboration with the United Nations International Computing Centre to evaluate the total cost of ownership, return on investment, and social impact of AI use cases. The aim is to submit these frameworks to a standards body, such as ISO, to develop standards and benchmarks, which do not currently exist. This will ultimately enable users to compare options based not just on accuracy but on efficiency scores and cost per task too
  3. Creating a marketplace for developers and organizations to publish their Frugal AI tools, models, services and APIs. It will be launched this year, with a particular focus on involving start-up companies
  4. Promoting the Frugal AI approach from enterprises to underserved communities.

Another point Sathiaseelan makes, meanwhile, is that individual, large models developed by a small number of players in the global north cannot hope to satisfy the diverse needs of developing countries, such as India. This means a more decentralized approach is required, he says:

A country like India has more than 2,500 languages but most of them aren’t represented on the web. So, if you build a model, how do you frame them for ethnic purposes? In future, there’ll be more decentralized frameworks, infrastructure, and models. This means communities will be able to develop their own AI models in their own languages to satisfy their own cultural needs. As a result, the Hub is trying to position itself towards this, so that people wanting to build a tech stack or model have the processes, frameworks and standards to do so.

An answer rather than the answer

But Sathiaseelan also acknowledges that on top of language and cultural barriers, the global south faces significant infrastructure, regulatory and skills challenges. Therefore, by 2030, the Hub intends to set up Frugal AI Labs, which will work with governments, non-governmental organizations, startups and corporates to build capacity. Osta explains:

From New York to agricultural communities in Thailand, we’ll co-create relevant use cases. We’ll upskill communities on Frugal AI principles, help developers, and share ideas on how to put systems together and scale them. Each Lab will be there for a specific point in time to build up capacity and capability. Funding will come from different sources based on audience, so perhaps corporates in a business setting, and grants or philanthropic funding for community projects.

However, Osta believes that Frugal AI is not the only answer but rather simply one answer to AI’s environmental challenges. Sathiaseelan agrees, indicating that Frugal AI is more likely to “complement and co-exist with traditional AI” rather than replace it:

Frugal AI is not a magic wand, but we like to believe it could be one of the most directly controllable levers as it addresses underlying demand for water, energy, compute infrastructure and the like. We can’t keep on designing larger and larger models that run inefficiently, and ‘green data centers’ aren’t going to change anything as the issue is one of workload. So, we believe Frugal AI is more environmentally friendly as you can create smaller, task-specific models, and use optimization techniques, such as intelligent caching. There are also lots of things in the stack to help you use less energy, less water for cooling and less infrastructure, which leads to a slower hardware replacement cycle.

Another driver for adoption, believes Sathiaseelan, is that company boards will soon start demanding that the same cost discipline be applied to AI as it is to other technologies. This, he says, will lead to requirements for lower cost solutions:

So, in a similar way to ‘security by design’, the Frugal AI philosophy will over time become increasingly mandatory as boards demand better cost discipline and return on investment from their AI expenditure.

Loading
A grey colored placeholder image