Could electrical digital twins solve AI’s capacity absorption problems?
- Summary:
-
The continued growth of AI is bumping up against twin problems of electrical supply and enterprise demand. Collaboration on electrical digital twins could help widen both bottlenecks.
Will electricity digital twins mature before the AI investment runway cuts short their takeoff? That’s a question I found myself pondering amidst a growing sentiment that the AI bubble might soon burst, and perhaps it doesn't have enough power or connections for all the new data centers in the works.
Last week, Gita Gopinath, the IMF’s former Chief Economist, cautioned that a burst AI bubble might torch $35 trillion in wealth. Meanwhile, Andrej Karpathy, who coined the term vibe coding, resorted to good old-fashioned programming skills for a new chat app. He subsequently extended his estimates for artificial general intelligence to a decade.
A fundamental problem underneath all of these is a lack of capacity on two fronts. There isn't enough electricity to go around for all the shiny new data centers, let alone all the electric cars and heat pumps everyone wants to build. Also, the notion of simply scaling Large Language Models (LLMs) to solve every problem is starting to run out of steam.
A moonshot collaboration on electrical and energy digital twin infrastructure might help widen both these bottlenecks. An essential aspect of all of this lies in building more comprehensive knowledge graphs, provenance chains, and interconnected physical models. Connecting the dots between disparate electrical models and control systems would lower the cost of wiring up the grid infrastructure. It would also accelerate the development of millions of interconnected AI, physics, and financial models better suited for the task than LLMs. It could also lend more credibility to the UK’s goal of creating 400,000 new jobs in the energy sector, not to mention the other.
As it stands today, there are dozens of electrical modeling, simulation, and management tools, as well as specifications and open source software. But they mostly operate in silos at the level of individual domains and individual vendors. NVIDIA has made some progress in plugging a few electrical system vendors together to improve the efficiency and capacity within data center walls. But addressing the electrical infrastructure gap powering those data centers will require bridging a few more silos.
NVIDIA CEO Jensen Huang suggested the UK might need a few more gas plants to power all the new data centers, but these will require wires, gas pipes, and neighbors who are happy with new power plants or wires. Also, this is in a country that essentially threw away over £1.3 billion in 2024 to shut down wind power and fire up gas plants last year because it did not have enough wires or batteries to absorb all the energy. And there is currently a 10-year queue for connecting new power plants representing 400 gigawatts of power. The cost of upgrading all of this is estimated at £104 per household/year. Maybe ratepayers will be happy to help cover these costs, but this is not exactly a stellar marketing message for all those advantages of more AI.
Off the top of my head, here are some of the ways that electrical digital twins might help solve the capacity gap on the infrastructure front:
- Demonstrate investor opportunities for innovations in powerlines, control systems, and battery storage.
- Simplify technical and financial modeling for new infrastructure.
- Support new market mechanics for decentralized generation and storage.
- Enhance community outreach, buy-in, and incentivization for energy projects.
- Streamline automated code compliance processes for regulatory approval.
- Optimize the development of more efficient infrastructure.
- Improve the learning and development of energy skills.
- Prototype more resilient and secure control and management systems.
Lots of digital silos
There are certainly lots of efforts to build digital twins of various aspects of energy infrastructure. The International Electrotechnical Commission wrote about how digital twins will revolutionize the power sector last year. It said government agencies, standards bodies, utilities, consumers, and researchers need to work collaboratively to gain consensus on standards and open data practices that address privacy and security constraints.
As it stands today, utilities operate with multiple models for planning, communications, operations, grid protection, and billing. Each comes with its own data formats, level of detail, and domain experts. Crossing these silos requires manual processes for maintaining data, reconciling models, and creating new analytics models for new studies. Streamlining these processes could help utilities and grid operators optimize renewable integration, plan for new data center loads or devise more efficient market mechanisms and algorithms.
Also, it seems like every week, the news is filled with local councils pushing back on a new solar array over here, a battery storage project over there, or the new pylons expected to connect them. Digital twins could make it easier to show how various design approaches and compensation schemes can help utilities and data center operators articulate their vision more clearly. This allows for earlier adjustments in the process to mitigate objections, rather than just rolling the dice at the end and hoping for the best.
Building a structural blueprint
Creating an electrical digital twin that connects existing silos in a meaningful way will require an unprecedented level of collaboration across vendors and industries. This includes a blueprint for data exchange and simulation across lots of models, a semantic layer for translating context and enabling reasoning, and a trust layer to ensure the integrity, privacy, and security of data and models.
The IEC has been making progress on a Common Information Model that defines a common vocabulary for components like generators, transformers, wires, and meters. This has facilitated the creation of a Common Grid Model Exchange Standard that is already mandated in Europe. The next step is to improve co-simulation approaches across domains, such as predicting voltage transients in power networks or analyzing the dynamics of energy markets. Frameworks like the Hierarchical Engine for Large-scale Infrastructure Co-Simulation developed in the US show promise here.
A more intelligent energy grid that supports automated reasoning and analysis also needs a semantic layer to provide context, define relationships, and contextualize the data. Microsoft has done some interesting work with its Digital Twins Definition Language ontology for energy grids, creating graphs of inter-connected assets. More work will likely be required to generate knowledge graphs for connecting different types of sensors to models.
Building these large-scale co-simulation capabilities could have a knock-on effect in accelerating embodied AI infrastructure with different scaling properties than those with LLMs. For example, the active inference paradigm suggests the path towards better AI lies in training lots of decentralized domain experts that exchange information about their level of uncertainty. One agent might be optimized for improving the performance of a substation, another for managing a battery array, and others for optimizing market decisions.
My take
Progress in electrical digital twins will likely slog along for the foreseeable future. Vendors will make progress in improving processes within their own tools over the next few years.
A few things might add a bit more fuel to this process. Innovative startups that excel could emerge to better connect the silos across some of the most valuable use cases, which might push legacy vendors to collaborate a bit faster. Just today, the Financial Times reported that Tesla was making more money off its battery storage than legacy energy vendors thanks to a better energy trading algorithm – ouch.
It’s also possible that NVIDIA might strategically expand the scope of its Omniverse approach to support electrical digital twins across domains and vendors, as it's already doing for spatial and mechanical models and simulations. They have already shown some leadership for this within the data center. Only time will tell if they have the vision for extending this approach outside the data center walls before the AI investment frenzy runs out of oxygen. I certainly hope so - a lot more than slightly higher power bills will be at stake if the AI bubble bursts before then.