Main content

Why 2026 will be the year of infrastructure reckoning - energy, sovereignty, and AI reality

Patrick Smith & Fred Lherault Profile picture for user Patrick Smith and Fred Lherault January 14, 2026
Summary:
Pure Storage's Patrick Smith and Fred Lherault share the five big predictions around AI hype - energy, cyber resiliency and data sovereignty

Screen with 2026 on pink & blue background. People are sat on office chairs in front.
(©canva.com)

As we enter 2026, there are more factors impacting the business landscape than ever before. Global businesses will have to have flexibility built into operations in order to navigate these issues to keep customers, shareholders and employees satisfied. Here are five trends we expect will heavily influence organizations this year.    

Organizations will stop using one solution for cyber resilience 

The number of high profile attacks has been staggering in the last year. Not only in volume, but the extent to which they have crippled major manufacturers and businesses. The historical approach - that of considering cyber resilience as a stand alone issue, where one vendor can protect an entire company - will be put to bed. Organizations will move away from using point solutions and embrace the wider ecosystem of options as understanding grows that they can’t go it alone. An interconnected framework can help prevent a ripple effect when an attack happens - users should be able to identify and halt an attack in progress. The rate and scale of attacks will continue and having a properly integrated framework is vital to mitigate risk and speed up recovery. 

Energy availability will seriously constrain data centers and Terabytes per Watt should become an industry standard measurement  

While efforts to reduce energy use have fallen down the political and business agenda in recent months, it remains a priority for some. We see three big trends:

  1. Energy availability will be a key criteria when it comes to building new data centers and electricity scarcity will hold back construction. Data center architecture and location will be now primarily based on access to energy. I expect the co-location of energy generation and data centers to avoid dependencies on an under provisioned grid.
  2. District heating solutions (distributing the waste heat produced by data centers in other places) will start to be more prevalent. Providers will start doing something with the heat produced, be it diverting it to residential properties or greenhouses for agriculture. However, until it’s mandated by regulation, it won't be taken seriously and governments need to consider this.
  3. Another industry standard which should be updated is the way data storage efficiency is measured. Terabytes per Watt (TBe/W) measures the amount of data stored per unit of energy and should be introduced. This is a relevant and clear measurement that captures real-world energy use, and is a simple, vendor neutral, and accurate benchmark. This approach could reduce the impact of  increases in energy prices, enhance energy security, and relieve pressure on overstretched infrastructure

The AI hype has died and reality has set in - data readiness and inference are key for the Enterprise

While some organizations are still convincing themselves how essential AI is, most are now realistic about what they do, and crucially, don’t deploy. The switch in focus from training to inference means that without a robust inference platform, and the ability to get data ready for AI pipelines, organizations are set to fail. As AI inference workloads are becoming part of the production workflow, organizations are going to have to ensure their infrastructure supports not just fast access but high availability, security and non-disruptive operations. Not doing this will be costly both from a results perspective and an operational perspective in terms of resource (GPUs) utilization. 

However, most organizations are still struggling with the data readiness challenge. Getting data AI-ready requires going through many phases such as data ingestion, curation, transformation, vectorization, indexing and serving. Each of these phases can typically take days or weeks and delay the point when the AI project’s results can be evaluated. Organizations who care about using AI with their own data will focus on streamlining and automating the whole data pipeline for AI, not just for faster initial results evaluation but also for continuous ingestion of newly created data and iteration.  

AI and data sovereignty will drive massive cloud repatriation  

The dual issues of AI and data sovereignty are driving concerns about where data is stored, and how organizations can maintain trust and guarantee access in the event of any issues. In order to extract value from AI, it’s critical for organizations to know where their most important data is and that it’s ready for use.  

Adding to this are concerns about data sovereignty which are driving more organizations to reconsider their cloud strategy. Rising geopolitical tensions and regulatory pressure will shape nations’ data centre strategies in 2026 to combat this. Governments in particular want to minimize the risk that access to data could be used as a threat or negotiating tactic. Organizations should be similarly wary and prepare themselves.  

Kubevirt will take off in production at a large scale 

Many customers are adopting alternatives to VMware. Kubevirt provides one platform which encompasses both virtualization and containerization needs, and I expect it will take off in 2026. The Kubevirt offering has also matured to the point where it’s suitable for enterprise needs. For many, moving to another virtualization provider is a huge upheaval, and while it will save money in the short term, it is unlikely to provide access to the features needed in the long term. Kubevirt provides better control and organizations can quickly realize the value in a platform which provides the capabilities to manage, orchestrate and monitor containers.

Loading
A grey colored placeholder image