Your browser doesn’t support HTML5 audio
ServiceNow has beaten its guidance once again on the release of its Q1 2026 earnings, with AI fueling much of the growth. Subscription revenues came in at $3.671 billion, up twenty-two per cent year-on-year (nineteen per cent in constant currency). Total revenues were $3.77 billion. Current remaining performance obligations reached $12.64 billion, up twenty-two-and-a-half per cent, and the company ended the quarter with 630 customers spending more than $5 million in annual contract value, up around twenty-two per cent year-on-year.
The results suggest an expansion in enterprise appetite for AI adoption and spend, with buyers resonating with ServiceNow's case that governed, orchestrated agentic AI is a platform problem worth paying to solve. Notably, Now Assist customers spending over $1 million in annual contract value grew over 130 per cent year-on-year. We spoke to Amit Zavery, President, Chief Product Officer and Chief Operating Officer, ahead of the earnings release to dive into what is driving the AI spend acceleration, to better understand AI pricing, and where the recent acquisitions take the platform.
The governance factor
We at diginomica have tracked ServiceNow's AI Control Tower positioning closely since it became central to the company's strategy through 2025. Upon release of the Q4 results in January, I noted that solving the governance bottleneck was starting to look like a genuine commercial differentiator. The Q1 numbers continue to support that view. Zavery said:
The comfort level in terms of using agentic as a way to automate, accelerate, and improve efficiency is high now. When we introduced AI Control Tower, it allowed customers to feel more confident with their agentic investments. It takes away the barrier to usage from the security, compliance, and governance perspective. Once we give them that comfort level and peace of mind, you suddenly see a bigger bet on our agentic AI.
When we do this with data and with devices, there's a lot of opportunity for them to use it in their broader context across the company - and that's driving the higher ticket price from an ASP perspective. The volume of customers is growing because of that as well.
This resonates with what we are hearing from the diginomica network. CIOs in our most recent survey on agentic governance found that around half of respondents had basic frameworks in place but acknowledged gaps. The pitch that ServiceNow absorbs the governance burden rather than leaving enterprises to build it themselves is likely a compelling proposition for many.
The pricing shift
Earlier this month I wrote about ServiceNow ending its AI add-on era - baking AI capabilities into every commercial tier by default rather than selling them as a separate upsell. Today is the first chance I’ve had to speak to someone from ServiceNow to understand what that will mean from a revenue growth perspective, as well as the material impact for customers. Zavery said:
What we've done is not really just one thing. The idea was to rethink a lot of our products to be AI-native - how do we bring AI capabilities as a core element inside all of our SKUs, and standardize the products as well? Because we were all over the map with different workflows, upsell SKUs, and the like - which is a pain for the customer. So we simplified that.
We also moved to the hybrid pricing structure across the board - some element of guaranteed pricing, and then flexibility in terms of what you deploy and what value you create. The way we want to count our revenue is going to be all incremental revenue.
And interestingly, Zavery made an explicit comment that highlighted the impact of AI and the shift away from seat-based pricing. He said:
More than fifty per cent of our net-new is coming from non-seat licences. We're getting away from the idea that everything is a fixed amount for a particular seat regardless of usage. We're getting much more prescriptive and thoughtful about usage as well as value creation.
The hybrid model is a guaranteed floor with flexibility across consumption metrics including assets, devices, data volume, and connectors. On outcome-based pricing, which surfaces periodically as the logical next step for enterprise AI commercialisation, Zavery was firm:
I think we're at a good pricing structure. I don't think we want to tweak it a lot. This has been discussed with customers quite extensively, and people get it - and a lot of other vendors are copying it now; it's becoming the standardized way people think.
The problem with outcome-based pricing is it's so nebulous and hard to define. You end up not having any predictability for either the vendor or the customer. If an outcome doesn't happen - is it because somebody didn't implement it right? Is it because you didn't have the right data? Is it because you didn't have the right expertise on your side? Is it because you didn't define the right requirements? Or did the product fall short? How do you resolve those things as a vendor?
It's very hard to build software and have predictable revenues. And it's horrible for customers too, because they end up arguing - 'I got 30% value, not 50%, not 100%.' Where do you draw the line? And then the buyer or user changes company, as you can imagine. If you sold to someone and they said 'this is the value outcome I want to judge you on,' and a new CIO comes in two years later and says 'that's not what I want to agree on - my outcome should be this,' are you going to redo the contract?
If there was any doubt in Zavery’s comments, he added:
The conversation about value and outcomes has been around since software was built, but nobody's been able to make it work, because when you get into the reality of the world it's very different. It looks very good theoretically, practically it's a different story. Now you ask a real CIO - it's 'no way, I don't know what that would mean.'
I've seen a lot of vendors - particularly AI-native companies who have never built enterprise software in their life - say 'we'll do value-based pricing or outcome-based pricing.' Then after two years they realize they don't know what to do. They can't figure out how to price, how to sell, customers don't understand, and they go back to the ways customers know how to buy products.
No spare parts
The Autonomous Workforce launch earlier this month initially focused on the Level 1 Service Desk AI Specialist, which handles common IT support requests end-to-end without human intervention. Twenty customers went through a structured early adopter programe before general availability, several are already in production. A further fifty were added at GA, with ServiceNow providing dedicated AI specialist support through the rollout.
Zavery said:
They don't want piecemeal AI agents, they don't want spare parts, they don't want to be the orchestrator, they don't want to deal with building an AI platform themselves. What they ultimately want is automation, great outcomes, improvement in efficiency, and faster resolution for customers and employees. If we can deliver the full end-to-end service at a lower price than they would build and maintain it themselves - it's a win-win.
Whether it delivers consistently across a broader range of enterprise environments is the question worth watching over the coming quarters. We will be looking closely at these stories at ServiceNow’s upcoming Knowledge conference in Las Vegas in a couple of weeks.
Who owns the control plane?
Google Cloud, Salesforce, ServiceNow, and others, are all now attempting to claim the governance and control plane for agentic AI. Coincidentally, I’m at Google Cloud Next in Las Vegas today - and the framing from CEO Thomas Kurian's keynote felt very familiar to ServiceNow’s AI Control Tower framing last year. Governance, security, management and control were at the center of Kurian’s announcements this week (keep an eye out for my write up on this later today).
Zavery, for those who don't know, joined ServiceNow from Google Cloud, and he is friendly in his commentary about his previous employer. But he said:
The control plane cannot come from people who are also prescribing the full stack, because then they'll be prescribing a stack they own most of the time. And interoperability always exists in theory, but the discovery, understanding the domain, understanding the integration at the detail level gets much harder.
He added:
The world is going to be very broad, open, and multi-platform - and we become the glue connecting those things together while letting them shine in the areas they're great at.
The lock-in concern maps directly onto what CIOs raise with us regularly. ServiceNow's multi-cloud, multi-model neutrality pitch is differentiating in that context - as it has been for a decade - even if it requires ongoing proof at the integration level.
My take
The AI story and the financial story are now running together at ServiceNow, not ahead and behind each other. The Now Assist numbers are now validating a strategy that ServiceNow has been pushing for 24 months now. The real test over the coming quarters is Autonomous Workforce: whether the L1 service desk claim holds up in production at scale, and whether what works in a controlled early adopter programme survives contact with broader enterprise reality. More on those customer proof points to come from Knowledge 2026 in May. In the meantime, ServiceNow may have dampened some of the ‘SaaSpocalypse’ concerns with the earnings this quarter - it’s clear that customers aren’t retreating in any meaningful way. Quite the opposite. I’m sure the numbers will be put forward as evidence of its strategy hitting the right note with buyers and being the right approach as customers gather in Las Vegas in a fortnight.