How ServiceNow Is Shaping the Future of Work With AI
Miku Jha left Google Cloud AI for ServiceNow because she saw something rare: a bold product vision paired with a culture of open and transparent collaboration.
“That combination — AI platform leverage plus execution culture — is why I came,” Jha said.
As ServiceNow’s vice president of applied AI forward deployed engineering, she leads an elite team of AI/ML engineers who co-develop high-value, agentic use cases with customers—taking them from concept to production—and then codify what works back into the company’s platform.
“That creates a flywheel where every deployment makes the platform smarter and faster for the next customer,” Jha said.
Jha’s team is one of many driving AI transformation at ServiceNow, leaning on years of industry expertise and diverse skill sets to develop a broad range of cutting-edge AI products.
Hugo Lafleur, senior engineering manager of AI services, guides the team responsible for building the company’s AI Voice Agents. These agents, now an integral part of ServiceNow’s platform, enable teams to understand, respond to, and act upon natural spoken interactions, providing support across voice-enabled channels, like mobile and web.
“Customers want to be able to reduce the pressure on their live agents to unlock the next level of scaling,” Lafleur said.
ServiceNow’s approach to AI hinges on reinventing its customers’ operations, which is what NowLLM is designed to do. Director of Machine Learning Engineering, Ranga Prasad Chenna, oversaw the creation of NowLLM — also known as the Apriel model, which is a specialized AI model that focuses on ServiceNow’s workflows, including IT service management and customer service management. It is trained on data relevant to these contexts, allowing it to provide highly accurate and context-aware responses.
“Evidence suggests that when small language models, such as ServiceNow’s Apriel model, get fine-tuned on high-quality, diverse data sets, they can achieve performance levels comparable to large frontier models across a wide range of domain-specific tasks while delivering faster response times and substantially lower costs,” Chenna said.
With its ongoing AI initiatives, ServiceNow not only empowers technologists to shape the future of work globally, it also enables them to carve out enriching career journeys for themselves in the process. It’s been named one of 25 companies on Fortune’s “World’s Best Workplaces” list for three consecutive years, and for good reason.
About ServiceNow
ServiceNow’s AI platform enables enterprises to unify AI, data, and workflows through automation, so that enterprises can reinvent how people work across every corner of their businesses.
How Applied AI Forward Deployed Engineers Turn AI Into Real Business Impact
Applied AI Forward Deployed Engineering (FDE) is the bridge between advanced AI capabilities and real-world enterprise value. At ServiceNow, FDE engineers ensure that AI isn't just theoretical — it's production-ready, scalable, and solving real customer problems.
“ServiceNow sits at a unique intersection,” Jha said. “We're the system of action for the enterprise, which makes the platform an AI-native operating system for workflows. Models and data are governed centrally, embedded into real processes, and measured against business outcomes.”
Jha describes ServiceNow's FDEs as full-stack AI/ML engineers with a customer-first mindset — builders who not only understand the technology but also obsess over the outcome. They turn complex business requirements into production-ready solutions, handling everything from prompting and evaluations to enterprise integration, all while upholding the governance and guardrails that make AI trustworthy at scale.
“Every win becomes a reusable pattern in the ServiceNow AI Platform, shaping our roadmap and raising the baseline for what ‘production-ready AI’ means,” she said.
“Every win becomes a reusable pattern in the ServiceNow AI Platform, shaping our roadmap and raising the baseline for what ‘production-ready AI’ means.”
For Jha, one of the most energizing aspects of ServiceNow’s culture is the access and speed teams get. Leaders are hands-on, decisions are made quickly, and teams rally around customer value. This allows them to build governance in from the start and scale solutions faster.
“This shortens integration cycles, and lets us move from pilot to scaled rollout with fewer handoffs,” Jha said.
Ownership is a core principle of ServiceNow’s culture. Every engineer on the Forward Deployed Engineering team owns a reusable asset — such as a pattern, tool, or capability — that they build, maintain, and aim to contribute to the full platform eventually.
“We celebrate measurable impact — like time-to-value, adoption, and improvements in service-level agreements — not just flashy demos,” Jha said. “Each build has clear acceptance criteria, and post-launch stewardship ensures ROI remains strong.”
As ServiceNow sharpens its AI-native operating system, Jha’s team members have limitless opportunities to drive business impact — and advance their careers.
“If you’re an engineer or product leader who wants to ship real impact at scale, applied AI FDE is where you’ll do the most important work of your career,” Jha said.
Traits That ServiceNow Looks For in Applied AI Forward Deployed Engineers
- Founder’s mindset and bias to ship
- Comfort with ambiguity
- Obsessed with customer outcomes
- Full-stack AI: knowledge of prompting and evals, integrations, data pipelines, and workflow design
- Empathy and communication: the ability to translate between executives, subject matter experts, and engineers
How AI Voice Agents Are Transforming Customer Support
According to Senior Engineering Manager of AI Services Hugo Lafleur, speech, as a natural interface to systems, is a time-old problem that many technology providers have been attempting to address for a long time. Businesses want to scale while freeing up their live agents. That's where ServiceNow’s AI Voice Agents come in.
The application his team is building allows people to call in and get support from an AI agent. Ultimately, the goal is to be able to fulfill increasingly complex user inquiries with agents that continuously evolve in sophistication and conversation naturalness.
Spearheading this work has enabled Hugo Lafleur to tackle a variety of exciting technical challenges. However, it’s sometimes the impactful leadership decisions that stand out as the most meaningful.
“As an engineering leader, one of my primary responsibilities is to develop my people to secure the future of the business. When choosing people for AI Voice Agents, I had a choice: Turn to the usual suspects, those with a proven track record of success in AI services, or consider an equally competent group of people who are ready and hungry to get it done. To me, the choice was obvious: Build your people. In the months that followed, that team has executed so well and built such a stellar reputation! Clearly, one of the best leadership decisions I’ve ever made.”
Despite a few challenges along the way, Lafleur said that strong leadership and cross-department collaboration made all the difference.
“In any large workstream, there are different kinds of personalities,” Lafleur said. “There are realists, there are optimists, there are visionaries — this creates tension, a healthy kind. And this tension is what leads to an outstanding outcome: a great product.”
“There are realists, there are optimists, there are visionaries — this creates tension, a healthy kind. And this tension is what leads to an outstanding outcome: a great product.”
NowLLM: Solving the 'Trust Gap' Between Enterprise Customers and Public AI Models
Director of Machine Learning Engineering, Ranga Prasad Chenna, said his team created NowLLM after learning that many customers were hesitant to adopt external frontier LLMs like OpenAI’s GPT or Google’s Gemini, due to data privacy concerns. ServiceNow tackled this challenge by adopting a “NowLLM+3P” model strategy, which offers the “best of both worlds” for its customers.
“They have the flexibility to pick and choose among a multitude of LLM models for their various use cases, both from a functional and non-functional perspective,” Chenna said.
His team produces large-scale synthetic data and benchmarks AI models in a secure environment — work that fuels core platform capabilities like Text2Code, a generative AI feature in the ServiceNow suite, and Guardian, a built-in safety and security layer for ServiceNow’s generative AI platform, Now Assist.
To keep up with the industry’s rapid change, Chenna and other leaders created two new teams: one to evaluate LLMs using industry benchmarks, and one to test how different models perform on ServiceNow use cases using real customer datasets in a secure environment.
“This dual approach provided the executive team with the critical insights needed to define a clear model strategy for ServiceNow,” he said.
“I feel that being part of ServiceNow's exciting GenAI journey and contributing to our vision of transforming the platform into an AI-native platform is a learning experience I’ll value for a long time.”
For Chenna, leading a team of scientists and engineers to execute on the company’s GenAI roadmap was a “particularly special moment” in his career, highlighting the impact his team has on shaping the future of ServiceNow.
“I feel that being part of ServiceNow’s exciting GenAI journey and contributing to our vision of transforming the platform into an AI-native platform is a learning experience I’ll value for a long time,” Chenna said.
Frequently Asked Questions
What kinds of AI problems do teams at ServiceNow work on?
Teams build real-world AI solutions that power enterprise workflows, including AI Voice Agents for customer support, agentic AI use cases developed with customers, and secure language models like NowLLM designed specifically for ServiceNow workflows.
What is Applied AI Forward Deployed Engineering (FDE) at ServiceNow?
Applied AI FDE engineers bridge advanced AI technology and real business impact by taking AI solutions from concept to production, integrating them into customer environments, and turning successful deployments into reusable platform capabilities.
How does ServiceNow support innovation and experimentation in AI?
Leadership encourages rapid experimentation by giving teams flexibility, budget and access to tools, while also building governance and guardrails from the start to ensure AI solutions are secure, scalable and production-ready.
What skills and traits does ServiceNow look for in AI engineers?
ServiceNow looks for engineers with a builder mindset who are comfortable with ambiguity, focused on customer outcomes, skilled across the full AI stack and strong communicators who can translate between technical and business teams.
What is the culture like for engineers and technologists at ServiceNow?
The culture emphasizes ownership, speed and measurable impact, with engineers owning reusable assets, collaborating closely across teams and being evaluated on real outcomes like time-to-value, adoption and customer success rather than demos alone.



