6 min read time

IT Service Management (ITSM) trends in 2024

by   in IT Operations Cloud

We sat down with OpenText experts to discuss the trends that will shape service management in 2024. GenAI dominates, but there’s more to unpack here. GenAI drives greater enterprise service management (ESM) maturity, scrupulous knowledge management, and democratization of information. The overarching trend is that ITSM is a journey to better user experiences and higher value delivery.

From creativity to productivity—GenAI reality sets in.

2024 is the year of GenAI reckoning as companies work out how to use GenAI for productivity gains and more meaningful work. We spent 2023 talking about and being impressed by GenAI’s creative capabilities. Now the conversation is shifting to, “How will GenAI help me/my company be productive?” To that end, companies will start with end user scenarios that involve knowledge summarization. Think how-to questions, service outage inquiries, and HR requests—areas where GenAI can easily synthesize answers from existing knowledge.

Next, agent-driven uses cases will start taking hold. For now, GenAI will be prompted by support agents asking questions such as: “What high priority tickets should I work on today?” “What’s the best way to resolve this incident?” “Can you share similar tickets that were resolved successfully?”

But as GenAI gains our trust, it will gain autonomy too. This is not likely to happen in 2024, but that is the direction we are heading towards. In fact, this evolution is not unlike the way that change management processes mature. Initially, all changes require approval. But as an organization matures to be less reactive (emergency changes) and more proactive (planned changes), a growing trust in automation results in a lower number of approvals. GenAI will undergo a similar process. As a result, ITSM will become even more essential as the system of record for tracking GenAI’s actions.

Dean Clayton, Senior Product Marketing Management, OpenText

GenAI, SaaS management, and GreenOps integration gain traction.

2024 is the year customers will launch their GenAI projects—either leveraging vendors’ solutions or building their own GenAI centers of excellence. Projects will focus on end-user and enterprise service management (ESM) scenarios, where GenAI can make the biggest impact to elevate user experiences across IT and non-IT enterprise services.

But GenAI will require a strong knowledge management strategy. Virtual agents, the clumsy chatbots of yesteryear, hid poor knowledge management problems. With virtual agents, the bottleneck was the technology, not the knowledge. Now GenAI delivers on technology—but it needs a well-organized knowledge corpus. Without it, GenAI’s performance always falls short.

Moving on from GenAI, there’s SaaS management—not a new trend, but one that will gain more traction in 2024. With organizations increasingly moving to clouds, traditional service asset management (SAM) is shifting from managing compliance to optimizing costs. Key questions are not about software compliance (in SaaS there’s no risk of noncompliance), but where customers overspend, what their costs are by cost center, and are they oversubscribed. Customers will expect functionally rich products to help them manage and report on their software in clouds.

Also gaining traction in 2024—the integration of ITSM with GreenOps practices. This integration is essential for tackling the two main sources of IT-related carbon emissions: IT-owned hardware, including data center equipment and workstation clients, and cloud services sourced from external vendors.

Jacques Conand, Senior Director, Product Management, OpenText

GenAI is really good at human-like conversations. Its technical language skills will get better too.

GenAI excels at understanding natural language and responding to user queries in a human-like way, so early GenAI projects focused primarily on ESM use cases. GenAI is also capable of handling “technical” language, such as writing scripts, creating automation workflows, and generating analytics reports. In 2024, IT-side use cases will gain more attention as GenAI becomes an IT power tool for greater efficiency.

But for IT to trust and act on its suggestions, GenAI’s technical language must be reliable and accurate. It must also work with the highest quality of domain-specific knowledge. We’re at the cusp of these developments in 2024 as early adopters develop and test proof of concepts.

David Baron, Service Management Automation Architect, OpenText

GenAI democratizes information and simplifies interfaces.

GenAI is a journey—because trust in it will build over time. GenAI is already enhancing user experiences and transforming Tier 1 support. According to our internal Tier 1 support benchmarks, a typical support request takes 8 to 12 exchanges between the user and the agent, with each response taking about 55 seconds. GenAI can resolve requests 10 times faster than that, saving time and resources.

In 2024, GenAI will pave the way towards even more democratization of information and the simplification of interfaces. Consider junior support agents who can tackle complex requests on their own thanks to GenAI enabling them with the right enterprise knowledge, guiding them through automation steps, or even taking the right action for resolution. With GenAI continuing to abstract complexity, agents will need only to ask GenAI, “Can you handle this user issue?”

Adam Luciano, Director of AI and Incubation, OpenText

To improve service quality, companies seek GenAI and discovery solutions.

Companies have a genuine interest in bringing GenAI to their ITSM practice, but why? Because they want a modern ITSM approach to improve their services and create greater value for their users. At the same time, they are cautious about the quality of GenAI’s responses. GenAI is not magic—or without risks. It relies on carefully curated domain-specific enterprise knowledge for accurate answers and the protection of proprietary data. Without good knowledge management, companies cannot have good GenAI services.

That’s why we’ll increasingly see companies investing in their knowledge management practices and turning to out-of-the-box GenAI solutions that can work with their private knowledge repositories. Additionally, companies will show greater interest in discovery—for the same reason as their interest in GenAI.  In both cases, it boils down to one thing: improving the quality of their services.

Isabelle Roth, ITSM EMEA Practice Lead, OpenText

AITSM and DEX continue to mature—with user experience at the core.

In 2024, we will see a lot of GenAI, of course. But also, AI-driven ITSM (aka AITSM). Although AITSM is not new, the AITSM maturity has not been reached yet. Today, ITSM is mostly workflow-based and detecting dependencies between workflows is still manual work. Companies need a holistic understanding of their systems to reduce complexity and prevent outages from happening. Without AI, analyzing events, converting incidents into problems, and then fixing those problems take time. But with the power of AITSM, companies can easily predict how services will evolve—across events, changes, and seasons—and proactively take the necessary actions to bring about the craved stability. 

Digital employee experience (DEX) will continue to mature too, supported by GenAI and integrations with end-point management tools. On that last point, what better way to improve user experiences than by understanding users’ ultimate productivity tools—PCs, laptops, and smart phones!

Soumajit Das, Senior Product Manager, OpenText

Private, indexed LLMs gain favor in the enterprise.

ITSM vendors have taken three approaches to incorporating GenAI into their platforms: integrating with a public model like ChatGPT, combining data sets from multiple customers in a shared large language model (LLM), or offering a private LLM that runs on company-specific data.

In 2024, more companies will opt for private LLMs, which protect their proprietary data or intellectual property from leaking into the public sphere. Technology-wise, private LLMs that use retrieval augmented generation (RAG) will be favored over LLMs that rely on vast amounts of data for training. That’s because RAG models, which index and retrieve data from a customer’s private data store, will prove to be more scalable and cost-effective.

Travis Greene, Senior Director of Product Marketing, OpenText

Related items

 

 

 

Labels:

Service Management Automation