What are the most important best practices to follow when evaluating and implementing an ITSM solution, especially one with AI capabilities? What sorts of challenges should organizations prepare for?

1.4k viewscircle icon1 Upvotecircle icon6 Comments
Sort by:
Chief Information Technology Officer in IT Services21 days ago

Prioritize data quality, integration, and governance, as AI relies on clean, structured inputs. Focus on user adoption, with strong change management and training.

Director of infrastrucure and operations in Services (non-Government)21 days ago

Let me sum up our journey and best practices:

Choose a mature platform, don’t reinvent the wheel We selected a well-established ITSM platform (ServiceNow in our case) rather than building custom solutions. These platforms offer extensive out-of-the-box capabilities that are hard to replicate or justify through internal development.

Stick to standards and minimize customization We aimed to use the platform as close to out-of-the-box as possible. Customizations often introduce performance issues, complicate upgrades, and inflate costs. Standardization ensures smoother implementation and long-term maintainability.

Start with reliable internal data sources Instead of relying on ITOM auto-discovery, which generated excessive noise and manual cleanup, we integrated data from trusted systems like VMware and Microsoft Defender. This gave us a solid foundation for our CMDB and service processes. Auto-discovery may be revisited later as a complementary feature.

Phase AI adoption deliberately AI was intentionally scoped into a third phase—after a year of stable operations. We first focused on getting the core ITSM processes right. Our AI roadmap includes use cases like automated ticket communication, incident correlation, and anomaly detection in monitoring metrics. The goal is to reduce repetitive tasks and shift from reactive to proactive operations.

Push data into ITSM, but retain ownership at the source We designed our architecture to feed data into the ITSM platform while keeping it in the original systems. This avoids vendor lock-in and allows for parallel migration paths if needed. Should the platform become cost-prohibitive or misaligned, we can reroute integrations with minimal disruption.

Treat ITSM as a program, not a project We categorized this initiative as a Large to Extra-Large effort using T-shirt sizing. It encompasses service desk consolidation, hardware inventory, Azure integration, CMDB setup, service requests, event handling, communication workflows, and release management. Each component is a mini-project within the overarching ITSM program.

Budget realistically—think beyond MVP We coined this a “PI project,” referencing the ratio of MVP effort to full implementation: roughly 3.1415 times the initial scope. This helped stakeholders understand the true scale and complexity from the outset.

Track costs per stream and validate AI ROI Cost control is critical. We monitor each track individually and assess the long-term financial impact of activating AI features. Our business case for AI is centered on cost reduction—not additive spend—ensuring that automation delivers measurable value.

Challenges to Prepare For:
- Underestimating the scope and interdependencies of ITSM components
- Managing stakeholder expectations around AI timelines and outcomes
- Avoiding over-customization that hinders scalability and upgrades
- Ensuring data quality and consistency across integrated systems
- Balancing innovation with cost control and vendor flexibility
- Implementing ITSM with AI is a transformative journey. Our approach has been to build a stable foundation first, then layer in intelligence where it drives efficiency and insight.

Lightbulb on1
Chief Information Officer21 days ago

ITSM scope of work must be defined in alignment with Organization structure and service portfolio of the technology stack in the organization, if this identified clearly then you will have the best fit tools for the need, challange is to ensure team trained well and consistent in usage and update

VP of Information Technology in Services (non-Government)22 days ago

For us, it's important to have the ability to segment different stakeholders or group/agents using different forms or conditional questions for the the request process.

It needs to have an easy to navigate UX.

Good metrics/reporting and easy to build custom reports as well.

Ideally an integration to a SSO like okta where user provisioning is automated for internal use cases is vital.

CIO23 days ago

First of all it is important to understand what business problem you need to solve, and what is value for money. Arrange to do a Proof of Concept with the Vendor to determine whether their AI can actually help you with solving the problem.

Gain good understanding how the AI works and how is your data being used. You want to protect your data from leaks and secure it well. You need to know what are the guardrails and data governance needs to be in place to safeguard access and use of your data.

Content you might like

Read More Comments

Migrating more workloads to cloud22%

Optimize cost of the existing cloud usage52%

Refactoring to microservices/containers14%

Automating policies for security/governance6%

Improving cloud cost/usage reporting3%

View Results

Ransomware and multifaceted extortion34%

Business email compromise39%

Third-party vendor compromise (supply chain)17%

Cloud security incidents6%

I have no idea1%

View Results