logo

NJP

Making AI Search Work: Practical Lessons from the Field

New article articles in ServiceNow Community · Apr 29, 2025 · article

Introduction

This article continues our series from the AI Center of Excellence Team at ServiceNow.

It focuses on solving a specific set of customer challenges related to AI Search, based on real-world project experience. Please note: this is not a comprehensive guide, but rather a focused and honest overview — highlighting what tends to work well, common pitfalls, and important aspects to consider.

Common Implementation Challenges

  • Understanding existing capabilities
    Many teams struggle to get a clear picture of what AI Search actually offers, how to connect external data sources, and how to make use of AI Search Analytics.
  • Desired results not being returned
    A frequent concern is that the search doesn't return the expected results, often pointing to configuration gaps or issues with content quality.
  • Lack of early architectural planning
    Decisions around external source integration (e.g., connector configuration), or scaling are sometimes overlooked at the beginning of the project — leading to costly rework later on and loss of user trust.

Let’s dive into each of these in more detail.

Understanding Existing Capabilities

AI Search in ServiceNow is a hybrid model, combining semantic and keyword-based search techniques.

  • Semantic search understands the intent and contextual meaning behind user queries, enabling it to return relevant results even if the exact words aren’t present.
  • Keyword search focuses on direct matches with the user’s typed input.

Furthermore, AI Search expanded its reach by supporting external data sources. This means organisations can now index and retrieve content not only from within ServiceNow but also from platforms such as Atlassian Confluence Cloud and Microsoft SharePoint Online.

This enhancement makes it possible to surface external information in Now Assist Q&A Genius Results, giving users a more comprehensive and integrated knowledge experience.

Now Assist Q&A Genius Results is a feature within ServiceNow’s AI Search that provides users with concise, actionable answers derived from knowledge articles. It understands user intent and delivers relevant content directly — without needing users to click through multiple results.

This improves self-service, increases knowledge utilisation, and reduces the time needed to find information.

ServiceNow also provides robust out-of-the-box analytics, including the User Search Analyzer dashboard, part of Now Assist Analytics. This dashboard helps track key metrics like:

  • Total search queries
  • Most common search phrases
  • Queries that returned no results

These insights are crucial for identifying knowledge gaps, optimising search relevance, and improving user satisfaction. Once insights are gathered, it's vital to act on them. Incorporate Continual Service Improvement (CSI) practices to evolve your AI Search setup over time:

  • Use “no result” queries to identify missing or unclear content.
  • Regularly update stop words and synonyms based on actual user behaviour.
  • Adjust chunking or boosting rules to match changing usage patterns.
  • Review promoted content quarterly to ensure it remains relevant.

By embedding CSI into your operational rhythm, your AI Search implementation becomes a living system — continuously adapting to meet evolving user needs.

Challenge: AI Search Doesn't Return Expected Results

Inconsistent Results Across Portals

If results differ between portals (e.g., Employee Center and Service Portal), ensure their search profiles are configured consistently:

  • Same search sources
  • Same genius results
  • Unified stop words and rules

Desired results are not being returned

In the Search Profile configuration, you can define Result Improvement Rules to promote, boost, or block certain results. Actions include:

  • Boost: Increases the relevance score of targeted results.
  • Promote: Forces specific items to the top, regardless of score.
  • Block: Hides unwanted results (e.g., deprecated services or outdated policies).

Also, configure stop words — common terms like “the”, “is”, or company-specific acronyms that add no value to search logic. Removing them sharpens result accuracy and improves processing speed.

Example: If internal terms like “XYZ” are used frequently but don’t help with search intent, they should be excluded as stop words.

Another key to improving relevancy is using an effective chunking strategy.

Instead of indexing long documents, break them into smaller, semantically meaningful pieces. Choose the "Passage"strategy under the Semantic Index configuration. You can define chunking based on word count, sentence boundaries, or custom logic.

Smaller, cleaner chunks make it easier for the model to return accurate responses — particularly for specific or partial queries — and they boost overall performance.

Challenge: Lack of Architectural Planning

A common pitfall in AI Search projects is neglecting architecture early on. Without a solid foundation, teams often face performance issues, unnecessary costs, or security risks. Here are two major areas to focus on:

Indexing Scope & Cost Optimization

Define what data truly needs indexing:

  • Apply filters (e.g., only active records, or recent updates)
  • Avoid duplication between profiles
  • Align search profiles to user roles and business functions

If left unchecked, indexing everything leads to bloated data volumes, longer processing times, and increased storage or licensing costs.

External Content Connectors

If you’re integrating external sources (e.g., SharePoint, Confluence), pay special attention to:

  • Security Model: AI Search respects original system permissions. Validate they align with your organisation’s access policies.
  • Crawling Configuration: Define start points, inclusion/exclusion filters, and update intervals to avoid unnecessary load.
  • Volume Planning: One connector can index up to 1 million documents. For larger libraries, split the content across multiple connectors or apply filtering to stay efficient.

Failure to plan external connector use leads to incomplete indexing, mismatched access rights, or performance degradation — all expensive to fix later.

Data quality is key when enabling intelligent features like Genius Results.

Activating Genius Results can significantly enhance the user experience by delivering concise, intent-aware answers. However, rolling it out across all knowledge articles without assessing content quality can lead to vague, misleading, or outright incorrect responses — which undermines user trust and adoption.

To mitigate this, a best-practice approach is to segment your knowledge sources using Search Sources and enable Genius Results only for high-quality, curated content. This allows you to:

  • Limit Q&A functionality to thoroughly reviewed or newly written KB articles
  • Exclude older or inconsistent content until it’s updated or rewritten
  • Test and optimise Q&A responses before deploying them more widely

In parallel, maintaining a well-structured catalog plays a vital role. Following best practices — such as using clear, consistent naming conventions and properly configured metadata — dramatically improves both the searchability and usability of your content. This ensures that users can find what they need quickly, leading to greater efficiency and a more satisfying experience.

As part of your Continual Service Improvement (CSI) efforts, consider gradually expanding your Genius Results coverage:

  • Use AI Search Analytics to identify high-impact articles for improvement
  • Prioritise knowledge areas with frequent user queries and weak engagement
  • Migrate more search sources into the Q&A-enabled group as their quality improves

Conclusion

AI Search in ServiceNow offers powerful capabilities — but like any enterprise feature, it requires strategic planning, ongoing optimisation, and a clear understanding of how users interact with it.

By addressing early architecture, understanding capabilities, and continuously tuning the experience using analytics, organisations can dramatically improve both search quality and user satisfaction.

𝘗𝘚: 𝘝𝘪𝘦𝘸𝘴 𝘢𝘳𝘦 𝘮𝘺 𝘰𝘸𝘯, 𝘢𝘯𝘥 𝘥𝘰 𝘯𝘰𝘵 𝘳𝘦𝘱𝘳𝘦𝘴𝘦𝘯𝘵 𝘮𝘺 𝘵𝘦𝘢𝘮, 𝘦𝘮𝘱𝘭𝘰𝘺𝘦𝘳, 𝘱𝘢𝘳𝘵𝘯𝘦𝘳𝘴, 𝘰𝘳 𝘤𝘶𝘴𝘵𝘰𝘮𝘦𝘳𝘴.

View original source

https://www.servicenow.com/community/now-assist-articles/making-ai-search-work-practical-lessons-from-the-field/ta-p/3250487