Using external LLMs with Now Assist
New article articles in ServiceNow Community
·
Mar 26, 2025
·
article
When using Now Assist products, you may encounter a scenario where you wish to modify the large language model (LLM), providing generative AI capabilities. This guide will walk you through this process, indicating how you can set up the connection to an external LLM and what options you have for adjusting the provider for both out-of-the-box (OOTB) and custom Now Assist skills.
Please note that the selection of an LLM is entirely dependent on your use case and the limitations you face regarding topics such as data sovereignty and handling. We cannot provide guidance on which LLMs to select.
Scenario 1: You want to modify the LLM for an OOTB Now Assist skill
We offer the ability to modify LLMs for certain OOTB Now Assist skills using Now Assist Skill Kit. To see which of your skills are eligible for this, you can navigate to the sn_nowassist_skill_config table, and find the skills where is_template = true. The list of skills one can do this for as of March 2025 is as follows:
| Application | Skill Name |
|---|---|
| GRC Common GenAI | Issue Summarization |
| Now Assist for IT Service Management (ITSM) | Change request summarization |
| Now Assist for Field Service Management (FSM) | Work Order Task Summarization |
| Now Assist for Customer Service Management (CSM) | Resolution notes generation |
| Now Assist for IT Service Management (ITSM) | Incident summarization |
| Now Assist for HR Service Delivery (HRSD) | Case summarization |
| Now Assist for Customer Service Management (CSM) | Case summarization |
If you wish to modify the provider for a skill not on this list, please let your account representative know.
To modify the OOTB skill’s LLM, you can follow the steps in the video below.
Scenario 2: Selecting a LLM for a custom Now Assist skill
When you are looking to build your own generative AI functionality, you can use Now Assist Skill Kit. During the process of creating a skill, you will be asked to select from a list of providers. This list is sourced from the records with external=true on the sys_generative_ai_model_config table.
How to connect to external LLMs
For both scenarios, when selecting an LLM other than Now LLM, you are using an LLM that is not managed by ServiceNow, and thus you must provide your own key. There are 2 options: connecting to an LLM that we have a spoke for, and Bring your own LLM (BYO LLM). We do advise that usage of a LLM that we have a spoke for is preferred, as the connection process is more streamlined and easy to manage.
Connecting to an LLM spoke
We provide a number of spokes to external LLMs including (as of March 2025):
- Azure OpenAI
- OpenAI
- Aleph Alpha
- IBM's WatsonX
- Amazon Bedrock
- Google's Vertex AI
- Google's Gemini AI Studio
To connect to a spoke, you need to procure your own license and provide ServiceNow with the key and any additional credentials that the API may need to verify the connection.
For providers that offer multiple LLMs, such as Amazon Bedrock and Vertex AI, please note that you are limited to usage of only one LLM at this time.
To see how to connect to one of our spokes, you can view the recording below:
If you are connecting to IBM’s WatsonX spoke, you can follow the steps outlined from minutes 8:00 until 13:00 in the below AI Academy session. Do note that the build that occurs after the 13 minute mark is an outdated method of building generative AI functionality within ServiceNow, and we instead recommend using Now Assist Skill Kit.
Connecting to a LLM that does not have a spoke (Custom LLM)
Instances on at least the Washington DC release are able to use the generic LLM connector to connect to any external LLM not listed above, i.e. BYOLLM. This process requires a fair amount of technical acumen. To integrate with a non-spoke-supported LLM, you need:
- An API key from the provider
- Endpoint for the LLM
- Access to API documentation for that LLM to assist with writing the transformation script to translate the input and response into an acceptable format
An example demonstrating the process of connecting to an external LLM using the generic LLM connector can be seen in the video below:
https://www.servicenow.com/community/now-assist-articles/using-external-llms-with-now-assist/ta-p/3218103