logo

NJP

Generative AI Controller FAQ

Import · Sep 29, 2023 · article

VictorChen_0-1695949850302.png

(Washington DC release)

1. What is the Generative AI Controller?

The Generative AI Controller is our intelligence connection layer to the Large Language Models (LLM) that produce AI-generated content. It serves as the foundation for our generative AI products but can also be leveraged to create your own custom generative AI applications and workflows within Flow Designer, Virtual Agent Designer, and scripts.

You can find an example of building and deploying a custom workflow here.

2. Where can the Generative AI Controller be used in the Now Platform?

You can leverage the Generative AI Controller to integrate generative AI functionality within Flow Designer, Virtual Agent Designer, and general platform scripting.

Example usage within Virtual Agent Designer:

VictorChen_1-1695949850303.png

3. What out-of-box use cases are provided by the Generative AI controller app?

The use cases available are below:

Action Description
Generate Content Generate content with generative AI to create bodies of text quickly, such as generating proposed responses agents can use to respond to incoming emails.
QnA Help users find the answers to their questions, freeing up time for agents
Summarize Summarize large bodies of text or complicated exchanges to make the transfer of information easier, such as during handoffs between Virtual and live agents
Generic Prompt Use Generative AI to generate ideas and brainstorm on any topic, such as ideas for increasing engagement. E.g. you can ask the LLM to create a marketing plan.
Sentiment Analysis Calculate the sentiment of an input utterance as Positive/Neutral/Negative.

4. Which Generative AI providers are currently supported? ​​​​​​​

OpenAI, Azure OpenAI, Aleph Alpha, IBM WatsonX, and Google Cloud (Vertex AI and Makersuite). As of the Washington release, you can also connect with any external LLM using the generic LLM connector.

5. What are the requirements to get started with the Generative AI Controller?

To begin using the Generative AI Controller, you need the following:

  • An existing Pro Plus/Enterprise Plus entitlement
  • License and API key from the external LLM you wish to connect to
  • Your instance must at least be on Vancouver P2+

Store app:

Note: You do not need to install the controller store application yourself - it is bundled with the Now Assist for [x] plugins.

6. Does the Generative AI Controller use Integration Hub and consumes transactions?

The Controller uses embedded Integration Hub spokes to connect to the third-party LLM service providers, but uses does not incur IHub transaction costs. It will consume "Assists" as part of the Pro Plus/Enterprise Plus SKU.

7. Does my ServiceNow data leave my instance when using 3rd party Generative AI providers?

​​​​​​​Yes. Please be aware that data and queries you make from the app are sent to the party that hosts the external LLM you are leveraging. Note their data privacy policies and decide what usage policy best fits your organization.

You can learn more about how Now Assist handles data handling and security here.

8. I have sensitive data in my instance that I don't want to leave my instance - how can I do this?

You can employ the use of the Sensitive Data Handler plugin. It is not active by default and must be installed and configured by the admin to help detect and mask data considered sensitive for your business. To configure, you must define the regex patterns that match what you consider to be sensitive information.

9. Is the generative AI output moderated for unsafe or harmful content?

​​​​​​​This depends on how you configure your account/instance of the external LLM. Each provider will have their own content moderation abilities that you can review and determine if that are fit for purpose.

10. Is there a possibility that Generative AI output is inaccurate (e.g. hallucination)?

​​​​​​​While we have minimized the possibility of hallucinations through selective use cases and prompt engineering, there is always a risk of inaccurate information in generated content. Thus, it is always recommended to employ human-in-the-loop review for such content.

11. Is it possible to modify/create my own prompt, temperature, or number of tokens?

​​​​​​​Currently no. ServiceNow provides a read-only initial prompt, and in your workflows you can expand upon it. Default temperature = 0. Default max_tokens = 500 is the amount reserved for the response.

12. What models are currently supported out-of-box for OpenAI?

​​​​​​​GPT-3, GPT 3.5 turbo, and GPT-4.

13. What is the external cost per transaction calling the API?​​​​​​​

It depends on the provider and the length of the output. We recommend reaching out to the providers to understand their pricing models.

14. When using Azure OpenAI, I’m running into an error: "The API deployment for this resource does not exist. If you create the deployment within the last 5 minutes, please wait a moment and try again."

Ensure that your deployment name is the same as the model name in Azure admin portal.

​​​

15. Is it possible to maintain context between multiple queries?

Not today.

16. Can I use the Generative AI Controller to connect to Now LLM for my custom use cases?

Currently no. You can only connect to external LLM providers with the Generative AI Controller.

View original source

https://www.servicenow.com/community/ai-intelligence-articles/generative-ai-controller-faq/ta-p/2686478