logo

NJP

GQL (Glide Query Language) Part 6: The Processor

Import · Oct 26, 2016 · article

Last time, we derived the JSAN (JavaScript Array Notation) data format from JSON and saw how the GQL class from the Script Include can return result sets in either JSON or JSAN data format. This time, we'll look at the processor and see how it handles the requests coming from the browser and consumes the result sets from the Script Include, encoding them for transmission.

We have a choice between two ServiceNow tools when it comes to processing inbound web requests:

  1. Scoped GlideScriptedProcessor
  2. Scripted REST APIs

While Scripted REST APIs provide a nice UI with nifty features, let's use Scoped GlideScriptedProcessor since our processor will be simple. The processor handles the following tasks:

  1. Receive GET requests from the browser or other sources.
  2. Extract parameters from the query string.
  3. Instantiate and execute the GQL class from the Script Include.
  4. Receive the result set from the GQL class in JSAN or JSON data format (the return value is a JavaScript object and it's not yet encoded in JSON).
  5. Based on the requested format, encode (stringify) the result set into JSON or convert to CSV or HTML table.
  6. Transmit the result via response.

Please note that JSAN is an object data format and is still encoded as JSON for transmission.

Below is the processor script that captures the above:

/**

  • GQL (Glide Query Language) processor

*/

(function process(g_request, g_response, g_processor) {

  try {

      var gql = g_request.getParameter('gql');

      var format = g_request.getParameter('format');

      var result = new GQL().query(gql, format);

      g_response.setHeader('cache-control', 'no-cache');   // disable page caching to avoid stale result

      if (/CSV/i.test(format)) {

          var filename = result.query && result.query.table || 'gql';

          g_response.setHeader('Content-Disposition', 'attachment;filename=' + filename + '.csv');

          g_processor.writeOutput('text/csv', getCSV());

      }

      else if (/HTML/i.test(format)) g_processor.writeOutput('text/html', getHTML());

      else g_processor.writeJSON(result);

  } catch (e) { g_processor.writeOutput('text/plain', 'ERROR: ' + e + '\r' + e.stack); }

  function getCSV() {

      var columns = [], rows = [];

      result.labels.forEach(function(label) { columns.push(escapeCSV(label)); });

      rows.push(columns.join(','));

      for (var i = 0; i < result.records.length; i++) {

          columns = [];

          result.records[i].forEach(function(column) { columns.push(escapeCSV(column)); });

          rows.push(columns.join(','));

      }

      return rows.join('\r\n');

    /**

      * Takes raw field value and returns CSV escaped value

      * based on RFC 4180 Common Format and MIME Type for CSV Files

      * October 2005 http://tools.ietf.org/html/rfc4180

      *

      * @param {string} raw - raw field value

      * @return {string} escaped CSV field value, if applicable per RFC 4180

      */  

      function escapeCSV(raw) {

          var out = (raw + '').replace(/"/g,'""');   // escape all double quotes  

          if (/[\r\n",]/.test(out)) out = '"' + out + '"';   // if it has special characters, enclose in double quotes

          return out;

      }  

  }

  function getHTML() {

      var columns = [], rows = ['

');

      for (var i = 0; i < result.records.length; i++) {

          columns = [];

          result.records[i].forEach(function(column) { columns.push(escapeHTML(column)); });

          rows.push('

');

      }

      rows.push('

'];

      result.labels.forEach(function(label) { columns.push(escapeHTML(label)); });

      rows.push(columns.join('

'), '
', columns.join(''), '
');

      return rows.join('');

      function escapeHTML(raw) {   // escape reserved HTML characters

          var MAP = { '&':'&','<':'<','>':'>'};

          return (raw + '').replace(/[&<>]/g, function(c) { return MAP[c]; });

      }

  }

})(g_request, g_response, g_processor);

Lines 8 and 9 extracts the two parameters gql and format from the request object g_request.

Line 10 instantiates the GQL class, executes the query() method using the two parameters extracted, and saves the result set.

Lines 14 through 20 encodes the result set into CSV, HTML, or JSON (for both JSAN and JSON formats) and transmits it back to the requester; notice the content types are set according to the encoding type. The writeJSON() method handles the JSON encoding and also sets the content type.

This sums up what the processor does; it doesn't get much simpler. The rest lines are for the CSV and HTML encoders. These don't really need to be part of the processor, but I wanted to illustrate how JSAN data can be easily turned into other formats. The arrays used in JSAN can be quickly converted to a formatted row using the forEach() array method. Both CSV and HTML encoders work very similarly, iterating over records and columns as row and column delimiters are inserted. HTML tables can be as easily generated on the client side from JSAN data using AngularJS, for example; this may be the preferred approach if you want tighter interactions with the tables in the browser.

Let's now take a look at the Processor configuration page whose screenshot is shown below:

image

I'd like to point out a few things:

  1. Application is read-only and automatically set to "gql".
  2. Set Type to "script" since we're using JavaScript in the Script section.
  3. Set Path to "proc", short for "processor".
  4. Path Endpoint is read-only and is automatically set to "x_64935_gql_proc". Here, "64935" is the "Vendor prefix", which, in this case, is a numeric id automatically assigned based on the personal developer instance being used. "gql" is the application name and "proc" is the Path.

Using the Path Endpoint, a web service call can be made using a URL similar to (replace "instance" with your own instance)

https://instance.service-now.com/x_64935_gql_proc.do?gql=SELECTnumber[Number],short_description[Title],dv_state[State],caller_id.email[Caller%20Email],dv_cmdb_ci[CI],cmdb_ciFROMincidentWHEREactive=true^priority<2EQORDERBYpriorityORDERBYDESCnumberLIMIT10&format=HTML

This returns an HTML table similar to this (from Incident demo data):

Number Title State Caller Email CI cmdb_ci
INC0000055 SAP Sales app is not accessible In Progress carol.coughlin@example.com SAP Sales and Distribution 26e494480a0a0bb400ad175538708ad9
INC0000054 SAP Materials Management is slow or there is an outage On Hold christen.mitchell@example.com SAP Materials Management 26e44e8a0a0a0bb40095ff953f9ee520
INC0000053 The SAP HR application is not accessible In Progress margaret.gray@example.com SAP Human Resources 26e51a2f0a0a0bb4008628d2254c42db
INC0000052 SAP Financial Accounting application appears to be down In Progress bud.richman@example.com SAP Financial Accounting 26e426be0a0a0bb40046890d90059eaa
INC0000051 Manager can't access SAP Controlling application In Progress employee@example.com SAP Controlling 26e46e5b0a0a0bb4005d1146846c429c
INC0000050 Can't access Exchange server - is it down? In Progress jerrod.bennett@example.com EXCH-SD-05 281190e3c0a8000b003f593aa3f20ca6
INC0000031 When can we get off Remedy? UI is killing us In Progress employee@example.com
INC0000025 I need more memory In Progress don.goodliffe@example.com IBM-T42-DLG 469facd7a9fe1981015830c43428ca2d
INC0000018 Sales forecast spreadsheet is READ ONLY In Progress taylor.vreeland@example.com
INC0000017 How do I create a sub-folder On Hold employee@example.com

At the end of the URL is the format parameter, which is set to "HTML" in the above example. This can be changed to "JSAN" or "JSON" to get the JSON encoded object, which opens up as a file in the browser that you can inspect. When the format is set to "CSV", a CSV-formatted file downloads and opens up in Microsoft Excel, if you have it set as the default application for CSV files.

I'm sure you can think of some use cases where a simple GET command via URL returns data you can easily consume. How about an email notification template with an embedded table listing stale tickets assigned to the individual recipients? Again, GQL can be used as a web service, as we just saw, or in any server-side scripts, just like what the above processor is doing. Either way, you pass a GQL statement and get back a result set without having to work with GlideRecords directly.

With this, we have now completed what we set out to do in Part 2:

  1. GQL syntax - define GQL syntax to be used
  2. Processor - handle bidirectional HTTP traffic
  3. Parser - parse GQL query for processing
  4. Retriever - query and retrieve result set
  5. Encoder - encode result set for output

addressing all of the requirements from Part 1:

  1. Focus on SELECT for now.
  2. Don't need complex Subqueries or CTEs (Common Table Expressions) for now.
  3. Select the raw and/or display value of a field.
  4. Be able to rename column headings.
  5. Support dot walking of reference fields.
  6. Be able to filter the result set.
  7. Be able to sort by one or more columns.

I haven't covered the use of the asterisk or calculated columns in the SELECT clause; these will be covered in the future.

Next time, we'll look at GQL Pad, an interactive GQL statement editor and executor built using a very simple UI Page with a touch of AngularJS; this will be a small "app" that's built on top of what we have seen so far. I'll also show you how to get the entire working app from GitHub in another blog. We're almost there, so please stay tuned!

Please feel free to connect, follow, post feedback / questions / comments, share, like, bookmark, endorse.

John Chun, PhD PMP see John's LinkedIn profile

visit snowaid

View original source

https://www.servicenow.com/community/developer-blog/gql-glide-query-language-part-6-the-processor/ba-p/2282797