Data Sources - Learn Integrations on the Now Platform
[Music] in this video we're going to go into a few more features about import sets if you haven't watched the previous video in this series i recommend you watch it to become familiar with import sets data sources and transform apps these are pretty big topics so i'm breaking this into three smaller sections to keep it from getting too long and to make it easier for you to reference specific pieces later in this video we'll cover data sources the next video goes into the details around transform maps and the one after that covers transform map field mapping now you may have noticed that the examples used for the import sets are using excel spreadsheets regardless of the data source most of the import set concepts are the same i'll point out exceptions where spreadsheets differ from other data source types here's our requirement for this video our communications team has asked that we import the spreadsheet comdevices.xlsx into the cmdb table cmdbci com however when we compare the spreadsheet with the values in the choice fields we notice there are some differences that need to be accounted for if we just import the spreadsheet the way it is the system will automatically create new options in those fields to account for the values in the spreadsheet this could lead to data inconsistency confusion and poor reporting unlike the previous video where the system built the data source for us automatically when we did the load data page we're going to do everything by hand to really understand what's going on first we'll navigate to system import sets administration data sources and click new the data source defines the what and how we will be importing this includes the import set formats protocols and credentials if needed we'll give the data source a name communication devices the import set table a label import comms and choose excel as our file type we'll tell it to use the data on sheet number one and header row one next we'll save the record and attach the spreadsheet to this record let's pause a moment and look at some of the other options on the data source under the type field if we select jdbc it changes the form layout accordingly we would use jdbc to connect to a jdbc compliant database like mysql to retrieve records the mid server field is used if our database is behind a corporate firewall we'll talk more about the mid servers later for now think of it as an agent to help us access on-prem resources the format is one of these popular database formats if we do a jdbc connection we may need to talk to our database administrator to get other field values like database name port username password and more to set up the connection properly now let's change the type to ldap this stands for lightweight directory access protocol as an example we may want to import our user and group information from our active directory server using ldap to allow users to sign in with the familiar login and password without having us to set up a single sign-in identity provider the ldap target allows us to select a record from the ldap ou definitions table similar to jdbc we would work with our administrators to understand which ou server and other parameters are required if we need to fill that in the type oidc indicates we're using an open id connector such as google or amazon to act as our identity provider if we've configured an sso provider to use open id the system will automatically create a data source configured with type oidc that we can modify later if needed choosing the type rest integration hub allows us to select a rest web service action to do our import for us commonly integration hub actions would be used in flows built with flow designer however we can use them as data sources and server side scripts too the rest type also exposes the format field to help the system understand the format of the incoming data we'll talk about these options in an upcoming episode the data stream integration hub option is similar to the rest integration hub option but uses a data stream action to retrieve large payloads say greater than 10 megabytes continuously and it implicitly uses pagination to retrieve the data in a continuous stream this is more efficient for the remote system and for your instance when large amounts of data are being transferred choosing the data stream type gives us two options we can either create individual fields on the import set based on the json format or put each record's json information in a data field and we can act on it in the transform map and of course if we choose data stream integration hub as a type we need to select which data stream note that both the rest and data stream options require integration hub licensing if you have questions about licensing please consult your account team the final option is to load the data from a custom script this obviously requires you to write some javascript and it isn't used all that frequently in my opinion but it allows you to specify how the data is retrieved and stored in the import set table the focus of the data loader script should be to simply retrieve data and store it in the import set table not manipulate or transform it leave that to the transform map we'll get to that soon let's go back to the file type and check the options we have with that the format field has several to choose from each with its own sub options for our case we chose to use the file attachment option but we could have also used ftp http or several other similar methods again we would need to get server and credential information from our respective administrators to use these other methods for now it's enough to recognize that we have a lot of options to choose from when configuring a data source as we build our data source it's a good idea to test load to create an import set table and validate our raw data can be read after all just because a file ends with xlsx doesn't mean it's really an excel file we do this by clicking the related link load 20 test records the screen should look very familiar if you watch the previous video it indicates that the records were imported if we have an error we want to correct that before continuing in the section marked next steps the link import sets brings us to a table of import sets related to this data source since this is the first time we'll only have one record however if we ran this import several times there would be an entry for each time the spreadsheet has been imported the state field tells us that the data was loaded and opening the record can give us a few more details we can also see this table from system import sets advanced import sets and here's a handy tip rather than re-import all the data from our data source over and over possibly taking time and money we can open a processed import set and use the reprocess ui action to change its state back to loaded this may be helpful as we develop and debug transform scripts back on that status page we can also take a look at the actual import set records that were imported this is a good time to ensure that the field types and lengths from our sample data will hold up for the rest of our import let's say we've imported people and our initial 20 records never got an address longer than 15 characters long the system will default to a minimum of 40 characters and that's as much room as we'll have to store any subsequent data in that field and then later we come along and import 1500 users one has an address longer than 40 characters that's going to be truncated let's open up one of the import set records for our communications devices like earlier videos the system makes some assumptions based on the data in the spreadsheet so you won't see choice fields and reference fields here mostly strings and dates when we right click on the name label and select show you underscore name we can see it's a string of length 40. that should do it for us if we need to we can always right click and choose configure dictionary to adjust the field length up this is only needed on string fields remember once we increase a field size we can't reduce it later okay that takes care of the options for the data source and doing a sample import we'll click this link and continue developing the transform app in the next video
https://www.youtube.com/watch?v=-Kg9FCdSp9I