logo

NJP

Demo - Uplift Your CMDB Experience

Import · Sep 23, 2020 · video

to effectively manage and improve your systems you need to know exactly what assets are in your it environment and have current accurate configuration data with an accurate cmdb it's easy to understand your organization's i.t environment particularly in the areas of service impact analysis asset management compliance and configuration management in this demo we'll show you how servicenow helps you populate the cmdb worry free monitor your cmdb integrations from a central dashboard and then gain insights quickly and easily to support critical infrastructure and service decisions to uplift your cmdb let's start at well the beginning populating the cmdb using integration hub etl the landing page shows all the existing etl transform maps if you haven't completed all the steps to create an etl transform map the map status will show his draft on this page let's create a new transform map here you're guided through the process in an assistant experience showing you the steps you've completed and the next steps to follow our only unlocked step is import source data and provide basic details so we'll start there on this screen the user picks a cmdb application or can create a new one a cmdb application is simply the application that produced the data to be brought into the cmdb for discovery source we'll select manual entry new discovery sources must be done in the platform you can't create them here in ihtl we'll name our new etl transform map and pick the computer data source which our administrator already created for us note that you can only create one etl transform map for each data source we'll click on the button mark as complete to save the basic details our next step is now unlocked so we'll move on to preview and prepare the data ihetl displays all the columns and rows for the source data we're working with on this step we'll create all the necessary entries for the robust transform engine the cmdb import traffic cop to create our cmdb data on this screen you'll perform operations to transform your data the you last inventory date column needs to be converted to a date so we'll start there to create a new transform on a column click on the context menu icon on the column and select new transform on the new transform panel we'll click on the drop-down transform type and search for convert to date on the source timestamp format input we'll change the format using standard date notation and click apply to save this operation next we'll need to perform some concatenations creating new columns by combining two or more columns together we'll start with u operating system name select the concatenation operation for this transform input 1 defaults to the column we initially selected we want to concatenate the operating system version to this so we'll put this in input column 2. on the input concatenate with string we'll hit the spacebar to add a space between the data inputs ihetl automatically creates the output column name based on the input columns but let's rename this output column name to something more meaningful operating system and click apply to save we'll also do additional concatenations on you computer name and you file system name some transform types help maintain cmdb data integrity by cleaning source data using lookups to current cmdb data cleanse hardware model is one of them which returns existing or new core company and hardware models let's perform cleanse hardware model on umake our transform type is clense hardware model we need both the make and model for this transform so on the input model name we'll search for umodel we rename the output field and click apply the result is a string that has the manufacturer sys id manufacturer name model sys id and model name concatenated with a triple pipe now that we have these four clean values for a model we'll split the cleanse hardware model column into four separate clean columns let's say we're done with preparing the data so we'll click on the mark as complete button at the top in this next step we'll specify the cmdb classes that the source data will be mapped to from the source data we can see that some configuration items are linux servers some are windows servers and the roster computers we'll create a conditional class for this scenario on the add conditional class modal we'll create conditions to be met to map the source data to different cmdb classes first we'll select the column to filter on in this case operating system if operating system contains windows then our class will be windows server we'll continue this process for linux servers and data that doesn't satisfy these conditions will automatically be mapped to the computer class since all the rows contain a file system we'll add that as well since we don't need a condition here we'll just click add class and select file system now that we've specified our ci classes it's time to map our source data to attributes for each class let's start with windows server one click on the button setup mapping here ihetl shows the required fields to create a windows server ci you can drag and drop the columns into the required fields or use a keyboard list to search for the data columns to map the attributes we'll map the remaining attributes quickly you can also add new attributes which are considered optional we've completed the mappings for the linux server computer and file system classes behind the scenes we can now mark this as complete next we'll define the relationships between the classes first we'll add a relationship we select the parent linux server 1 then the child file system 1 and finally select the relationship in this case contains contained by and repeat for the remaining computer classes once the transforms classes and mappings are done you can create sample results this allows you to see if the data is being pushed to the cmdb as you would expect ihetl displays key metrics for the preview run it also displays tabs for each cmdb ci class created relationships and error and activity logs if the results are not what you expect and you want to make revisions you can click edit mapping for any of the cmdb classes tabs or edit relationships on the relationships tab this automatically rolls back the results and provides a clean slate to re-run the test as needed since we're happy with the results we'll click on retain data our last step is to schedule the actual import to the cmdb we'll click on set import schedule and then click on the button set schedules which opens a new tab click new to create a new schedule we'll give it a name and select computer as the data source when done we'll go back to the browsers tab where ihetl is and click on the button mark as complete after the import runs you can open the list view of the cmdbci table and confirm all new cis are added this is a brief example of how integration hub etl manages the import of data to your cmdb enforcing rules to maintain data integrity while allowing flexibility to test and revise your imports if needed the integration commons for cmdb store app is the new central management point for integrations into your cmdb integration commons contains a set of transforms and script includes to standardize the values stored in the cmdb by data integrations or changes integration commons for cmdb provides a cmdb integrations dashboard with a central view of status processing results and processing errors for all installed cmdb integrations you can see metrics for all integration runs or filter the view to a specific cmdb integration a specific time duration or a specific integration run the cmdb execution status tab displays metrics such as the total number of integrations in processed rows integration runs actively running daily statistics and details about the classes that were updated click on the cmdb integration errors tab to see metrics such as number of import and integration errors and number of erroneous imported records you can narrow down the scope of the integration runs included in the metrics on the dashboard by configuring filters on the right hand side of the dashboard set any of the following filters and then click apply the filter settings apply to any metric with a filter icon in its upper left corner cmdb query builder provides an interface through which a user can query the cmdb based on ci relationships without having to write scripts the saved queries can be saved and scheduled enabling dynamic reports to be included on dashboards for the iet team and other teams we already have a few queries built using cmdb query builder let's explore the query tickets on servers running postgresql the cmdb query builder consists of a canvas where users can draw the query the left panel of the canvas includes ci classes that can be dragged to the canvas multiple classes on the canvas can be linked by lines representing relationships as you can see this query contains two ci classes and a non-cmdb table the task table it will display all the servers that host postgresql as well as any tasks such as incidents problems and changes on those servers this query can help an it team who may be planning a patching activity to get an overall status of postgresql there's an option to create a schedule which enables the it team to get email notifications at scheduled dates and times with results of this query also you need to create a schedule if you wish to utilize the dynamic reporting feature of cmdb query builder which we'll explore in a moment let's run the query we can see that the results contain several tasks along with the servers which host postgresql we will now create a dynamic report to visually represent the breakdown of tickets we've selected a bar graph for the report type and grouped it by the task type the report is ready and the it team may add this report to their home page or dashboard let's now explore another type of query called service mapping query which can also be built using cmdb query builder service mapping query allows teams to draw a pattern consisting of classes and relationships between those classes and then returns the services that contain the pattern within their service map here we can see a query services using postgresql this query will return all services which have postgresql instances in their service map this type of query is useful for an it team to identify the dependencies that an individual ci may have cmdb query builder supports domain separation this means that a query created in one domain cannot be shared between other domains so we've shown that the cmdb query builder is a great tool for teams to navigate the cmdb for insights to help it plan for changes and upgrades without having to write complex scripts in this demo you've seen how servicenow helps you easily add cmdb resources while maintaining data integrity with integration hub etl monitor and maintain cmdb integrations from the cmdb integrations dashboard and build infrastructure and service queries easily to improve decisions with cmdb query builder for more information on cmdb check out the cmdb solution page at servicenow.com as well as the additional resources on this demo page

View original source

https://www.youtube.com/watch?v=-2DzEGRzLWY