logo

NJP

IntegrationHub Data Stream for Import Data Source - Live Coding Happy Hour for 2021-01-15

Import · Jan 16, 2021 · video

[Music] so [Music] hello and welcome to live coding happy hour my name is brad tilton and i am excited to be here today on this beautiful friday january 15th uh we've got a special guest today and jared it is not his first time joining us but i don't know how many times has it been now jared do you know probably five thick maybe oh nice nice well go ahead and introduce yourself yeah i'm jared montz i'm the newly promoted vp of engineering at clear sky inc we put uh identity excuse me identity governance and administration apps into the servicenow store and help our customers just uh manage their risk and iga processes thanks jared how about you check hey i'm chuck tomasi senior developer advocate at servicenow and i am getting older by the day been with the company for about ten and a half years oh my gosh it's coming up on 11 where does the time go was a customer for a couple of years before that and started tinkering around with computers back in the early 80s so that's coming up on 40 years so a lot of fun love every minute of it and we get some fun stuff to show you today all right and my name is brad tilton i'm a developer advocate here with the developer program at servicenow i've been developing on the platform since about 2008 and have spent time as a customer partner and employee for the last four-ish years so before we get into it today we're going to go ahead and introduce what we'll be drinking and then at the end of the show we will all rate our drinks we'll go ahead and start in the same order with jared today i have a again once again fancy sparkling water um in a legacy nice fancy looking water that's nice i i am a little embarrassed so i'm keeping it covered up but uh i think the only thing left is the either the grapefruit rattler that i had last week or a corona premium which it always makes you feel like i just got done snorkeling and yeah moran and what that's what they hand you in mexico that's right well i have a fun one today so last week i drank a negative space it's from one of my favorite breweries and today i have a positive space which is a spin on the negative space where they basically double all of the spices that they put into the negative space and it's funny i've had this sitting in my fridge for about a month and a half because my brother gave it to me i can't find him up here and my brother gave it to me but told me i had to drink it on the show and i had to drink it after i had had a negative space the previous week so i'm kind of proud of myself that i was able to uh to resist for a month and a half but we'll see how it is does the negative space balance out the positive space and then you get a zero space i'm not sure white space i don't think it works like that but uh that that would have been nice maybe if you combine them something happens i don't know maybe it does i haven't done that yet i gotta put that on my list of potential beer names if i ever start a brewery white space there you go yeah that's going to go all right my rest ipa i like that all right well today i'm excited about the show it's something that i don't have a ton of experience with but i'm excited to learn about we're going to do some we're going to work with data stream actions as a data source and chuck is going to walk us through that so chuck you want to describe what we're doing today and then we can drop into a screen share yeah you bet uh the the backdrop for this is we're migrating the tech now data and app off of that instance this sounds real familiar because we did something similar or tried to do something similar with the community live stream app bring that into our artifact manager application which is on a different instance we've the artifact manager already has all the blog entries all of the uh live coding happy hour just that's where the stuff is that we manage except for tech now which predated a lot of that stuff so we're bringing it in and the architecture has changed uh instead of one record for an episode that contains links to the community and all the youtube stats and whatnot we've broken it out into different articles and different record types in artifact manager that being said i've got a many-to-many table for the attendees after we get done with one of the webinars for tech now we get this spreadsheet from the vendor and we import it into the instance and it says here's a sheet1 is your attendees she two is the questions and answers sheet three are the survey results and you kind of get the idea we get some statistics out of there that we bring into the instance and then we can start uh getting some reports compared to how we did last month etc etc i've already gone through and imported the attendees that's just the straight first name last name email address i've constructed a name field much like sysuser does that's a concatenation of first name and last name so i bring those in and those are a single it's not not in the cis user table i made that conscious decision not to do that these are the webinar attendees or now they'll be artifact attendees because we could potentially for something else the um the other side is what is the webinar record so i need a i have excuse me i have a many-to-many table both on the legacy system and the new system that joins those together because one person may go to several techno episodes kind of get the idea like users and groups there's that many to many table in the middle that joins them and now we could do even more statistics like who's been to the la a webinar in the last six months or who hasn't and maybe we need to invite them in all kinds of different ways you can slice and dice this well the goal is to make to take advantage of one of the paris features you can use a data stream from integration hub as a data source so normally when we import things using data sources and import sets we will attach a spreadsheet or you do jdbc query or something of that nature well now you can make the source a data stream which is really freaking cool because i just love data streams after building a number of them last summer for getting paginated data that's what we're up to i did one for the attendees there were over 11 000 records in the attendees table we're going to try and keep our eye out for privacy and not expose anybody but servicenow email addresses and i've got the episodes imported that was yesterday's mission and partly this morning's mission so that's done i'll be there's a little work left to do on cleaning up the data but i've got the attendees and i've got the episodes in artifact manager now we're going to go import them well in a many-to-many table there's at least two fields and they're both sis ids the cis ids are great for the other system now we need a way to map those into the new system because it's not an episode anymore it's an artifact different table differences id the attendees all came in with new sys ids when those were created so you kind of see there's a bit of a a crazy hook in here and if i use the name field as the display value and say hey you just go set the display value to jaredmont or brad tilton or chuck tomasi there may be two jared montz out there with different email addresses that's the unique factor so temporarily i've changed the display value on the technol legacy system to the email address i'm guaranteed to get unique records out of there if there are two i actually saw there was one person that had an underscore in their email address and a dot and i swear it's the same person because the domain was the same maybe they changed the way that their corporate email system worked after a while so we're going to hit things like that i'm sure it's the same person and it might not matter in the long run but i don't want to take that chance with more common names so let's get started on that any questions before we begin no i i'm excited to see it okay screen share uh we've got to get the right screen here that one and go [Music] all right so i am on the dev program dev instance and this is my table that i'm after i've got artifacts and i've got attendees there's nothing in here right now if i look at attendees i've filtered out the 11 000 records so that it's just servicenow people there's 170 people who have attended tech now over the last couple of years which is always intriguing and appreciated so i want to first start by building well let me show you what it looks like i'll give you one what it looks like baked let's go to data sources and underneath system import sets data sources i've got import attendees and here you can see here's my control thing there here you can see i don't know why it does that you got to be careful yeah rename things because it defaults to the first field yeah you get the focus but it's always a tad late later than you want it to be data stream is my type and you could say you know we've we've done file in the past if you've been through any implementation boot camps or anything and over here it says great tell me about what data stream action you're going to use this is an ordinary data stream action that i built you can also do rest integration hub but you can't use any old rest action in fact rest is a step there is no rest action you have to build that specifically through here now the cool thing is i can just go click on this and it'll bring me right to flow designer to that data stream action and i can tell it's a data stream action when it gets done running because up here in the upper left it says data stream action i've got my inputs there's no inputs just go you don't give inputs to data streams i'm not doing any pre-processing on this and i do have a little pagination now for communicating with another servicenow instance we've got to use offset and limit and i'll get into that in just a little bit but the template you use you can say hey i can use a template and it will fish this finish this script for me only it doesn't quite get it right because there's a critical variable called get next page which says is there still more to do which is how it does the pagination it doesn't set any of that for you i got this script off of the docs only it's a picture so i had to type it myself which is disappointing in a way then it drops into the rest step and it does your typical rest stuff go do a get against this table use the credentials that i told you about and here's my cisperm limit which was set in the variables here's my cisperm offset which is set in the variables and if you've ever looked at uh like some of the urls we should do this sometimes just take apart a url and do all the sysparm stuff on it i'm also using a sysparm query but instead of using it as a filter i'm using it as an order so i say order by name so that i know what order these records are coming in i get all the people by name and i know when i'm done when i hit the end if i'm still on g it's not done there's a splitter step that says how do you want me to parse up the the json that comes in to know one record from the next that could be different depending on who your source is fortunately for service now it's real easy and the script parser step which takes that json apart into its individual records and tells me how i want to represent it and finally of course our output what does one of those target records look like this is what you look like and you better have attributes that match these names let's go build one of those because we already know this one works but we'll keep this one up as a reference because i want to copy that script so what was the this one that you just showed us what is that importing this is this is this is the one that's attached to the data source that imported the attendees oh and once it's attached in the data source then you just go build yourself a regular old transform map and you get all the benefits that transforms give you like the coalescing and whether i should run business rules or not uh this one is pretty simple i've got email map to email first name last name coalescon email nothing real surprising there nice so i have not done this yet so this is all a great mystery but i thought about that display value thing this morning let's start with data stream and i'm going to call this uh import tech now attendees episode attendees because i already did the attendee information that was the simple table this is the many to many artifact manager blah blah blah looks good to me and it goes and fleshes this out so it put inputs again i have no inputs to this one i'm not going to be doing any pre-processing scripts so let's jump right into the request how will you get data rest step enable pagination most definitely there's 16 998 records by two more people two more even 17k run a script before each request i don't think so at this point and as you see if you when you check these it just turns on extra steps in this template which is really handy the pagination here's where this template thing comes in handy i wish there was more than one kind of template obviously this is geared drastically towards servicenow but doesn't fill in all of the parameters there's nothing in here about get next page but again if you go to the data source page in the docs and look at pagination you'll see that it gives you a script just like yes so i'm going to copy it and let me let me digest that for you oops okay so i say change the limit above to configure the results per page first we turn it into an integer assume these variables are like many other javascript things and you're probably not getting the data type you want so i run it through parseint i get the offset and the limit this is this is to say start at zero and get me 10 records or 100 records start at 10 and get me another 10. start it that's that's what tells you now the total count actually comes out of and i was wondering this when i was playing around yesterday what is this page response thing page response gives you access to things like the response headers when you go into the rest api explorer and you do a test request you see all these things like how long it took how many records there were who the signing certificate authority is lots of headers you can get at those headers from in here and the important one that i want is this one called x total count which is going to say 16 998 in my case well i now know where i am and where i need to be and where my next offset will be so i can say are we done yet that's really what that one's saying if we aren't done yet then get another page this is the critical get next page thing and probably the biggest thing you'll run into when you do pagination is i either got one page or i got an infinite loop and most likely it's always because of the way you're handling variables.getnextpage so that one that one will throw it it cost me days last summer but now i have an utmost respect for it if that's true then i'll go and update the offset and bump it to the next page so the next time i come around it's right and then for whatever reason the example on the page does this it turns it back into a string which i thought was interesting um yeah if the next offset let's say we've got 16 998 and i'm going by 100 and it says your next offset is 17 000. well obviously this is not going to evaluate it false and say don't get any more then and this is really what drives the pagination if i were in flow designer and i were consuming this i just drop in my data stream action and it kind of makes that action like a for each so there's a built-in branch under it and i can say great for every record and it just goes through one two three four five six seven eight nine oh hey it's time to get another page of information and behind the scenes the data stream does this chunking for you but out in front it just looks like a record another record kind of like a glide record query does but that's the pagination setup i lost my cursor on some other screen and another county in this room let's go get the rest step pretty easy yeah clarify on that we are not going through a mid server for this example but if you do go through mid server is i think there's some fun things to do with pagination as as well oh do you have an example or use case um as i i haven't used this in production or anything like that but as i understand it normally when you're paginating through a mid server it goes from the instance to the mid server to your destination and back up to the instance you increment and it sends another thing through the ecc queue down to the mid server with this um there's a way to configure it um again as i understand the release notes where you can configure it to go down to the send one command down to the mid server the mid server then does a hundred pages to the resource compiles your json whatever and then sends that back up to the instance for processing so you save a whole lot of round trips oh nice well that would that would that makes that makes a whole lot of sense rather than one record at a time okay let's use the appropriate i already set up this is basic auth from instance to instance you can use whatever you want but um somewhere in here i have artifact manager techno credentials that looks right it's check can you pull the right part of your browser over a little bit we can see till about the right side of the properties button right now right you tell me when it's that looks good yeah i think that's good of my drop downs and whatnot i'll lose my data pill picker if i go any further okay so my variables are up here on the data panel and i've got tech now as my instance i'm going to configure this manually the resource path is now going to be the table api and i honestly don't remember what the structure was for the table api so i'm going to cheat and go over here you can get this out of the rest api explorer as well in fact i often test there first let's go resource path is going to be instead of this it's going to be my m2m uh episode attendees this one is plural i spelled it completely wrong didn't i when he teases that p-p-e-n-d-e-s yeah it's a man it might be episodes into it okay now i have to go look darn it hate it when you don't remember names i usually name my many to many's very consistently because yeah this one predates my standardization what do you what do you go with first out of the two tables i go with the top table so it whatever whatever the main record is and then the related i want to see so if it was um and i guess that's arbitrary because it's in many to many darn it it's where i'm going to be using it most like on the jeopardy game i have a game record with players so it was uh game underscore players or game underscore player because it's one record let's see what this is so let's go to any old episode that has some attendees we did idr phil has a good question he says do you use plurals in the mtm tables or not i don't anymore at least i try not to i try to use plurals in tables as never as possible yeah so agreed oops so much for my privacy but uh here you can actually go to studio might be a little more privacy there we go tech now um oh darn it wrong instance let's go to the tech now dev instance you can't open studio on production by the way yeah if it was it was pushed by the repo last time all right try that again putting loading learning studio and my source table is called what app am i in i'm in jeopardy switch heck now tables episode attendees that is not the way i would name it if i were to do it again today the only tables that i'm aware of in servicenow that are plural are cis properties and the pa tables and why the pa people didn't read the manual before they got started i don't but make it a general practice not to make plural tables because it's already going to plural the table label for you and once you're once you're off that standardization it gets really confusing like was this a plural or not so this is x herb m2m attendees episode okay i didn't this this really predates my and i'm glad i copy and paste it because i wasn't even all right uh query parameters we know we need arm limit right to tell it how many we got and we'll go from the variable we know we need this is going to vary from api to api by the way how you do the pagination and how you get a group of records this is perm offset when you do youtube you just pass it a page token i think there's i think there is a limit on there though you say i want 50 records and their api says you can get up to 50 records it is limit and offset and then the cursor based that you just described are the big two so once you figure those out you can integrate with just about anything yep and obviously says parm limit and says paramount offset our service now it could be n and q or pnt or whatever their their query parameters are up to you there is one other one that i want to put in here and for privacy just for this test i'm going to put in a cisperm query that mimics this table that i have this filter not that one this one so i just get the thing where email contains servicenow for privacy i'll take this out later honest i swear i will i'll remember we believe you how many times you said that right and these are pretty small records so i'm going to get pages of 100 something you may want to keep in mind that i discovered yesterday last night [Laughter] is when you go through and do this parsing uh there's a like a built-in json parser that does this thing for you and it will if your payload is greater than 16384 which many of you will recognize as 2 to the was that 12th i can't remember now the uh but it's 16k that's as big of a record as it can get and i was importing the actual tech now records which have blog articles in a field which one record exceeds that let alone getting a page of records it happening there is a system property that you can change to extend how big that json parser will do and now my page size my my limit is one but i can get these records all it's totally transparent to flow designer okay so keep that in mind did we we finished parsing we said we'll get 100 records we get our rest step i think we also it would be good practice to put in content type application json and accept i think this is the default but i always feel better putting them in the headers to say here's what i'm sending and here's what i'll take all right then we get down to the parsing how will you identify each record i'll use the json xml splitter of course great how will you parse them with the script parser because i don't have much of a choice with that i assume at some point the xml parser or the json parser in here at some point i don't know what if you said i don't want to it's a mandatory field you only have one choice why you could tell there's something on the roadmap source format json item path so the item path if i were doing xml i would give it an xpath slash this slash that to get down to the array of stuff now i don't have my sample payload on here but if you've ever looked at oh the heck with it let's just go get a sample payload um i want to go back here and go to the rest api explorer because it's always fun to understand what it is you're getting waiting for those tables to load and let's get x under or b tech now m2m any day there we go attendees we'll do this as farm query that i just had fortunately that was still in the copy paste buffer we'll just get one record just to see what the payload looks like and we don't really need to limit our offset on this but what i do want is this display value that's the other one i should set i must set that otherwise you're just going to get societies back in fact let's let's leave that off just to see what happens if we forget it and go send go get me a record from this table here's those headers that i've got to look at and total count was zero that wasn't any fun um oh you know why because there is no email field it's going to be attendee dot email ah okay i still got zero let's take the filter out for a moment and i'm probably just going to get a society send i get one i got one i got one okay and here's what you would normally get let's see if we can this is the many to many record my attendee record has a link because it's a reference field and the value is a lovely sis id that's not going to none of this does any good on the target system same thing with the episode there's no episode table and this idea is completely bogus so what i would be better off doing is getting specific information about this the display values would really help and to make sure that i don't get some privacy information i really wish i had done that in a pop-out window let's go and get here the table i want just the table name episodes how are we doing on time we're doing good i'll come back to rest api explorer that dot list um we can either do privacy shrubbery or i can tear this off and put it on another screen let's do that instead okay you stare at that studio picture for a while well i figure out there's only one record on here that's not right oh i'm in dev what i only have one record okay that was uh one record and only one record why do i only have one record in that table because we're pulling from production i mean this is dev don't rely on your dev data how's that for testimonial let's go back to production because it will have the data that i'm really really after and again i'm going to take this and put it on another screen then do a privacy filter just so i have the servicenow people in there again list and filter show related fields email so brad did did we mention that this data stream action if you don't see this on your pdi that you might not have the integration hub enterprise activated can you mention that that's a great call out that i just i when i was first looking for this um it i ended up going back to the documentation back to that request integration hub document that lists everything out um i get in the spirit of live coding and kind of celebrating our missteps as learning experiences a related story to that i was in our company's development instance i had uh through high activated integration hub enterprise and i was happily building my my flow my integration hub and i went to go use the json parser and it wasn't there and i looked around couldn't find it xml parser was there data stream or everything for enterprise was there except for json parser and i couldn't i couldn't figure it out got frustrated put in a high ticket um the person from high politely says uh your dev instance is on orlando and jason parser is not available until paris so uh it's not just the occasional javascript mistake or or studio mistake it's even documentation mistakes and high ticket mistakes that [Music] that even the veterans um do occasionally missing json parser does that sound familiar brad we had one of those on live code earlier this year when we all thought it came out in orlando and try and tried to find it we're like wait a minute yeah didn't come out yet my slack message last night i said wait dev program dev doesn't have a json parser what happened i missed that episode and i i learned the hard way so yeah i'll teach you to miss an episode no that's a good point i mean that lots of times it's plugin not activated you know if you're working with a bunch of different instances they're not on the same version or you know all that stuff oh they can have it i think i just put away the wrong table i went in i lost track of where i was where was i i was on the item path what i wanted was the rest api explorer with this query to see the payload yes that's what it was that's what it was so rest api explorer on my techno instance i want to see the legacy data according to the legacy system loading tables loading tables x orb techno m2m attendees episodes okay let's put in our sysparm query and with servicenow and i should get one record that get one record looks like it where's my total this one doesn't have it total count 191 okay but it only gives you one because there's a implied cis prime limit okay it's in a result array i've got my top level object i got my result and i've got those crazy societies that aren't going to do anybody any good so what i want to do is say go up here and say display says parm display value you can either return the values uh you can get the actual values which are cis id you get the display values with true societies with false or both with by saying uh all so if i were to say all this is great for reference fields but it makes every other thing an object too yeah like choice here's your display value here's a here's a string it's the same value and display value but i'm going to put it in an object for you so i guess if you wanted to say you know there are maybe a case where you want to go through your properties in the object and get all the display values that would guarantee everything has a attribute called display value all right so if i put in both sys id display value and value that's helpful let me zoom in a little bit on that yeah nice huh when would you ever use that again maybe you could just specify a list of fields and everything is guaranteed to have a display value attribute inside it i could see that but here's the attendee display value is the email address and then this other stuff isn't going to help me on the other side when you get to the episode here's the title and here is the stuff that's not going to be any good on the other side that's if i said all now if i set this to just true it defaults to false by the way set it to true and try it again i now have just the objects with the reference fields with the display value and their link i mean you could always tease out the sys id if you needed to for that reference but i it's not going to do any good on the other system so think about that when you're interchanging information what does the other system know and what do they have to work with but given this i can now determine if that exists or not so the structure is noted this way that's a long way to go to get our little parser thing going the item path is called dollar result that's where you put in there to say that's where you can find your array of stuff now the script parser step has two points to it there's an inputs and an outputs and the inputs has an implicit source item in here so source item is built in and target object is the implied output so if i whoa not sure where i went on that to the bottom of the script and write out the end the fastest way i know to do this is to say outputs i actually have not had a lot of good luck when i assign directly to outputs so i'm going to say our obj equals json parse inputs dot source item just go take the whole thing you got back in that payload and parse it for me and then outputs.target object equals obj it might be me it might just be a bad experience i had way back oh six months ago but this also lets me introspect it if i wanted to do a string of fine go did i really get it put it in the log or do something else uh yeah just just assigning it assumes that it's going to work the first time too so that's uh yeah i would never do that uh i'm i'm going to assume it's not going to work uh and i'm going to need to to debug it so i've made my request i've split it up i parsed it now i've got to assign it to the outputs and my output is going to be really fun i'm going to create an output and call it naming the variable is always the worst part right attendee here artifact handy it's it's one object you're gonna get one at a time and it's going to be called artifact attendee now here's where it gets upon we're going to define an object and you know what i don't want to do that simplified one because it's not a flat object it were i could do this real easy and just go oh like the attendee had a name a first name a last name and an email very flat these reference fields kind of muck this up a little bit so first i want to parse that and then i'm going to assign these explicitly and do outputs a target object it's already an object so i don't need to define anything i can just start populating it i have attendee email equals obj dot attendee right is that what i got out of the payload let's go look at this is where it's really handy to have this rest api explorer attendee.display value keep this sample payload with you at all times till you're done attendee display value and then are we looking at target object dot attendee email yes thank you how i had two equal signs in there that would have been fun object dot thank you jared and then the other one is going to be the artifact title obj uh episode display value yes we don't know so we're gonna go look episode display value okay good sometimes you have to construct these things yourself and sometimes we don't need that sometimes you get lucky and you can just go parse the input object it's going to be my output now when you define these go back to edit outputs i've got my object these variables must match the ones in your script so you can label them anything you want call it attendee but this part has to be attendee email and that is going to be a string this one is called artifact title and it must match artifact title all right what do you think think we have half a chance go test it we're so eager to hit that big test button will it run really run it ran it did something and i think i get one page of 20 records i don't see any output total item count zero page count one all right what happened where is our request step configuration step configuration that's the splitter step where's my rest um did we get an output uh results body oh yeah result nothing where's my can i see the request request payload size 557 whites response payload 1160 what is this i want to see what i tried sending there we go it went to api not a permissions thing because i'm admin on the other system and it's using my credentials all right well we got a 200 so it didn't bomb out too bad but we got no response payload what was in the response header did you get a did i have zero as the total record sponse header response header you read that i can't throw it in a json beautifier ut5 total count zero s uh yeah server transaction id [Music] thank you there it is oh count zero okay that didn't do us any good what happened it wasn't an error like we got the wrong table name so query parameters this perm limit 100 might have empty offset zero empty arm query email oh i solved the wrong that's right it was was it attending you got out later and you were [Laughter] no i said is that what i promised i got to change it because it doesn't work you'll update it immediately let's get that attendee dot email and so that go back to our rest put in our sysparm query there yeah cursor out somewhere else so i can get a save button now we'll get some data we got to get this record and rolling because we still need to put it into the data source only 12 minutes left i can do that i can do that it's all going to come down to that transform anyway now do we get something we got an object total item count 20. oh no it's got to be something in that script parser that i'm doing what was empty there did we know it was in the oh yeah what was the did it have an error or anything in the parser i don't think so we're gonna look at the execution details again you wanna see the page details yeah let's go look like any errors yeah so there's stuff in the results response body looks good and it knew there were 20 records in there because that's what it should do is get sample size of that attendee display we don't have yeah we don't have display value here what we have lincoln you know why because we never put it into the request we did it in an api explorer we did in the api explorer we never put it in here to say display value now you uh is it just display true i think it's just this part display right it's like oh no i'm i'm way wrong i was with brad on that one it didn't look it didn't look quite right no it doesn't look quite right test run these are the things you'll trip up on people and i'm doing this a little off kilter because i've never done a many to many table and outputs look at that hey all right servicenow people in here now that we know this is working and it will give us some records a hundred at a time so we'll get a whole two pages for this query let's go back to our data source and make a new data source so system import sets data sources new techno come on episode attendees close enough imports that table name import episode let's let's name it like the other one just so i don't go insane tip for uh import set table labeling i always include imp or the word import because i have definitely uh named tables the same as the target and that doesn't go well i don't even give them good labels [Laughter] i want them to jump out of me and go hey all right now we can pick the data stream and if we look in here we should have import tech now episode that's my other data stream i should have named the action with a ds i can relabel it i suppose heck no attendees say where is it where is it i do that episodes do you have to publish it i do have to publish it that is a very good point there's a if you ever doubt yourself you can always do the good old show and look at the reference qualifier published equals true there we go good call yeah i have so much faith in it that i'm going to okay one more time and i thought we might have some spare time to go chase i was thinking about that always when you think this is gonna be quick never never okay well relative to the old world of doing it all in scripts and scheduled jobs and if we we have time to spare going this way it's true we are a little spoiled with the flow designer and integration hub i can do that in 45 minutes when i was doing the um when i was doing the episode import and each one of those mapped to two or three different records on the target system man that is so much easier to do in flow designer than it's on before and on after scripts in the transform map it was just butt ugly over there okay let's import 20 test records just to make sure running running running running thank you chrome but i'll remember that password i'm sure i got 20 inserts and let's go create a transform map now that you know my import set structure call this techno episode attendees target table is going to be x the decency artifact manager that's the artifact yeah artifact manager m2m artifact attendee that's where we go we don't need business rules there's nothing to be gained by that here save now here's the fun part how to do the field mapping so what we have is artifact title an artifact email and all we know here is attendee an artifact no other way around yep let's not do that that would be bad i've never taken two fields out at the same time now these have to be driven by display value because remember i'm not getting a society here um do i want to coalesce logic would tell me that they're already going to be unique i need to make sure that my display value on this target table is also the email address so forgive me a moment while i jump into studio and look at why do i have a table named episode attendees whatever oh those those must be my that's the yeah okay threw me off on that one attendee is my many to many after you create your many-to-many deck oh look it's already in front of my face how convenient uh who's the display value on oh i need the display value on the attendee table not on the many to many this one has email good good good good good good and then i think is artifact the title is that the artifice tell me it's your app i'm pretty sure it is i i think it is but i think we have numbers too we do and for whatever reason it's not generating numbers on this instance but that's a problem for another time not a number title title is decent yeah okay i'm ready except when you do a test load watch this you do a transform it does success no it doesn't do success it doesn't do success because go ahead jared you had something to say i did want to take another look at that coalesce value and confirm that we don't want to set both to true yeah i think it would either be both or none right but if we set them both to true then you will will only ever have one uh you know one many-to-many record no it means both have to match yeah yeah and that's what that's i think that's what we want is to on the every import we don't want to keep creating records we want to find it and upgrade it it's a good point which will give you both if if it's true i'm probably going to run this import like a thousand times before i actually make it to production but what was the there's a rationale you don't want to set both to true no i do want is that both to true oh right right okay and i was saying either both or none you don't want to set one to true on a many many to many yeah no no no okay i get that totally let's go back to our data source data source we're almost there we're almost there when i was there at least we're almost to a point where we can admit we're a success or a failure let's load all records and it load 91 that's good let's run the transform on our latest import set run that and it doesn't tell you on this one so let's go to the transform history 191 inserts oh sweetness right at the end of the show look at that 190 nice just so i am certain what is one of the episodes extend integration out with custom spokes let's see how many servicenow people attended that one so my related list of attendees three servicenow people were on here and aaron if you go to aaron's record oh i never put the related list on so let's put the relatives on now you could see how many artifacts he's associated with one he only cares about an integration hub for now well then he should be watching this video okay so that was the fun of importing a many to many table which i'd never done before and using a data source any any final comments i got to go remember to remove that cisperm query [Laughter] that's right 1000 whatever it is all right you want to drop us out of screen share here anything in the chat we need to know about i don't think so i think uh people enjoyed it um yeah we had a what what's a scheduled job question earlier um you could do a scheduled job data source standard stuff really applies after that anything in that transform with the befores and on afters and onstart on complete all of that is available in the integration series up to episode 15 which released on december 1st we forgot to do our plugs are we doing that at the end we're going to do it after after we write i just plugged in that's fine available on the servicenow developer channel on the playlist learn integrations on the now platform there are more coming out i've got three more on email that are coming up and then i go into web services so i'm sort of in this limbo i've got hardware on order that i can finish this video production on the next chapter so stick with me it's coming it's coming while i'm waiting for the equipment i write scripts all right well let's uh rate our drinks jared we'll start with you uh what uh what do you give it uh fancy sparkling water uh i'm gonna go with a four again i've got my it's corona premium or special or premiere corona premiere three seven five with the quarter point success factor nice i've got my positive space which is a spin on my negative space from last week and i'm going to give it a 4.75 it was excellent oh awesome you did 475 last week if i remember i did well they're they're pretty similar beers so yeah i did all right well uh thanks for quick reminder january 26th 8 a.m pacific time is tech now we will be covering the quebec features i don't have the link for the registration page at my fingertips this time but i believe we put in the chat last time uh if you if you really want i'll put it in the comments on the video so if you're looking for it i'll put it in the comments not the uh not the chat but i'll put the registration that's going to be the quebec release so tons of good stuff coming out don't want to miss that one last week we did an interview with paige duffy that will be coming out on break point next wednesday which i believe is the 19th no 20th 20th of january and then we're going to be talking to uh andrew barnes right after that about quebec features and get some of his insights of what's hot and what's not and his favorite stuff and where we're going to take things so lots and lots and lots of more great content you don't want to miss that one but the big one is quebec on the 26th so mark your calendars for whatever time zone that's in 8 a.m pacific time on the 26th for quebec platform features yeah and we may be doing something quebec related on next week's show so stay tuned for that we'll have to see jared anything you want to uh plug nothing actively but i'll i'll see you out on the s and devs slack um you want to catch up around midweek pdis they'll be upgrading to cool you can upgrade them to quebec what starting thursday 21st i think so that was the last thing i heard i don't know that i don't know we can commit to that day but i think that's probably uh that's probably about right safe harbor safe harbor safe harbor thursday or friday look for that option so you can start upgrading your pdi to quebec last i checked it took about an hour to upgrade for the previous releases so then you can watch the tech now and understand what you got like it's like getting a package from amazon they go i'm not going to open it for a couple of days thank you brad thank you jerry all right yeah well thanks everybody thanks everybody who joined us in the chat and uh we will talk to everyone next week bye guys take care bye [Music] you

View original source

https://www.youtube.com/watch?v=DEe0N7IGzXM