Demo - Sweagle Product Overview (configuration element management)
so this is a demo of sweegle and an introduction to its core functionalities and capabilities um in case you haven't seen the intro presentation this is just a quick reminder of of some of the things that sweegle does so if you imagine you've got configuration data that sits in a series of engineering development tools in infrastructure tools sits all over the estate what we want to do is we want to collect that and consolidate it into a graph based data model so it is very good at automatically kind of consolidating those things into an understandable format it applies version control audit control securities allows you to export to lots of different things and as part of that as a validation engine that essentially is constantly watching that configuration making sure that um you know the data that's stored in git is as higher quality as the data being used for terraform or azure or as part of the service now cmdb or just in general files you know there's a huge array of different data types and places that they kind of can be used um and then you've got a series of tools you want to go and consume that data so you've got configuration tools you've got pipeline tools you've got applications releases um you know be that azure devops it'll be that jenkins be that your own custom configurations some manual scripting that you've been doing you know whatever it might be swingles it's in the middle and it's either drawing in data from one tool and providing it to another um on demand all driven kind of through apis and this demo is just going to show you know some of the the core functionality that that we do there okay so to follow on from the slides um with sweegle we have the concept of data coming in and we track all of those things as part of this incoming tab a data model that we then turn all of those files and data types into a structured format and then a library of assets that you can then go and use and deploy and manipulate and work with if you think of the incoming tab as collecting all of the data from lots of different places jenkins github lots of different repositories maybe some databases cmdbs that type of stuff and it's it's essentially all being brought into swingle sometimes data is being added sometimes it's being removed depending on what changes people are making and what we do is we build a data model out of that and the the reason we're building this data bottle is you know we want to simplify the process around configuration data we want to apply some structure to it we want to make it nice and easy to understand what's going on so in this this example el dorado is my application my application has a series of components that sit behind it so there might be different uis the uis depending on what version they are i have different settings i'm storing you know spring settings secrets i've got some micro services so dashboards token services you know whatever my application component um kind of settings are that's where i'm storing these types of things um then i've got a series of environments doing very similar things i've got devtest uat and prod depending on the environment depends again on the settings that are being stored there those might be from a mixture of different places so we could have a some stuff stored in gear some stuff stored in terraform other things part of a cmdb but it's all amalgamated into one view as part of this data model then got the infrastructure that it's running on so if we're doing some infrastructure as code stuff i might have some some terraform data that's being kind of stored here i may have some ansible may have you know data from lots of different places that meets my infrastructure as simple as like windows server settings or linux server settings um to actual configurable items depending what's happening i've got my integrations so if i'm using in this case primarily ansible but you know as an organization maybe we've got access to a few different types of um integrations that we need to provide data for and then releases the the amount of releases that we're kind of going through and what we're doing is we're applying a level of structure and standardization to this data so we store things as key value pairs within this data model the data model is typically automatically generated so we'll try and automate the structure of that so that it your there's a one-time process of of um you know making sure the data is in the right format that we're doing the right kind of stuff with it um and generally the data model just kind of gets built based on the files but you can then apply a series of rework trying to apply standardization to these types of things simplification um in this case with environments now what we're trying to do is to make sure that we're avoiding drift when it comes to the configuration settings so the data that exists in tests potentially shouldn't ever end up in prod depending on certain settings certain allowances etc etc and swigel's kind of giving that that ability to separate to segregate but equally share if you need to so depending on your user roles what you got access to depends on what you're doing um what we're doing is essentially storing key value pairs so these could have come in as a json file it could have been an xml file could have been as a properties file could be as something completely different but essentially everything gets stored in the same way and then we can manipulate this data to get it out in lots of different formats depending on who needs to use it so you're not rebuilding the same assets lots of times you're essentially giving yourself different ways to use the same data depending on what's going on as part of this you know we might be including things like infrastructure settings load balancers firewalls a bit of cloud stuff maybe some legit legacy data as well you know the the list is almost endless but it all comes into the data model it all kind of gets loaded in there and we apply apply series of structures to it once something's in the data model we start to track its history so over time i can see that originally about 30 items were loaded as part of my test environment and it's kind of grown um releases new components new infrastructure stuff gets added stuff gets removed and swingle just tracks kind of the full history of what's going on there so at any point in time if you decide that you want to go and see right what happened in between version 47 and 48 um we can just come in and understand you know what's been added and deleted so password was added a last name was added and the first name was changed from test to me right so very easy to understand that change why that's so significant as if at you know two o'clock or whatever it was so if at 10 past two things stop working at 2 o'clock things start working again i've at least got the ability to come and see what changed or equally if something was working it stopped working again showing me what's changed what's the differences that are kind of happening in regards to this data um we give a really nice graphical way of being able to understand what is currently deployed and what's assigned to a particular environment in this sense but if you think you could look from an environment's perspective you could look from an applications perspective you can understand like like so what's the assigned release and what's the the makeup of that release what's currently the assigned ui one of the settings that are assigned there why this is important is you typically especially into applications in infrastructure the infrastructure is potentially owned by a different team to what the application is run by and equally the environments so if somebody for example were to change some of these these host details the impact of that could be across different teams different environments different settings etc etc so it just gives you visibility that if i change something here who do i need to potentially go and update or equally have they given me permission just to be able to do that automatically and you know i'm because i'm aware of all of the different things that exist so quite a nice way of being able to understand changes track changes um track dependencies um and understand the impact of change so you know what's changing why is it changing and so on and so forth one of the big things we've got as well inside sweegle is the ability to check for data quality so i've got a library of different tests here that i can either choose to assign or that i can leave unassigned that will search for different types of data and whether i'm meeting my standard or not so in this case i've got an error when i'm running my my database check which is essentially telling me that i'm running localhost against a series of my environments now i've turned this off for my test environment because i'm saying that for tests this is okay but actually if i was to run this against production i would want to make sure that we're not just using localhost that we've put fully qualified domains in there um accuracy is searching for a particular database name that's not been found so these are the types of errors that quite often slip through the net especially if you haven't got kind of a process collecting configuration and there's loads of different examples it's kind of an extensible library so you know no http is going to check that we're using https only um there's some various levels i think like password checkers and that sort of stuff making sure that we are uh i guess checking the right things and what i can also see here is there's an approval system within sweegle so the ability to approve change so you get the idea of prevention rather than deploying and then going and fixing issues as they arise so i can see here that i've got a series of changes so i've got three changes coming in i've got a server name that's been updated i've got some passwords that are being added and that type of stuff i can just see that you know i can just see visually that that password is not encrypted but sweegle is also going to tell me about that um any of the validations that we run will or that we build essentially you can choose to automatically run against every single set of data and therefore you get an understanding very quickly of who what's broken the rules and so on and so forth so what we're going to do is we're going to come in and we're going to fix some of these problems and i'll just take you through the the process of modifying data in sweegle so you could just modify this back in a source directory and run it through a pipeline and automate it of course and this is more more of a manual granular way of doing i suppose but if we say i can see this password here which was the one that was added let's just go and encrypt that particular value okay so it locks it away from human eyes and opens it up to um to a more structured way of of dealing with it i can add in new keys so if i wanted to add in um something i don't know maybe a new url and it will just kind of put in a correct url maybe it's going to be offer aws amir server something along those lines uh i've already got a url already down here um but uh so it may be url 2.0 i don't know terrible example so you can add keys in you can add values in generally you do this through files and there's an automatic upload process that you can kind of go and use if you do it that way swiggle will automatically kind of structure how it thinks the data should look and you can manipulate it after that so we've just made a couple of changes what we could also maybe do is have a look at some of the the machine learning elements of what swegel does so if we take a look at maybe some of these these aws servers for example so what i've got here is a an s3 server and swigel's automatically typed the server and what that means is it's it's looking at these values and it's applying logic and consistency to it so this particular set of data has been encrypted automatically details that are required have been flagged so if i was to come in and remove some of these details that would flag as part of a change set that would flag to me to tell me that that's not something that we want to do so we can kind of set this back to public public read um so if we take a look at this what have we got so this uh rds instance for example we can see we're we're running a small database class and therefore rather than having 12 in here should have um a database size of less than 10. um my oracle ee looks like a spelling mistake to me that should probably be oraclexie yeah so you can programmatically start to enforce different standards automatically using swedel um rather than having to rely on people to uh you know not make mistakes which um if they're anything like me it's it's that's a hard thing to do so all of these changes get tracked in the change set so we're tracking all of these changes um i'm going to approve these changes and that's going to go and submit it to sweegle to go and check and double check for me so as part of that process we will see that this is this is the eldorado.test that we've been working on so it was updated a few seconds ago i can see that my current stored status has got no warnings and errors which is good and that data that i'm adding so the modification data that's coming in doesn't have any problems with it if there were warnings and errors it would tell you here these again the type of things that are typically automatically notified to teams to go and work out then as part of that i can choose what i want to do with it so does this become my latest and stored snapshot my latest thing to go and update i think it absolutely does um and i'm just going to put a little tag and say you know i fixed uh password and url you know try and add in some some good details here what that means is that becomes now my latest snapshot that i can go and export so there are tools share puppy ansibles um etc etc that need uh to be able to get data to work with so if you if you imagine that um i want to feed ansible some the latest version of config i'll give it version 1.49 which is the snapshot we just took and say once it has a json file you're essentially giving it this this json as an api call that it can come in and retrieve the latest snapshot if you're working with an earlier version again you can just go and pick and choose the data that you want and depending what it is will just depend on what data is returned to you so 45 maybe somebody wants just one release back so in 48 and you'll notice that the data is changing often quite subtle changes right so the the url so on and so forth but uh the it's all it's all kind of uh essentially a library a history library of config equally if you've got other tools that need it in different formats you can just transform it into whatever is required so again applications typically more properties and any files configuration tools sometimes yaml and json but it's all just the same data that's being represented so it's not like you have to have multiple versions of that data it's all about having you know that same data reused and reworked um and that reuse and reusability is kind of a big uh a big win for for our customers um you can then apply your own custom exporters so if you just want to return smaller data very particular data like outputting the whole thing is is fine for like an automation script but maybe for a particular tool it only wants to retrieve a single value or maybe 10 values it doesn't need the full stack of config highly malleable highly configurable to be able to do these types of things so just uh final kind of summary data comes in comes in from lots of different places you get your bitbucket so on and so forth we structure it as a data model you apply logic you can then start to improve upon that data model as you wish we validate it so we go and make sure that we are checking and we are getting lots of green ticks rather than errors etc etc and then this data is used as part of a deployment to actually go and physically deploy and uh be be used by someone or something uh to to go and use that data um at the pointy end as it were so hopefully that makes sense any questions feel free to reach out that was kind of an introduction to swiegel and its core capabilities
https://www.youtube.com/watch?v=jY2Mg-2c7VY