More Osquery Data Modeling in Snowflake

Data Modeling osquery Data in Snowflake

We have been working on many projects the past few months, and osquery data pipelines with the help of FleetDM has been one of our constant works in progress. Before I dive into the data modeling part, I wanted to share how we scaled this data out. We ended up going forward with FleetDM’s SaaS offering over self-hosting the app. However, we are still streaming query results from FleetDM cloud to Snowflake over AWS Kinesis Firehose and we now have data streaming from FleetDM into Snowflake at a very fast rate and high volume. We are typically collecting data from osquery every hour on most things, and we have a lesser number of queries that run every 12 hours or once a day.

Working with osquery Data in Snowflake

I recently told my dog, Booms, all about how you can use osquery data in Snowflake. Needless to say, he was quite impressed with all the use cases and in-depth data you can model about your fleet of laptops using osquery data in Snowflake.

/img/booms-snow1.jpg

Booms came to the office with me, and he even got a bit of company swag! As we spent the day working I told him all about all the work my team was doing with osquery. Anyone can see in this pic of him, he is very impressed with the speed and scale one can leverage data from osquery in Snowflake

Servicenow Asset Data in Snowflake

Inventory Asset data is extremely important to any organization. It is also some of the most difficult data to keep accurate and up-to-date. We have been shipping ServiceNow data to Snowflake here for a good while. Over the years the asset data has proven time after time to be extremely valuable. It helps filter out tons of noise and allows one to focus on specific asset data at any given time. Just by creating some simple queries and joining ServiceNow data with other data sources. I recently created a new view in Snowflake to help us better filter data down to more specifics.

Diversifying Your IT Tools: Integrating Munki

Hello Everyone, and it has been a while since my last blog post. I have been busy with work and with some other things outside of work as well. However, we did finally release Munki to our fleet last summer. This was a milestone project for us, and while we aren’t the first Org to do this, I wanted to share with the community how and why we did this. Everything I will write here has probably already been done by many other Orgs that currently use Munki, and we aren’t really doing much that is radically different. So, if you hae deployed a server-less Munki setup before, this is probably not going to be new to you. That being said, server-less is a pretty key word for our vision here.

Complete Asset Oversight with the Data Cloud

Gain All the Insights You Never Had

Asset and Inventory controls are difficult, and many Orgs spend tons of time and effort to try to solve problems around this very subject. In fact, they will hire humans, and spend lots of time and money to also get a hold of their inventory of assets. This problem is still very much difficult to solve, but the Data Cloud makes this problem at least solvable. I have been in tech for over 20 years, and I have even had jobs where inventory of assets was one of my job duties. I had a local database I built on a Linux desktop and a barcode scanner where I could manage my own little shop of computers and computer parts. I would manually enter data, I would scan barcodes, and I even had a sign-out sheet for people to sign for things they took, so I could balance my books later on. Systems like this do not scale, they are very error-prone, and they take so much labor. Oftentimes I had to play asset detective and dig through our IT software, the home-grown database that I had, and the various paperwork from other systems to get a feel for what our inventory actually was. Also, I was very new to Linux and databases at the time, and I highly doubt my little pet project was really any good at all. It was at least something I suppose.

Data-Enabled: Post Nudge Campaign Results

Data-Enabled: Post Nudge Campaign Results

A lot of Organizations have adopted things like CIS Controls and part of those controls are around patching and vulnerability management. Depending on what version of the CIS controls you are using, this could be CIS 3 or CIS 7. This can already be confusing as the controls are very similar between versions but the have moved places in the CIS framework. One thing I think most tech and security professionals can agree upon, is that patching often to the latest operating systems is a great way to avoid your fleet of devices from being exploited. Known vulnerabilities have a shelf life, and bad actors know that some organizations do not patch, or patch extremely slow. So, they develop attacks against known published vulnerabilities.