Author Archives: Warren Kahn

Kusto Detective Agency: Hints and my experience

Challenges

So, what is the Kusto Detective Agency?

This set of challenges is an amazing, gamified way to learn the Kusto Query Language (KQL), which is the language used by several Azure services including Azure Monitor, Sentinel, M365 Defender and Azure Data Explorer (ADX) to name a few. Using your skills, you will help the citizens of Digitown solve mysteries and crimes to make the city a better place!

How do I get started?

The challenges are available here https://detective.kusto.io/, follow a few basic steps to get started by creating an ADX cluster here https://aka.ms/kustofree and copy the Cluster URI you need this as a part of the onboarding answer.

Now answer a simple question using KQL that being to calculate the sum of the “Score” column

If you are just getting started learning KQL check out Rod Trents ‘Must Learn KQL’ series!

https://aka.ms/MustLearnKQL https://azurecloudai.blog/2021/11/17/must-learn-kql-part-1-tools-and-resources/

as well as these cool resources

Watch this basic explainer on how the query language works: http://aka.ms/StartKqlVideo
Check out the documentation here: Kusto Query Language (KQL) overview- Azure Data Explorer | Microsoft Docs

For help with the first query click the spoiler tag below

Onboarding Query

Onboarding //This is the name of the table we will be running our query against
| summarize sum(Score) //the sum command will add up all the numbers in the “Score” column

General advice

Each challenge has up to three hints that can be accessed through the hints section of your Detective UI, the hints are quite useful, and I would recommend using them if you get stuck especially as some of them include information which is important to confirm assumptions. There are also different ways to get to the answers which shows the power of creative thinking.

Challenge 1: The rarest book is missing!

The first challenge is quite interesting you are tasked with finding a rare missing book. Most people I’ve spoken to have figured out the method but get stuck on the KQL query I’ve included an extra hint below to get you started.

Query Hint
In order to solve this you’ll need to work with the weights of the books on the selves.
KQL commands that will be helpful are sum() and join
The rarest book is missing challenge1 text

This was supposed to be a great day for Digitown’s National Library Museum and all of Digitown.
The museum has just finished scanning more than 325,000 rare books, so that history lovers around the world can experience the ancient culture and knowledge of the Digitown Explorers.
The great book exhibition was about to re-open, when the museum director noticed that he can’t locate the rarest book in the world:
“De Revolutionibus Magnis Data”, published 1613, by Gustav Kustov.
The mayor of the Digitown herself, Mrs. Gaia Budskott – has called on our agency to help find the missing artifact.

Luckily, everything is digital in the Digitown library:

  • – Each book has its parameters recorded: number of pages, weight.
  • – Each book has RFID sticker attached (RFID: radio-transmitter with ID).
  • – Each shelve in the Museum sends data: what RFIDs appear on the shelve and also measures actual total weight of books on the shelve.

Unfortunately, the RFID of the “De Revolutionibus Magnis Data” was found on the museum floor – detached and lonely.
Perhaps, you will be able to locate the book on one of the museum shelves and save the day?

Query challenge 1

//This query will calculate the weight of the books on each shelf and compare that to the weight registered by the sensor, find the shelf with extra weight and we’ll find our book!
Shelves
| mv-expand rf_ids
| extend RID = tostring(rf_ids)
| join (Books) on $left.RID == $right.rf_id
| summarize sum(weight_gram) by shelf, total_weight
| extend diff = total_weight – sum_weight_gram
| order by diff


I will be talking about the rest of the challenges in a later series so be sure to check back soon, in the meantime good luck Detective!

Loading

Azure Monitor: VM insights now supports Azure Monitor agent

It’s been on my wish list for a while, but it looks like the Azure Monitor has a present for us. You can now enable VM Insights using Azure Monitor Agent (AMA). Note: This is public preview.

With this release these are the key features:

  • Easy configuration using data collection rules(DCR) to collect VM performance counters and specific data types.
  • Option to enable/disable processes and dependencies data that provides Map view, thus, optimizing costs.
  • Enhanced security and performance that comes with using Azure Monitor agent and managed identity.

For those not familiar with VM Insights here is a fantastic overview but in short VM Insights gives a standardized way of measuring and managing the performance and health of your virtual machines and virtual machine scale sets. This includes running processes and dependencies on other resource

Changes for Azure Monitor agent

Be aware of the following differences when using AMA for VM Insights

Workspace configuration. VM insights no longer needs to be enabled on the Log Analytics workspace

Data collection rule. Azure Monitor agent uses data collection rules (DCR) to configure its data collection. VM insights creates a data collection rule that is automatically deployed if you enable your machine using the Azure portal.

Agent deployment. There are minor changes to the process for onboarding virtual machines to VM insights in the Azure portal. You must now select which agent you want to use, and you must select a data collection rule for Azure Monitor agent.

How do I configure VM Insights with AMA?

This is a fairly easy process, just note the dependency on data collection rules. Here is the official documentation

Enjoy and happy monitoring!

Loading

Azure Monitor Basic Logs

What are Basic Logs?

Relatively new and still in preview Basic Logs are a way to save costs when working with high-volume logs typically associated with debugging, troubleshooting and auditing but they should not be used where analytics and alerts are important.

How do I configure one?

Firstly, it is important to note that tables created with the Data Collector API do not support Basic Logs.
The following are supported:

  • Logs created via Data Collection Rules (DCR)
  • ContainerLogsv2 (Used by Container Insights)
  • Apptraces

All tables by default are set to analytics mode, in order to change this, navigate to Log Analytics Workspaces, select the workspace with the log you want to change. Choose Tables from the navigation rail, select the log and choose Manage Table from the right

Change the table plan to Basic. Note that the default retention changes from 30 days to 8 days. This can of course also be done through the API or CLI


How can I query Basic Logs?

Log queries against Basic Logs are optimized for simple data retrieval using a subset of KQL language, including the following operators:

There are also some other limitations such as time range cannot be specified in the query, and purge is not available, for limitations refer to the official documentation

How much cheaper is it?

Basic Logs $0.615 per GB of data ingested
Standard Pay-as-you-go price $2.76 per GB (5GB free per month) with discounts for purchasing a commitment tier of up to 5000GB per day.

During preview there is no charge for querying basic logs however there will be a small charge once it reaches GA based on the amount of data the query scans, not just the amount of data the query returns. At this time the expected cost is $0.007 per GB of data scanned.

Basic Logs Architecture Reference

Loading

SCOM 2019: Update rollup 4 has arrived!

UR 4 is a fairly big update with a host of improvements and fixes, as always it can be downloaded from the catalog here now lets dive right in.

Overall some expected improvements to suppport later operating systems and versiosn of .net there are a tonn of fixes mostly around correcting minior issues which are too numerous to list here but can be found on the KB page

Improvements

  • Support for Windows 11
  • Enabled .NET4.8 support
  • UI improvements in Operations console:
    • Support for sort option by column, in Overrides Summary.
    • For Monitors, Rules, Task and Discoveries, Management Pack label text is selectable in the workflow Properties window.
    • Added new fields for Class Technical Name in the State Views. Added the same in the wizard for creating a new Alert, Event, Performance or State View.
    • Added Target Class Display Name to help identify the target of a rule while selecting rules during the creation of a new Performance View.
    • Added 3 new columns Management Pack, Sealed and Members in the Authoring pane > Groups.

Loading

Azure Managed Grafana

Recently announced the preview for Azure Managed Grafana is now available. For those who maybe don’t know Grafana is an observability platform which lets you create mixed data dashboards form a variety of sources.

And now you can run it in Azure!

Lets get started

First you need to create a Grafana workspace, in the Azure Portal search for Azure Managed Grafana select it and click +Create. Fill out all the usual suspects, choosing your subscription, resource group, location and workspace name.

On the following tab create a managed identity as this is the way Grafana will be able to access data from your resources. Then create your workspace.

Next we need access

Grafana needs access to the resources you want to build dashboards for, you can easily do this with Azure RBAC and it can easily be done at the resource group or subscription level as well.

Using Access Control (IAM) give Monitoring reader access to the Managed Identify you created as part of your Grafana workspace

Lets make some dashboards

First lets open our Grafana, navigate to the workspace previously created and click on the endpoint address on the Overview blade

On the landing page there’s a notification to configure a data source. We’ll be using Azure Monitor, simply click add data source and choose Azure Monitor from among all of the available options, we can see here that there are plenty to choose from which is part of the power of Grafana.

Name your connection, choose managed identify and select the relevant subscription. Then click Save & test

A nice feature is the ability to access pre-built dashboards out of the box, clicking on the Dashboards tab shows us several options which we can import with the click of a button.

And we’re all set below is an example of the Azure Storage Insights dashboard which I was able to configure from start to finish in less than 5 minutes.

Overall, Azure Managed Grafana is very cool and offers an alternative approach to mixed data dashboards from a variety of sources. Of course, you can also create customized visuals and there are plenty of options to ensure you end up with something meaningful and perfect for your needs. I’m looking forward to seeing this go GA.

Happy dashboarding!

See the source image

Loading

SCOM 2022: GA Hooray!

The latest and dare I say greatest version of SCOM has just been released. Lets unpack some of the exciting new features, a full list and details are available here!

Overall this is a fantastic update and really shows the investment in SCOM that Microsoft is making.

Enhanced Operations Manager RBAC

This one is a great improvement which has been a long time coming, the addition of a Read-only Administrator role makes it much easier for say auditors to view the environment without any risk. It’s also now possible to create Custom User Roles so it’s now possible to give someone access to just install agents.

Improved Alert closure experience

One of the more controversial changes in SCOM 2019 was that alerts could not be closed unless the underlying monitor was in a healthy state. While this does have some benefits you now have the option to open out back to the old alert experience.

Support for GMSA from the word go

It’s now possible to use GMSA account right in the installer from SCOM 2022 RTM

Native Teams Notifications
We now have the ability to send notifications directly to Teams natively which opens up some really interesting ways to use notifications to handle our alert lifecycle.

Many Quality of Life updates

Specific registry keys will be retained during UR updates
Custom installed agent paths will be retained during upgrade
.Net 4.8 support
A new column in the alert view to show if the source is a monitor or rule
Change tracking reports have their own folder
Source (FQDN) is now viewable in the management pack tuning view.
Support for newer browsers
Support for newer Linux versions
Certificate now get encrypted with SHA256
Install on Always On without the need for post configuration changes

That’s it for now, happy monitoring!

Loading

SCOM 2019: Which management server is my gateway paired to?

Sometimes for a variety of reasons it becomes necessary to try and figure out which gateways are paired to which management servers and unfortunately this is a configuration that can often slip under the radar when documenting a management group.

Luckily there is a simply way to figure this out without having to log on to each server and trawl through the registry.

5,165 Superhero Silhouette Cliparts, Stock Vector and Royalty Free Superhero  Silhouette Illustrations
Powershell to the recue!

Get-SCOMGatewayManagementServer | where {$_.Name –eq “< GATEWAY SERVER >”} | Get-SCOMParentManagementServer

Note: this command has changed slightly from past versions of SCOM

Loading

Calling a Logic App from an Azure Monitor workbook!

Workbooks have a couple of new action types which let you do some very cool things. The one I’m going to focus on now is called ARM actions and this is some amazing stuff , if you thought workbooks were powerful before then watch this space!

Arm Actions

First ARM actions can be used to call various Azure actions against a resource. In the example workbook you can Start and Stop a website which is quite useful as you can do it directly from the workbook without having to navigate to the Resource Blade.

This uses a parameter to fetch the site name and pipes it into an ARM action of Start

Calling a Logic App

Super cool and very useful. Now lets look at how we can up our game a little bit. Using this same method you can actually call a Logic App, this is slightly more complex as you need to have the ARM Action path to said Logic App which looks like this:

/subscriptions/{Subscription:id}/resourceGroups/{RG}/providers/Microsoft.Logic/workflows/<LogicApp Name>/triggers/manual/run?api-version=2016-06-01

Note the various parameters and you can also parametrize the Logic App name, I have it hardcoded in this example. Also note in this case the trigger type is manual, this because the Logic App trigger is “When an HTTP request is received” and I am sending a JSON payload from the worbook to the Logic App.

You can also specify other triggers for Request, Recurrence and API Connection.

Now what can you do with this? Well as you might imagine the possibilities are endless, in my case I’m calling the Logic App to populate a secondary set of App data into Log Analytics to add more scope to the original workbook.

Once the Logic App has been run the App Info column changes to Populated and the GetAppDetails prompt changes to Refresh, the data is then made visible in a second grid below.

Conclusion

I’m very excited by the world that has opened up with this type of advanced workbook essentially turning them from an awesome visual tool into an awesome manageability tool.

If you ‘ve made use of this functionality I’d love to hear from you.

That’s all for now, happy workbooking!

Loading

Monitoring data from an API with Azure Monitor aka. Monitoring Endpoints with Sentinel

I was recently afforded a very interesting opportunity to help extend the reporting capability of Microsoft Defender, the end result used a combination of a logic app and a workbook to achieve something that is quite awesome (even if I do say so myself). Huge thanks to Jason Baxter and Hesham SaaD for their part in this.

It is worth noting that while this particular case used Sentinel you can achieve the same with Azure Monitor and a standard logic app, the choice will come down to whether or not the data is security related.

Now while the full details of the solution can be found here. I wanted to take a brief moment to talk about the power of the framework for this solution which can be broken down into key components.

  1. An API – now lots of applications have easily accessible monitoring data, however some don’t and there is often a wealth of information to be found with a web call to an API endpoint.
  2. A logic app – getting data from an API to log analytics may seen complex but using a logic app offers a low code approach which can meet most solutions needs. In the event of scaling it is also possible to use PowerShell and Azure functions to achieve a move robust result.
  3. A Workbook – As some of you may know I am a huge proponent of workbooks, they offer interactivity and flexibility while being easy and quick to create (more on that here), and you can of course also alert on the data once it’s in your Log Analytics workspace or even use PowerBI to further enrich your visuals.

The possibilities are literally endless, in the last week alone I’ve been asked to adapt this method monitor elements of other products such as Microsoft Teams. This a great method to keep in the back pocket and I’d love to hear from anyone who’s using it or something similar.

Pricing - Azure Monitor | Microsoft Azure

Loading

SCOM vs. Azure Monitor

Recently I was invited to speak at Silect MP University about SCOM vs Azure Monitor, the session is available for viewing below.

Join me as I discuss the pros and cons of both tool as well as how to leverage them both individually and together for a variety of scenarios.

Loading