Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Select a service from the drop-down list to the left.
Applies to: ✅ Microsoft Fabric ✅ Azure Data Explorer ✅ Azure Monitor ✅ Microsoft Sentinel
Use Kusto Query Language (KQL) to write queries in Azure Data Explorer, Azure Monitor Log Analytics, Microsoft Sentinel, and more. This tutorial introduces the essential KQL operators you can use to access and analyze your data.
For more specific guidance on how to query logs in Azure Monitor, see Get started with log queries.
Note
Can't find what you're looking for? This article was recently split, as follows:
- Learn common operators (this article)
- Use aggregation functions
- Join data from multiple tables
- Create geospatial visualizations
In this tutorial, you learn how to:
The examples in this tutorial use the StormEvents table, which is publicly available in the help cluster. To explore with your own data, create your own free cluster.
The examples in this tutorial use the StormEvents table, which is publicly available in the Weather analytics sample data.
Prerequisites
To run the following queries, you need a query environment with access to the sample data. You can use one of the following options:
- A Microsoft account or Microsoft Entra user identity to sign in to the help cluster
- A Microsoft account or Microsoft Entra user identity
- A Fabric workspace with a Microsoft Fabric-enabled capacity
Count rows
Begin by using the count operator to find the number of storm records in the StormEvents table.
StormEvents
| count
Output
| Count |
|---|
| 59066 |
See a sample of data
To get a sense of the data, use the take operator to view a sample of records. This operator returns a specified number of arbitrary rows from the table, which can be useful for previewing the general data structure and contents.
StormEvents
| take 5
The following table shows only five of the 22 returned columns. To see the full output, run the query.
| StartTime | EndTime | EpisodeId | EventId | State | EventType | ... |
|---|---|---|---|---|---|---|
| 2007-09-20T21:57:00Z | 2007-09-20T22:05:00Z | 11078 | 60913 | FLORIDA | Tornado | ... |
| 2007-12-20T07:50:00Z | 2007-12-20T07:53:00Z | 12554 | 68796 | MISSISSIPPI | Thunderstorm Wind | ... |
| 2007-12-30T16:00:00Z | 2007-12-30T16:05:00Z | 11749 | 64588 | GEORGIA | Thunderstorm Wind | ... |
| 2007-09-29T08:11:00Z | 2007-09-29T08:11:00Z | 11091 | 61032 | ATLANTIC SOUTH | Waterspout | ... |
| 2007-09-18T20:00:00Z | 2007-09-19T18:00:00Z | 11074 | 60904 | FLORIDA | Heavy Rain | ... |
Select a subset of columns
Use the project operator to simplify the view and select a specific subset of columns. Using project is often more efficient and easier to read than viewing all columns.
StormEvents
| take 5
| project State, EventType, DamageProperty
Output
| State | EventType | DamageProperty |
|---|---|---|
| ATLANTIC SOUTH | Waterspout | 0 |
| FLORIDA | Heavy Rain | 0 |
| FLORIDA | Tornado | 6200000 |
| GEORGIA | Thunderstorm Wind | 2000 |
| MISSISSIPPI | Thunderstorm Wind | 20000 |
List unique values
The results of the previous query show that there are multiple types of storms. Use the distinct operator to list all of the unique storm types.
StormEvents
| distinct EventType
The table contains 46 types of storms. Here's a sample of 10 of them.
| EventType |
|---|
| Thunderstorm Wind |
| Hail |
| Flash Flood |
| Drought |
| Winter Weather |
| Winter Storm |
| Heavy Snow |
| High Wind |
| Frost/Freeze |
| Flood |
| ... |
Sort results
To view the top floods in Texas that caused the most damage, use the sort operator to arrange the rows in descending order based on the DamageProperty column. The default sort order is descending. To sort in ascending order, specify asc.
StormEvents
| where State == 'TEXAS' and EventType == 'Flood'
| sort by DamageProperty
| project StartTime, EndTime, State, EventType, DamageProperty
Output
| StartTime | EndTime | State | EventType | DamageProperty |
|---|---|---|---|---|
| 2007-08-18T21:30:00Z | 2007-08-19T23:00:00Z | TEXAS | Flood | 5000000 |
| 2007-06-27T00:00:00Z | 2007-06-27T12:00:00Z | TEXAS | Flood | 1200000 |
| 2007-06-28T18:00:00Z | 2007-06-28T23:00:00Z | TEXAS | Flood | 1000000 |
| 2007-06-27T00:00:00Z | 2007-06-27T08:00:00Z | TEXAS | Flood | 750000 |
| 2007-06-26T20:00:00Z | 2007-06-26T23:00:00Z | TEXAS | Flood | 750000 |
| ... | ... | ... | ... | ... |
Filter by condition
The where operator filters rows of data based on certain criteria.
The following query looks for storm events in a specific State of a specific EventType.
StormEvents
| where State == 'TEXAS' and EventType == 'Flood'
| project StartTime, EndTime, State, EventType, DamageProperty
There are 146 events that match these conditions. Here's a sample of five of them.
| StartTime | EndTime | State | EventType | DamageProperty |
|---|---|---|---|---|
| 2007-01-13T08:45:00Z | 2007-01-13T10:30:00Z | TEXAS | Flood | 0 |
| 2007-01-13T09:30:00Z | 2007-01-13T21:00:00Z | TEXAS | Flood | 0 |
| 2007-01-13T09:30:00Z | 2007-01-13T21:00:00Z | TEXAS | Flood | 0 |
| 2007-01-15T22:00:00Z | 2007-01-16T22:00:00Z | TEXAS | Flood | 20000 |
| 2007-03-12T02:30:00Z | 2007-03-12T06:45:00Z | TEXAS | Flood | 0 |
| ... | ... | ... | ... | ... |
Filter by date and time range
Use the between operator to filter data based on a specific time range.
The following query finds all storm events between August 1, 2007, and August 30, 2007, along with their states, event types, start times, and end times. The query returns results sorted in ascending order by start time.
StormEvents
| where StartTime between (datetime(2007-08-01 00:00:00) .. datetime(2007-08-30 23:59:59))
| project State, EventType, StartTime, EndTime
| sort by StartTime asc
Output
| State | Eventype | StartTime | EndTime |
|---|---|---|---|
| GEORGIA | Excessive Heat | 2007-08-01 00:00:00 | 2007-08-27 23:59:00 |
| TENNESSEE | Drought | 2007-08-01 00:00:00 | 2007-08-31 23:59:00 |
| TENNESSEE | Drought | 2007-08-01 00:00:00 | 2007-08-3123:59:00 |
| SOUTH CAROLINA | Drought | 2007-08-01 00:00:00 | 2007-08-31 23:59:00 |
| TENNESSEE | Drought | 2007-08-01 00:00:00 | 2007-08-31 23:59:00 |
| GEORGIA | Excessive Heat | 2007-08-01 00:00:00 | 2007-08-27 23:59:00 |
| TENNESSEE | Drought | 2007-08-01 00:00:00 | 2007-08-31 23:59:00 |
| MINNESOTA | Drought | 2007-08-01 00:00:00 | 2007-08-31 23:59:00 |
| WISCONSIN | Drought | 2007-08-01 00:00:00 | 2007-08-31 23:59:00 |
| GEORGIA | Excessive Heat | 2007-08-01 00:00:00 | 2007-08-27 23:59:00 |
| ... | ... | ... | ... |
Get the top n rows
The top operator returns the first n rows sorted by the specified column.
The following query returns the five Texas floods that caused the most damaged property.
StormEvents
| where State == 'TEXAS' and EventType == 'Flood'
| top 5 by DamageProperty
| project StartTime, EndTime, State, EventType, DamageProperty
Output
| StartTime | EndTime | State | EventType | DamageProperty |
|---|---|---|---|---|
| 2007-08-18T21:30:00Z | 2007-08-19T23:00:00Z | TEXAS | Flood | 5000000 |
| 2007-06-27T00:00:00Z | 2007-06-27T12:00:00Z | TEXAS | Flood | 1200000 |
| 2007-06-28T18:00:00Z | 2007-06-28T23:00:00Z | TEXAS | Flood | 1000000 |
| 2007-06-27T00:00:00Z | 2007-06-27T08:00:00Z | TEXAS | Flood | 750000 |
| 2007-06-26T20:00:00Z | 2007-06-26T23:00:00Z | TEXAS | Flood | 750000 |
Note
The order of the operators is important. If you put top before where in this example, you get different results. Each operator transforms the data in order. For more information, see tabular expression statements.
Create calculated columns
You can use the project and extend operators to create calculated columns.
Use project to specify only the columns you want to view. Use extend to add the calculated column to the end of the table.
The following query creates a calculated Duration column with the difference between the StartTime and EndTime. Since you only want to view a few select columns, using project is the better choice in this case.
StormEvents
| where State == 'TEXAS' and EventType == 'Flood'
| top 5 by DamageProperty desc
| project StartTime, EndTime, Duration = EndTime - StartTime, DamageProperty
Output
| StartTime | EndTime | Duration | DamageProperty |
|---|---|---|---|
| 2007-08-18T21:30:00Z | 2007-08-19T23:00:00Z | 1.01:30:00 | 5000000 |
| 2007-06-27T00:00:00Z | 2007-06-27T12:00:00Z | 12:00:00 | 1200000 |
| 2007-06-28T18:00:00Z | 2007-06-28T23:00:00Z | 05:00:00 | 1000000 |
| 2007-06-27T00:00:00Z | 2007-06-27T08:00:00Z | 08:00:00 | 750000 |
| 2007-06-26T20:00:00Z | 2007-06-26T23:00:00Z | 03:00:00 | 750000 |
If you take a look at the computed Duration column, you might notice that the flood that caused the most damage was also the longest flood.
Use extend to view the calculated Duration column along with all of the other columns. The Duration column is added as the last column.
StormEvents
| where State == 'TEXAS' and EventType == 'Flood'
| top 5 by DamageProperty desc
| extend Duration = EndTime - StartTime
Output
| StartTime | EndTime | ... | Duration |
|---|---|---|---|
| 2007-08-18T21:30:00Z | 2007-08-19T23:00:00Z | ... | 1.01:30:00 |
| 2007-06-27T00:00:00Z | 2007-06-27T12:00:00Z | ... | 12:00:00 |
| 2007-06-28T18:00:00Z | 2007-06-28T23:00:00Z | ... | 05:00:00 |
| 2007-06-27T00:00:00Z | 2007-06-27T08:00:00Z | ... | 08:00:00 |
| 2007-06-26T20:00:00Z | 2007-06-26T23:00:00Z | ... | 03:00:00 |
Map values from one set to another
Static mapping is a useful technique for changing the presentation of your results. In KQL, you can perform static mapping by using a dynamic dictionary and accessors to map values from one set to another.
let sourceMapping = dynamic(
{
"Emergency Manager" : "Public",
"Utility Company" : "Private"
});
StormEvents
| where Source == "Emergency Manager" or Source == "Utility Company"
| project EventId, Source, FriendlyName = sourceMapping[Source]
Output
| EventId | Source | FriendlyName |
|---|---|---|
| 68796 | Emergency Manager | Public |
| ... | ... | ... |
| 72609 | Utility Company | Private |
| ... | ... | ... |
Next step
Now that you're familiar with the essentials of writing Kusto queries, go on to the next tutorial and learn how to use aggregation functions to gain deeper insight into your data.