0% found this document useful (0 votes)
35 views12 pages

Adx Hol 062022

This document outlines the steps for a hands-on workshop on Azure Synapse Data Explorer (ADX). The workshop will take around 2 hours and cover: creating an ADX cluster; ingesting sample data using one-click ingestion; exploring basic queries; creating an update policy to transform data; ingesting additional data and running advanced queries; building materialized views; and exploring a sample dashboard.

Uploaded by

Inge Smit
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views12 pages

Adx Hol 062022

This document outlines the steps for a hands-on workshop on Azure Synapse Data Explorer (ADX). The workshop will take around 2 hours and cover: creating an ADX cluster; ingesting sample data using one-click ingestion; exploring basic queries; creating an update policy to transform data; ingesting additional data and running advanced queries; building materialized views; and exploring a sample dashboard.

Uploaded by

Inge Smit
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Azure/Synapse Data Explorer

- Upskilling Workshop: Day 2

Hands on Workshop
Henning Rauch
Kåre Rasmussen

28th June, 2022


ADX in 2 hours – overview
• Step 1 – Download pdf: https://aka.ms/adx.partner.hol
• Step 2 – Create Kusto/ADX free cluster
• Step 3 – Ingest sample data using 1-click, play with simple queries
• Step 4 – Create update policy to transform data
• Step 5 – Clean up raw data, do full load of sample SQL Server metrics
and Servers location dimension data
• Step 6 – Run advanced commonly used KQL queries to join, aggregate
etc
• Step 7 - Build materialized view
• Step 8 - Build and explore sample dashboard(import exported
dashboard)
ADX in 2 hours
• Step 1 - Create ADX free cluster & database –
https://aka.ms/kustofree
• Prerequisites: AAD identity or some other Microsoft identity
(*@outlook.com, etc)
• Link to create MS account
ADX in 2 hours
• Step 2 – Generate DDL and ingest data using one-click ingest
• Use this dataset to ingest ‘from blob’ option on 1-click
• Use “MULTIJSON” format, give tablename as “RawMetrics”
ADX in 2 hours
• Step 3 – Play with the data
• Using this BasicKQLQueries script
ADX in 2 hours
• Step 4 – create transformation logic, apply using update policy
• Use this ‘AdvancedKQLQueries’ script to copy below commands
//Create function with transformation logic
.create-or-alter function Transform_RawMetrics {
RawMetrics
| mv-apply kv=fields on
(
mv-expand kind=array kv
| extend SQLMetrics = tostring(kv[0]), Value=todouble(kv[1])
)
| project SQLMetrics, Value, MetricType=tostring(split(name,"kube_")[1]), Timestamp=timestamp, Host=tostring(tags.host), MeasurementDb
Type=tostring(tags.measurement_db_type), SQLInstance=tostring(tags.sql_instance), tags
}

//Create destination table schema with the above query's result


.set-or-append TransformedMetrics <| Transform_RawMetrics() | limit 0

//Apply update policy on destination table


.alter table TransformedMetrics policy update
@'[{"IsEnabled": true, "Source": "RawMetrics", "Query": "Transform_RawMetrics()", "IsTransactional": true, "PropagateIngestionProperti
es": false}]'

Copy above commands from ‘AdvancedKQLQueries’ script. If copying text from pdf, please give
space among each command, move query text in one line
ADX in 2 hours
• Step 5 – cleanup & do full load of data
.clear table RawMetrics data
Ingest the entire dataset from this ‘blob container’ into RawMetrics using one-click
ingest
ADX in 2 hours
• Step 6 – advanced queries
• Ingest dimension table ‘ServersLocation’ from this blob container, ‘MULTIJSON’
format
• Use this ‘AdvancedKQLQueries’ script
ADX in 2 hours
• Step 7 – create materialized views
• Deduplication
• Downsampling
• Last Known Value
//dedup
.create async materialized-view with(backfill=true) TransformedMetricsDedup on table TransformedMetrics
{
TransformedMetrics
| summarize take_any(*) by SQLMetrics, Value, MetricType, Timestamp, Host, MeasurementDbType, SQLInstance
}

//downsampling
.create async materialized-view with(backfill=true) TransformedMetricsDownSampling on materialized-view TransformedMetricsDedup
{
TransformedMetricsDedup
| summarize Value_avg = avg(Value), Value_min=min(Value), Value_max=max(Value) by SQLMetrics, MetricType, Host, MeasurementDbType, S
QLInstance, bin(Timestamp, 1h)
}

//last known value


.create async materialized-view with(backfill=true) TransformedMetricsLastKnownValue on table TransformedMetrics
{
TransformedMetrics
| summarize arg_max(Timestamp, Value, tags) by SQLMetrics, MetricType, Host, MeasurementDbType, SQLInstance
}
ADX in 2 hours
• Step 8 – Build and explore sample dashboard, use exported dashboard
• Navigate to dashboards
• Import this exported dashboard
Advanced
Live Connection Experience
• Step 1 - Create paid ADX cluster
• Step 2 - Use data simulator, destination of your choice(storage or EH)
• https://github.com/microsoft/TelemetryLogsGeneratorAndBenchmark
• kusto-high-scale-ingestion/README.md at master · Azure-Samples/kusto-high-scale-ingestion
(github.com)
• Step 3 – Ingest data to ADX using 1-click or pre-create data connection
• If using data connection route then pre-create table, mapping(optional)
• Ingest data into raw staging table
• Adjust ingestion policy(optional) -
.alter database SampleLogsHol policy ingestionbatching @'{"MaximumBatchi
ngTimeSpan":"00:00:30", "MaximumNumberOfItems": 500, "MaximumRawDataSize
MB": 1024}'
• Step 4 - Transform using update policy, land parsed data into destination table
• Step 5 - Query data
• Use commonly used KQL queries
• Build materialized view(optional)
• Step 6 - Build and explore sample dashboard

You might also like