Azure Developer Intro
Azure Developer Intro
b GET STARTED
Azure billing
Azure supports the most popular programming languages in use today, including
Python, JavaScript, Java, .NET and Go. With a comprehensive SDK library and extensive
support in tools you already use like VS Code, Visual Studio, IntelliJ, and Eclipse, Azure is
designed to take advantage of skills you already have and make you productive right
away.
Application hosting on Azure - Azure can host your entire application stack from
web applications and APIs to databases to storage services. Azure supports a
variety of hosting models from fully managed services to containers to virtual
machines. When using fully managed Azure services, your applications can take
advantage of the scalability, high-availability, and security built in to Azure.
While Azure contains over 100 services, this article outlines the Azure services you'll use
most frequently as a developer. For a comprehensive list of all Azure services, see the
Azure documentation hub page.
Azure App Host .NET, Java, Node.js, and Python web applications and APIs in a fully
Service managed Azure service. You only need to deploy your code to Azure. Azure
takes care of all the infrastructure management like high availability, load
balancing, and autoscaling.
Azure Host static web apps built using frameworks like Gatsby, Hugo, or VuePress,
Static Web or modern web apps built using Angular, React, Svelte, or Vue. Static web
Apps apps automatically build and deploy based off of code changes and feature
API integration with Azure Functions.
Azure A serverless compute platform for creating small, discrete segments of code
Functions that can be triggered from a variety of different events. Common
applications include building serverless APIs or orchestrating event-drive
architectures.
Azure Quickly deploy a production ready Kubernetes cluster to the cloud and
Kubernetes offload the operational overhead to Azure. Azure handles critical tasks, like
Services health monitoring and maintenance. You only need to manage and maintain
the agent nodes.
Azure Host your app using virtual machines in Azure when you need more control
Virtual over your computing environment. Azure VMs offer a flexible, scalable
Machines computing environment for both Linux and Windows virtual machines.
Data
Icon Service Description
Storage
Azure Blob Storage is a popular service that manages the storage, retrieval, and security
of non-structured BLOB data.
Azure Azure Blob Storage allows your applications to store and retrieve files in the
Blob cloud. Azure Storage is highly scalable to store massive amounts of data and
Storage data is stored redundantly to ensure high availability.
Azure Azure Data Lake Storage is designed to support big data analytics by providing
Data scalable, cost-effective storage for structured, semi-structured or unstructured
Lake data.
Storage
Messaging
Here's a list of the most popular services that manage sending, receiving, and routing of
messages from and to apps.
Azure A fully managed enterprise message broker supporting both point to point and
Service publish-subscribe integrations. It's ideal for building decoupled applications,
Bus queue-based load leveling, or facilitating communication between
microservices.
Azure Azure Event Hubs is a managed service that can ingest and process massive
Event data streams from websites, apps, or devices.
Hubs
Azure A simple and reliable queue that can handle large workloads.
Queue
Storage
Cognitive Services
Azure Cognitive Services is a collection of cloud-based services that allow you to add AI-
based capabilities to your application. Here's a list of popular Cognitive Services.
Speech Transcribe audible speech into readable, searchable text or convert text
to lifelike speech for more natural interfaces.
Form Recognizer Document extraction service that understands your forms allowing you
to quickly extract text and structure from documents.
Cognitive Use natural language processing (NLP) to identify key phrases and
Service for conduct sentiment analysis from text.
Language
QnA Maker Build a chat bot experience by distilling information into easy-to-
navigate questions and answers.
Other
And finally, here's a list of popular services that support a wide range of workflows,
methodologies, functionalities, and industries.
Azure Key Every application has application secrets like connection strings and API keys
Vault it must store. Azure Key Vault helps you store and access those secrets
securely, in an encrypted vault with restricted access to make sure your
secrets and your application aren't compromised.
Azure provides a variety of different ways to host your app depending on your needs.
https://www.microsoft.com/en-us/videoplayer/embed/RE50vLy?postJsllMsg=true
Azure App Service automatically patches and maintains the OS and language
frameworks for you. App Service also supports autoscaling, high availability and
deployment slots so you can spend your time building great apps rather than worrying
about infrastructure concerns.
Azure App Service also supports running containerized web apps. Customized
containers give apps hosted in app service full access to the underlying operating
system and make it possible to host web apps using any application stack while still
taking advantage of features like autoscaling and high availability provided by Azure
App Service.
Static web apps are commonly built using libraries and frameworks like Angular, React,
Svelte, Vue, or Blazor where server side rendering isn't required. In addition, Azure Static
Web Apps Azure support use of a serverless API architecture either through an
integrated Azure Functions API or linking to an existing Azure Functions app.
Azure Functions
Azure Functions is a "serverless"-style offering that lets you write just the code you need
to respond to events or run on a schedule. Rather than worrying about building out and
managing a whole application or the infrastructure to run your code, you write just the
code you need to handle the event..With Functions, you can trigger code execution with
HTTP requests, webhooks, cloud service events, or on a schedule. You can code in your
development language of choice, such as C#, F#, Node.js, Python, or PHP. With
consumption-based billing, you pay only for the time that your code executes, and
Azure scales as needed.
Azure Kubernetes Service allows you to build and run modern, portable, microservices-
based applications using both stateless and stateful applications as teams progress
through the adoption of microservices-based applications.
Azure Batch
Azure Batch is used to run large-scale parallel and high-performance computing (HPC)
jobs in Azure. Azure Batch creates and manages a pool of compute nodes (virtual
machines), installs the applications you want to run, and schedules jobs to run on the
nodes. There's no cluster or job scheduler software to install, manage, or scale. Instead,
you use Batch APIs and tools, command-line scripts, or the Azure portal to configure,
manage, and monitor your jobs.
Because of the level of control that you have with VMs, you can run a wide range of
server workloads on Azure that don't fit into a PaaS model. For more information, see
the Virtual Machines documentation.
Connect your app to Azure Services
Article • 10/18/2022
Azure offers a variety of services that applications can take advantage of regardless of
whether they are hosted in Azure or on-premises. For example you could:
Use Azure Blob Storage to store and retrieve files in the cloud.
Add full text searching capability to your application using Azure Cognitive Search.
Use Azure Service Bus to handle messaging between different components of a
microservices architecture.
Use Text Analytics to identify and redact sensitive data in a document.
Azure services offer the benefit that they are fully managed by Azure.
Azure SDK - Available for .NET, Java, JavaScript, Python and Go.
Azure REST API - Available from all languages.
When possible, it is recommended to use the Azure SDK to access Azure services from
application code. Advantages of using the Azure SDK include:
Accessing Azure services is just like using any other library. You import the
appropriate SDK package into your application, create a client object, and then call
methods on the client object to communicate with your Azure resource.
Simplifies the process of authenticating your application to Azure. When creating
an SDK client object, you include the right credentials and the SDK takes care of
authenticating your calls to Azure
Simplified programming model. Internally, the Azure SDK calls the Azure REST
API. However, the Azure SDK has built in error handling, retry logic, and result
pagination making programming against the SDK simpler than calling the REST
API directly.
Azure SDK
The Azure SDK allows programmatic access to Azure services from .NET, Java, JavaScript,
Python, and Go applications. Applications install the necessary packages from their
respective package manager and then call methods to programmatically access Azure
resources.
https://www.microsoft.com/en-us/videoplayer/embed/RE50C7t?postJsllMsg=true
More information about the Azure SDK for each language can be found in each
language's developer center.
.NET Azure SDK for .NET overview Azure SDK for .NET package list
Java Azure SDK for Java overview Azure SDK for Java package list
JavaScript Azure SDK for JavaScript Azure SDK for JavaScript package
overview list
Python Azure SDK for Python overview Azure SDK for Python package list
Azure provides a variety of tools to create and manage the Azure resources used by
your application.
https://www.microsoft.com/en-us/videoplayer/embed/RE50C5I?postJsllMsg=true
Different tools are designed to support different use cases, and most Azure developers
use a combination of different tools depending on the job they need to perform. For
example, you might:
Use a GUI tool like the Azure portal or the Azure Tools extension for VS Code
when prototyping Azure resources for a new application. GUI tools guide you
through the process of creating new services and let you review and select the
options for a service using drop-down menus and other graphical elements.
Write a script using the Azure CLI or Azure PowerShell to automate a common
task. For example, you might create a script that creates a basic dev environment
for a new web application consisting of an Azure App Service, a database, and blob
storage. Writing a script ensures the resources are created the same way each time
and is faster to run than clicking through a UI.
Use Infrastructure as Code (IaC) tools to declaratively deploy and manage Azure
resources. Tools like Terraform, Ansible, or Bicep allow you to codify the Azure
resources needed for a solution in declarative syntax, ensuring the consistent
deployment of Azure resources across environments and preventing
environmental drift.
Azure portal
The Azure portal is a web-based interface designed for managing Azure resources.
The Azure portal features:
Create, manage, and deploy code to web sites using Azure App Service.
Create, browse, and query Azure databases
Create, debug, and deploy Azure Functions directly from VS Code
Deploy containerized applications from VS Code
Azure CLI
The Azure CLI is a cross-platform command line tool that runs on Windows, Linux and
macOS. The Azure CLI:
Azure CLI commands are easily incorporated into popular scripting languages like Bash
giving you the ability to script common tasks.
Azure CLI
LOCATION='eastus'
RESOURCE_GROUP_NAME='msdocs-expressjs-mongodb-tutorial'
WEB_APP_NAME='msdocs-expressjs-mongodb-123'
APP_SERVICE_PLAN_NAME='msdocs-expressjs-mongodb-plan-123'
RUNTIME='NODE|14-lts'
Azure PowerShell
Azure PowerShell is a set of cmdlets for managing Azure resources directly from
PowerShell. Azure PowerShell is installed as a PowerShell module and works with
PowerShell 7.0.6 LTS and PowerShell 7.1.3 or higher on all platforms including Windows,
macOS, and Linux. It's also compatible with Windows PowerShell 5.1.
Azure PowerShell is tightly integrated with the PowerShell language. Commands follow
a verb-noun format and data is returned as PowerShell objects. If you are already
familiar with PowerShell scripting, Azure PowerShell is a natural choice.
Azure PowerShell
$location = 'eastus'
$resourceGroupName = 'msdocs-blob-storage-demo-azps'
$storageAccountName = 'stblobstoragedemo999'
For more information on choosing between Azure CLI and Azure PowerShell, see the
article Choose the right command-line tool.
For infrastructure deployments that are automated, repeated, and reliable, Azure
supports a variety of Infrastructure as Code tools.
Bicep
Bicep is a domain-specific language (DSL) that uses declarative syntax to deploy Azure
resources. It provides concise syntax, reliable type safety, and support for code reuse.
Bicep
Terraform
Hashicorp Terraform is an open-source tool for provisioning and managing cloud
infrastructure. It codifies infrastructure in configuration files that describe the topology
of cloud resources. The Terraform CLI provides a simple mechanism to deploy and
version configuration files to Azure.
Terraform
provider "azurerm" {
features {}
}
sku {
tier = "Standard"
size = "S1"
}
}
site_config {
linux_fx_version = "NODE|10.14"
}
}
Ansible
Ansible is an open-source product that automates cloud provisioning, configuration
management, and application deployments. Using Ansible you can provision virtual
machines, containers, and network and complete cloud infrastructures. Also, Ansible
allows you to automate the deployment and configuration of resources in your
environment.
yml
- hosts: localhost
connection: local
vars:
resource_group: myResourceGroup
webapp_name: myfirstWebApp
plan_name: myAppServicePlan
location: eastus
tasks:
- name: Create a resource group
azure_rm_resourcegroup:
name: "{{ resource_group }}"
location: "{{ location }}"
Before you get too far in designing your application to run on Azure, chances are you'll
need to do a little planning ahead of time. As you get started, there are some basic
Azure concepts that you need to understand to make the best decisions for your
scenario. Considerations include:
Azure regions
A region is a set of datacenters deployed within a latency-defined perimeter and
connected through a dedicated regional low-latency network. Azure gives you the
flexibility to deploy applications where you need to, including across multiple regions to
deliver cross-region resiliency when necessary.
Typically, you want all of the resources for a solution to be in the same region to
minimize latency between different components of your application. This means if your
solution consists of an Azure App Service, a database, and Azure Blob storage, all of
these resources should be created in the same Azure region.
Not every Azure service is available in every region. The Products available by region
page can help you find a region where the Azure services needed by your app are
available.
https://www.microsoft.com/en-us/videoplayer/embed/RE50C5F?postJsllMsg=true
Resource groups are most often used to group together all of the Azure resources
needed for a solution in Azure. For example, say you've a web application deployed to
Azure App Service that uses a SQL database, Azure Storage, and also Azure Key Vault.
It's common practice to put all of the Azure resources needed for this solution into a
single resource group.
This makes it easier to tell what resources are needed for the application to run and
what resources are related to each other. As such, the first step in creating resources for
an app in Azure is usually creating the resource group that will serve as a container for
the app's resources.
https://www.microsoft.com/en-us/videoplayer/embed/RE50C5E?postJsllMsg=true
Environments
If you've developed on-premises, you are familiar with promoting your code through
dev, test, and production environments. In Azure, to create separate environments you
would create a separate set of Azure resources for each environment you need.
Since it's important that each environment be an exact copy, it's recommended to either
script the creation of resources needed for an environment or use Infrastructure as Code
(IaC) tools to declaratively specify the configuration of each environment. This makes
sure that the environment creation process is repeatable and also give you the ability to
spin up new environments on demand, for example for performance or security testing
of your application.
https://www.microsoft.com/en-us/videoplayer/embed/RE50C5M?postJsllMsg=true
DevOps Support
Whether it's publishing your apps to Azure with continuous integration or provisioning
resources for a new environment, Azure integrates with most of the popular DevOps
tools. You can work with the tools that you already have and maximize your existing
experience with support for tools like:
GitHub Actions
Azure DevOps
Octopus Deploy
Jenkins
Terraform
Ansible
Chef
How am I billed?
Article • 10/18/2022
When creating applications that use Azure, you need to understand the factors that
influence the cost of the solutions you create. You will also want to understand how you
can estimate the cost of a solution, how you're billed, and how you can monitor the
costs incurred in your Azure subscriptions.
If you're using an Azure account from your workplace or school, your organization's
Azure administrators has likely assigned different groups and roles to your account that
govern what you can and cannot do in Azure. If you can't create a certain type of
resource, check with your Azure administrator on the permissions assigned to your
account.
Organizations often create multiple Azure subscriptions for billing and management
purposes. For example, an organization may choose to create one subscription for each
department in the organization such that each department pays for their own Azure
resources. When creating Azure resources, it's important to pay attention to what
subscription you're creating the resources in because the owner of that subscription will
pay for those resources.
If you have an individual Azure account tied to your Microsoft account, it's also possible
to have multiple subscriptions. For example, a user might have both a Visual Studio
Enterprise subscription that provides monthly Azure credits and a Pay-as-you-go
subscription which bills to their credit card. In this scenario, you again want to be sure
and choose the right subscription when creating Azure resources to avoid an
unexpected bill for Azure services.
https://www.microsoft.com/en-us/videoplayer/embed/RE50ydI?postJsllMsg=true
Compute power - Compute power refers to the amount of CPU and memory
assigned to a resource. The more compute power allocated to a resource, the
higher the cost will be. Many Azure services include the ability to elastically scale,
allowing you to ramp up compute power when demand is high but scale back and
save money when demand is low.
Storage amount - Most storage services are billed based on the amount of data
you want to store.
Storage hardware - Some storage services provide options on the type of
hardware your data will be stored on. Depending on the type of data you're
storing, you may want a more long-term storage option with slower read and write
speeds, or you may be willing to pay for low latency read and writes for highly
transactional operations.
Bandwidth - Most services bill ingress and egress separately. Ingress is the amount
of bandwidth required to handle incoming requests. Egress is the amount of
bandwidth required to handle outgoing data that satisfies those requests.
Per use - Some services bill based on the number of times the service is used or a
count of the number of requests that are handled or the number of some entity
(such as Azure Active Directory user accounts) that have been configured.
Per service - Some services simply charge a straight monthly fee.
Region - Sometimes, services have different prices depending on the region (data
center) where it's hosted.
To access billing information in the Azure portal, sign in to the Azure portal and follow
these steps.
Instructions Screenshot
You will be taken to the Cost Management + Billing Overview page. On this
page you can:
1. Use the left-hand menu to review Invoices and Payment methods for
your subscriptions.
2. View a list of your subscriptions and their current charges. Selecting a
subscription from the table will take you to detailed cost information
about that subscription.
You can also access the Cost Management + Billing overview page directly.
The first is cost alerts which allows you to set spending thresholds and receive
notifications as your bill nears those thresholds.
The second is Azure Cost Management which helps you plan for and control your
costs, providing cost analysis, budgets, recommendations, and allows you to
export cost management data for analysis in Excel or your own custom reporting.
Most Azure services let you programmatically control and manage their resources with
REST APIs. Services evolve through new published versions of their APIs with different
contracts that add new features and/or modify their behaviors.
This article outlines the policy that the Azure service, SDK, and CLI teams use for
versioning the Azure REST APIs. While Azure teams make every effort to adhere to this
policy, deviations may occasionally occur.
Service versioning
Each published version of an API is identified by a date value in YYYY-MM-DD format,
called the api-version . Newer versions have later dates.
All API operations require clients to specify a valid API version for the service via the
api-version query string parameter in the URL. For example:
https://management.azure.com/subscriptions?api-version=2020-01-01 . Client SDKs and
tools include the api-version value automatically. For more considerations, see the
Client SDKs and service versions section later in this article.
Usually, published service versions remain available and supported for many years, even
as newer versions become available. In most cases, the only time you should adopt a
new service version within existing code is to take advantage of new features.
Stable versions
Most service versions published are stable versions. Stable versions are backwards
compatible, meaning that any code you write that relies on one version of a service can
adopt a newer stable version without requiring any code changes to maintain
correctness or existing functionality.
Preview versions
Occasionally, Microsoft publishes a preview version of a service to gather feedback
about proposed changes and new features. Preview service versions are identified with
the suffix -preview in their api-version - for example, 2022-07-07-preview .
Unless explicitly intended to introduce a breaking change from the previous stable
version, new preview versions include all the features of the most recent stable version
and add new preview features. However, between preview versions, a service may break
any of the newly added preview features.
Previews aren't intended for long-term use. Anytime a new stable or preview version of
a service becomes available, existing preview versions may become unavailable as early
as 90 days from the availability of the new version. Use preview versions only in
situations where you're actively developing against new service features and you're
prepared to adopt a new, non-preview version soon after it's released. If some features
from a preview version are released in a new stable version, remaining features still in
preview will typically be published in a new preview version.
When you use an SDK to access an Azure service, taking advantage of new versions and
features typically requires upgrading the client library version used by the application.
New stable versions of services are accompanied by new point releases of client
libraries. For new breaking change versions, new client libraries are published as either
point release versions or major release versions. The type of release depends on the
nature of the service's change and the way the library is able to accommodate it. Only
beta-version client libraries use preview service versions.
SDK client libraries support manual overriding of the service version. Overriding a client
library's default service version is an advanced scenario and may lead to unexpected
behavior. If you make use of this feature, test your application thoroughly to ensure it
works as desired.
The Azure command line tools may occasionally expose preview features. These
commands are marked with a Preview label and will output a warning indicating limited
support and potential changes in future tool versions.
Next steps
Azure REST API specifications
Microsoft REST API guidelines
Azure SDK general guidelines
Passwordless connections for Azure
services
Article • 06/02/2023
7 Note
This article describes the security challenges with passwords and introduces
passwordless connections for Azure services.
Embedding passwords in an application itself presents a huge security risk for many
reasons, including discovery through a code repository. Many developers externalize
such passwords using environment variables so that applications can load them from
different environments. However, this only shifts the risk from the code itself to an
execution environment. Anyone who gains access to the environment can steal
passwords, which in turn, increases your data exfiltration risk.
The following code example demonstrates how to connect to Azure Storage using a
storage account key. Many developers gravitate towards this solution because it feels
familiar to options they've worked with in the past, even though it isn't an ideal solution.
If your application currently uses access keys, consider migrating to passwordless
connections.
C#
You can configure passwordless connections to Azure services using Service Connector
or you can configure them manually. Service Connector enables managed identities in
app hosting services like Azure Spring Apps, Azure App Service, and Azure Container
Apps. Service Connector also configures backend services with passwordless
connections using managed identities and Azure RBAC, and hydrates applications with
necessary connection information.
The following video illustrates passwordless connections from apps to Azure services,
using Java applications as an example. Similar coverage for other languages is
forthcoming.
https://www.youtube-nocookie.com/embed/X6nR3AjIwJw
Introducing DefaultAzureCredential
Passwordless connections to Azure services through Azure AD and Role Based Access
control (RBAC) can be implemented using DefaultAzureCredential from the Azure
Identity client libraries.
) Important
The order and locations in which DefaultAzureCredential searches for credentials varies
between languages:
.NET
C++
Go
Java
JavaScript
Python
For example, when working locally with .NET, DefaultAzureCredential will generally
authenticate using the account the developer used to sign-in to Visual Studio, Azure CLI,
or Azure PowerShell. When the app is deployed to Azure, DefaultAzureCredential will
automatically discover and use the managed identity of the associated hosting service,
such as Azure App Service. No code changes are required for this transition.
7 Note
The following code example demonstrates how to connect to Service Bus using
passwordless connections. Other documentation describes how to migrate to this setup
for a specific service in more detail. A .NET app can pass an instance of
DefaultAzureCredential into the constructor of a service client class.
C#
See also
For a more detailed explanation of passwordless connections, see the developer guide
Configure passwordless connections between multiple Azure apps and services.
Configure passwordless connections
between multiple Azure apps and
services
Article • 02/14/2023
You can read more about best practices and when to use system-assigned identities
versus user-assigned identities in the identities best practice recommendations.
Explore DefaultAzureCredential
Managed identities are generally implemented in your application code through a class
called DefaultAzureCredential from the Azure.Identity client library.
DefaultAzureCredential supports multiple authentication methods and automatically
determines which should be used at runtime. You can read more about this approach in
the DefaultAzureCredential overview.
This tutorial applies to the following architectures, though it can be adapted to many
other scenarios as well through minimal configuration changes.
3. Toggle the Status setting to On to enable a system assigned managed identity for
the service.
5. On the Add role assignment screen, for the Assign access to option, select
Managed identity. Then choose +Select members.
6. In the flyout, search for the managed identity you created by entering the name of
your app service. Select the system assigned identity, and then choose Select to
close the flyout menu.
7. Select Next a couple times until you're able to select Review + assign to finish the
role assignment.
8. Repeat this process for the other services you would like to connect to.
Local development considerations
You can also enable access to Azure resources for local development by assigning roles
to a user account the same way you assigned roles to your managed identity.
1. After assigning the Storage Blob Data Contributor role to your managed identity,
under Assign access to, this time select User, group or service principal. Choose +
Select members to open the flyout menu again.
2. Search for the user@domain account or Azure AD security group you would like to
grant access to by email address or name, and then select it. This should be the
same account you use to sign-in to your local development tooling with, such as
Visual Studio or the Azure CLI.
7 Note
You can also assign these roles to an Azure Active Directory security group if you
are working on a team with multiple developers. You can then place any developer
inside that group who needs access to develop the app locally.
C#
Inside of your project, add a reference to the Azure.Identity NuGet package. This
library contains all of the necessary entities to implement DefaultAzureCredential .
You can also add any other Azure libraries that are relevant to your app. For this
example, the Azure.Storage.Blobs and Azure.KeyVault.Keys packages are added in
order to connect to Blob Storage and Key Vault.
.NET CLI
At the top of your Program.cs file, add the following using statements:
C#
using Azure.Identity;
using Azure.Storage.Blobs;
using Azure.Security.KeyVault.Keys;
In the Program.cs file of your project code, create instances of the necessary
services your app will connect to. The following examples connect to Blob Storage
and service bus using the corresponding SDK classes.
C#
When this application code runs locally, DefaultAzureCredential will search down a
credential chain for the first available credentials. If the Managed_Identity_Client_ID is
null locally, it will automatically use the credentials from your local Azure CLI or Visual
Studio sign-in. You can read more about this process in the Azure Identity library
overview.
This overall process ensures that your app can run securely locally and in Azure without
the need for any code changes.
To configure this setup in your code, make sure your application registers separate
services to connect to each storage account or database. Make sure to pull in the
correct managed identity client IDs for each service when configuring
DefaultAzureCredential . The following code example configures the following service
connections:
C#
C#
These types of scenarios are explored in more depth in the identities best practice
recommendations.
Next steps
In this tutorial, you learned how to migrate an application to passwordless connections.
You can read the following resources to explore the concepts discussed in this article in
more depth:
Managed identities for Azure resources is a feature of Azure Active Directory. Each of
the Azure services that support managed identities for Azure resources are subject to
their own timeline. Make sure you review the availability status of managed identities for
your resource and known issues before you begin.
User-assigned identities can be used by multiple resources, and their life cycles are
decoupled from the resources’ life cycles with which they’re associated. Read which
resources support managed identities.
This life cycle allows you to separate your resource creation and identity administration
responsibilities. User-assigned identities and their role assignments can be configured in
advance of the resources that require them. Users who create the resources only require
the access to assign a user-assigned identity, without the need to create new identities
or role assignments.
As system-assigned identities are created and deleted along with the resource, role
assignments can't be created in advance. This sequence can cause failures while
deploying infrastructure if the user creating the resource doesn't also have access to
create role assignments.
If your infrastructure requires that multiple resources require access to the same
resources, a single user-assigned identity can be assigned to them. Administration
overhead will be reduced, as there are fewer distinct identities and role assignments to
manage.
If you require that each resource has its own identity, or have resources that require a
unique set of permissions and want the identity to be deleted as the resource is deleted,
then you should use a system-assigned identity.
Scenario Recommendation Notes
Replicated User-assigned Resources that carry out the same task – for
resources/applications identity example, duplicated web servers or identical
functionality running in an app service and in an
application on a virtual machine – typically require
the same permissions.
Audit Logging System-assigned If you need to log which specific resource carried
identity out an action, rather than which identity, use a
system-assigned identity.
Permissions Lifecycle System-assigned If you require that the permissions for a resource
Management identity be removed along with the resource, use a system-
assigned identity.
The diagram shows four virtual machines with system-assigned identities. Each virtual
machine has the same role assignments that grants them access to two storage
accounts.
When a user-assigned identity is associated with the four virtual machines, only two role
assignments are required, compared to eight with system-assigned identities. If the
virtual machines' identity requires more role assignments, they'll be granted to all the
resources associated with this identity.
Security groups can also be used to reduce the number of role assignments that are
required. This diagram shows four virtual machines with system-assigned identities,
which have been added to a security group, with the role assignments added to the
group instead of the system-assigned identities. While the result is similar, this
configuration doesn't offer the same Resource Manager template capabilities as user-
assigned identities.
This model provides the flexibility to both use a shared user-assigned identity and apply
granular permissions when needed.
In the example below, “Virtual Machine 3” and “Virtual Machine 4” can access both
storage accounts and key vaults, depending on which user-assigned identity they use
while authenticating.
In the example below, “Virtual Machine 4” has both a user-assigned identity, giving it
access to both storage accounts and key vaults, depending on which identity is used
while authenticating. The role assignments for the system-assigned identity are specific
to that virtual machine.
Limits
View the limits for managed identities and for custom roles and role assignments.
For example, if a managed Identity (ClientId = 1234) has been granted read/write access
to StorageAccount7755 and has been assigned to LogicApp3388, then Alice, who does
not have any direct permissions over the managed identity or the storage account but
has permission to execute code within LogicApp3388 can also read/write data to/from
StorageAccount7755 by executing the code that uses the managed identity.
In general, when granting a user administrative access to a resource that can execute
code (such as a Logic App) and has a managed identity, consider if the role being
assigned to the user can install or run code on the resource, and if yes only assign that
role if the user really needs it.
Maintenance
System-assigned identities are automatically deleted when the resource is deleted, while
the lifecycle of a user-assigned identity is independent of any resources with which it's
associated.
You'll need to manually delete a user-assigned identity when it's no longer required,
even if no resources are associated with it.
Role assignments that are associated with deleted managed identities will be displayed
with “Identity not found” when viewed in the portal. Read more.
Role assignments which are no longer associated with a user or service principal will
appear with an ObjectType value of Unknown . In order to remove them, you can pipe
several Azure PowerShell commands together to first get all the role assignments, filter
to only those with an ObjectType value of Unknown and then remove those role
assignments from Azure.
Azure PowerShell
In both cases, for non-human identities such as Azure AD Applications and Managed
identities, the exact mechanism of how this authorization information is presented to
the application is not ideally suited today. Today's implementation with Azure AD and
Azure Role Based Access Control (Azure RBAC) uses access tokens issued by Azure AD
for authentication of each identity. If the identity is added to a group or role, this is
expressed as claims in the access token issued by Azure AD. Azure RBAC uses these
claims to further evaluate the authorization rules for allowing or denying access.
Given that the identity's groups and roles are claims in the access token, any
authorization changes do not take effect until the token is refreshed. For a human user
that's typically not a problem, because a user can acquire a new access token by logging
out and in again (or waiting for the token lifetime to expire, which is 1 hour by default).
Managed identity tokens on the other hand are cached by the underlying Azure
infrastructure for performance and resiliency purposes: the back-end services for
managed identities maintain a cache per resource URI for around 24 hours. This means
that it can take several hours for changes to a managed identity's group or role
membership to take effect. Today, it is not possible to force a managed identity's token
to be refreshed before its expiry. If you change a managed identity’s group or role
membership to add or remove permissions, you may therefore need to wait several
hours for the Azure resource using the identity to have the correct access.
If this delay is not acceptable for your requirements, consider alternatives to using
groups or roles in the token. To ensure that changes to permissions for managed
identities take effect quickly, we recommend that you group Azure resources using a
user-assigned managed identity with permissions applied directly to the identity,
instead of adding to or removing managed identities from an Azure AD group that has
permissions. A user-assigned managed identity can be used like a group because it can
be assigned to one or more Azure resources to use it. The assignment operation can be
controlled using the Managed identity contributor and Managed identity operator role.
Migrate a .NET application to use
passwordless connections with Azure
SQL Database
Article • 06/01/2023
Application requests to Azure SQL Database must be authenticated. Although there are
multiple options for authenticating to Azure SQL Database, you should prioritize
passwordless connections in your applications when possible. Traditional authentication
methods that use passwords or secret keys create security risks and complications. Visit
the passwordless connections for Azure services hub to learn more about the
advantages of moving to passwordless connections. The following tutorial explains how
to migrate an existing application to connect to Azure SQL Database to use
passwordless connections instead of a username and password solution.
For this migration guide, ensure you have an Azure AD admin assigned to your Azure
SQL Database.
3. In the Azure Active Directory flyout menu, search for the user you want to assign
as admin.
Sign-in to Azure
For local development, make sure you're signed-in with the same Azure AD account you
want to use to access Azure SQL Database. You can authenticate via popular
development tools, such as the Azure CLI or Azure PowerShell. The development tools
with which you can authenticate vary across languages.
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
1. In the Azure portal , browse to your SQL database and select Query editor
(preview).
2. Select Continue as <your-username> on the right side of the screen to sign into the
database using your account.
SQL
Running these commands assigns the SQL DB Contributor role to the account
specified. This role allows the identity to read, write, and modify the data and
schema of your database. For more information about the roles assigned, see
Fixed-database roles.
passwordless connections. However, you must update your database connection string
to use the passwordless format. For example, the following code works with both SQL
authentication and passwordless connections:
C#
string connectionString =
app.Configuration.GetConnectionString("AZURE_SQL_CONNECTIONSTRING")!;
1. Locate your connection string. For local development with .NET applications, this is
usually stored in one of the following locations:
2. Replace the connection string value with the following passwordless format.
Update the <database-server-name> and <database-name> placeholders with your
own values:
JSON
Server=tcp:<database-server-name>.database.windows.net,1433;Initial
Catalog=<database-name>;
Encrypt=True;TrustServerCertificate=False;Connection
Timeout=30;Authentication="Active Directory Default";
Azure portal
1. At the top of the Azure portal, search for Managed identities. Select the
Managed Identities result.
2. Select + Create at the top of the Managed Identities overview page.
3. On the Basics tab, enter the following values:
After the resource is created, select Go to resource to view the details of the
managed identity.
Azure portal
Complete the following steps in the Azure portal to associate the user-assigned
managed identity with your app. These same steps apply to the following Azure
services:
3. Select + Add to open the Add user assigned managed identity flyout.
5. Search for the MigrationIdentity by name and select it from the search results.
1. In the Azure portal, browse to your SQL database and select Query editor
(preview).
2. Select Continue as <username> on the right side of the screen to sign into the
database using your account.
SQL
Running these commands assigns the SQL DB Contributor role to the user-
assigned managed identity. This role allows the identity to read, write, and modify
the data and schema of your database.
) Important
You can read more about configuring database roles and security on the following
resources:
1. Navigate to the configuration page of your App Service instance and locate the
Azure SQL Database connection string.
2. Select the edit icon and update the connection string value to match following
format. Change the <database-server-name> and <database-name> placeholders
with the values of your own service.
JSON
Server=tcp:<database-server-name>.database.windows.net,1433;Initial
Catalog=<database-name>;
Encrypt=True;TrustServerCertificate=False;Connection
Timeout=30;Authentication="Active Directory Default";
3. Save your changes and restart the application if it does not do so automatically.
Next steps
In this tutorial, you learned how to migrate an application to passwordless connections.
You can read the following resources to explore the concepts discussed in this article in
more depth:
Passwordless overview
Managed identity best practices
Tutorial: Secure a database in Azure SQL Database
Authorize database access to SQL Database
Migrate a Java application to use
passwordless connections with Azure
SQL Database
Article • 05/30/2023
This article explains how to migrate from traditional authentication methods to more
secure, passwordless connections with Azure SQL Database.
Azure AD authentication
Microsoft Azure AD authentication is a mechanism for connecting to Azure SQL
Database using identities defined in Azure AD. With Azure AD authentication, you can
manage database user identities and other Microsoft services in a central location, which
simplifies permission management.
Although it's possible to connect to Azure SQL Database with passwords, you should
use them with caution. You must be diligent to never expose the passwords in an
unsecure location. Anyone who gains access to the passwords is able to authenticate.
For example, there's a risk that a malicious user can access the application if a
connection string is accidentally checked into source control, sent through an unsecure
email, pasted into the wrong chat, or viewed by someone who shouldn't have
permission. Instead, consider updating your application to use passwordless
connections.
Many Azure services support passwordless connections, for example via Azure Managed
Identity. These techniques provide robust security features that you can implement
using DefaultAzureCredential from the Azure Identity client libraries. In this tutorial,
you'll learn how to update an existing application to use DefaultAzureCredential
instead of alternatives such as connection strings.
determines which should be used at runtime. This approach enables your app to use
different authentication methods in different environments (local dev vs. production)
without implementing environment-specific code.
The order and locations in which DefaultAzureCredential searches for credentials can
be found in the Azure Identity library overview. For example, when working locally,
DefaultAzureCredential will generally authenticate using the account the developer
used to sign in to Visual Studio. When the app is deployed to Azure,
DefaultAzureCredential will automatically switch to use a managed identity. No code
changes are required for this transition.
To ensure that connections are passwordless, you must take into consideration both
local development and the production environment. If a connection string is required in
either place, then the application isn't passwordless.
In your local development environment, you can authenticate with Azure CLI, Azure
PowerShell, Visual Studio, or Azure plugins for Visual Studio Code or IntelliJ. In this case,
you can use that credential in your application instead of configuring properties.
7 Note
7 Note
Since the JDBC driver for Azure SQL Database doesn't support passwordless
connections from local environments yet, this article will focus only on applications
deployed to Azure hosting environments and how to migrate them to use
passwordless connections.
Bash
export AZ_RESOURCE_GROUP=<YOUR_RESOURCE_GROUP>
export AZ_DATABASE_SERVER_NAME=<YOUR_DATABASE_SERVER_NAME>
export AZ_DATABASE_NAME=demo
export CURRENT_USERNAME=$(az ad signed-in-user show --query
userPrincipalName --output tsv)
export CURRENT_USER_OBJECTID=$(az ad signed-in-user show --query id --output
tsv)
Replace the placeholders with the following values, which are used throughout this
article:
<YOUR_RESOURCE_GROUP> : The name of the resource group your resources are in.
If you're using Azure CLI, run the following command to make sure it has sufficient
permission:
Bash
Azure CLI
This command will set the Azure AD admin to the current signed-in user.
7 Note
You can only create one Azure AD admin per Azure SQL Database server. Selection
of another one will overwrite the existing Azure AD admin configured for the
server.
Java
XML
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.5.4</version>
</dependency>
Java
String url =
"jdbc:sqlserver://$AZ_DATABASE_SERVER_NAME.database.windows.net:143
3;databaseName=$AZ_DATABASE_NAME;authentication=ActiveDirectoryMSI;
"
Connection con = DriverManager.getConnection(url);
In this section, you'll execute two steps to enable your application to run in an Azure
hosting environment in a passwordless way:
7 Note
Azure also provides Service Connector, which can help you connect your hosting
service with SQL server. With Service Connector to configure your hosting
environment, you can omit the step of assigning roles to your managed identity
because Service Connector will do it for you. The following section describes how
to configure your Azure hosting environment in two ways: one via Service
Connector and the other by configuring each hosting environment directly.
) Important
App Service
1. On the main overview page of your Azure App Service instance, select Identity
from the navigation pane.
2. On the System assigned tab, make sure to set the Status field to on. A system
assigned identity is managed by Azure internally and handles administrative
tasks for you. The details and IDs of the identity are never exposed in your
code.
You can also assign managed identity on an Azure hosting environment using the Azure
CLI.
App Service
You can assign a managed identity to an Azure App Service instance with the az
webapp identity assign command, as shown in the following example:
Azure CLI
Next, grant permissions to the managed identity you created to access your SQL
database.
Service Connector
If you connected your services using Service Connector, the previous step's
commands already assigned the role, so you can skip this step.
Next steps
In this tutorial, you learned how to migrate an application to passwordless connections.
You can read the following resources to explore the concepts discussed in this article in
more depth:
Authorize access to blob data with managed identities for Azure resources.
Authorize access to blobs using Azure Active Directory
Migrate a Node.js application to use
passwordless connections with Azure
SQL Database
Article • 06/08/2023
Application requests to Azure SQL Database must be authenticated. Although there are
multiple options for authenticating to Azure SQL Database, you should prioritize
passwordless connections in your applications when possible. Traditional authentication
methods that use passwords or secret keys create security risks and complications. Visit
the passwordless connections for Azure services hub to learn more about the
advantages of moving to passwordless connections.
For this migration guide, ensure you have an Azure AD admin assigned to your Azure
SQL Database.
3. In the Azure Active Directory flyout menu, search for the user you want to assign
as admin.
Sign-in to Azure
For local development, make sure you're signed-in with the same Azure AD account you
want to use to access Azure SQL Database. You can authenticate via popular
development tools, such as the Azure CLI or Azure PowerShell. The development tools
with which you can authenticate vary across languages.
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
1. In the Azure portal , browse to your SQL database and select Query editor
(preview).
2. Select Continue as <your-username> on the right side of the screen to sign into the
database using your account.
SQL
Running these commands assigns the SQL DB Contributor role to the account
specified. This role allows the identity to read, write, and modify the data and
schema of your database. For more information about the roles assigned, see
Fixed-database roles.
ini
AZURE_SQL_SERVER=<YOURSERVERNAME>.database.windows.net
AZURE_SQL_DATABASE=<YOURDATABASENAME>
AZURE_SQL_PORT=1433
2. Existing application code that connects to Azure SQL Database using the Node.js
SQL Driver - tedious continues to work with passwordless connections with minor
changes. To use a user-assigned managed identity, pass the authentication.type
and options.clientId properties.
Node.js
import sql from 'mssql';
// Passwordless configuration
const config = {
server,
port,
database,
authentication: {
type: 'azure-active-directory-default',
},
options: {
encrypt: true,
clientId: process.env.AZURE_CLIENT_ID // <----- user-assigned
managed identity
}
};
constructor(config) {
this.config = config;
console.log(`Database: config: ${JSON.stringify(config)}`);
}
async connect() {
try {
console.log(`Database connecting...${this.connected}`);
if (this.connected === false) {
this.poolconnection = await sql.connect(this.config);
this.connected = true;
console.log('Database connection successful');
} else {
console.log('Database already connected');
}
} catch (error) {
console.error(`Error connecting to database:
${JSON.stringify(error)}`);
}
}
async disconnect() {
try {
this.poolconnection.close();
console.log('Database connection closed');
} catch (error) {
console.error(`Error closing database connection:
${error}`);
}
}
async executeQuery(query) {
await this.connect();
const request = this.poolconnection.request();
const result = await request.query(query);
return result.rowsAffected[0];
}
}
Passwordless overview
Managed identity best practices
Managed identities in Azure AD for Azure SQL
Azure portal
1. At the top of the Azure portal, search for Managed identities. Select the
Managed Identities result.
2. Select + Create at the top of the Managed Identities overview page.
3. On the Basics tab, enter the following values:
After the resource is created, select Go to resource to view the details of the
managed identity.
Azure portal
Complete the following steps in the Azure portal to associate the user-assigned
managed identity with your app. These same steps apply to the following Azure
services:
3. Select + Add to open the Add user assigned managed identity flyout.
5. Search for the MigrationIdentity by name and select it from the search results.
1. In the Azure portal, browse to your SQL database and select Query editor
(preview).
2. Select Continue as <username> on the right side of the screen to sign into the
database using your account.
SQL
Running these commands assigns the SQL DB Contributor role to the user-
assigned managed identity. This role allows the identity to read, write, and modify
the data and schema of your database.
) Important
You can read more about configuring database roles and security on the following
resources:
Node.js
const config = {
server,
port,
database,
authentication: {
type: 'azure-active-directory-default'
},
options: {
encrypt: true
}
};
Next steps
In this tutorial, you learned how to migrate an application to passwordless connections.
You can read the following resources to explore the concepts discussed in this article in
more depth:
Passwordless overview
Managed identity best practices
Tutorial: Secure a database in Azure SQL Database
Authorize database access to SQL Database
Migrate a Python application to use
passwordless connections with Azure
SQL Database
Article • 06/01/2023
Application requests to Azure SQL Database must be authenticated. Although there are
multiple options for authenticating to Azure SQL Database, you should prioritize
passwordless connections in your applications when possible. Traditional authentication
methods that use passwords or secret keys create security risks and complications. Visit
the passwordless connections for Azure services hub to learn more about the
advantages of moving to passwordless connections. The following tutorial explains how
to migrate an existing Python application to connect to Azure SQL Database to use
passwordless connections instead of a username and password solution.
For this migration guide, ensure you have an Azure AD admin assigned to your Azure
SQL Database.
3. In the Azure Active Directory flyout menu, search for the user you want to assign
as admin.
Sign-in to Azure
For local development, make sure you're signed-in with the same Azure AD account you
want to use to access Azure SQL Database. You can authenticate via popular
development tools, such as the Azure CLI or Azure PowerShell. The development tools
with which you can authenticate vary across languages.
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
1. In the Azure portal , browse to your SQL database and select Query editor
(preview).
2. Select Continue as <your-username> on the right side of the screen to sign into the
database using your account.
SQL
Running these commands assigns the SQL DB Contributor role to the account
specified. This role allows the identity to read, write, and modify the data and
schema of your database. For more information about the roles assigned, see
Fixed-database roles.
Python
connection_string = os.environ["AZURE_SQL_CONNECTIONSTRING"]
def get_all():
with get_conn() as conn:
cursor = conn.cursor()
cursor.execute("SELECT * FROM Persons")
# Do something with the data
return
def get_conn():
credential =
identity.DefaultAzureCredential(exclude_interactive_browser_credential=False
)
token_bytes =
credential.get_token("https://database.windows.net/.default").token.encode("
UTF-16-LE")
token_struct = struct.pack(f'<I{len(token_bytes)}s', len(token_bytes),
token_bytes)
SQL_COPT_SS_ACCESS_TOKEN = 1256 # This connection option is defined by
microsoft in msodbcsql.h
conn = pyodbc.connect(connection_string, attrs_before=
{SQL_COPT_SS_ACCESS_TOKEN: token_struct})
return conn
Tip
Passwordless overview
Managed identity best practices
Azure portal
1. At the top of the Azure portal, search for Managed identities. Select the
Managed Identities result.
2. Select + Create at the top of the Managed Identities overview page.
3. On the Basics tab, enter the following values:
After the resource is created, select Go to resource to view the details of the
managed identity.
Azure portal
Complete the following steps in the Azure portal to associate the user-assigned
managed identity with your app. These same steps apply to the following Azure
services:
3. Select + Add to open the Add user assigned managed identity flyout.
5. Search for the MigrationIdentity by name and select it from the search results.
1. In the Azure portal, browse to your SQL database and select Query editor
(preview).
2. Select Continue as <username> on the right side of the screen to sign into the
database using your account.
SQL
Running these commands assigns the SQL DB Contributor role to the user-
assigned managed identity. This role allows the identity to read, write, and modify
the data and schema of your database.
) Important
You can read more about configuring database roles and security on the following
resources:
7 Note
The example connection code shown in this migration guide uses the
DefaultAzureCredential class when deployed. Specifically, it uses the
DefaultAzureCredential without passing the user-assigned managed identity client
ID to the constructor. In this scenario, the fallback is to check for the
AZURE_CLIENT_ID environment variable. If the AZURE_CLIENT_ID environment
variable doesn't exist, a system-assigned managed identity will be used if
configured.
Next steps
In this tutorial, you learned how to migrate an application to passwordless connections.
You can read the following resources to explore the concepts discussed in this article in
more depth:
Passwordless overview
Managed identity best practices
Tutorial: Secure a database in Azure SQL Database
Authorize database access to SQL Database
Migrate an application to use
passwordless connections with Azure
Cosmos DB for NoSQL
Article • 06/01/2023
Azure CLI
az cosmosdb sql role definition create \
--account-name <cosmosdb-account-name> \
--resource-group <resource-group-name> \
--body '{
"RoleName": "PasswordlessReadWrite",
"Type": "CustomRole",
"AssignableScopes": ["/"],
"Permissions": [{
"DataActions": [
"Microsoft.DocumentDB/databaseAccounts/readMetadata",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/items/*"
,
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/*"
]
}]
}'
2. When the command completes, copy the ID value from the name field and paste it
somewhere for later use.
3. Assign the role you created to the user account or service principal that will
connect to Cosmos DB. During local development, this will generally be your own
account that's logged into a development tool like Visual Studio or the Azure CLI.
Retrieve the details of your account using the az ad user command.
Azure CLI
4. Copy the value of the id property out of the results and paste it somewhere for
later use.
5. Assign the custom role you created to your user account using the az cosmosdb
sql role assignment create command and the IDs you copied previously.
Azure CLI
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
.NET
.NET CLI
C#
using Azure.Identity;
C#
Azure portal
1. At the top of the Azure portal, search for Managed identities. Select the
Managed Identities result.
2. Select + Create at the top of the Managed Identities overview page.
3. On the Basics tab, enter the following values:
You need to configure your web app to use the managed identity you created. Assign
the identity to your app using either the Azure portal or the Azure CLI.
Azure portal
Complete the following steps in the Azure portal to associate an identity with your
app. These same steps apply to the following Azure services:
4. Select + Add to open the Add user assigned managed identity flyout.
6. Search for the MigrationIdentity by name and select it from the search results.
To assign a role at the resource level using the Azure CLI, you first must retrieve the
resource ID using the az cosmosdb show command. You can filter the output properties
using the --query parameter.
Azure CLI
az cosmosdb show \
--resource-group '<resource-group-name>' \
--name '<cosmosdb-name>' \
--query id
Copy the output ID from the preceding command. You can then assign roles using the
az role assignment command of the Azure CLI.
Azure CLI
1. On the managed identity overview page, copy the client ID value to your clipboard.
.NET
C#
3. Redeploy your code to Azure after making this change in order for the
configuration updates to be applied.
Next steps
In this tutorial, you learned how to migrate an application to passwordless connections.
You can read the following resources to explore the concepts discussed in this article in
more depth:
The following tutorial explains how to migrate an existing application to connect using
passwordless connections. These same migration steps should apply whether you're
using access keys, connection strings, or another secrets-based approach.
The following example assigns the Azure Event Hubs Data Sender and Azure Event
Hubs Data Receiver roles to your user account. These role grants read and write access
to event hub messages.
Azure portal
1. In the Azure portal, locate your event hub using the main search bar or left
navigation.
2. On the event hub overview page, select Access control (IAM) from the left-
hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this example,
search for Azure Event Hubs Data Sender and select the matching result and
then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
7. In the dialog, search for your Azure AD username (usually your user@domain
email address) and then choose Select at the bottom of the dialog.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
9. Repeat these steps for the Azure Event Hubs Data Receiver role to allow the
account to send and receive messages.
) Important
In most cases, it will take a minute or two for the role assignment to propagate in
Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and try
again.
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
.NET
C++
Go
Java
Node.js
Python
.NET
.NET CLI
C#
using Azure.Identity;
C#
4. Make sure to update the event hubs namespace in the URI of your
EventHubProducerClient or EventProcessorClient objects. You can find the
namespace name on the overview page of the Azure portal.
Passwordless Overview
Managed identity best practices
Azure portal
1. At the top of the Azure portal, search for Managed identities. Select the
Managed Identities result.
2. Select + Create at the top of the Managed Identities overview page.
3. On the Basics tab, enter the following values:
After the resource is created, select Go to resource to view the details of the
managed identity.
Complete the following steps in the Azure portal to associate an identity with your
app. These same steps apply to the following Azure services:
4. Select + Add to open the Add user assigned managed identity flyout.
6. Search for the MigrationIdentity by name and select it from the search results.
Azure portal
1. Navigate to your event hub overview page and select Access Control (IAM)
from the left navigation.
3. In the Role search box, search for Azure Event Hub Data Sender, which is a
common role used to manage data operations for queues. You can assign
whatever role is appropriate for your use case. Select the Azure Event Hub
Data Sender from the list and choose Next.
4. On the Add role assignment screen, for the Assign access to option, select
Managed identity. Then choose +Select members.
5. In the flyout, search for the managed identity you created by name and select
it from the results. Choose Select to close the flyout menu.
6. Select Next a couple times until you're able to select Review + assign to finish
the role assignment.
7. Repeat these steps for the Azure Event Hub Data Receiver role.
1. On the managed identity overview page, copy the client ID value to your clipboard.
.NET
client ID.
C#
3. Redeploy your code to Azure after making this change in order for the
configuration updates to be applied.
Next steps
In this tutorial, you learned how to migrate an application to passwordless connections.
You can read the following resources to explore the concepts discussed in this article in
more depth:
This article explains how to migrate from traditional authentication methods to more
secure, passwordless connections with Azure Event Hubs for Kafka.
Application requests to Azure Event Hubs for Kafka must be authenticated. Azure Event
Hubs for Kafka provides different ways for apps to connect securely. One of the ways is
to use a connection string. However, you should prioritize passwordless connections in
your applications when possible.
Passwordless connections are supported since Spring Cloud Azure 4.3.0. This article is a
migration guide for removing credentials from Spring Cloud Stream Kafka applications.
Azure AD authentication
Microsoft Azure AD authentication is a mechanism for connecting to Azure Event Hubs
for Kafka using identities defined in Azure AD. With Azure AD authentication, you can
manage service principal identities and other Microsoft services in a central location,
which simplifies permission management.
SAS authentication
Event Hubs also provides Shared Access Signatures (SAS) for delegated access to Event
Hubs for Kafka resources.
Although it's possible to connect to Azure Event Hubs for Kafka with SAS, it should be
used with caution. You must be diligent to never expose the connection strings in an
unsecure location. Anyone who gains access to the connection strings is able to
authenticate. For example, there's a risk that a malicious user can access the application
if a connection string is accidentally checked into source control, sent through an
unsecure email, pasted into the wrong chat, or viewed by someone who shouldn't have
permission. Instead, authorizing access using the OAuth 2.0 token-based mechanism
provides superior security and ease of use over SAS. Consider updating your application
to use passwordless connections.
Many Azure services support passwordless connections, for example via Azure Managed
Identity. These techniques provide robust security features that you can implement
using DefaultAzureCredential from the Azure Identity client libraries. In this tutorial,
you'll learn how to update an existing application to use DefaultAzureCredential
instead of alternatives such as connection strings.
The order and locations in which DefaultAzureCredential searches for credentials can
be found in the Azure Identity library overview. For example, when working locally,
DefaultAzureCredential will generally authenticate using the account the developer
In your local development environment, you can authenticate with Azure CLI, Azure
PowerShell, Visual Studio, or Azure plugins for Visual Studio Code or IntelliJ. In this case,
you can use that credential in your application instead of configuring properties.
7 Note
Bash
export AZ_RESOURCE_GROUP=<YOUR_RESOURCE_GROUP>
export AZ_EVENTHUBS_NAMESPACE_NAME=<YOUR_EVENTHUBS_NAMESPACE_NAME>
export AZ_EVENTHUB_NAME=<YOUR_EVENTHUB_NAME>
Replace the placeholders with the following values, which are used throughout this
article:
you'll use.
<YOUR_EVENTHUB_NAME> : The name of the event hub you'll use.
Azure portal
1. In the Azure portal, locate your Event Hubs namespace using the main search
bar or left navigation.
2. On the Event Hubs overview page, select Access control (IAM) from the left-
hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this example,
search for Azure Event Hubs Data Sender and Azure Event Hubs Data Receiver
and select the matching result and then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose Select members.
7. In the dialog, search for your Azure AD username (usually your user@domain
email address) and then choose Select at the bottom of the dialog.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
For more information about granting access roles, see Authorize access to Event Hubs
resources using Azure Active Directory.
Azure CLI
Sign in to Azure through the Azure CLI by using the following command:
Azure CLI
az login
Next, use the following steps to update your Spring Kafka application to use
passwordless connections. Although conceptually similar, each framework uses different
implementation details.
Java
1. Inside your project, open the pom.xml file and add the following reference:
XML
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.6.0</version>
</dependency>
2. After migration, implement AuthenticateCallbackHandler and
OAuthBearerToken in your project for OAuth2 authentication, as shown in
the following example.
Java
@Override
public void configure(Map<String, ?> configs, String mechanism,
List<AppConfigurationEntry> jaasConfigEntries) {
TokenRequestContext request =
buildTokenRequestContext(configs);
this.resolveToken = tokenCredential ->
tokenCredential.getToken(request).map(OAuthBearerTokenImp::new);
}
@SuppressWarnings("unchecked")
private URI buildEventHubsServerUri(Map<String, ?> configs) {
String bootstrapServer =
Arrays.asList(configs.get(BOOTSTRAP_SERVERS_CONFIG)).get(0).toStrin
g();
bootstrapServer = bootstrapServer.replaceAll("\\[|\\]", "");
URI uri = URI.create("https://" + bootstrapServer);
return uri;
}
@Override
public void close() {
// NOOP
}
}
Java
@Override
public String value() {
return accessToken.getToken();
}
@Override
public Long startTimeMs() {
return claims.getIssueTime().getTime();
}
@Override
public long lifetimeMs() {
return claims.getExpirationTime().getTime();
}
@Override
public Set<String> scope() {
// Referring to https://docs.microsoft.com/azure/active-
directory/develop/access-tokens#payload-claims, the scp
// claim is a String which is presented as a space
separated list.
return Optional.ofNullable(claims.getClaim("scp"))
.map(s -> Arrays.stream(((String) s)
.split(" "))
.collect(Collectors.toSet()))
.orElse(null);
}
@Override
public String principalName() {
return (String) claims.getClaim("upn");
}
3. When you create your Kafka producer or consumer, add the configuration
needed to support the SASL/OAUTHBEARER mechanism. The following
examples show what your code should look like before and after migration. In
both examples, replace the <eventhubs-namespace> placeholder with the name
of your Event Hubs namespace.
Before migration, your code should look like the following example:
Java
String.format("org.apache.kafka.common.security.plain.PlainLoginMod
ule required username=\"$ConnectionString\" password=\"%s\";",
connectionString));
return new KafkaProducer<>(properties);
After migration, your code should look like the following example. In this
example, replace the <path-to-your-KafkaOAuth2AuthenticateCallbackHandler>
placeholder with the full class name for your implemented
KafkaOAuth2AuthenticateCallbackHandler .
Java
After making these code changes, run your application locally. The new configuration
should pick up your local credentials, assuming you're logged into a compatible IDE or
command line tool, such as the Azure CLI, Visual Studio, or IntelliJ. The roles you
assigned to your local dev user in Azure will allow your app to connect to the Azure
service locally.
In this section, you'll execute two steps to enable your application to run in an Azure
hosting environment in a passwordless way:
Azure also provides Service Connector, which can help you connect your hosting
service with Event Hubs. With Service Connector to configure your hosting
environment, you can omit the step of assigning roles to your managed identity
because Service Connector will do it for you. The following section describes how
to configure your Azure hosting environment in two ways: one via Service
Connector and the other by configuring each hosting environment directly.
) Important
The following steps show you how to assign a system-assigned managed identity for
various web hosting services. The managed identity can securely connect to other Azure
Services using the app configurations you set up previously.
App Service
1. On the main overview page of your Azure App Service instance, select Identity
from the navigation pane.
2. On the System assigned tab, make sure to set the Status field to on. A system
assigned identity is managed by Azure internally and handles administrative
tasks for you. The details and IDs of the identity are never exposed in your
code.
You can also assign managed identity on an Azure hosting environment by using the
Azure CLI.
App Service
You can assign a managed identity to an Azure App Service instance with the az
webapp identity assign command, as shown in the following example.
Azure CLI
Next, grant permissions to the managed identity you created to access your Event Hubs
namespace. You can grant permissions by assigning a role to the managed identity, just
like you did with your local development user.
Service Connector
If you connected your services using the Service Connector, you don't need to
complete this step. The following necessary configurations were handled for you:
If you chose to use a connection string, the connection string was added as an
app environment variable.
Next steps
In this tutorial, you learned how to migrate an application to passwordless connections.
You can read the following resources to explore the concepts discussed in this article in
more depth:
Authorize access to blob data with managed identities for Azure resources
Authorize access to blobs using Azure Active Directory
Migrate an application to use
passwordless connections with Azure
Database for MySQL
Article • 05/30/2023
This article explains how to migrate from traditional authentication methods to more
secure, passwordless connections with Azure Database for MySQL.
Azure AD authentication
Microsoft Azure AD authentication is a mechanism for connecting to Azure Database for
MySQL using identities defined in Azure AD. With Azure AD authentication, you can
manage database user identities and other Microsoft services in a central location, which
simplifies permission management.
Although it's possible to connect to Azure Database for MySQL with passwords, you
should use them with caution. You must be diligent to never expose the passwords in an
unsecure location. Anyone who gains access to the passwords is able to authenticate.
For example, there's a risk that a malicious user can access the application if a
connection string is accidentally checked into source control, sent through an unsecure
email, pasted into the wrong chat, or viewed by someone who shouldn't have
permission. Instead, consider updating your application to use passwordless
connections.
Many Azure services support passwordless connections, for example via Azure Managed
Identity. These techniques provide robust security features that you can implement
using DefaultAzureCredential from the Azure Identity client libraries. In this tutorial,
you'll learn how to update an existing application to use DefaultAzureCredential
instead of alternatives such as connection strings.
determines which should be used at runtime. This approach enables your app to use
different authentication methods in different environments (local dev vs. production)
without implementing environment-specific code.
The order and locations in which DefaultAzureCredential searches for credentials can
be found in the Azure Identity library overview. For example, when working locally,
DefaultAzureCredential will generally authenticate using the account the developer
To ensure that connections are passwordless, you must take into consideration both
local development and the production environment. If a connection string is required in
either place, then the application isn't passwordless.
In your local development environment, you can authenticate with Azure CLI, Azure
PowerShell, Visual Studio, or Azure plugins for Visual Studio Code or IntelliJ. In this case,
you can use that credential in your application instead of configuring properties.
7 Note
Bash
export AZ_RESOURCE_GROUP=<YOUR_RESOURCE_GROUP>
export AZ_DATABASE_SERVER_NAME=<YOUR_DATABASE_SERVER_NAME>
export AZ_DATABASE_NAME=demo
export AZ_MYSQL_AD_NON_ADMIN_USERNAME=
<YOUR_AZURE_AD_NON_ADMIN_USER_DISPLAY_NAME>
export AZ_MYSQL_AD_MI_USERNAME=<YOUR_AZURE_AD_MI_DISPLAY_NAME>
export AZ_USER_IDENTITY_NAME=<YOUR_USER_ASSIGNED_MANAGEMED_IDENTITY_NAME>
export CURRENT_USERNAME=$(az ad signed-in-user show --query
userPrincipalName --output tsv)
export CURRENT_USER_OBJECTID=$(az ad signed-in-user show --query id --output
tsv)
Replace the placeholders with the following values, which are used throughout this
article:
<YOUR_RESOURCE_GROUP> : The name of the resource group your resources are in.
<YOUR_DATABASE_SERVER_NAME> : The name of your MySQL server, which should be
non-admin user. Make sure the name is a valid user in your Azure AD tenant.
<YOUR_AZURE_AD_MI_DISPLAY_NAME> : The display name of Azure AD user for your
managed identity. Make sure the name is a valid user in your Azure AD tenant.
<YOUR_USER_ASSIGNED_MANAGEMED_IDENTITY_NAME> : The name of your user-assigned
managed identity server, which should be unique across Azure.
If you're using Azure CLI, run the following command to make sure it has sufficient
permission:
Bash
Run the following command to the create user identity for assigning:
Azure CLI
az identity create \
--resource-group $AZ_RESOURCE_GROUP \
--name $AZ_USER_IDENTITY_NAME
) Important
Run the following command to assign the identity to the MySQL server for creating the
Azure AD admin:
Azure CLI
Azure CLI
This command will set the Azure AD admin to the current signed-in user.
7 Note
You can only create one Azure AD admin per MySQL server. Selection of another
one will overwrite the existing Azure AD admin configured for the server.
You can skip this step if you're using Bash because the flexible-server create
command already detected your local IP address and set it on MySQL server.
If you're connecting to your MySQL server from Windows Subsystem for Linux (WSL) on
a Windows computer, you need to add the WSL host ID to your firewall. Obtain the IP
address of your host machine by running the following command in WSL:
Bash
cat /etc/resolv.conf
Copy the IP address following the term nameserver , then use the following command to
set an environment variable for the WSL IP address:
Bash
AZ_WSL_IP_ADDRESS=<the-copied-IP-address>
Then, use the following command to open the server's firewall to your WSL-based app:
Azure CLI
Create a SQL script called create_ad_user.sql for creating a non-admin user. Add the
following contents and save it locally:
Bash
Then, use the following command to run the SQL script to create the Azure AD non-
admin user:
Bash
Now use the following command to remove the temporary SQL script file:
Bash
rm create_ad_user.sql
7 Note
You can read more detailed information about creating MySQL users in Create
users in Azure Database for MySQL.
Azure CLI
Sign in to Azure through the Azure CLI by using the following command:
Azure CLI
az login
Next, use the following steps to update your code to use passwordless connections.
Although conceptually similar, each language uses different implementation details.
Java
XML
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity-extensions</artifactId>
<version>1.0.0</version>
</dependency>
2. Enable the Azure MySQL authentication plugin in the JDBC URL. Identify the
locations in your code that currently create a java.sql.Connection to connect
to Azure Database for MySQL. Update url and user in your
application.properties file to match the following values:
properties
url=jdbc:mysql://$AZ_DATABASE_SERVER_NAME.mysql.database.azure.com:
3306/$AZ_DATABASE_NAME?
serverTimezone=UTC&sslMode=REQUIRED&defaultAuthenticationPlugin=com
.azure.identity.extensions.jdbc.mysql.AzureMysqlAuthenticationPlugi
n&authenticationPlugins=com.azure.identity.extensions.jdbc.mysql.Az
ureMysqlAuthenticationPlugin
user=$AZ_MYSQL_AD_NON_ADMIN_USERNAME
7 Note
properties
url=jdbc:mysql://$AZ_DATABASE_SERVER_NAME.mysql.database.azure.com:
3306/$AZ_DATABASE_NAME?
serverTimezone=UTC&sslMode=REQUIRED&authenticationPlugins=com.azure
.identity.extensions.jdbc.mysql.AzureMysqlAuthenticationPlugin
user=$AZ_MYSQL_AD_NON_ADMIN_USERNAME
In this section, you'll execute two steps to enable your application to run in an Azure
hosting environment in a passwordless way:
7 Note
Azure also provides Service Connector, which can help you connect your hosting
service with PostgreSQL. With Service Connector to configure your hosting
environment, you can omit the step of assigning roles to your managed identity
because Service Connector will do it for you. The following section describes how
to configure your Azure hosting environment in two ways: one via Service
Connector and the other by configuring each hosting environment directly.
) Important
App Service
1. On the main overview page of your Azure App Service instance, select Identity
from the navigation pane.
2. On the System assigned tab, make sure to set the Status field to on. A system
assigned identity is managed by Azure internally and handles administrative
tasks for you. The details and IDs of the identity are never exposed in your
code.
You can also assign managed identity on an Azure hosting environment by using the
Azure CLI.
App Service
You can assign a managed identity to an Azure App Service instance with the az
webapp identity assign command, as shown in the following example:
Azure CLI
Next, grant permissions to the managed identity you assigned to access your MySQL
instance.
These steps will create an Azure AD user for the managed identity and grant all
permissions for the database $AZ_DATABASE_NAME to it. You can change the database
name $AZ_DATABASE_NAME to fit your needs.
First, create a SQL script called create_ad_user.sql for creating a non-admin user. Add the
following contents and save it locally:
Bash
Then, use the following command to run the SQL script to create the Azure AD non-
admin user:
Bash
Now use the following command to remove the temporary SQL script file:
Bash
rm create_ad_user.sql
Java
Update your code to use the user created for the managed identity:
Java
properties.put("user", "$AZ_MYSQL_AD_MI_USERNAME");
After making these code changes, you can build and redeploy the application. Then,
browse to your hosted application in the browser. Your app should be able to connect
to the MySQL database successfully. Keep in mind that it may take several minutes for
the role assignments to propagate through your Azure environment. Your application is
now configured to run both locally and in a production environment without the
developers having to manage secrets in the application itself.
Next steps
In this tutorial, you learned how to migrate an application to passwordless connections.
You can read the following resources to explore the concepts discussed in this article in
more depth:
Authorize access to blob data with managed identities for Azure resources.
Authorize access to blobs using Azure Active Directory
Migrate an application to use
passwordless connections with Azure
Database for PostgreSQL
Article • 05/30/2023
This article explains how to migrate from traditional authentication methods to more
secure, passwordless connections with Azure Database for PostgreSQL.
Azure AD authentication
Microsoft Azure AD authentication is a mechanism for connecting to Azure Database for
PostgreSQL using identities defined in Azure AD. With Azure AD authentication, you can
manage database user identities and other Microsoft services in a central location, which
simplifies permission management.
Although it's possible to connect to Azure Database for PostgreSQL with passwords, you
should use them with caution. You must be diligent to never expose the passwords in an
unsecure location. Anyone who gains access to the passwords is able to authenticate.
For example, there's a risk that a malicious user can access the application if a
connection string is accidentally checked into source control, sent through an unsecure
email, pasted into the wrong chat, or viewed by someone who shouldn't have
permission. Instead, consider updating your application to use passwordless
connections.
Many Azure services support passwordless connections, for example via Azure Managed
Identity. These techniques provide robust security features that you can implement
using DefaultAzureCredential from the Azure Identity client libraries. In this tutorial,
you'll learn how to update an existing application to use DefaultAzureCredential
instead of alternatives such as connection strings.
determines which should be used at runtime. This approach enables your app to use
different authentication methods in different environments (local dev vs. production)
without implementing environment-specific code.
The order and locations in which DefaultAzureCredential searches for credentials can
be found in the Azure Identity library overview. For example, when working locally,
DefaultAzureCredential will generally authenticate using the account the developer
used to sign in to Visual Studio. When the app is deployed to Azure,
DefaultAzureCredential will automatically switch to use a managed identity. No code
changes are required for this transition.
To ensure that connections are passwordless, you must take into consideration both
local development and the production environment. If a connection string is required in
either place, then the application isn't passwordless.
In your local development environment, you can authenticate with Azure CLI, Azure
PowerShell, Visual Studio, or Azure plugins for Visual Studio Code or IntelliJ. In this case,
you can use that credential in your application instead of configuring properties.
7 Note
Bash
export AZ_RESOURCE_GROUP=<YOUR_RESOURCE_GROUP>
export AZ_DATABASE_SERVER_NAME=<YOUR_DATABASE_SERVER_NAME>
export AZ_DATABASE_NAME=demo
export AZ_POSTGRESQL_AD_NON_ADMIN_USERNAME=
<YOUR_AZURE_AD_NON_ADMIN_USER_DISPLAY_NAME>
export AZ_LOCAL_IP_ADDRESS=<YOUR_LOCAL_IP_ADDRESS>
export CURRENT_USERNAME=$(az ad signed-in-user show --query
userPrincipalName --output tsv)
Replace the placeholders with the following values, which are used throughout this
article:
<YOUR_RESOURCE_GROUP> : The name of the resource group your resources are in.
non-admin user. Make sure the name is a valid user in your Azure AD tenant.
<YOUR_LOCAL_IP_ADDRESS> : The IP address of your local computer, from which you'll
run your Spring Boot application. One convenient way to find it is to open
whatismyip.akamai.com .
To set up an Azure AD administrator after creating the server, follow the steps in
Manage Azure Active Directory roles in Azure Database for PostgreSQL - Flexible Server.
7 Note
Azure Database for PostgreSQL instances are secured by default. They have a firewall
that doesn't allow any incoming connection. To be able to use your database, you need
to add a firewall rule that will allow the local IP address to access the database server.
Because you configured your local IP address at the beginning of this article, you can
open the server's firewall by running the following command:
Azure CLI
Obtain the IP address of your host machine by running the following command in WSL:
Bash
cat /etc/resolv.conf
Copy the IP address following the term nameserver , then use the following command to
set an environment variable for the WSL IP Address:
Bash
AZ_WSL_IP_ADDRESS=<the-copied-IP-address>
Then, use the following command to open the server's firewall to your WSL-based app:
Azure CLI
Create a SQL script called create_ad_user_local.sql for creating a non-admin user. Add
the following contents and save it locally:
Bash
Bash
psql "host=$AZ_DATABASE_SERVER_NAME.postgres.database.azure.com
user=$CURRENT_USERNAME dbname=postgres port=5432 password=$(az account get-
access-token --resource-type oss-rdbms --output tsv --query accessToken)
sslmode=require" < create_ad_user_local.sql
Now use the following command to remove the temporary SQL script file:
Bash
rm create_ad_user_local.sql
7 Note
You can read more detailed information about creating PostgreSQL users in Create
users in Azure Database for PostgreSQL.
Azure CLI
Sign in to Azure through the Azure CLI by using the following command:
Azure CLI
az login
Next, use the following steps to update your code to use passwordless connections.
Although conceptually similar, each language uses different implementation details.
Java
1. Inside your project, add the following reference to the azure-identity-
extensions package. This library contains all of the entities necessary to
implement passwordless connections.
XML
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity-extensions</artifactId>
<version>1.0.0</version>
</dependency>
2. Enable the Azure PostgreSQL authentication plugin in JDBC URL. Identify the
locations in your code that currently create a java.sql.Connection to connect
to Azure Database for PostgreSQL. Update url and user in your
application.properties file to match the following values:
properties
url=jdbc:postgresql://$AZ_DATABASE_SERVER_NAME.postgres.database.az
ure.com:5432/$AZ_DATABASE_NAME?
sslmode=require&authenticationPluginClassName=com.azure.identity.ex
tensions.jdbc.postgresql.AzurePostgresqlAuthenticationPlugin
user=$AZ_POSTGRESQL_AD_NON_ADMIN_USERNAME
In this section, you'll execute two steps to enable your application to run in an Azure
hosting environment in a passwordless way:
7 Note
Azure also provides Service Connector, which can help you connect your hosting
service with PostgreSQL. With Service Connector to configure your hosting
environment, you can omit the step of assigning roles to your managed identity
because Service Connector will do it for you. The following section describes how
to configure your Azure hosting environment in two ways: one via Service
Connector and the other by configuring each hosting environment directly.
) Important
The following steps show you how to assign a system-assigned managed identity for
various web hosting services. The managed identity can securely connect to other Azure
Services using the app configurations you set up previously.
App Service
1. On the main overview page of your Azure App Service instance, select Identity
from the navigation pane.
2. On the System assigned tab, make sure to set the Status field to on. A system
assigned identity is managed by Azure internally and handles administrative
tasks for you. The details and IDs of the identity are never exposed in your
code.
You can also assign managed identity on an Azure hosting environment by using the
Azure CLI.
App Service
You can assign a managed identity to an Azure App Service instance with the az
webapp identity assign command, as shown in the following example:
Azure CLI
Next, grant permissions to the managed identity you assigned to access your
PostgreSQL instance.
Service Connector
If you connected your services using Service Connector, the previous step's
commands already assigned the role, so you can skip this step.
Java
Update your code to use the user created for the managed identity:
7 Note
Java
properties.put("user", "$AZ_POSTGRESQL_AD_MI_USERNAME");
After making these code changes, you can build and redeploy the application. Then,
browse to your hosted application in the browser. Your app should be able to connect
to the PostgreSQL database successfully. Keep in mind that it may take several minutes
for the role assignments to propagate through your Azure environment. Your
application is now configured to run both locally and in a production environment
without the developers having to manage secrets in the application itself.
Next steps
In this tutorial, you learned how to migrate an application to passwordless connections.
You can read the following resources to explore the concepts discussed in this article in
more depth:
Authorize access to blob data with managed identities for Azure resources.
Authorize access to blobs using Azure Active Directory
Migrate an application to use
passwordless connections with Azure
Service Bus
Article • 06/12/2023
Application requests to Azure Service Bus must be authenticated using either account
access keys or passwordless connections. However, you should prioritize passwordless
connections in your applications when possible. This tutorial explores how to migrate
from traditional authentication methods to more secure, passwordless connections.
.NET
C#
Connection strings should be used with caution. Developers must be diligent to never
expose the keys in an unsecure location. Anyone who gains access to the key is able to
authenticate. For example, if an account key is accidentally checked into source control,
sent through an unsecure email, pasted into the wrong chat, or viewed by someone who
shouldn't have permission, there's risk of a malicious user accessing the application.
Instead, consider updating your application to use passwordless connections.
determines which should be used at runtime. This approach enables your app to use
different authentication methods in different environments (local dev vs. production)
without implementing environment-specific code.
The order and locations in which DefaultAzureCredential searches for credentials can
be found in the Azure Identity library overview and varies between languages. For
example, when working locally with .NET, DefaultAzureCredential will generally
authenticate using the account the developer used to sign-in to Visual Studio, Azure CLI,
or Azure PowerShell. When the app is deployed to Azure, DefaultAzureCredential will
automatically discover and use the managed identity of the associated hosting service,
such as Azure App Service. No code changes are required for this transition.
7 Note
The following code example demonstrates how to connect to Service Bus using
passwordless connections. The next section describes how to migrate to this setup for a
specific service in more detail.
C#
In this scenario, you'll assign permissions to your user account scoped to a specific
Service Bus namespace, to follow the Principle of Least Privilege. This practice gives
users only the minimum permissions needed and creates more secure production
environments.
The following example will assign the Azure Service Bus Data Owner role to your user
account, which allows you to send and receive data.
) Important
In most cases it will take a minute or two for the role assignment to propagate in
Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and try
again.
Azure portal
1. In the Azure portal, locate your Service Bus namespace using the main search
bar or left navigation.
2. On the Service Bus overview page, select Access control (IAM) from the left-
hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this example,
search for Azure Service Bus Data Owner and select the matching result and
then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
7. In the dialog, search for your Azure AD username (usually your user@domain
email address) and then choose Select at the bottom of the dialog.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
For local development, make sure you're authenticated with the same Azure AD account
you assigned the role to. You can authenticate via popular development tools, such as
the Azure CLI or Azure PowerShell. The development tools with which you can
authenticate vary across languages.
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
.NET
.NET CLI
C#
using Azure.Identity;
C#
var serviceBusNamespace =
$"https://{namespace}.servicebus.windows.net";
ServiceBusClient client = new(
serviceBusNamespace,
new DefaultAzureCredential());
The following steps demonstrate how to create a system-assigned managed identity for
various web hosting services. The managed identity can securely connect to other Azure
Services using the app configurations you set up previously.
Service Connector
Some app hosting environments support Service Connector, which helps you
connect Azure compute services to other backing services. Service Connector
automatically configures network settings and connection information. You can
learn more about Service Connector and which scenarios are supported on the
overview page.
For this migration guide you'll use App Service, but the steps are similar on Azure
Spring Apps and Azure Container Apps.
7 Note
Azure Spring Apps currently only supports Service Connector using connection
strings.
1. On the main overview page of your App Service, select Service Connector
from the left navigation.
2. Select + Create from the top menu and the Create connection panel will
open. Enter the following values:
4. Leave the default values selected, and then choose Next: Review + Create.
Alternatively, you can also enable managed identity on an Azure hosting environment
using the Azure CLI.
Service Connector
You can use Service Connector to create a connection between an Azure compute
hosting environment and a target service using the Azure CLI. The CLI automatically
handles creating a managed identity and assigns the proper role, as explained in
the portal instructions.
If you're using an Azure App Service, use the az webapp connection command:
Azure CLI
If you're using Azure Spring Apps, use the az spring connection command:
Azure CLI
Azure CLI
Next, you need to grant permissions to the managed identity you created to access your
Service Bus. You can do this by assigning a role to the managed identity, just like you
did with your local development user.
Service Connector
If you connected your services using the Service Connector you don't need to
complete this step. The necessary configurations were handled for you:
If you selected a managed identity while creating the connection, a system-
assigned managed identity was created for your app and assigned the Azure
Service Bus Data Owner role on the Service Bus.
If you selected connection string, the connection string was added as an app
environment variable.
Next steps
In this tutorial, you learned how to migrate an application to passwordless connections.
Migrate an application to use
passwordless connections with Azure
Blob Storage
Article • 05/10/2023
The following tutorial explains how to migrate an existing application to connect using
passwordless connections. These same migration steps should apply whether you're
using access keys, connection strings, or another secrets-based approach.
In this scenario, you'll assign permissions to your user account, scoped to the storage
account, to follow the Principle of Least Privilege. This practice gives users only the
minimum permissions needed and creates more secure production environments.
The following example will assign the Storage Blob Data Contributor role to your user
account, which provides both read and write access to blob data in your storage
account.
) Important
In most cases it will take a minute or two for the role assignment to propagate in
Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and try
again.
Azure portal
1. In the Azure portal, locate your storage account using the main search bar or
left navigation.
2. On the storage account overview page, select Access control (IAM) from the
left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this example,
search for Storage Blob Data Contributor and select the matching result and
then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
7. In the dialog, search for your Azure AD username (usually your user@domain
email address) and then choose Select at the bottom of the dialog.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
.NET
.NET CLI
C#
using Azure.Identity;
4. Make sure to update the storage account name in the URI of your
BlobServiceClient . You can find the storage account name on the overview page
Azure portal
1. At the top of the Azure portal, search for Managed identities. Select the
Managed Identities result.
2. Select + Create at the top of the Managed Identities overview page.
3. On the Basics tab, enter the following values:
After the resource is created, select Go to resource to view the details of the
managed identity.
Azure portal
Complete the following steps in the Azure portal to associate an identity with your
app. These same steps apply to the following Azure services:
6. Search for the MigrationIdentity by name and select it from the search results.
Azure portal
1. Navigate to your storage account overview page and select Access Control
(IAM) from the left navigation.
3. In the Role search box, search for Storage Blob Data Contributor, which is a
common role used to manage data operations for blobs. You can assign
whatever role is appropriate for your use case. Select the Storage Blob Data
Contributor from the list and choose Next.
4. On the Add role assignment screen, for the Assign access to option, select
Managed identity. Then choose +Select members.
5. In the flyout, search for the managed identity you created by name and select
it from the results. Choose Select to close the flyout menu.
6. Select Next a couple times until you're able to select Review + assign to finish
the role assignment.
1. On the managed identity overview page, copy the client ID value to your clipboard.
.NET
C#
3. Redeploy your code to Azure after making this change in order for the
configuration updates to be applied.
Next steps
In this tutorial, you learned how to migrate an application to passwordless connections.
You can read the following resources to explore the concepts discussed in this article in
more depth:
Authorize access to blobs using Azure Active Directory
To learn more about .NET Core, see Get started with .NET in 10 minutes .
Migrate an application to use
passwordless connections with Azure
Queue Storage
Article • 05/10/2023
The following tutorial explains how to migrate an existing application to connect using
passwordless connections. These same migration steps should apply whether you're
using access keys, connection strings, or another secrets-based approach.
The following example assigns the Storage Queue Data Contributor role to your user
account. This role grants read and write access to queue data in your storage account.
Azure portal
1. In the Azure portal, locate your storage account using the main search bar or
left navigation.
2. On the storage account overview page, select Access control (IAM) from the
left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this example,
search for Storage Queue Data Contributor and select the matching result and
then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
7. In the dialog, search for your Azure AD username (usually your user@domain
email address) and then choose Select at the bottom of the dialog.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
) Important
In most cases, it will take a minute or two for the role assignment to propagate in
Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and try
again.
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
.NET
Go
Java
Node.js
Python
.NET
1. To use DefaultAzureCredential in a .NET application, install the
Azure.Identity package:
.NET CLI
C#
using Azure.Identity;
3. Identify the locations in your code that create a QueueClient object to connect
to Azure Queue Storage. Update your code to match the following example:
C#
4. Make sure to update the storage account name in the URI of your QueueClient
object. You can find the storage account name on the overview page of the Azure
portal.
Run the app locally
After making these code changes, run your application locally. The new configuration
should pick up your local credentials, such as the Azure CLI, Visual Studio, or IntelliJ. The
roles you assigned to your user in Azure allows your app to connect to the Azure service
locally.
Passwordless Overview
Managed identity best practices
Azure portal
1. At the top of the Azure portal, search for Managed identities. Select the
Managed Identities result.
2. Select + Create at the top of the Managed Identities overview page.
3. On the Basics tab, enter the following values:
You need to configure your web app to use the managed identity you created. Assign
the identity to your app using either the Azure portal or the Azure CLI.
Azure portal
Complete the following steps in the Azure portal to associate an identity with your
app. These same steps apply to the following Azure services:
4. Select + Add to open the Add user assigned managed identity flyout.
6. Search for the MigrationIdentity by name and select it from the search results.
Azure portal
1. Navigate to your storage account overview page and select Access Control
(IAM) from the left navigation.
3. In the Role search box, search for Storage Queue Data Contributor, which is a
common role used to manage data operations for queues. You can assign
whatever role is appropriate for your use case. Select the Storage Queue Data
Contributor from the list and choose Next.
4. On the Add role assignment screen, for the Assign access to option, select
Managed identity. Then choose +Select members.
5. In the flyout, search for the managed identity you created by name and select
it from the results. Choose Select to close the flyout menu.
6. Select Next a couple times until you're able to select Review + assign to finish
the role assignment.
1. On the managed identity overview page, copy the client ID value to your clipboard.
.NET
C#
3. Redeploy your code to Azure after making this change in order for the
configuration updates to be applied.
Next steps
In this tutorial, you learned how to migrate an application to passwordless connections.
You can read the following resources to explore the concepts discussed in this article in
more depth:
Authorize access to blobs using Azure Active Directory
To learn more about .NET, see Get started with .NET in 10 minutes .
Tutorial: Create a passwordless
connection to a database service via
Service Connector
Article • 08/02/2023
Passwordless connections use managed identities to access Azure services. With this
approach, you don't have to manually track and manage secrets for managed identities.
These tasks are securely handled internally by Azure.
Service Connector enables managed identities in app hosting services like Azure Spring
Apps, Azure App Service, and Azure Container Apps. Service Connector also configures
database services, such as Azure Database for PostgreSQL, Azure Database for MySQL,
and Azure SQL Database, to accept managed identities.
In this tutorial, you use the Azure CLI to complete the following tasks:
Prerequisites
Azure CLI version 2.48.1 or higher.
An Azure account with an active subscription. Create an Azure account for free .
An app deployed to Azure App Service in a region supported by Service
Connector.
Set up environment
Account
Sign in with the Azure CLI via az login . If you're using Azure Cloud Shell or are already
logged in, confirm your authenticated account with az account show .
Network connectivity
If your database server is in Virtual Network, ensure your environment that runs the
Azure CLI command can access the server in the Virtual Network.
Azure CLI
If you use:
Azure Spring Apps, use az spring connection create instead. For more examples,
see Connect Azure Spring Apps to the Azure database.
Azure Container Apps, use az containerapp connection create instead. For more
examples, see Create and connect a PostgreSQL database with identity
connectivity.
7 Note
If you use the Azure portal, go to the Service Connector blade of Azure App
Service, Azure Spring Apps, or Azure Container Apps, and select Create to create
a connection. The Azure portal will automatically compose the command for you
and trigger the command execution on Cloud Shell.
The following Azure CLI commands use a --client-type parameter. Run the az webapp
connection create postgres-flexible -h to get the supported client types, and choose
Azure CLI
This Service Connector command completes the following tasks in the background:
Enable system-assigned managed identity, or assign a user identity for the app
$APPSERVICE_NAME hosted by Azure App Service/Azure Spring Apps/Azure
Container Apps.
Set the Azure Active Directory admin to the current signed-in user.
Add a database user for the system-assigned managed identity, user-assigned
managed identity, or service principal. Grant all privileges of the database
$DATABASE_NAME to this user. The username can be found in the connection string
Troubleshooting
Permission
If you encounter any permission-related errors, confirm the Azure CLI signed-in user
with the command az account show . Make sure you log in with the correct account.
Next, confirm that you have the following permissions that may be required to create a
passwordless connection with Service Connector.
Permission Operation
server
In some cases, the permissions aren't required. For example, if the Azure CLI-
authenticated user is already an Active Directory Administrator on SQL server, you don't
need to have the Microsoft.Sql/servers/administrators/write permission.
Service Connector needs to access Azure Active Directory to get information of your
account and managed identity of hosting service. You can use the following command
to check if your device can access Azure Active Directory:
Azure CLI
az ad signed-in-user show
If you don't log in interactively, you may also get the error and Interactive
authentication is needed . To resolve the error, log in with the az login command.
Connect to database with Azure Active
Directory authentication
After creating the connection, you can use the connection string in your application to
connect to the database with Azure Active Directory authentication. For example, you
can use the following solutions to connect to the database with Azure Active Directory
authentication.
Java
XML
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.3.6</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity-extensions</artifactId>
<version>1.1.5</version>
</dependency>
2. Get the connection string from environment variables and add the plugin
name to connect to the database:
Java
import java.sql.*;
App Service
For Azure App Service, you can deploy the application code via the az webapp
deploy command. For more information, see Quickstart: Deploy an ASP.NET web
app.
Then you can check the log or call the application to see if it can connect to the
database on Azure successfully.
Next steps
For more information about Service Connector and passwordless connections, see the
following resources:
This page shows all the supported compute services, clients, and authentication types to
connect services to Azure SQL Database instances, using Service Connector. This page
also shows the default environment variable names and application properties needed
to create service connections. You might still be able to connect to an Azure SQL
Database instance using other programming languages, without using Service
Connector. Learn more about the Service Connector environment variable naming
conventions.
.NET
Go
Java
Java -
Spring Boot
Node.js
PHP
Python
Python -
Django
Client type System-assigned User-assigned Secret/connection Service
managed identity managed identity string principal
Ruby
None
7 Note
.NET (sqlClient)
y-client-ID>;Authentication=ActiveDirectoryManage
dIdentity
Go (go-mssqldb)
Node.js
Azure_SQL_CLIENTSECRET Azure SQL Database client Secre <your Client Secret >
t
Python (pyobdc)
Ruby
Next steps
Follow the tutorial listed below to learn more about Service Connector.
This page shows the supported authentication types and client types of Azure Database
for MySQL - Flexible Server using Service Connector. You might still be able to connect
to Azure Database for MySQL in other programming languages without using Service
Connector. This page also shows default environment variable names and values (or
Spring Boot configuration) you get when you create the service connection. You can
learn more about Service Connector environment variable naming convention.
) Important
Azure Database for MySQL - Single Server is on the retirement path. We strongly
recommend for you to upgrade to Azure Database for MySQL - Flexible Server. For
more information about migrating to Azure Database for MySQL - Flexible Server,
see What's happening to Azure Database for MySQL Single Server?
.NET
Go (go-sql-driver
for mysql)
Client type System-assigned User-assigned Secret/connection Service
managed managed string principal
identity identity
Java (JDBC)
Java - Spring
Boot (JDBC)
Node.js (mysql)
Python (mysql-
connector-
python)
Python-Django
PHP (MySQLi)
Ruby (mysql2)
None
7 Note
Database for MySQL name, Azure Database for MySQL username, Azure Database for
MySQL password, server host, and port.
.NET (MySqlConnector)
Python (mysql-connector-python)
PHP (MySQLi)
Ruby (mysql2)
Next steps
Follow the tutorials listed below to learn more about Service Connector.
This page shows the supported authentication types and client types of Azure Database for
PostgreSQL using Service Connector. You might still be able to connect to Azure Database
for PostgreSQL in other programming languages without using Service Connector. This page
also shows default environment variable names and values (or Spring Boot configuration)
you get when you create the service connection. You can learn more about Service
Connector environment variable naming convention.
.NET
Go (pg)
Java (JDBC)
Java - Spring
Boot (JDBC)
Node.js (pg)
PHP (native)
Python
(psycopg2)
Python-Django
Ruby (ruby-pg)
None
7 Note
password.
.NET (ADO.NET)
Go (pg)
Java (JDBC)
connection name>?sslmode=require&user=<username>
string
Node.js (pg)
PHP (native)
Python
Ruby (ruby-pg)
Next steps
Follow the tutorials listed below to learn more about Service Connector.
You can read more about best practices and when to use system-assigned identities
versus user-assigned identities in the identities best practice recommendations.
Explore DefaultAzureCredential
Managed identities are generally implemented in your application code through a class
called DefaultAzureCredential from the Azure.Identity client library.
DefaultAzureCredential supports multiple authentication methods and automatically
determines which should be used at runtime. You can read more about this approach in
the DefaultAzureCredential overview.
This tutorial applies to the following architectures, though it can be adapted to many
other scenarios as well through minimal configuration changes.
3. Toggle the Status setting to On to enable a system assigned managed identity for
the service.
5. On the Add role assignment screen, for the Assign access to option, select
Managed identity. Then choose +Select members.
6. In the flyout, search for the managed identity you created by entering the name of
your app service. Select the system assigned identity, and then choose Select to
close the flyout menu.
7. Select Next a couple times until you're able to select Review + assign to finish the
role assignment.
8. Repeat this process for the other services you would like to connect to.
Local development considerations
You can also enable access to Azure resources for local development by assigning roles
to a user account the same way you assigned roles to your managed identity.
1. After assigning the Storage Blob Data Contributor role to your managed identity,
under Assign access to, this time select User, group or service principal. Choose +
Select members to open the flyout menu again.
2. Search for the user@domain account or Azure AD security group you would like to
grant access to by email address or name, and then select it. This should be the
same account you use to sign-in to your local development tooling with, such as
Visual Studio or the Azure CLI.
7 Note
You can also assign these roles to an Azure Active Directory security group if you
are working on a team with multiple developers. You can then place any developer
inside that group who needs access to develop the app locally.
C#
Inside of your project, add a reference to the Azure.Identity NuGet package. This
library contains all of the necessary entities to implement DefaultAzureCredential .
You can also add any other Azure libraries that are relevant to your app. For this
example, the Azure.Storage.Blobs and Azure.KeyVault.Keys packages are added in
order to connect to Blob Storage and Key Vault.
.NET CLI
At the top of your Program.cs file, add the following using statements:
C#
using Azure.Identity;
using Azure.Storage.Blobs;
using Azure.Security.KeyVault.Keys;
In the Program.cs file of your project code, create instances of the necessary
services your app will connect to. The following examples connect to Blob Storage
and service bus using the corresponding SDK classes.
C#
When this application code runs locally, DefaultAzureCredential will search down a
credential chain for the first available credentials. If the Managed_Identity_Client_ID is
null locally, it will automatically use the credentials from your local Azure CLI or Visual
Studio sign-in. You can read more about this process in the Azure Identity library
overview.
This overall process ensures that your app can run securely locally and in Azure without
the need for any code changes.
To configure this setup in your code, make sure your application registers separate
services to connect to each storage account or database. Make sure to pull in the
correct managed identity client IDs for each service when configuring
DefaultAzureCredential . The following code example configures the following service
connections:
C#
C#
These types of scenarios are explored in more depth in the identities best practice
recommendations.
Next steps
In this tutorial, you learned how to migrate an application to passwordless connections.
You can read the following resources to explore the concepts discussed in this article in
more depth:
Managed identities for Azure resources is a feature of Azure Active Directory. Each of
the Azure services that support managed identities for Azure resources are subject to
their own timeline. Make sure you review the availability status of managed identities for
your resource and known issues before you begin.
Managed identities for Azure resources provides Azure services with an automatically
managed identity in Azure Active Directory. You can use this identity to authenticate to
any service that supports Azure AD authentication, without having credentials in your
code.
In this article, you learn how to enable and disable system and user-assigned managed
identities for an Azure Virtual Machine (VM), using the Azure portal.
Prerequisites
If you're unfamiliar with managed identities for Azure resources, check out the
overview section.
If you don't already have an Azure account, sign up for a free account before
continuing.
1. Sign in to the Azure portal using an account associated with the Azure
subscription that contains the VM.
If you have a Virtual Machine that no longer needs system-assigned managed identity:
1. Sign in to the Azure portal using an account associated with the Azure
subscription that contains the VM.
3. Under System assigned, Status, select Off and then click Save:
Currently, the Azure portal does not support assigning a user-assigned managed
identity during the creation of a VM. Instead, refer to one of the following VM creation
Quickstart articles to first create a VM, and then proceed to the next section for details
on assigning a user-assigned managed identity to the VM:
1. Sign in to the Azure portal using an account associated with the Azure
subscription that contains the VM.
2. Navigate to the desired VM and click Identity, User assigned and then +Add.
3. Click the user-assigned identity you want to add to the VM and then click Add.
1. Sign in to the Azure portal using an account associated with the Azure
subscription that contains the VM.
2. Navigate to the desired VM and click Identity, User assigned, the name of the
user-assigned managed identity you want to delete and then click Remove (click
Yes in the confirmation pane).
Next steps
Using the Azure portal, give an Azure VM's managed identity access to another
Azure resource.
Connect to and query Azure SQL
Database using .NET and Entity
Framework Core
Article • 05/11/2023
Prerequisites
An Azure subscription .
A SQL database configured with Azure Active Directory (Azure AD) authentication.
You can create one using the Create database quickstart.
.NET 7.0 or later.
Visual Studio or later with the ASP.NET and web development workload.
The latest version of the Azure CLI.
The latest version of the Entity Framework Core tools:
Visual Studio users should install the Package Manager Console tools for Entity
Framework Core.
.NET CLI users should install the .NET CLI tools for Entity Framework Core.
1. For local development connections, make sure your logical server is configured to
allow your local machine IP address and other Azure services to connect:
Select Add your client IPv4 address(xx.xx.xx.xx) to add a firewall rule that
will enable connections from your local machine IPv4 address. Alternatively,
you can also select + Add a firewall rule to enter a specific IP address of your
choice.
Make sure the Allow Azure services and resources to access this server
checkbox is selected.
2 Warning
Enabling the Allow Azure services and resources to access this server
setting is not a recommended security practice for production scenarios.
Real applications should implement more secure approaches, such as
stronger firewall restrictions or virtual network configurations.
2. The server must also have Azure AD authentication enabled with an Azure Active
Directory admin account assigned. For local development connections, the Azure
Active Directory admin account should be an account you can also log into Visual
Studio or the Azure CLI with locally. You can verify whether your server has Azure
AD authentication enabled on the Azure Active Directory page.
3. If you're using a personal Azure account, make sure you have Azure Active
Directory setup and configured for Azure SQL Database in order to assign your
account as a server admin. If you're using a corporate account, Azure Active
Directory will most likely already be configured for you.
Visual Studio
1. In the Visual Studio menu bar, navigate to File > New > Project...
2. In the dialog window, enter ASP.NET into the project template search box and
select the ASP.NET Core Web API result. Choose Next at the bottom of the
dialog.
3. For the Project Name, enter DotNetSQL. Leave the default values for the rest
of the fields and select Next.
4. For the Framework, select .NET 7.0 and uncheck Use controllers (uncheck to
use minimal APIs). This quickstart uses a Minimal API template to streamline
endpoint creation and configuration.
5. Choose Create. The new project opens inside the Visual Studio environment.
2. In the resulting window, search for EntityFrameworkCore. Locate and install the
following packages:
Alternatively, you can also run the Install-Package cmdlet in the Package Manager
Console window:
PowerShell
Install-Package Microsoft.EntityFrameworkCore
Install-Package Microsoft.EntityFrameworkCore.SqlServer
Install-Package Microsoft.EntityFrameworkCore.Design
which to use at runtime. This approach enables your app to use different authentication
methods in different environments (local vs. production) without implementing
environment-specific code. The Azure Identity library overview explains the order and
locations in which DefaultAzureCredential looks for credentials.
Complete the following steps to connect to Azure SQL Database using Entity Framework
Core and the underlying DefaultAzureCredential class:
1. Add a ConnectionStrings section to the appsettings.Development.json file so that
it matches the following code. Remember to update the <your database-server-
name> and <your-database-name> placeholders.
7 Note
JSON
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
},
"ConnectionStrings": {
"AZURE_SQL_CONNECTIONSTRING": "Data
Source=passwordlessdbserver.database.windows.net;
Initial Catalog=passwordlessdb; Authentication=Active
Directory Default; Encrypt=True;"
}
}
2. Add the following code to the Program.cs file above the line of code that reads
var app = builder.Build(); . This code performs the following configurations:
Registers the Entity Framework Core DbContext class with the .NET
dependency injection container.
C#
var connection = String.Empty;
if (builder.Environment.IsDevelopment())
{
builder.Configuration.AddEnvironmentVariables().AddJsonFile("appse
ttings.Development.json");
connection =
builder.Configuration.GetConnectionString("AZURE_SQL_CONNECTIONSTR
ING");
}
else
{
connection =
Environment.GetEnvironmentVariable("AZURE_SQL_CONNECTIONSTRING");
}
builder.Services.AddDbContext<PersonDbContext>(options =>
options.UseSqlServer(connection));
3. Add the following endpoints to the bottom of the Program.cs file above app.Run()
to retrieve and add entities in the database using the PersonDbContext class.
C#
Finally, add the Person and PersonDbContext classes to the bottom of the
Program.cs file. The Person class represents a single record in the database's
Persons table. The PersonDbContext class represents the Person database and
allows you to perform operations on it through code. You can read more about
DbContext in the Getting Started documentation for Entity Framework Core.
C#
public class Person
{
public int Id { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
}
2. Run the following command to generate an initial migration that can create the
database:
Visual Studio
PowerShell
Add-Migration InitialCreate
3. A Migrations folder should appear in your project directory, along with a file
called InitialCreate with unique numbers prepended. Run the migration to
create the database using the following command:
Visual Studio
PowerShell
Update-Database
The Entity Framework Core tooling will create the database schema in Azure
defined by the PersonDbContext class.
1. Press the run button at the top of Visual Studio to launch the API project.
2. On the Swagger UI page, expand the POST method and select Try it.
3. Modify the sample JSON to include values for the first and last name. Select
Execute to add a new record to the database. The API returns a successful
response.
4. Expand the GET method on the Swagger UI page and select Try it. Select Execute,
and the person you just created is returned.
3. In the publishing dialog, select Azure as the deployment target, and then select
Next.
4. For the specific target, select Azure App Service (Windows), and then select Next.
5. Select the green + icon to create a new App Service to deploy to and enter the
following values:
Resource group: Select New and create a new resource group called msdocs-
dotnet-sql.
Hosting Plan: Select New to open the hosting plan dialog. Leave the default
values and select OK.
Select Create to close the original dialog. Visual Studio creates the App
Service resource in Azure.
6. Once the resource is created, make sure it's selected in the list of app services, and
then select Next.
7. On the API Management step, select the Skip this step checkbox at the bottom
and then select Finish.
8. Select Publish in the upper right of the publishing profile summary to deploy the
app to Azure.
When the deployment finishes, Visual Studio launches the browser to display the hosted
app, but at this point the app doesn't work correctly on Azure. You still need to
configure the secure connection between the App Service and the SQL database to
retrieve your data.
for you.
Azure CLI
You can verify the changes made by Service Connector on the App Service settings.
1. Navigate to the Identity page for your App Service. Under the System
assigned tab, the Status should be set to On. This value means that a system-
assigned managed identity was enabled for your app.
2. Navigate to the Configuration page for your App Service. Under the
Connection strings tab, you should see a connection string called
AZURE_SQL_CONNECTIONSTRING. Select the Click to show value text to
view the generated passwordless connection string. The name of this
connection string aligns with the one you configured in your app, so it will be
discovered automatically when running in Azure.
) Important
Although this solution provides a simple approach for getting started, it is not a
best practice for enterprise production environments. In those scenarios the app
should not perform all operations using a single, elevated identity. You should try
to implement the principle of least privilege by configuring multiple identities with
specific permissions for specific tasks.
You can read more about configuring database roles and security on the following
resources:
Azure portal
1. In the Azure portal search bar, search for Azure SQL and select the matching
result.
4. On the Azure you sure you want to delete... page that opens, type the name
of your database to confirm, and then select Delete.
Next steps
Tutorial: Secure a database in Azure SQL Database
Authorize database access to SQL Database
An overview of Azure SQL Database security capabilities
Azure SQL Database security best practices
Connect to and query Azure SQL
Database using .NET and the
Microsoft.Data.SqlClient library
Article • 07/11/2023
Prerequisites
An Azure subscription .
An Azure SQL database configured with Azure Active Directory (Azure AD)
authentication. You can create one using the Create database quickstart.
The latest version of the Azure CLI.
Visual Studio or later with the ASP.NET and web development workload.
.NET 7.0 or later.
1. For local development connections, make sure your logical server is configured to
allow your local machine IP address and other Azure services to connect:
Select Add your client IPv4 address(xx.xx.xx.xx) to add a firewall rule that
will enable connections from your local machine IPv4 address. Alternatively,
you can also select + Add a firewall rule to enter a specific IP address of your
choice.
Make sure the Allow Azure services and resources to access this server
checkbox is selected.
2 Warning
Enabling the Allow Azure services and resources to access this server
setting is not a recommended security practice for production scenarios.
Real applications should implement more secure approaches, such as
stronger firewall restrictions or virtual network configurations.
2. The server must also have Azure AD authentication enabled with an Azure Active
Directory admin account assigned. For local development connections, the Azure
Active Directory admin account should be an account you can also log into Visual
Studio or the Azure CLI with locally. You can verify whether your server has Azure
AD authentication enabled on the Azure Active Directory page.
3. If you're using a personal Azure account, make sure you have Azure Active
Directory setup and configured for Azure SQL Database in order to assign your
account as a server admin. If you're using a corporate account, Azure Active
Directory will most likely already be configured for you.
Visual Studio
1. In the Visual Studio menu, navigate to File > New > Project...
2. In the dialog window, enter ASP.NET into the project template search box and
select the ASP.NET Core Web API result. Choose Next at the bottom of the
dialog.
3. For the Project Name, enter DotNetSQL. Leave the default values for the rest
of the fields and select Next.
4. For the Framework, select .NET 7.0 and uncheck Use controllers (uncheck to
use minimal APIs). This quickstart uses a Minimal API template to streamline
endpoint creation and configuration.
5. Choose Create. The new project opens inside the Visual Studio environment.
7 Note
additional capabilities.
Visual Studio
For local development with passwordless connections to Azure SQL Database, add
the following ConnectionStrings section to the appsettings.json file. Replace the
<database-server-name> and <database-name> placeholders with your own values.
JSON
"ConnectionStrings": {
"AZURE_SQL_CONNECTIONSTRING": "Server=tcp:<database-server-
name>.database.windows.net,1433;Initial Catalog=<database-
name>;Encrypt=True;TrustServerCertificate=False;Connection
Timeout=30;Authentication=\"Active Directory Default\";"
}
For example, when the app runs locally, DefaultAzureCredential authenticates via
the user you're signed into Visual Studio with, or other local tools like the Azure CLI.
Once the app deploys to Azure, the same code discovers and applies the managed
identity that is associated with the hosted app, which you'll configure later. The
Azure Identity library overview explains the order and locations in which
DefaultAzureCredential looks for credentials.
7 Note
C#
using Microsoft.Data.SqlClient;
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
app.UseHttpsRedirection();
string connectionString =
app.Configuration.GetConnectionString("AZURE_SQL_CONNECTIONSTRING")!;
try
{
// Table would be created ahead of time in production
using var conn = new SqlConnection(connectionString);
conn.Open();
app.MapGet("/Person", () => {
var rows = new List<string>();
if (reader.HasRows)
{
while (reader.Read())
{
rows.Add($"{reader.GetInt32(0)}, {reader.GetString(1)},
{reader.GetString(2)}");
}
}
return rows;
})
.WithName("GetPersons")
.WithOpenApi();
command.Parameters.Clear();
command.Parameters.AddWithValue("@firstName", person.FirstName);
command.Parameters.AddWithValue("@lastName", person.LastName);
app.Run();
Finally, add the Person class to the bottom of the Program.cs file. This class represents a
single record in the database's Persons table.
C#
1. Press the run button at the top of Visual Studio to launch the API project.
2. On the Swagger UI page, expand the POST method and select Try it.
3. Modify the sample JSON to include values for the first and last name. Select
Execute to add a new record to the database. The API returns a successful
response.
4. Expand the GET method on the Swagger UI page and select Try it. Choose
Execute, and the person you just created is returned.
3. In the publishing dialog, select Azure as the deployment target, and then select
Next.
4. For the specific target, select Azure App Service (Windows), and then select Next.
5. Select the + icon to create a new App Service to deploy to and enter the following
values:
Resource group: Select New and create a new resource group called msdocs-
dotnet-sql.
Hosting Plan: Select New to open the hosting plan dialog. Leave the default
values and select OK.
Select Create to close the original dialog. Visual Studio creates the App
Service resource in Azure.
6. Once the resource is created, make sure it's selected in the list of app services, and
then select Next.
7. On the API Management step, select the Skip this step checkbox at the bottom
and then choose Finish.
8. On the Finish step, select Close if the dialog does not close automatically.
9. Select Publish in the upper right of the publishing profile summary to deploy the
app to Azure.
When the deployment finishes, Visual Studio launches the browser to display the hosted
app, but at this point the app doesn't work correctly on Azure. You still need to
configure the secure connection between the App Service and the SQL database to
retrieve your data.
The following steps are required to create a passwordless connection between the
App Service instance and Azure SQL Database:
1. Create a managed identity for the App Service. The Microsoft.Data.SqlClient
library included in your app will automatically discover the managed identity,
just like it discovered your local Visual Studio user.
2. Create a SQL database user and associate it with the App Service managed
identity.
3. Assign SQL roles to the database user that allow for read, write, and
potentially other permissions.
Azure CLI
You can verify the changes made by Service Connector on the App Service
settings.
1. Navigate to the Identity page for your App Service. Under the System
assigned tab, the Status should be set to On. This value means that a
system-assigned managed identity was enabled for your app.
2. Navigate to the Configuration page for your App Service. Under the
Connection strings tab, you should see a connection string called
AZURE_SQL_CONNECTIONSTRING. Select the Click to show value text to
view the generated passwordless connection string. The name of this
connection string matches the one you configured in your app, so it will
be discovered automatically when running in Azure.
) Important
Although this solution provides a simple approach for getting started, it's not a
best practice for production-grade environments. In those scenarios, the app
shouldn't perform all operations using a single, elevated identity. You should
try to implement the principle of least privilege by configuring multiple
identities with specific permissions for specific tasks.
You can read more about configuring database roles and security on the
following resources:
2. Append the /swagger/index.html path to the URL to load the same Swagger test
page you used locally.
3. Execute test GET and POST requests to verify that the endpoints work as expected.
Tip
If you receive a 500 Internal Server error while testing, it may be due to your
database networking configurations. Verify that your logical server is configured
with the settings outlined in the Configure the database section.
Congratulations! Your application is now connected to Azure SQL Database in both local
and hosted environments.
Azure portal
1. In the Azure portal search bar, search for Azure SQL and select the matching
result.
4. On the Azure you sure you want to delete... page that opens, type the name
of your database to confirm, and then select Delete.
Quickstart: Azure Cosmos DB for NoSQL
client library for .NET
Article • 03/08/2023
Get started with the Azure Cosmos DB client library for .NET to create databases,
containers, and items within your account. Follow these steps to install the package and
try out example code for basic tasks.
7 Note
Prerequisites
An Azure account with an active subscription.
No Azure subscription? You can try Azure Cosmos DB free with no credit card
required.
.NET 6.0 or later
Azure Command-Line Interface (CLI) or Azure PowerShell
Prerequisite check
In a terminal or command window, run dotnet --version to check that the .NET
SDK is version 6.0 or later.
Run az --version (Azure CLI) or Get-Module -ListAvailable AzureRM (Azure
PowerShell) to check that you have the appropriate Azure command-line tools
installed.
Setting up
This section walks you through creating an Azure Cosmos DB account and setting up a
project that uses Azure Cosmos DB for NoSQL client library for .NET to manage
resources.
Create an Azure Cosmos DB account
Tip
No Azure subscription? You can try Azure Cosmos DB free with no credit card
required. If you create an account using the free trial, you can safely skip ahead to
the Create a new .NET app section.
This quickstart will create a single Azure Cosmos DB account using the API for NoSQL.
Portal
Tip
For this quickstart, we recommend using the resource group name msdocs-
cosmos-quickstart-rg .
2. From the Azure portal menu or the Home page, select Create a resource.
3. On the New page, search for and select Azure Cosmos DB.
4. On the Select API option page, select the Create option within the NoSQL
section. Azure Cosmos DB has six APIs: NoSQL, MongoDB, PostgreSQL,
Apache Cassandra, Apache Gremlin, and Table. Learn more about the API for
NoSQL.
5. On the Create Azure Cosmos DB Account page, enter the following
information:
Subscription Subscription Select the Azure subscription that you wish to use for
name this Azure Cosmos account.
Location The region Select a geographic location to host your Azure Cosmos
closest to DB account. Use the location that is closest to your users
your users to give them the fastest access to the data.
Apply Azure Apply or Do Enable Azure Cosmos DB free tier. With Azure Cosmos
Cosmos DB not apply DB free tier, you'll get the first 1000 RU/s and 25 GB of
free tier storage for free in an account. Learn more about free
discount tier .
7 Note
You can have up to one free tier Azure Cosmos DB account per Azure
subscription and must opt-in when creating the account. If you do not
see the option to apply the free tier discount, this means another account
in the subscription has already been enabled with free tier.
6. Select Review + create.
7. Review the settings you provide, and then select Create. It takes a few minutes
to create the account. Wait for the portal page to display Your deployment is
complete before moving on.
9. From the API for NoSQL account page, select the Keys navigation menu
option.
10. Record the values from the URI and PRIMARY KEY fields. You'll use these
values in a later step.
.NET CLI
.NET CLI
.NET CLI
dotnet build
Make sure that the build was successful with no errors. The expected output from the
build should look something like this:
Output
Build succeeded.
0 Warning(s)
0 Error(s)
Windows
PowerShell
$env:COSMOS_ENDPOINT = "<cosmos-account-URI>"
$env:COSMOS_KEY = "<cosmos-account-PRIMARY-KEY>"
Object model
Before you start building the application, let's look into the hierarchy of resources in
Azure Cosmos DB. Azure Cosmos DB has a specific object model used to create and
access resources. The Azure Cosmos DB creates resources in a hierarchy that consists of
accounts, databases, containers, and items.
Account
Database Database
item
item
item
For more information about the hierarchy of different resources, see working with
databases, containers, and items in Azure Cosmos DB.
You'll use the following .NET classes to interact with these resources:
CosmosClient - This class provides a client-side logical representation for the Azure
Cosmos DB service. The client object is used to configure and execute requests
against the service.
Database - This class is a reference to a database that may, or may not, exist in the
service yet. The database is validated server-side when you attempt to access it or
perform an operation against it.
Container - This class is a reference to a container that also may not exist in the
service yet. The container is validated server-side when you attempt to work with
it.
QueryDefinition - This class represents a SQL query and any query parameters.
FeedIterator<> - This class represents an iterator that can track the current page of
results and get a new page of results.
FeedResponse<> - This class represents a single page of responses from the
iterator. This type can be iterated over using a foreach loop.
Code examples
Authenticate the client
Create a database
Create a container
Create an item
Get an item
Query items
The sample code described in this article creates a database named cosmicworks with a
container named products . The products table is designed to contain product details
such as name, category, quantity, and a sale indicator. Each product also contains a
unique identifier.
For this sample code, the container will use the category as a logical partition key.
You can also authorize requests to Azure services using passwords, connection strings,
or other credentials directly. However, this approach should be used with caution.
Developers must be diligent to never expose these secrets in an unsecure location.
Anyone who gains access to the password or secret key is able to authenticate.
DefaultAzureCredential offers improved management and security benefits over the
account key to allow passwordless authentication. Both options are demonstrated in the
following example.
Passwordless (Recommended)
For example, your app can authenticate using your Visual Studio sign-in credentials
when developing locally, and then use a managed identity once it has been
deployed to Azure. No code changes are required for this transition.
When developing locally with passwordless authentication, make sure the user
account that connects to Cosmos DB is assigned a role with the correct permissions
to perform data operations. Currently, Azure Cosmos DB for NoSQL doesn't include
built-in roles for data operations, but you can create your own using the Azure CLI
or PowerShell.
Azure CLI
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/item
s/*",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/*"
]
}]
}'
2. When the command completes, copy the ID value from the name field and
paste it somewhere for later use.
3. Assign the role you created to the user account or service principal that will
connect to Cosmos DB. During local development, this will generally be your
own account that's logged into a development tool like Visual Studio or the
Azure CLI. Retrieve the details of your account using the az ad user
command.
Azure CLI
4. Copy the value of the id property out of the results and paste it somewhere
for later use.
5. Assign the custom role you created to your user account using the az
cosmosdb sql role assignment create command and the IDs you copied
previously.
Azure CLI
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
.NET CLI
From the project directory, open the Program.cs file. In your editor, add using
directives for the Microsoft.Azure.Cosmos and Azure.Identity namespaces.
C#
using Microsoft.Azure.Cosmos;
using Azure.Identity;
Define a new instance of the CosmosClient class using the constructor, and
Environment.GetEnvironmentVariable to read the COSMOS_ENDPOINT environment
variable you created earlier.
C#
For more information on different ways to create a CosmosClient instance, see Get
started with Azure Cosmos DB for NoSQL and .NET.
The Azure CLI approach is used in this example. Use the az cosmosdb sql database
create and az cosmosdb sql container create commands to create a Cosmos DB
NoSQL database and container.
Azure CLI
After the resources have been created, use classes from the
Microsoft.Azure.Cosmos client libraries to connect to and query the database.
C#
Console.WriteLine($"New database:\t{database.Id}");
Get the container
The Database.GetContainer will return a reference to the specified container.
C#
Console.WriteLine($"New container:\t{container.Id}");
Create an item
The easiest way to create a new item in a container is to first build a C# class or record
type with all of the members you want to serialize into JSON. In this example, the C#
record has a unique identifier, a categoryId field for the partition key, and extra
categoryName, name, quantity, and sale fields.
C#
C#
Console.WriteLine($"Created
item:\t{createdItem.id}\t[{createdItem.categoryName}]");
For more information on creating, upserting, or replacing items, see Create an item in
Azure Cosmos DB for NoSQL using .NET.
Get an item
In Azure Cosmos DB, you can perform a point read operation by using both the unique
identifier ( id ) and partition key fields. In the SDK, call Container.ReadItemAsync<>
passing in both values to return a deserialized instance of your C# type.
C#
For more information about reading items and parsing the response, see Read an item
in Azure Cosmos DB for NoSQL using .NET.
Query items
After you insert an item, you can run a query to get all items that match a specific filter.
This example runs the SQL query: SELECT * FROM products p WHERE p.categoryId =
"61dba35b-4f02-45c5-b648-c6badc0cbd79" . This example uses the QueryDefinition type
and a parameterized query expression for the partition key filter. Once the query is
defined, call Container.GetItemQueryIterator<> to get a result iterator that will manage
the pages of results. Then, use a combination of while and foreach loops to retrieve
pages of results and then iterate over the individual items.
C#
while (feed.HasMoreResults)
{
FeedResponse<Product> response = await feed.ReadNextAsync();
foreach (Product item in response)
{
Console.WriteLine($"Found item:\t{item.name}");
}
}
To run the app, use a terminal to navigate to the application directory and run the
application.
.NET CLI
dotnet run
Output
Clean up resources
When you no longer need the API for NoSQL account, you can delete the corresponding
resource group.
Portal
1. Navigate to the resource group you previously created in the Azure portal.
Tip
3. On the Are you sure you want to delete dialog, enter the name of the
resource group, and then select Delete.
Next steps
In this quickstart, you learned how to create an Azure Cosmos DB for NoSQL account,
create a database, and create a container using the .NET SDK. You can now dive deeper
into a tutorial where you manage your Azure Cosmos DB for NoSQL resources and data
using a .NET console application.
Tutorial: Develop a .NET console application with Azure Cosmos DB for NoSQL
Quickstart: Send events to and receive
events from Azure Event Hubs using
.NET
Article • 08/08/2023
In this quickstart, you learn how to send events to an event hub and then receive those
events from the event hub using the Azure.Messaging.EventHubs .NET library.
7 Note
Quickstarts are for you to quickly ramp up on the service. If you are already familiar
with the service, you may want to see .NET samples for Event Hubs in our .NET SDK
repository on GitHub: Event Hubs samples on GitHub , Event processor samples
on GitHub .
Prerequisites
If you're new to Azure Event Hubs, see Event Hubs overview before you go through this
quickstart.
Microsoft Azure subscription. To use Azure services, including Azure Event Hubs,
you need a subscription. If you don't have an existing Azure account, you can sign
up for a free trial or use your MSDN subscriber benefits when you create an
account .
Microsoft Visual Studio 2022. The Azure Event Hubs client library makes use of
new features that were introduced in C# 8.0. You can still use the library with
previous C# language versions, but the new syntax isn't available. To make use of
the full syntax, we recommend that you compile with the .NET Core SDK 3.0 or
higher and language version set to latest . If you're using Visual Studio, versions
before Visual Studio 2022 aren't compatible with the tools needed to build C# 8.0
projects. Visual Studio 2022, including the free Community edition, can be
downloaded here .
Create an Event Hubs namespace and an event hub. The first step is to use the
Azure portal to create an Event Hubs namespace and an event hub in the
namespace. Then, obtain the management credentials that your application needs
to communicate with the event hub. To create a namespace and an event hub, see
Quickstart: Create an event hub using Azure portal.
The first option shows you how to use your security principal in Azure Active Directory
and role-based access control (RBAC) to connect to an Event Hubs namespace. You
don't need to worry about having hard-coded connection strings in your code or in a
configuration file or in a secure storage like Azure Key Vault.
The second option shows you how to use a connection string to connect to an Event
Hubs namespace. If you're new to Azure, you may find the connection string option
easier to follow. We recommend using the passwordless option in real-world
applications and production environments. For more information, see Authentication
and authorization. You can also read more about passwordless authentication on the
overview page.
Passwordless
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. Learn more
about the available scopes for role assignments on the scope overview page.
The following example assigns the Azure Event Hubs Data Owner role to your user
account, which provides full access to Azure Event Hubs resources. In a real
scenario, follow the Principle of Least Privilege to give users only the minimum
permissions needed for a more secure production environment.
Azure Event Hubs Data Owner: Enables data access to Event Hubs namespace
and its entities (queues, topics, subscriptions, and filters)
Azure Event Hubs Data Sender: Use this role to give the sender access to
Event Hubs namespace and its entities.
Azure Event Hubs Data Receiver: Use this role to give the receiver access to
Event Hubs namespace and its entities.
If you want to create a custom role, see Rights required for Event Hubs operations.
) Important
In most cases, it will take a minute or two for the role assignment to propagate
in Azure. In rare cases, it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your Event Hubs namespace using the main
search bar or left navigation.
2. On the overview page, select Access control (IAM) from the left-hand
menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Azure Event Hubs Data Owner and select the matching
result. Then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
1. Launch Visual Studio. If you see the Get started window, select the Continue
without code link in the right pane.
2. On the Create a new project dialog box, do the following steps: If you don't see
this dialog box, select File on the menu, select New, and then select Project.
Passwordless (Recommended)
1. Select Tools > NuGet Package Manager > Package Manager Console from
the menu.
PowerShell
Install-Package Azure.Messaging.EventHubs
Install-Package Azure.Identity
Passwordless (Recommended)
1. Replace the existing code in the Program.cs file with the following sample
code. Then, replace <EVENT_HUB_NAMESPACE> and <HUB_NAME> placeholder
values for the EventHubProducerClient parameters with the names of your
Event Hubs namespace and the event hub. For example:
"spehubns0309.servicebus.windows.net" and "spehub" .
C#
using Azure.Identity;
using Azure.Messaging.EventHubs;
using Azure.Messaging.EventHubs.Producer;
using System.Text;
// The Event Hubs client types are safe to cache and use as a
singleton for the lifetime
// of the application, which is best practice when events are being
published or read regularly.
// TODO: Replace the <EVENT_HUB_NAMESPACE> and <HUB_NAME>
placeholder values
EventHubProducerClient producerClient = new EventHubProducerClient(
"<EVENT_HUB_NAMESPACE>.servicebus.windows.net",
"<HUB_NAME>",
new DefaultAzureCredential());
try
{
// Use the producer client to send the batch of events to the
event hub
await producerClient.SendAsync(eventBatch);
Console.WriteLine($"A batch of {numOfEvents} events has been
published.");
}
finally
{
await producerClient.DisposeAsync();
}
C#
4. On the Event Hubs Namespace page in the Azure portal, you see three incoming
messages in the Messages chart. Refresh the page to update the chart if needed. It
may take a few seconds for it to show that the messages have been received.
7 Note
For the complete source code with more informational comments, see this file
on the GitHub
Follow these recommendations when using Azure Blob Storage as a checkpoint store:
Use a separate container for each processor group. You can use the same storage
account, but use one container per each group.
Don't use the container for anything else, and don't use the storage account for
anything else.
Storage account should be in the same region as the deployed application is
located in. If the application is on-premises, try to choose the closest region
possible.
On the Storage account page in the Azure portal, in the Blob service section, ensure
that the following settings are disabled.
Hierarchical namespace
Blob soft delete
Versioning
Passwordless (Recommended)
When developing locally, make sure that the user account that is accessing blob
data has the correct permissions. You'll need Storage Blob Data Contributor to
read and write blob data. To assign yourself this role, you'll need to be assigned the
User Access Administrator role, or another role that includes the
Microsoft.Authorization/roleAssignments/write action. You can assign Azure RBAC
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. You can learn
more about the available scopes for role assignments on the scope overview page.
In this scenario, you'll assign permissions to your user account, scoped to the
storage account, to follow the Principle of Least Privilege. This practice gives users
only the minimum permissions needed and creates more secure production
environments.
The following example will assign the Storage Blob Data Contributor role to your
user account, which provides both read and write access to blob data in your
storage account.
) Important
In most cases it will take a minute or two for the role assignment to propagate
in Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your storage account using the main search
bar or left navigation.
2. On the storage account overview page, select Access control (IAM) from
the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Storage Blob Data Contributor and select the
matching result and then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Passwordless (Recommended)
1. Select Tools > NuGet Package Manager > Package Manager Console from
the menu.
PowerShell
Install-Package Azure.Messaging.EventHubs
Install-Package Azure.Messaging.EventHubs.Processor
Install-Package Azure.Identity
Passwordless (Recommended)
1. Replace the existing code in the Program.cs file with the following sample
code. Then, replace the <STORAGE_ACCOUNT_NAME> and <BLOB_CONTAINER_NAME>
placeholder values for the BlobContainerClient URI. Replace the
<EVENT_HUB_NAMESPACE> and <HUB_NAME> placeholder values for the
EventProcessorClient as well.
using Azure.Identity;
using Azure.Messaging.EventHubs;
using Azure.Messaging.EventHubs.Consumer;
using Azure.Messaging.EventHubs.Processor;
using Azure.Storage.Blobs;
using System.Text;
// Create a blob container client that the event processor will use
// TODO: Replace <STORAGE_ACCOUNT_NAME> and <BLOB_CONTATINAER_NAME>
with actual names
BlobContainerClient storageClient = new BlobContainerClient(
new
Uri("https://<STORAGE_ACCOUNT_NAME>.blob.core.windows.net/<BLOB_CON
TAINER_NAME>"),
new DefaultAzureCredential());
7 Note
For the complete source code with more informational comments, see this file
on the GitHub .
4. You should see a message that the events have been received.
Bash
These events are the three events you sent to the event hub earlier by running the
sender program.
5. In the Azure portal, you can verify that there are three outgoing messages, which
Event Hubs sent to the receiving application. Refresh the page to update the chart.
It may take a few seconds for it to show that the messages have been received.
Schema validation for Event Hubs SDK based
applications
You can use Azure Schema Registry to perform schema validation when you stream data
with your Event Hubs SDK-based applications. Azure Schema Registry of Event Hubs
provides a centralized repository for managing schemas and you can seamlessly
connect your new or existing applications with Schema Registry.
Clean up resources
Delete the resource group that has the Event Hubs namespace or delete only the
namespace if you want to keep the resource group.
Next steps
See the following tutorial:
Tutorial: Visualize data anomalies in real-time events sent to Azure Event Hubs
Quickstart: Azure Key Vault certificate
client library for .NET
Article • 01/13/2023
Get started with the Azure Key Vault certificate client library for .NET. Azure Key Vault is a
cloud service that provides a secure store for certificates. You can securely store keys,
passwords, certificates, and other secrets. Azure key vaults may be created and
managed through the Azure portal. In this quickstart, you learn how to create, retrieve,
and delete certificates from an Azure key vault using the .NET client library
Prerequisites
An Azure subscription - create one for free
.NET 6 SDK or later
Azure CLI
A Key Vault - you can create one using Azure portal, Azure CLI, or Azure
PowerShell.
Setup
This quickstart is using Azure Identity library with Azure CLI to authenticate user to
Azure Services. Developers can also use Visual Studio or Visual Studio Code to
authenticate their calls, for more information, see Authenticate the client with Azure
Identity client library.
Sign in to Azure
1. Run the login command.
Azure CLI
az login
If the CLI can open your default browser, it will do so and load an Azure sign-in
page.
Azure CLI
.NET CLI
.NET CLI
dotnet build
Console
Build succeeded.
0 Warning(s)
0 Error(s)
.NET CLI
For this quickstart, you'll also need to install the Azure Identity client library:
.NET CLI
Windows
set KEY_VAULT_NAME=<your-key-vault-name>
Windows PowerShell
PowerShell
$Env:KEY_VAULT_NAME="<your-key-vault-name>"
macOS or Linux
Bash
export KEY_VAULT_NAME=<your-key-vault-name>
Object model
The Azure Key Vault certificate client library for .NET allows you to manage certificates.
The Code examples section shows how to create a client, set a certificate, retrieve a
certificate, and delete a certificate.
Code examples
Add directives
Add the following directives to the top of Program.cs:
C#
using System;
using Azure.Identity;
using Azure.Security.KeyVault.Certificates;
In this example, the name of your key vault is expanded to the key vault URI, in the
format https://<your-key-vault-name>.vault.azure.net . For more information about
authenticating to key vault, see Developer's Guide.
C#
string keyVaultName = Environment.GetEnvironmentVariable("KEY_VAULT_NAME");
var kvUri = "https://" + keyVaultName + ".vault.azure.net";
Save a certificate
In this example, for simplicity you can use self-signed certificate with default issuance
policy. For this task, use the StartCreateCertificateAsync method. The method's
parameters accepts a certificate name and the certificate policy.
C#
7 Note
If certificate name exists, above code will create new version of that certificate.
Retrieve a certificate
You can now retrieve the previously created certificate with the GetCertificateAsync
method.
C#
Delete a certificate
Finally, let's delete and purge the certificate from your key vault with the
StartDeleteCertificateAsync and PurgeDeletedCertificateAsync methods.
C#
// You only need to wait for completion if you want to purge or recover the
certificate.
await operation.WaitForCompletionAsync();
Sample code
Modify the .NET console app to interact with the Key Vault by completing the following
steps:
C#
using System;
using System.Threading.Tasks;
using Azure.Identity;
using Azure.Security.KeyVault.Certificates;
namespace key_vault_console_app
{
class Program
{
static async Task Main(string[] args)
{
const string certificateName = "myCertificate";
var keyVaultName =
Environment.GetEnvironmentVariable("KEY_VAULT_NAME");
var kvUri = $"https://{keyVaultName}.vault.azure.net";
.NET CLI
dotnet build
Console
Next steps
In this quickstart, you created a key vault, stored a certificate, and retrieved that
certificate.
To learn more about Key Vault and how to integrate it with your apps, see the following
articles:
Get started with the Azure Key Vault key client library for .NET. Azure Key Vault is a cloud
service that provides a secure store for cryptographic keys. You can securely store
cryptographic keys, passwords, certificates, and other secrets. Azure key vaults may be
created and managed through the Azure portal. In this quickstart, you learn how to
create, retrieve, and delete keys from an Azure key vault using the .NET key client library
Prerequisites
An Azure subscription - create one for free
.NET 6 SDK or later
Azure CLI
A Key Vault - you can create one using Azure portal, Azure CLI, or Azure
PowerShell.
Setup
This quickstart is using Azure Identity library with Azure CLI to authenticate user to
Azure Services. Developers can also use Visual Studio or Visual Studio Code to
authenticate their calls, for more information, see Authenticate the client with Azure
Identity client library.
Sign in to Azure
1. Run the login command.
Azure CLI
az login
If the CLI can open your default browser, it will do so and load an Azure sign-in
page.
Azure CLI
.NET CLI
.NET CLI
dotnet build
Console
Build succeeded.
0 Warning(s)
0 Error(s)
.NET CLI
For this quickstart, you'll also need to install the Azure Identity client library:
.NET CLI
Windows
set KEY_VAULT_NAME=<your-key-vault-name>
Windows PowerShell
PowerShell
$Env:KEY_VAULT_NAME="<your-key-vault-name>"
macOS or Linux
Bash
export KEY_VAULT_NAME=<your-key-vault-name>
Object model
The Azure Key Vault key client library for .NET allows you to manage keys. The Code
examples section shows how to create a client, set a key, retrieve a key, and delete a key.
Code examples
Add directives
Add the following directives to the top of Program.cs:
C#
using System;
using Azure.Identity;
using Azure.Security.KeyVault.Keys;
In this example, the name of your key vault is expanded to the key vault URI, in the
format https://<your-key-vault-name>.vault.azure.net . For more information about
authenticating to key vault, see Developer's Guide.
C#
var keyVaultName = Environment.GetEnvironmentVariable("KEY_VAULT_NAME");
var kvUri = $"https://{keyVaultName}.vault.azure.net";
Save a key
For this task, use the CreateKeyAsync method. The method's parameters accepts a key
name and the key type.
C#
7 Note
If key name exists, this code will create new version of that key.
Retrieve a key
You can now retrieve the previously created key with the GetKeyAsync method.
C#
Delete a key
Finally, let's delete and purge the key from your key vault with the StartDeleteKeyAsync
and PurgeDeletedKeyAsync methods.
C#
// You only need to wait for completion if you want to purge or recover the
key.
await operation.WaitForCompletionAsync();
C#
using System;
using System.Threading.Tasks;
using Azure.Identity;
using Azure.Security.KeyVault.Keys;
namespace key_vault_console_app
{
class Program
{
static async Task Main(string[] args)
{
const string keyName = "myKey";
var keyVaultName =
Environment.GetEnvironmentVariable("KEY_VAULT_NAME");
var kvUri = $"https://{keyVaultName}.vault.azure.net";
.NET CLI
dotnet build
.NET CLI
dotnet run
Console
Next steps
In this quickstart, you created a key vault, stored a key, and retrieved that key.
To learn more about Key Vault and how to integrate it with your apps, see the following
articles:
Get started with the Azure Key Vault secret client library for .NET. Azure Key Vault is a
cloud service that provides a secure store for secrets. You can securely store keys,
passwords, certificates, and other secrets. Azure key vaults may be created and
managed through the Azure portal. In this quickstart, you learn how to create, retrieve,
and delete secrets from an Azure key vault using the .NET client library
Prerequisites
An Azure subscription - create one for free
.NET 6 SDK or later
Azure CLI or Azure PowerShell
A Key Vault - you can create one using Azure portal, Azure CLI, or Azure PowerShell
Setup
Azure CLI
This quickstart is using Azure Identity library with Azure CLI to authenticate user to
Azure Services. Developers can also use Visual Studio or Visual Studio Code to
authenticate their calls, for more information, see Authenticate the client with Azure
Identity client library.
Sign in to Azure
1. Run the az login command.
Azure CLI
az login
If the CLI can open your default browser, it will do so and load an Azure sign-
in page.
Azure CLI
.NET CLI
.NET CLI
dotnet build
Build succeeded.
0 Warning(s)
0 Error(s)
.NET CLI
For this quickstart, you'll also need to install the Azure Identity client library:
.NET CLI
Windows
set KEY_VAULT_NAME=<your-key-vault-name>
Windows PowerShell
PowerShell
$Env:KEY_VAULT_NAME="<your-key-vault-name>"
macOS or Linux
Bash
export KEY_VAULT_NAME=<your-key-vault-name>
Object model
The Azure Key Vault secret client library for .NET allows you to manage secrets. The
Code examples section shows how to create a client, set a secret, retrieve a secret, and
delete a secret.
Code examples
Add directives
Add the following directives to the top of Program.cs:
C#
using System;
using Azure.Identity;
using Azure.Security.KeyVault.Secrets;
In this example, the name of your key vault is expanded to the key vault URI, in the
format https://<your-key-vault-name>.vault.azure.net . For more information about
authenticating to key vault, see Developer's Guide.
C#
string keyVaultName = Environment.GetEnvironmentVariable("KEY_VAULT_NAME");
var kvUri = "https://" + keyVaultName + ".vault.azure.net";
Save a secret
Now that the console app is authenticated, add a secret to the key vault. For this task,
use the SetSecretAsync method. The method's first parameter accepts a name for the
secret—"mySecret" in this sample.
C#
7 Note
If secret name exists, the code will create new version of that secret.
Retrieve a secret
You can now retrieve the previously set value with the GetSecretAsync method.
C#
Delete a secret
Finally, let's delete the secret from your key vault with the StartDeleteSecretAsync and
PurgeDeletedSecretAsync methods.
C#
await client.PurgeDeletedSecretAsync("mySecret");
Sample code
Modify the .NET console app to interact with the Key Vault by completing the following
steps:
C#
using System;
using System.Threading.Tasks;
using Azure.Identity;
using Azure.Security.KeyVault.Secrets;
namespace key_vault_console_app
{
class Program
{
static async Task Main(string[] args)
{
const string secretName = "mySecret";
var keyVaultName =
Environment.GetEnvironmentVariable("KEY_VAULT_NAME");
var kvUri = $"https://{keyVaultName}.vault.azure.net";
.NET CLI
dotnet run
Console
Next steps
To learn more about Key Vault and how to integrate it with your apps, see the following
articles:
7 Note
Prerequisites
If you're new to the service, see Service Bus overview before you do this quickstart.
Azure subscription. To use Azure services, including Azure Service Bus, you need a
subscription. If you don't have an existing Azure account, you can sign up for a free
trial .
Visual Studio 2022. The sample application makes use of new features that were
introduced in C# 10. You can still use the Service Bus client library with previous C#
language versions, but the syntax may vary. To use the latest syntax, we
recommend that you install .NET 6.0 or higher and set the language version to
latest . If you're using Visual Studio, versions before Visual Studio 2022 aren't
To create a namespace:
1. Sign in to the Azure portal
2. In the left navigation pane of the portal, select All services, select Integration from
the list of categories, hover the mouse over Service Bus, and then select Create on
the Service Bus tile.
3. In the Basics tag of the Create namespace page, follow these steps:
b. For Resource group, choose an existing resource group in which the namespace
will live, or create a new one.
c. Enter a name for the namespace. The namespace name should adhere to the
following naming conventions:
The name must be unique across Azure. The system immediately checks to
see if the name is available.
The name length is at least 6 and at most 50 characters.
The name can contain only letters, numbers, hyphens “-“.
The name must start with a letter and end with a letter or number.
The name doesn't end with “-sb“ or “-mgmt“.
d. For Location, choose the region in which your namespace should be hosted.
e. For Pricing tier, select the pricing tier (Basic, Standard, or Premium) for the
namespace. For this quickstart, select Standard.
) Important
If you selected the Premium pricing tier, specify the number of messaging
units. The premium tier provides resource isolation at the CPU and memory
level so that each workload runs in isolation. This resource container is called a
messaging unit. A premium namespace has at least one messaging unit. You
can select 1, 2, 4, 8 or 16 messaging units for each Service Bus Premium
namespace. For more information, see Service Bus Premium Messaging.
5. You see the home page for your service bus namespace.
3. Enter a name for the queue, and leave the other values with their defaults.
The first option shows you how to use your security principal in Azure Active Directory
and role-based access control (RBAC) to connect to a Service Bus namespace. You don't
need to worry about having hard-coded connection string in your code or in a
configuration file or in a secure storage like Azure Key Vault.
The second option shows you how to use a connection string to connect to a Service
Bus namespace. If you are new to Azure, you may find the connection string option
easier to follow. We recommend using the passwordless option in real-world
applications and production environments. For more information, see Authentication
and authorization. You can also read more about passwordless authentication on the
overview page.
Passwordless (Recommended)
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. Learn more
about the available scopes for role assignments on the scope overview page.
The following example assigns the Azure Service Bus Data Owner role to your user
account, which provides full access to Azure Service Bus resources. In a real
scenario, follow the Principle of Least Privilege to give users only the minimum
permissions needed for a more secure production environment.
Azure Service Bus Data Owner: Enables data access to Service Bus namespace
and its entities (queues, topics, subscriptions, and filters). A member of this
role can send and receive messages from queues or topics/subscriptions.
Azure Service Bus Data Sender: Use this role to give the send access to Service
Bus namespace and its entities.
Azure Service Bus Data Receiver: Use this role to give the receive access to
Service Bus namespace and its entities.
If you want to create a custom role, see Rights required for Service Bus operations.
) Important
In most cases, it will take a minute or two for the role assignment to propagate
in Azure. In rare cases, it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
1. If you don't have the Service Bus Namespace page open in the Azure portal,
locate your Service Bus namespace using the main search bar or left
navigation.
2. On the overview page, select Access control (IAM) from the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this example,
search for Azure Service Bus Data Owner and select the matching result. Then
choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
7. In the dialog, search for your Azure AD username (usually your user@domain
email address) and then choose Select at the bottom of the dialog.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Launch Visual Studio and sign-in to Azure
You can authorize access to the service bus namespace using the following steps:
1. Launch Visual Studio. If you see the Get started window, select the Continue
without code link in the right pane.
7 Note
Passwordless
1. Select Tools > NuGet Package Manager > Package Manager Console from
the menu.
PowerShell
Install-Package Azure.Messaging.ServiceBus
PowerShell
Install-Package Azure.Identity
Add code to send messages to the queue
1. Replace the contents of Program.cs with the following code. The important steps
are outlined below, with additional information in the code comments.
Passwordless
) Important
C#
using Azure.Messaging.ServiceBus;
using Azure.Identity;
// The Service Bus client types are safe to cache and use as a
singleton for the lifetime
// of the application, which is best practice when messages are
being published or read
// regularly.
//
// Set the transport type to AmqpWebSockets so that the
ServiceBusClient uses the port 443.
// If you use the default AmqpTcp, ensure that ports 5671 and 5672
are open.
var clientOptions = new ServiceBusClientOptions
{
TransportType = ServiceBusTransportType.AmqpWebSockets
};
//TODO: Replace the "<NAMESPACE-NAME>" and "<QUEUE-NAME>"
placeholders.
client = new ServiceBusClient(
"<NAMESPACE-NAME>.servicebus.windows.net",
new DefaultAzureCredential(),
clientOptions);
sender = client.CreateSender("<QUEUE-NAME>");
// create a batch
using ServiceBusMessageBatch messageBatch = await
sender.CreateMessageBatchAsync();
try
{
// Use the producer client to send the batch of messages to the
Service Bus queue
await sender.SendMessagesAsync(messageBatch);
Console.WriteLine($"A batch of {numOfMessages} messages has
been published to the queue.");
}
finally
{
// Calling DisposeAsync on client types is required to ensure
that network
// resources and other unmanaged objects are properly cleaned
up.
await sender.DisposeAsync();
await client.DisposeAsync();
}
Console.WriteLine("Press any key to end the application");
Console.ReadKey();
Bash
) Important
In most cases, it will take a minute or two for the role assignment to
propagate in Azure. In rare cases, it may take up to eight minutes. If you
receive authentication errors when you first run your code, wait a few
moments and try again.
The Active message count value for the queue is now 3. Each time you run
this sender app without retrieving the messages, this value increases by 3.
The current size of the queue increments each time the app adds messages
to the queue.
In the Messages chart in the bottom Metrics section, you can see that there
are three incoming messages for the queue.
7 Note
Passwordless
1. Select Tools > NuGet Package Manager > Package Manager Console from
the menu.
PowerShell
Install-Package Azure.Messaging.ServiceBus
PowerShell
Install-Package Azure.Identity
Passwordless
C#
using System.Threading.Tasks;
using Azure.Identity;
using Azure.Messaging.ServiceBus;
// the client that owns the connection and can be used to create
senders and receivers
ServiceBusClient client;
// the processor that reads and processes messages from the queue
ServiceBusProcessor processor;
C#
3. Append the following code to the end of the Program class. The important steps
are outlined below, with additional information in the code comments.
Passwordless
) Important
C#
// The Service Bus client types are safe to cache and use as a
singleton for the lifetime
// of the application, which is best practice when messages are
being published or read
// regularly.
//
// Set the transport type to AmqpWebSockets so that the
ServiceBusClient uses port 443.
// If you use the default AmqpTcp, make sure that ports 5671 and
5672 are open.
try
{
// add handler to process messages
processor.ProcessMessageAsync += MessageHandler;
// add handler to process any errors
processor.ProcessErrorAsync += ErrorHandler;
// start processing
await processor.StartProcessingAsync();
// stop processing
Console.WriteLine("\nStopping the receiver...");
await processor.StopProcessingAsync();
Console.WriteLine("Stopped receiving messages");
}
finally
{
// Calling DisposeAsync on client types is required to ensure
that network
// resources and other unmanaged objects are properly cleaned
up.
await processor.DisposeAsync();
await client.DisposeAsync();
}
Passwordless
C#
using System.Threading.Tasks;
using Azure.Messaging.ServiceBus;
using Azure.Identity;
// the client that owns the connection and can be used to create
senders and receivers
ServiceBusClient client;
// the processor that reads and processes messages from the queue
ServiceBusProcessor processor;
// The Service Bus client types are safe to cache and use as a
singleton for the lifetime
// of the application, which is best practice when messages are
being published or read
// regularly.
//
// Set the transport type to AmqpWebSockets so that the
ServiceBusClient uses port 443.
// If you use the default AmqpTcp, make sure that ports 5671 and
5672 are open.
try
{
// add handler to process messages
processor.ProcessMessageAsync += MessageHandler;
// start processing
await processor.StartProcessingAsync();
// stop processing
Console.WriteLine("\nStopping the receiver...");
await processor.StopProcessingAsync();
Console.WriteLine("Stopped receiving messages");
}
finally
{
// Calling DisposeAsync on client types is required to ensure
that network
// resources and other unmanaged objects are properly cleaned
up.
await processor.DisposeAsync();
await client.DisposeAsync();
}
6. Run the receiver application. You should see the received messages. Press any key
to stop the receiver and the application.
Console
Wait for a minute and then press any key to end the processing
Received: Message 1
Received: Message 2
Received: Message 3
7. Check the portal again. Wait for a few minutes and refresh the page if you don't
see 0 for Active messages.
The Active message count and Current size values are now 0.
In the Messages chart in the bottom Metrics section, you can see that there
are three incoming messages and three outgoing messages for the queue.
Clean up resources
Navigate to your Service Bus namespace in the Azure portal, and select Delete on the
Azure portal to delete the namespace and the queue in it.
See also
See the following documentation and samples:
Next steps
Get started with Azure Service Bus topics and subscriptions (.NET)
Get started with Azure Service Bus
topics and subscriptions (.NET)
Article • 01/03/2023
This quickstart shows how to send messages to a Service Bus topic and receive
messages from a subscription to that topic by using the Azure.Messaging.ServiceBus
.NET library.
7 Note
This quick start shows you two ways of connecting to Azure Service Bus:
connection string and passwordless. The first option shows you how to use a
connection string to connect to a Service Bus namespace. The second option
shows you how to use your security principal in Azure Active Directory and
the role-based access control (RBAC) to connect to a Service Bus namespace.
You don't need to worry about having hard-coded connection string in your
code or in a configuration file or in secure storage like Azure Key Vault. If you
are new to Azure, you may find the connection string option easier to follow.
We recommend using the passwordless option in real-world applications and
production environments. For more information, see Authentication and
authorization.
Prerequisites
If you're new to the service, see Service Bus overview before you do this quickstart.
Azure subscription. To use Azure services, including Azure Service Bus, you need a
subscription. If you don't have an existing Azure account, you can sign up for a free
trial .
Visual Studio 2022. The sample application makes use of new features that were
introduced in C# 10. You can still use the Service Bus client library with previous C#
language versions, but the syntax may vary. To use the latest syntax, we
recommend that you install .NET 6.0 or higher and set the language version to
latest . If you're using Visual Studio, versions before Visual Studio 2022 aren't
To create a namespace:
2. In the left navigation pane of the portal, select All services, select Integration from
the list of categories, hover the mouse over Service Bus, and then select Create on
the Service Bus tile.
3. In the Basics tag of the Create namespace page, follow these steps:
b. For Resource group, choose an existing resource group in which the namespace
will live, or create a new one.
c. Enter a name for the namespace. The namespace name should adhere to the
following naming conventions:
The name must be unique across Azure. The system immediately checks to
see if the name is available.
The name length is at least 6 and at most 50 characters.
The name can contain only letters, numbers, hyphens “-“.
The name must start with a letter and end with a letter or number.
The name doesn't end with “-sb“ or “-mgmt“.
d. For Location, choose the region in which your namespace should be hosted.
e. For Pricing tier, select the pricing tier (Basic, Standard, or Premium) for the
namespace. For this quickstart, select Standard.
) Important
If you selected the Premium pricing tier, specify the number of messaging
units. The premium tier provides resource isolation at the CPU and memory
level so that each workload runs in isolation. This resource container is called a
messaging unit. A premium namespace has at least one messaging unit. You
can select 1, 2, 4, 8 or 16 messaging units for each Service Bus Premium
namespace. For more information, see Service Bus Premium Messaging.
5. You see the home page for your service bus namespace.
3. Enter a name for the topic. Leave the other options with their default values.
4. Select Create.
Create a subscription to the topic
1. Select the topic that you created in the previous section.
The first option shows you how to use your security principal in Azure Active Directory
and role-based access control (RBAC) to connect to a Service Bus namespace. You don't
need to worry about having hard-coded connection string in your code or in a
configuration file or in a secure storage like Azure Key Vault.
The second option shows you how to use a connection string to connect to a Service
Bus namespace. If you are new to Azure, you may find the connection string option
easier to follow. We recommend using the passwordless option in real-world
applications and production environments. For more information, see Authentication
and authorization. You can also read more about passwordless authentication on the
overview page.
Passwordless (Recommended)
The following example assigns the Azure Service Bus Data Owner role to your user
account, which provides full access to Azure Service Bus resources. In a real
scenario, follow the Principle of Least Privilege to give users only the minimum
permissions needed for a more secure production environment.
Azure Service Bus Data Owner: Enables data access to Service Bus namespace
and its entities (queues, topics, subscriptions, and filters). A member of this
role can send and receive messages from queues or topics/subscriptions.
Azure Service Bus Data Sender: Use this role to give the send access to Service
Bus namespace and its entities.
Azure Service Bus Data Receiver: Use this role to give the receive access to
Service Bus namespace and its entities.
If you want to create a custom role, see Rights required for Service Bus operations.
) Important
In most cases, it will take a minute or two for the role assignment to propagate
in Azure. In rare cases, it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
1. If you don't have the Service Bus Namespace page open in the Azure portal,
locate your Service Bus namespace using the main search bar or left
navigation.
2. On the overview page, select Access control (IAM) from the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this example,
search for Azure Service Bus Data Owner and select the matching result. Then
choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
7. In the dialog, search for your Azure AD username (usually your user@domain
email address) and then choose Select at the bottom of the dialog.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
1. Launch Visual Studio. If you see the Get started window, select the Continue
without code link in the right pane.
Passwordless
1. Select Tools > NuGet Package Manager > Package Manager Console from
the menu.
PowerShell
Install-Package Azure.Messaging.ServiceBus
PowerShell
Install-Package Azure.Identity
Passwordless
C#
using System.Threading.Tasks;
using Azure.Messaging.ServiceBus;
using Azure.Identity;
// the client that owns the connection and can be used to create
senders and receivers
ServiceBusClient client;
// The Service Bus client types are safe to cache and use as a
singleton for the lifetime
// of the application, which is best practice when messages are
being published or read
// regularly.
// create a batch
using ServiceBusMessageBatch messageBatch = await
sender.CreateMessageBatchAsync();
try
{
// Use the producer client to send the batch of messages to the
Service Bus topic
await sender.SendMessagesAsync(messageBatch);
Console.WriteLine($"A batch of {numOfMessages} messages has
been published to the topic.");
}
finally
{
// Calling DisposeAsync on client types is required to ensure
that network
// resources and other unmanaged objects are properly cleaned
up.
await sender.DisposeAsync();
await client.DisposeAsync();
}
Bash
) Important
In most cases, it will take a minute or two for the role assignment to
propagate in Azure. In rare cases, it may take up to eight minutes. If you
receive authentication errors when you first run your code, wait a few
moments and try again.
b. On the Overview page, in the bottom-middle pane, switch to the Topics tab,
and select the Service Bus topic. In the following example, it's mytopic .
c. On the Service Bus Topic page, In the Messages chart in the bottom Metrics
section, you can see that there are three incoming messages for the topic. If you
don't see the value, wait for a few minutes, and refresh the page to see the
updated chart.
d. Select the subscription in the bottom pane. In the following example, it's S1. On
the Service Bus Subscription page, you see the Active message count as 3. The
subscription has received the three messages that you sent to the topic, but no
receiver has picked them yet.
7 Note
Passwordless
1. Select Tools > NuGet Package Manager > Package Manager Console from
the menu.
PowerShell
Install-Package Azure.Messaging.ServiceBus
PowerShell
Install-Package Azure.Identity
1. Replace the existing contents of Program.cs with the following properties and
methods:
Passwordless
C#
using System.Threading.Tasks;
using Azure.Messaging.ServiceBus;
using Azure.Identity;
// the client that owns the connection and can be used to create
senders and receivers
ServiceBusClient client;
Passwordless
) Important
C#
// The Service Bus client types are safe to cache and use as a
singleton for the lifetime
// of the application, which is best practice when messages are
being published or read
// regularly.
//
// Create the clients that we'll use for sending and processing
messages.
// TODO: Replace the <NAMESPACE-NAME> placeholder
client = new ServiceBusClient(
"<NAMESPACE-NAME>.servicebus.windows.net",
new DefaultAzureCredential());
try
{
// add handler to process messages
processor.ProcessMessageAsync += MessageHandler;
// start processing
await processor.StartProcessingAsync();
// stop processing
Console.WriteLine("\nStopping the receiver...");
await processor.StopProcessingAsync();
Console.WriteLine("Stopped receiving messages");
}
finally
{
// Calling DisposeAsync on client types is required to ensure
that network
// resources and other unmanaged objects are properly cleaned
up.
await processor.DisposeAsync();
await client.DisposeAsync();
}
Passwordless
C#
using System;
using System.Threading.Tasks;
using Azure.Messaging.ServiceBus;
using Azure.Identity;
// the client that owns the connection and can be used to create
senders and receivers
ServiceBusClient client;
// The Service Bus client types are safe to cache and use as a
singleton for the lifetime
// of the application, which is best practice when messages are
being published or read
// regularly.
//
// Create the clients that we'll use for sending and processing
messages.
// TODO: Replace the <NAMESPACE-NAME> placeholder
client = new ServiceBusClient(
"<NAMESPACE-NAME>.servicebus.windows.net",
new DefaultAzureCredential());
try
{
// add handler to process messages
processor.ProcessMessageAsync += MessageHandler;
// start processing
await processor.StartProcessingAsync();
// stop processing
Console.WriteLine("\nStopping the receiver...");
await processor.StopProcessingAsync();
Console.WriteLine("Stopped receiving messages");
}
finally
{
// Calling DisposeAsync on client types is required to ensure
that network
// resources and other unmanaged objects are properly cleaned
up.
await processor.DisposeAsync();
await client.DisposeAsync();
}
5. Run the receiver application. You should see the received messages. Press any key
to stop the receiver and the application.
Console
Wait for a minute and then press any key to end the processing
Received: Message 1 from subscription: S1
Received: Message 2 from subscription: S1
Received: Message 3 from subscription: S1
On the Service Bus Topic page, in the Messages chart, you see three
incoming messages and three outgoing messages. If you don't see these
numbers, wait for a few minutes, and refresh the page to see the updated
chart.
On the Service Bus Subscription page, you see the Active message count as
zero. It's because a receiver has received messages from this subscription and
completed the messages.
Next steps
See the following documentation and samples:
Get started with the Azure Blob Storage client library for .NET. Azure Blob Storage is
Microsoft's object storage solution for the cloud. Follow these steps to install the
package and try out example code for basic tasks. Blob storage is optimized for storing
massive amounts of unstructured data.
Prerequisites
Azure subscription - create one for free
Azure storage account - create a storage account
Current .NET SDK for your operating system. Be sure to get the SDK and not the
runtime.
Setting up
This section walks you through preparing a project to work with the Azure Blob Storage
client library for .NET.
1. At the top of Visual Studio, navigate to File > New > Project...
2. In the dialog window, enter console app into the project template search box
and select the first result. Choose Next at the bottom of the dialog.
3. For the Project Name, enter BlobQuickstart. Leave the default values for the
rest of the fields and select Next.
4. For the Framework, ensure .NET 6.0 is selected. Then choose Create. The new
project will open inside the Visual Studio environment.
C#
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using System;
using System.IO;
You can also authorize requests to Azure Blob Storage by using the account access key.
However, this approach should be used with caution. Developers must be diligent to
never expose the access key in an unsecure location. Anyone who has the access key is
able to authorize requests against the storage account, and effectively has access to all
the data. DefaultAzureCredential offers improved management and security benefits
over the account key to allow passwordless authentication. Both options are
demonstrated in the following example.
Passwordless (Recommended)
For example, your app can authenticate using your Visual Studio sign-in credentials
with when developing locally. Your app can then use a managed identity once it has
been deployed to Azure. No code changes are required for this transition.
In this scenario, you'll assign permissions to your user account, scoped to the
storage account, to follow the Principle of Least Privilege. This practice gives users
only the minimum permissions needed and creates more secure production
environments.
The following example will assign the Storage Blob Data Contributor role to your
user account, which provides both read and write access to blob data in your
storage account.
) Important
In most cases it will take a minute or two for the role assignment to propagate
in Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your storage account using the main search
bar or left navigation.
2. On the storage account overview page, select Access control (IAM) from
the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Storage Blob Data Contributor and select the
matching result and then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
1. For local development, make sure you're authenticated with the same Azure
AD account you assigned the role to. You can authenticate via popular
development tools, such as the Azure CLI or Azure PowerShell. The
development tools with which you can authenticate vary across languages.
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
Visual Studio
3. Update your Program.cs code to match the following example. When the code
is run on your local workstation during development, it will use the developer
credentials of the prioritized tool you're logged into to authenticate to Azure,
such as the Azure CLI or Visual Studio.
C#
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using System;
using System.IO;
using Azure.Identity;
// TODO: Replace <storage-account-name> with your actual storage
account name
var blobServiceClient = new BlobServiceClient(
new Uri("https://<storage-account-
name>.blob.core.windows.net"),
new DefaultAzureCredential());
4. Make sure to update the storage account name in the URI of your
BlobServiceClient . The storage account name can be found on the overview
page of the Azure portal.
7 Note
Object model
Azure Blob Storage is optimized for storing massive amounts of unstructured data.
Unstructured data doesn't adhere to a particular data model or definition, such as text
or binary data. Blob storage offers three types of resources:
Code examples
The sample code snippets in the following sections demonstrate how to perform basic
data operations with the Azure Blob Storage client library for .NET.
) Important
Make sure you have installed the correct NuGet packages and added the necessary
using statements in order for the code samples to work, as described in the setting
up section.
Create a container
Decide on a name for the new container. The code below appends a GUID value to the
container name to ensure that it is unique.
) Important
C#
To learn more about creating a container, and to explore more code samples, see Create
a blob container with .NET.
C#
To learn more about uploading blobs, and to explore more code samples, see Upload a
blob with .NET.
C#
Console.WriteLine("Listing blobs...");
To learn more about listing blobs, and to explore more code samples, see List blobs with
.NET.
Download a blob
Download the previously created blob by calling the DownloadToAsync method. The
example code adds a suffix of "DOWNLOADED" to the file name so that you can see
both files in local file system.
C#
To learn more about downloading blobs, and to explore more code samples, see
Download a blob with .NET.
Delete a container
The following code cleans up the resources the app created by deleting the entire
container by using DeleteAsync. It also deletes the local files created by the app.
The app pauses for user input by calling Console.ReadLine before it deletes the blob,
container, and local files. This is a good chance to verify that the resources were actually
created correctly, before they are deleted.
C#
// Clean up
Console.Write("Press any key to begin clean up");
Console.ReadLine();
Console.WriteLine("Done");
To learn more about deleting a container, and to explore more code samples, see Delete
and restore a blob container with .NET.
Passwordless (Recommended)
C#
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using Azure.Identity;
Console.WriteLine("Listing blobs...");
// Clean up
Console.Write("Press any key to begin clean up");
Console.ReadLine();
Console.WriteLine("Done");
If you're using Visual Studio, press F5 to build and run the code and interact with the
console app. If you're using the .NET CLI, navigate to your application directory, then
build and run the application.
Console
dotnet build
Console
dotnet run
Output
https://mystorageacct.blob.core.windows.net/quickstartblobs60c70d78-8d93-
43ae-954d-8322058cfd64/quickstart2fe6c5b4-7918-46cb-96f4-8c4c5cb2fd31.txt
Listing blobs...
quickstart2fe6c5b4-7918-46cb-96f4-8c4c5cb2fd31.txt
Downloading blob to
./data/quickstart2fe6c5b4-7918-46cb-96f4-8c4c5cb2fd31DOWNLOADED.txt
Before you begin the clean up process, check your data folder for the two files. You can
open them and observe that they are identical.
After you've verified the files, press the Enter key to delete the test files and finish the
demo.
Next steps
In this quickstart, you learned how to upload, download, and list blobs using .NET.
To learn more, see the Azure Blob Storage client libraries for .NET.
For tutorials, samples, quick starts and other documentation, visit Azure for .NET
developers.
To learn more about .NET, see Get started with .NET in 10 minutes .
Quickstart: Azure Queue Storage client
library for .NET
Article • 06/29/2023
Get started with the Azure Queue Storage client library for .NET. Azure Queue Storage is
a service for storing large numbers of messages for later retrieval and processing. Follow
these steps to install the package and try out example code for basic tasks.
Use the Azure Queue Storage client library for .NET to:
Create a queue
Add messages to a queue
Peek at messages in a queue
Update a message in a queue
Get the queue length
Receive messages from a queue
Delete messages from a queue
Delete a queue
Prerequisites
Azure subscription - create one for free
Azure Storage account - create a storage account
Current .NET SDK for your operating system. Be sure to get the SDK and not the
runtime.
Setting up
This section walks you through preparing a project to work with the Azure Queue
Storage client library for .NET.
1. In a console window (such as cmd, PowerShell, or Bash), use the dotnet new
command to create a new console app with the name QueuesQuickstart . This
command creates a simple "hello world" C# project with a single source file named
Program.cs.
Console
Console
cd QueuesQuickstart
Console
The Azure Identity client library package is also needed for passwordless connections to
Azure services.
Console
C#
using Azure;
using Azure.Identity;
using Azure.Storage.Queues;
using Azure.Storage.Queues.Models;
using System;
using System.Threading.Tasks;
Console.WriteLine("Azure Queue Storage client library - .NET quickstart
sample");
Authenticate to Azure
Application requests to most Azure services must be authorized. Using the
DefaultAzureCredential class provided by the Azure Identity client library is the
You can also authorize requests to Azure services using passwords, connection strings,
or other credentials directly. However, this approach should be used with caution.
Developers must be diligent to never expose these secrets in an unsecure location.
Anyone who gains access to the password or secret key is able to authenticate.
DefaultAzureCredential offers improved management and security benefits over the
account key to allow passwordless authentication. Both options are demonstrated in the
following example.
Passwordless (Recommended)
For example, your app can authenticate using your Visual Studio sign-in credentials
when developing locally, and then use a managed identity once it has been
deployed to Azure. No code changes are required for this transition.
When developing locally, make sure that the user account that is accessing the
queue data has the correct permissions. You'll need Storage Queue Data
Contributor to read and write queue data. To assign yourself this role, you'll need
to be assigned the User Access Administrator role, or another role that includes the
Microsoft.Authorization/roleAssignments/write action. You can assign Azure RBAC
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. You can learn
more about the available scopes for role assignments on the scope overview page.
In this scenario, you'll assign permissions to your user account, scoped to the
storage account, to follow the Principle of Least Privilege. This practice gives users
only the minimum permissions needed and creates more secure production
environments.
The following example will assign the Storage Queue Data Contributor role to your
user account, which provides both read and write access to queue data in your
storage account.
) Important
In most cases it will take a minute or two for the role assignment to propagate
in Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your storage account using the main search
bar or left navigation.
2. On the storage account overview page, select Access control (IAM) from
the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Storage Queue Data Contributor and select the
matching result and then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Object model
Azure Queue Storage is a service for storing large numbers of messages. A queue
message can be up to 64 KB in size. A queue may contain millions of messages, up to
the total capacity limit of a storage account. Queues are commonly used to create a
backlog of work to process asynchronously. Queue Storage offers three types of
resources:
Storage account: All access to Azure Storage is done through a storage account.
For more information about storage accounts, see Storage account overview
Queue: A queue contains a set of messages. All messages must be in a queue.
Note that the queue name must be all lowercase. For information on naming
queues, see Naming Queues and Metadata.
Message: A message, in any format, of up to 64 KB. A message can remain in the
queue for a maximum of 7 days. For version 2017-07-29 or later, the maximum
time-to-live can be any positive number, or -1 indicating that the message doesn't
expire. If this parameter is omitted, the default time-to-live is seven days.
Code examples
These example code snippets show you how to perform the following actions with the
Azure Queue Storage client library for .NET:
Passwordless (Recommended)
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
Once authenticated, you can create and authorize a QueueClient object using
DefaultAzureCredential to access queue data in the storage account.
DefaultAzureCredential automatically discovers and uses the account you signed in
C#
using Azure.Identity;
Next, decide on a name for the queue and create an instance of the QueueClient
class, using DefaultAzureCredential for authorization. We use this client object to
create and interact with the queue resource in the storage account.
) Important
Queue names may only contain lowercase letters, numbers, and hyphens, and
must begin with a letter or a number. Each hyphen must be preceded and
followed by a non-hyphen character. The name must also be between 3 and 63
characters long. For more information, see Naming queues and metadata.
Add the following code to the end of the Program.cs file. Make sure to replace the
<storage-account-name> placeholder value:
C#
7 Note
Messages sent using the QueueClient class must be in a format that can be
included in an XML request with UTF-8 encoding. You can optionally set the
MessageEncoding option to Base64 to handle non-compliant messages.
Create a queue
Using the QueueClient object, call the CreateAsync method to create the queue in your
storage account.
C#
C#
C#
C#
// Update a message using the saved receipt from sending the message
await queueClient.UpdateMessageAsync(receipt.MessageId, receipt.PopReceipt,
"Third message has been updated");
C#
QueueProperties properties = queueClient.GetProperties();
C#
You can optionally specify a value for maxMessages , which is the number of messages to
retrieve from the queue. The default is 1 message and the maximum is 32 messages.
You can also specify a value for visibilityTimeout , which hides the messages from
other operations for the timeout period. The default is 30 seconds.
The app pauses for user input by calling Console.ReadLine before it processes and
deletes the messages. Verify in your Azure portal that the resources were created
correctly, before they're deleted. Any messages not explicitly deleted eventually become
visible in the queue again for another chance to process them.
C#
Delete a queue
The following code cleans up the resources the app created by deleting the queue using
the DeleteAsync method.
C#
// Clean up
Console.WriteLine($"Deleting queue: {queueClient.Name}");
await queueClient.DeleteAsync();
Console.WriteLine("Done");
In your console window, navigate to your application directory, then build and run the
application.
Console
dotnet build
Console
dotnet run
The output of the app is similar to the following example:
Output
Press Enter key to 'process' messages and delete them from the queue...
When the app pauses before receiving messages, check your storage account in the
Azure portal . Verify the messages are in the queue.
Press the Enter key to receive and delete the messages. When prompted, press the
Enter key again to delete the queue and finish the demo.
Next steps
In this quickstart, you learned how to create a queue and add messages to it using
asynchronous .NET code. Then you learned to peek, retrieve, and delete messages.
Finally, you learned how to delete a message queue.
This tutorial shows you how to configure Azure Functions to connect to Azure Service Bus queues using
managed identities instead of secrets stored in the function app settings. The tutorial is a continuation of
the Create a function app without default storage secrets in its definition tutorial. To learn more about
identity-based connections, see Configure an identity-based connection..
While the procedures shown work generally for all languages, this tutorial currently supports C# class library
functions on Windows specifically.
Prerequisite
Complete the previous tutorial: Create a function app with identity-based connections.
3. On the Basics page, use the following table to configure the Service Bus namespace settings. Use the
default values for the remaining options.
Subscription Your subscription The subscription under which your resources are created.
Resource myResourceGroup The resource group you created with your function app.
group
Namespace Globally unique The namespace of your instance from which to trigger your function.
name name Because the namespace is publicly accessible, you must use a name that is
globally unique across Azure. The name must also be between 6 and 50
characters in length, contain only alphanumeric characters and dashes, and
can't start with a number.
Location myFunctionRegion The region where you created your function app.
Now, that you have a queue, you will add a role assignment to the managed identity of your function app.
7 Note
Role requirements for using identity-based connections vary depending on the service and how you
are connecting to it. Needs vary across triggers, input bindings, and output bindings. For more details
on specific role requirements, please refer to the trigger and binding documentation for the service.
1. In your service bus namespace that you just created, select Access Control (IAM). This is where you
can view and configure who has access to the resource.
3. Search for Azure Service Bus Data Receiver, select it, and click Next.
4. On the Members tab, under Assign access to, choose Managed Identity
6. Confirm that the Subscription is the one in which you created the resources earlier.
7. In the Managed identity selector, choose Function App from the System-assigned managed identity
category. The label "Function App" may have a number in parentheses next to it, indicating the
number of apps in the subscription with system-assigned identities.
8. Your app should appear in a list below the input fields. If you don't see it, you can use the Select box
to filter the results with your app's name.
9. Click on your application. It should move down into the Selected members section. Click Select.
10. Back on the Add role assignment screen, click Review + assign. Review the configuration, and then
click Review + assign.
You've granted your function app access to the service bus namespace using managed identities.
3. In Application settings, select + New application setting to create the new setting in the following
table.
4. After you create the two settings, select Save > Confirm.
7 Note
When using Azure App Configuration or Key Vault to provide settings for Managed Identity
connections, setting names should use a valid key separator such as : or / in place of the __ to
ensure names are resolved correctly.
Now that you've prepared the function app to connect to the service bus namespace using a managed
identity, you can add a new function that uses a Service Bus trigger to your local project.
C#
Console
cd LocalFunctionProj
This replaces the default version of the Service Bus extension package with a version that supports
managed identities.
4. Run the following command to add a Service Bus triggered function to the project:
C#
This adds the code for a new Service Bus trigger and a reference to the extension package. You need
to add a service bus namespace connection setting for this trigger.
5. Open the new ServiceBusTrigger.cs project file and replace the ServiceBusTrigger class with the
following code:
C#
This code sample updates the queue name to myinputqueue , which is the same name as you queue
you created earlier. It also sets the name of the Service Bus connection to ServiceBusConnection . This
is the Service Bus namespace used by the identity-based connection
ServiceBusConnection__fullyQualifiedNamespace you configured in the portal.
7 Note
If you try to run your functions now using func start you'll receive an error. This is because you don't
have an identity-based connection defined locally. If you want to run your function locally, set the app
setting ServiceBusConnection__fullyQualifiedNamespace in local.settings.json as you did in the
previous section. In addition, you'll need to assign the role to your developer identity. For more
details, please refer to the local development with identity-based connections documentation.
7 Note
When using Azure App Configuration or Key Vault to provide settings for Managed Identity
connections, setting names should use a valid key separator such as : or / in place of the __ to
ensure names are resolved correctly.
For example, ServiceBusConnection:fullyQualifiedNamespace .
Console
2. Browse to the \bin\Release\netcoreapp3.1\publish subfolder and create a .zip file from its contents.
3. Publish the .zip file by running the following command, replacing the FUNCTION_APP_NAME ,
RESOURCE_GROUP_NAME , and PATH_TO_ZIP parameters as appropriate:
Azure CLI
Now that you have updated the function app with the new trigger, you can verify that it works using the
identity.
4. Keep the previous tab open, and open the Azure portal in a new tab. In your new tab, navigate to your
Service Bus namespace, select Queues from the left blade.
8. Select your open Live Metrics tab and see the Service Bus queue execution.
Congratulations! You have successfully set up your Service Bus queue trigger with a managed identity!
Clean up resources
In the preceding steps, you created Azure resources in a resource group. If you don't expect to need these
resources in the future, you can delete them by deleting the resource group.
From the Azure portal menu or Home page, select Resource groups. Then, on the Resource groups page,
select myResourceGroup.
On the myResourceGroup page, make sure that the listed resources are the ones you want to delete.
Select Delete resource group, type myResourceGroup in the text box to confirm, and then select Delete.
Next steps
In this tutorial, you created a function app with identity-based connections.
Use the following links to learn more Azure Functions with identity-based connections:
Azure Key Vault provides a way to store credentials and other secrets with increased
security. But your code needs to authenticate to Key Vault to retrieve them. Managed
identities for Azure resources help to solve this problem by giving Azure services an
automatically managed identity in Azure Active Directory (Azure AD). You can use this
identity to authenticate to any service that supports Azure AD authentication, including
Key Vault, without having to display credentials in your code.
In this tutorial, you'll create and deploy Azure web application to Azure App Service.
You'll use a managed identity to authenticate your Azure web app with an Azure key
vault using Azure Key Vault secret client library for .NET and the Azure CLI. The same
basic principles apply when you use the development language of your choice, Azure
PowerShell, and/or the Azure portal.
For more information about Azure App service web applications and deployment
presented in this tutorial, see:
Prerequisites
To complete this tutorial, you need:
If you already have your web application deployed in Azure App Service, you can skip to
configure web app access to a key vault and modify web application code sections.
Create a .NET Core app
In this step, you'll set up the local .NET Core project.
In a terminal window on your machine, create a directory named akvwebapp and make it
the current directory:
Bash
mkdir akvwebapp
cd akvwebapp
Create a .NET Core app by using the dotnet new web command:
Bash
Run the application locally so you know how it should look when you deploy it to Azure:
Bash
dotnet run
You'll see the "Hello World!" message from the sample app displayed on the page.
For more information about creating web applications for Azure, see Create an ASP.NET
Core web app in Azure App Service
Bash
git init --initial-branch=main
git add .
git commit -m "first commit"
You can use FTP and local Git to deploy an Azure web app by using a deployment user.
After you configure your deployment user, you can use it for all your Azure
deployments. Your account-level deployment user name and password are different
from your Azure subscription credentials.
To configure the deployment user, run the az webapp deployment user set command.
Choose a user name and password that adheres to these guidelines:
The user name must be unique within Azure. For local Git pushes, it can't contain
the at sign symbol (@).
The password must be at least eight characters long and contain two of the
following three elements: letters, numbers, and symbols.
Azure CLI
The JSON output shows the password as null . If you get a 'Conflict'. Details: 409
error, change the user name. If you get a 'Bad Request'. Details: 400 error, use a
stronger password.
Record your user name and password so you can use it to deploy your web apps.
Azure CLI
Azure CLI
When the App Service plan is created, the Azure CLI displays information similar to what
you see here:
{
"adminSiteName": null,
"appServicePlanName": "myAppServicePlan",
"geoRegion": "West Europe",
"hostingEnvironmentProfile": null,
"id": "/subscriptions/0000-
0000/resourceGroups/myResourceGroup/providers/Microsoft.Web/serverfarms/myAp
pServicePlan",
"kind": "app",
"location": "West Europe",
"maximumNumberOfWorkers": 1,
"name": "myAppServicePlan",
< JSON data removed for brevity. >
"targetWorkerSizeId": 0,
"type": "Microsoft.Web/serverfarms",
"workerTierName": null
}
) Important
Like a key vault, an Azure web app must have a unique name. Replace <your-
webapp-name> with the name of your web app in the following examples.
Azure CLI
When the web app is created, the Azure CLI shows output similar to what you see here:
Local git is configured with url of 'https://<username>@<your-webapp-
name>.scm.azurewebsites.net/<ayour-webapp-name>.git'
{
"availabilityState": "Normal",
"clientAffinityEnabled": true,
"clientCertEnabled": false,
"clientCertExclusionPaths": null,
"cloningInfo": null,
"containerSize": 0,
"dailyMemoryTimeQuota": 0,
"defaultHostName": "<your-webapp-name>.azurewebsites.net",
"deploymentLocalGitUrl": "https://<username>@<your-webapp-
name>.scm.azurewebsites.net/<your-webapp-name>.git",
"enabled": true,
< JSON data removed for brevity. >
}
The URL of the Git remote is shown in the deploymentLocalGitUrl property, in the
format https://<username>@<your-webapp-name>.scm.azurewebsites.net/<your-webapp-
name>.git . Save this URL. You'll need it later.
Now configure your web app to deploy from the main branch:
Azure CLI
Go to your new app by using the following command. Replace <your-webapp-name> with
your app name.
Bash
https://<your-webapp-name>.azurewebsites.net
You'll see the default webpage for a new Azure web app.
Bash
git remote add azure <deploymentLocalGitUrl-from-create-step>
Use the following command to push to the Azure remote to deploy your app. When Git
Credential Manager prompts you for credentials, use the credentials you created in the
Configure the local Git deployment section.
Bash
This command might take a few minutes to run. While it runs, it displays information
similar to what you see here:
Bash
http://<your-webapp-name>.azurewebsites.net
You'll see the "Hello World!" message you saw earlier when you visited
http://localhost:5000 .
For more information about deploying web application using Git, see Local Git
deployment to Azure App Service
In the Azure CLI, to create the identity for the application, run the az webapp-identity
assign command:
Azure CLI
JSON
{
"principalId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
"tenantId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
"type": "SystemAssigned"
}
To give your web app permission to do get and list operations on your key vault, pass
the principalId to the Azure CLI az keyvault set-policy command:
Azure CLI
From the terminal window, install the Azure Key Vault secret client library for .NET and
Azure Identity client library packages:
Console
C#
using Azure.Identity;
using Azure.Security.KeyVault.Secrets;
using Azure.Core;
Add the following lines before the app.UseEndpoints call (.NET 5.0 or earlier) or
app.MapGet call (.NET 6.0), updating the URI to reflect the vaultUri of your key vault.
This code uses DefaultAzureCredential() to authenticate to Key Vault, which uses a token
from managed identity to authenticate. For more information about authenticating to
Key Vault, see the Developer's Guide. The code also uses exponential backoff for retries
in case Key Vault is being throttled. For more information about Key Vault transaction
limits, see Azure Key Vault throttling guidance.
C#
C#
await context.Response.WriteAsync(secretValue);
.NET 6.0
Update the line app.MapGet("/", () => "Hello World!"); to look like this line:
C#
Now that you've updated your code, you can redeploy it to Azure by using these Git
commands:
Bash
git add .
git commit -m "Updated web app to access my key vault"
git push azure main
Go to your completed web app
Bash
http://<your-webapp-name>.azurewebsites.net
Where before you saw "Hello World!", you should now see the value of your secret
displayed.
Next steps
Use Azure Key Vault with applications deployed to a virtual machine in .NET
Learn more about managed identities for Azure resources
View the Developer's Guide
Secure access to a key vault
How to use managed identities for App
Service and Azure Functions
Article • 06/27/2023
This article shows you how to create a managed identity for App Service and Azure
Functions applications and how to use it to access other resources.
) Important
7 Note
Managed identities are not available for apps deployed in Azure Arc.
A managed identity from Azure Active Directory (Azure AD) allows your app to easily
access other Azure AD-protected resources such as Azure Key Vault. The identity is
managed by the Azure platform and does not require you to provision or rotate any
secrets. For more about managed identities in Azure AD, see Managed identities for
Azure resources.
1. In the left navigation of your app's page, scroll down to the Settings group.
2. Select Identity.
3. Within the System assigned tab, switch Status to On. Click Save.
Azure portal
2. In the left navigation for your app's page, scroll down to the Settings group.
3. Select Identity.
5. Search for the identity you created earlier, select it, and select Add.
Once you select Add, the app restarts.
) Important
The back-end services for managed identities maintain a cache per resource URI for
around 24 hours. If you update the access policy of a particular target resource and
immediately retrieve a token for that resource, you may continue to get a cached
token with outdated permissions until that token expires. There's currently no way
to force a token refresh.
HTTP GET
HTTP
GET /MSI/token?resource=https://vault.azure.net&api-version=2019-08-01
HTTP/1.1
Host: localhost:4141
X-IDENTITY-HEADER: 853b9a84-5bfa-4b22-a3f3-0b9a43d9ad8a
HTTP
HTTP/1.1 200 OK
Content-Type: application/json
{
"access_token": "eyJ0eXAi…",
"expires_on": "1586984735",
"resource": "https://vault.azure.net",
"token_type": "Bearer",
"client_id": "5E29463D-71DA-4FE0-8E69-999B57DB23B0"
}
This response is the same as the response for the Azure AD service-to-service
access token request. To access Key Vault, you will then add the value of
access_token to a client connection with the vault.
For more information on the REST endpoint, see REST endpoint reference.
Remove an identity
When you remove a system-assigned identity, it's deleted from Azure Active Directory.
System-assigned identities are also automatically removed from Azure Active Directory
when you delete the app resource itself.
Azure portal
1. In the left navigation of your app's page, scroll down to the Settings group.
2. Select Identity. Then follow the steps based on the identity type:
7 Note
The IDENTITY_ENDPOINT is a local URL from which your app can request tokens. To get
a token for a resource, make an HTTP GET request to this endpoint, including the
following parameters:
Parameter In Description
name
resource Query The Azure AD resource URI of the resource for which a token should
be obtained. This could be one of the Azure services that support
Azure AD authentication or any other resource URI.
api-version Query The version of the token API to be used. Use 2019-08-01 .
Parameter In Description
name
) Important
If you are attempting to obtain tokens for user-assigned identities, you must
include one of the optional properties. Otherwise the token service will attempt to
obtain a token for a system-assigned identity, which may or may not exist.
Next steps
Tutorial: Connect to SQL Database from App Service without secrets using a
managed identity
Access Azure Storage securely using a managed identity
Call Microsoft Graph securely using a managed identity
Connect securely to services with Key Vault secrets
Assign an Azure role for access to blob
data
Article • 04/03/2023
Azure Active Directory (AAD) authorizes access rights to secured resources through
Azure role-based access control (Azure RBAC). Azure Storage defines a set of Azure
built-in roles that encompass common sets of permissions used to access blob data.
When an Azure role is assigned to an Azure AD security principal, Azure grants access to
those resources for that security principal. An Azure AD security principal may be a user,
a group, an application service principal, or a managed identity for Azure resources.
To learn more about using Azure AD to authorize access to blob data, see Authorize
access to blobs using Azure Active Directory.
7 Note
This article shows how to assign an Azure role for access to blob data in a storage
account. To learn about assigning roles for management operations in Azure
Storage, see Use the Azure Storage resource provider to access management
resources.
Azure portal
To access blob data in the Azure portal with Azure AD credentials, a user must have
the following role assignments:
A data access role, such as Storage Blob Data Reader or Storage Blob Data
Contributor
The Azure Resource Manager Reader role, at a minimum
To learn how to assign these roles to a user, follow the instructions provided in
Assign Azure roles using the Azure portal.
The Reader role is an Azure Resource Manager role that permits users to view
storage account resources, but not modify them. It does not provide read
permissions to data in Azure Storage, but only to account management resources.
The Reader role is necessary so that users can navigate to blob containers in the
Azure portal.
For example, if you assign the Storage Blob Data Contributor role to user Mary at
the level of a container named sample-container, then Mary is granted read, write,
and delete access to all of the blobs in that container. However, if Mary wants to
view a blob in the Azure portal, then the Storage Blob Data Contributor role by
itself will not provide sufficient permissions to navigate through the portal to the
blob in order to view it. The additional permissions are required to navigate
through the portal and view the other resources that are visible there.
A user must be assigned the Reader role to use the Azure portal with Azure AD
credentials. However, if a user has been assigned a role with
Microsoft.Storage/storageAccounts/listKeys/action permissions, then the user can
use the portal with the storage account keys, via Shared Key authorization. To use
the storage account keys, Shared Key access must be permitted for the storage
account. For more information on permitting or disallowing Shared Key access, see
Prevent Shared Key authorization for an Azure Storage account.
You can also assign an Azure Resource Manager role that provides additional
permissions beyond than the Reader role. Assigning the least possible permissions
is recommended as a security best practice. For more information, see Best
practices for Azure RBAC.
7 Note
Prior to assigning yourself a role for data access, you will be able to access
data in your storage account via the Azure portal because the Azure portal can
also use the account key for data access. For more information, see Choose
how to authorize access to blob data in the Azure portal.
Keep in mind the following points about Azure role assignments in Azure Storage:
When you create an Azure Storage account, you are not automatically assigned
permissions to access data via Azure AD. You must explicitly assign yourself an
Azure role for Azure Storage. You can assign it at the level of your subscription,
resource group, storage account, or container.
If the storage account is locked with an Azure Resource Manager read-only lock,
then the lock prevents the assignment of Azure roles that are scoped to the
storage account or a container.
If you have set the appropriate allow permissions to access data via Azure AD and
are unable to access the data, for example you are getting an
"AuthorizationPermissionMismatch" error. Be sure to allow enough time for the
permissions changes you have made in Azure AD to replicate, and be sure that you
do not have any deny assignments that block your access, see Understand Azure
deny assignments.
7 Note
You can create custom Azure RBAC roles for granular access to blob data. For more
information, see Azure custom roles.
Next steps
What is Azure role-based access control (Azure RBAC)?
Best practices for Azure RBAC
Managed identities in Azure Container
Apps
Article • 03/22/2023
A managed identity from Azure Active Directory (Azure AD) allows your container app to
access other Azure AD-protected resources. For more about managed identities in
Azure AD, see Managed identities for Azure resources.
A system-assigned identity is tied to your container app and is deleted when your
container app is deleted. An app can only have one system-assigned identity.
A user-assigned identity is a standalone Azure resource that can be assigned to
your container app and other resources. A container app can have multiple user-
assigned identities. The identity exists until you delete them.
Your app connects to resources with the managed identity. You don't need to
manage credentials in your container app.
You can use role-based access control to grant specific permissions to a managed
identity.
System-assigned identities are automatically created and managed. They're
deleted when your container app is deleted.
You can add and delete user-assigned identities and assign them to multiple
resources. They're independent of your container app's life cycle.
You can use managed identity to authenticate with a private Azure Container
Registry without a username and password to pull containers for your Container
App.
You can use managed identity to create connections for Dapr-enabled applications
via Dapr components
Limitations
Using managed identities in scale rules isn't supported. You'll still need to include the
connection string or key in the secretRef of the scaling rule.
7 Note
When adding a managed identity to a container app deployed before April 11,
2022, you must create a new revision.
Azure portal
1. In the left navigation of your container app's page, scroll down to the Settings
group.
2. Select Identity.
3. Within the System assigned tab, switch Status to On. Select Save.
Add a user-assigned identity
Configuring a container app with a user-assigned identity requires that you first create
the identity then add its resource identifier to your container app's configuration. You
can create user-assigned identities via the Azure portal or the Azure CLI. For information
on creating and managing user-assigned identities, see Manage user-assigned managed
identities.
Azure portal
2. In the left navigation for your container app's page, scroll down to the
Settings group.
3. Select Identity.
5. Search for the identity you created earlier and select it. Select Add.
Configure a target resource
For some resources, you'll need to configure role assignments for your app's managed
identity to grant access. Otherwise, calls from your app to services, such as Azure Key
Vault and Azure SQL Database, will be rejected even if you use a valid token for that
identity. To learn more about Azure role-based access control (Azure RBAC), see What is
RBAC?. To learn more about which resources support Azure Active Directory tokens, see
Azure services that support Azure AD authentication.
) Important
The back-end services for managed identities maintain a cache per resource URI for
around 24 hours. If you update the access policy of a particular target resource and
immediately retrieve a token for that resource, you may continue to get a cached
token with outdated permissions until that token expires. There's currently no way
to force a token refresh.
7 Note
When using Azure Identity client library, the user-assigned managed identity client
id must be specified.
.NET
7 Note
When connecting to Azure SQL data sources with Entity Framework Core,
consider using Microsoft.Data.SqlClient, which provides special connection
strings for managed identity connectivity.
For .NET apps, the simplest way to work with a managed identity is through the
Azure Identity client library for .NET. See the respective documentation headings of
the client library for information:
The linked examples use DefaultAzureCredential. It's useful for most the scenarios
because the same pattern works in Azure (with managed identities) and on your
local machine (without managed identities).
Azure CLI
az containerapp identity show --name <APP_NAME> --resource-group
<GROUP_NAME>
Azure portal
1. In the left navigation of your app's page, scroll down to the Settings group.
2. Select Identity. Then follow the steps based on the identity type:
Next steps
Monitor an app
Configure role-based access control with
Azure Active Directory for your Azure
Cosmos DB account
Article • 07/12/2023
7 Note
This article is about role-based access control for data plane operations in Azure Cosmos
DB. If you are using management plane operations, see role-based access control
applied to your management plane operations article.
Azure Cosmos DB exposes a built-in role-based access control system that lets you:
Authenticate your data requests with an Azure Active Directory (Azure AD) identity.
Authorize your data requests with a fine-grained, role-based permission model.
Concepts
The Azure Cosmos DB data plane role-based access control is built on concepts that are
commonly found in other role-based access control systems like Azure role-based access
control:
The permission model is composed of a set of actions; each of these actions maps to
one or multiple database operations. Some examples of actions include reading an item,
writing an item, or executing a query.
Azure Cosmos DB users create role definitions containing a list of allowed actions.
Role definitions get assigned to specific Azure AD identities through role assignments. A
role assignment also defines the scope that the role definition applies to; currently, three
scopes are currently:
An Azure Cosmos DB account,
An Azure Cosmos DB database,
An Azure Cosmos DB container.
Permission model
) Important
This permission model covers only database operations that involve reading and writing
data. It does not cover any kind of management operations on management resources,
including:
Create/Replace/Delete Database
Create/Replace/Delete Container
Read/Replace Container Throughput
Create/Replace/Delete/Read Stored Procedures
Create/Replace/Delete/Read Triggers
Create/Replace/Delete/Read User Defined Functions
You cannot use any Azure Cosmos DB data plane SDK to authenticate management
operations with an Azure AD identity. Instead, you must use Azure role-based access
control through one of the following options:
Read Database and Read Container are considered metadata requests. Access to these
operations can be granted as stated in the following section.
This table lists all the actions exposed by the permission model.
Name Corresponding
database
operation(s)
Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/items/read Read an
individual item
by its ID and
partition key
(point-read).
Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/items/replace Replace an
existing item.
Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/items/upsert "Upsert" an
item. This
operation
creates an item
if it doesn't
already exist, or
to replace the
item if it does
exist.
Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/executeStoredProcedure Execute a
stored
procedure.
Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/manageConflicts Manage
conflicts for
multi-write
region
Name Corresponding
database
operation(s)
accounts (that
is, list and
delete items
from the
conflict feed).
7 Note
Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/readChangeFeed
Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/*
Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/items/*
Metadata requests
The Azure Cosmos DB SDKs issue read-only metadata requests during initialization and to
serve specific data requests. These requests fetch various configuration details such as:
The global configuration of your account, which includes the Azure regions the account
is available in.
The partition key of your containers or their indexing policy.
The list of physical partitions that make a container and their addresses.
They do not fetch any of the data that you've stored in your account.
To ensure the best transparency of our permission model, these metadata requests are
explicitly covered by the Microsoft.DocumentDB/databaseAccounts/readMetadata action. This
action should be allowed in every situation where your Azure Cosmos DB account is accessed
through one of the Azure Cosmos DB SDKs. It can be assigned (through a role assignment) at
any level in the Azure Cosmos DB hierarchy (that is, account, database, or container).
) Important
) Important
The term role definitions here refer to Azure Cosmos DB specific role definitions. These
are distinct from Azure role-based access control role definitions.
/dbs/<database-name> (database-level),
/dbs/<database-name>/colls/<container-name> (container-level).
7 Note
PowerShell
$resourceGroupName = "<myResourceGroup>"
$accountName = "<myCosmosAccount>"
New-AzCosmosDBSqlRoleDefinition -AccountName $accountName `
-ResourceGroupName $resourceGroupName `
-Type CustomRole -RoleName MyReadOnlyRole `
-DataAction @( `
'Microsoft.DocumentDB/databaseAccounts/readMetadata',
'Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/items/read', `
'Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/executeQuery', `
'Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/readChangeFeed') `
-AssignableScope "/"
PowerShell
PowerShell
Output
RoleName : MyReadWriteRole
Id :
/subscriptions/<mySubscriptionId>/resourceGroups/<myResourceGroup>/providers/Micr
osoft.DocumentDB/databaseAcc
ounts/<myCosmosAccount>/sqlRoleDefinitions/<roleDefinitionId>
Type : CustomRole
Permissions : {Microsoft.Azure.Management.CosmosDB.Models.Permission}
AssignableScopes :
{/subscriptions/<mySubscriptionId>/resourceGroups/<myResourceGroup>/providers/Mic
rosoft.DocumentDB/databaseAc
counts/<myCosmosAccount>}
RoleName : MyReadOnlyRole
Id :
/subscriptions/<mySubscriptionId>/resourceGroups/<myResourceGroup>/providers/Micr
osoft.DocumentDB/databaseAcc
ounts/<myCosmosAccount>/sqlRoleDefinitions/<roleDefinitionId>
Type : CustomRole
Permissions : {Microsoft.Azure.Management.CosmosDB.Models.Permission}
AssignableScopes :
{/subscriptions/<mySubscriptionId>/resourceGroups/<myResourceGroup>/providers/Mic
rosoft.DocumentDB/databaseAc
counts/<myCosmosAccount>}
JSON
{
"RoleName": "MyReadOnlyRole",
"Type": "CustomRole",
"AssignableScopes": ["/"],
"Permissions": [{
"DataActions": [
"Microsoft.DocumentDB/databaseAccounts/readMetadata",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/items/read",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/executeQuery",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/readChangeFeed"
]
}]
}
Azure CLI
resourceGroupName='<myResourceGroup>'
accountName='<myCosmosAccount>'
az cosmosdb sql role definition create --account-name $accountName --resource-
group $resourceGroupName --body @role-definition-ro.json
Create a role named MyReadWriteRole that contains all actions in a file named role-definition-
rw.json:
JSON
{
"RoleName": "MyReadWriteRole",
"Type": "CustomRole",
"AssignableScopes": ["/"],
"Permissions": [{
"DataActions": [
"Microsoft.DocumentDB/databaseAccounts/readMetadata",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/items/*",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/*"
]
}]
}
Azure CLI
Azure CLI
JSON
[
{
"assignableScopes": [
"/subscriptions/<mySubscriptionId>/resourceGroups/<myResourceGroup>/providers/Mic
rosoft.DocumentDB/databaseAccounts/<myCosmosAccount>"
],
"id":
"/subscriptions/<mySubscriptionId>/resourceGroups/<myResourceGroup>/providers/Mic
rosoft.DocumentDB/databaseAccounts/<myCosmosAccount>/sqlRoleDefinitions/<roleDefi
nitionId>",
"name": "<roleDefinitionId>",
"permissions": [
{
"dataActions": [
"Microsoft.DocumentDB/databaseAccounts/readMetadata",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/items/*",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/*"
],
"notDataActions": []
}
],
"resourceGroup": "<myResourceGroup>",
"roleName": "MyReadWriteRole",
"sqlRoleDefinitionGetResultsType": "CustomRole",
"type": "Microsoft.DocumentDB/databaseAccounts/sqlRoleDefinitions"
},
{
"assignableScopes": [
"/subscriptions/<mySubscriptionId>/resourceGroups/<myResourceGroup>/providers/Mic
rosoft.DocumentDB/databaseAccounts/<myCosmosAccount>"
],
"id":
"/subscriptions/<mySubscriptionId>/resourceGroups/<myResourceGroup>/providers/Mic
rosoft.DocumentDB/databaseAccounts/<myCosmosAccount>/sqlRoleDefinitions/<roleDefi
nitionId>",
"name": "<roleDefinitionId>",
"permissions": [
{
"dataActions": [
"Microsoft.DocumentDB/databaseAccounts/readMetadata",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/items/read",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/executeQuery",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/readChangeFeed"
],
"notDataActions": []
}
],
"resourceGroup": "<myResourceGroup>",
"roleName": "MyReadOnlyRole",
"sqlRoleDefinitionGetResultsType": "CustomRole",
"type": "Microsoft.DocumentDB/databaseAccounts/sqlRoleDefinitions"
}
]
Using Azure Resource Manager templates
For a reference and examples of using Azure Resource Manager templates to create role
definitions, see Microsoft.DocumentDB databaseAccounts/sqlRoleDefinitions.
The principal ID of the identity that the role definition should be assigned to.
/dbs/<database-name> (database-level)
/dbs/<database-name>/colls/<container-name> (container-level)
The scope must match or be a subscope of one of the role definition's assignable scopes.
7 Note
If you want to create a role assignment for a service principal, make sure to use its Object
ID as found in the Enterprise applications section of the Azure Active Directory portal
blade.
7 Note
PowerShell
$resourceGroupName = "<myResourceGroup>"
$accountName = "<myCosmosAccount>"
$readOnlyRoleDefinitionId = "<roleDefinitionId>" # as fetched above
# For Service Principals make sure to use the Object ID as found in the
Enterprise applications section of the Azure Active Directory portal blade.
$principalId = "<aadPrincipalId>"
New-AzCosmosDBSqlRoleAssignment -AccountName $accountName `
-ResourceGroupName $resourceGroupName `
-RoleDefinitionId $readOnlyRoleDefinitionId `
-Scope "/" `
-PrincipalId $principalId
Azure CLI
resourceGroupName='<myResourceGroup>'
accountName='<myCosmosAccount>'
readOnlyRoleDefinitionId = '<roleDefinitionId>' # as fetched above
# For Service Principals make sure to use the Object ID as found in the
Enterprise applications section of the Azure Active Directory portal blade.
principalId = '<aadPrincipalId>'
az cosmosdb sql role assignment create --account-name $accountName --resource-
group $resourceGroupName --scope "/" --principal-id $principalId --role-
definition-id $readOnlyRoleDefinitionId
The way you create a TokenCredential instance is beyond the scope of this article. There are
many ways to create such an instance depending on the type of Azure AD identity you want
to use (user principal, service principal, group etc.). Most importantly, your TokenCredential
instance must resolve to the identity (principal ID) that you've assigned your roles to. You can
find examples of creating a TokenCredential class:
In .NET
In Java
In JavaScript
In Python
In .NET
The Azure Cosmos DB role-based access control is currently supported in the .NET SDK V3.
C#
In Java
The Azure Cosmos DB role-based access control is currently supported in the Java SDK V4.
Java
In JavaScript
The Azure Cosmos DB role-based access control is currently supported in the JavaScript SDK
V3.
JavaScript
In Python
The Azure Cosmos DB role-based access control is supported in the Python SDK versions
4.3.0b4 and higher.
Python
aad_credentials = ClientSecretCredential(
tenant_id="<azure-ad-tenant-id>",
client_id="<client-application-id>",
client_secret="<client-application-secret>")
client = CosmosClient("<account-endpoint>", aad_credentials)
type=aad&ver=1.0&sig=<token-from-oauth>
7 Note
The data explorer exposed in the Azure portal does not support the Azure Cosmos DB
role-based access control yet. To use your Azure AD identity when exploring your data,
you must use the Azure Cosmos DB Explorer instead.
When you access the Azure Cosmos DB Explorer with the specific ?
feature.enableAadDataPlane=true query parameter and sign in, the following logic is used to
1. A request to fetch the account's primary key is attempted on behalf of the identity
signed in. If this request succeeds, the primary key is used to access the account's data.
2. If the identity signed in isn't allowed to fetch the account's primary key, this identity is
directly used to authenticate data access. In this mode, the identity must be assigned
with proper role definitions to ensure data access.
Audit data requests
Diagnostic logs get augmented with identity and authorization information for each data
operation when using Azure Cosmos DB role-based access control. This augmentation lets
you perform detailed auditing and retrieve the Azure AD identity used for every data request
sent to your Azure Cosmos DB account.
This additional information flows in the DataPlaneRequests log category and consists of two
extra columns:
aadPrincipalId_g shows the principal ID of the Azure AD identity that was used to
JSON
"resources": [
{
"type": " Microsoft.DocumentDB/databaseAccounts",
"properties": {
"disableLocalAuth": true,
// ...
},
// ...
},
// ...
]
Limits
You can create up to 100 role definitions and 2,000 role assignments per Azure Cosmos
DB account.
You can only assign role definitions to Azure AD identities belonging to the same Azure
AD tenant as your Azure Cosmos DB account.
Azure AD group resolution isn't currently supported for identities that belong to more
than 200 groups.
The Azure AD token is currently passed as a header with each individual request sent to
the Azure Cosmos DB service, increasing the overall payload size.
Use managed identities for Azure resources to run code in Azure Container Instances
that interacts with other Azure services - without maintaining any secrets or credentials
in code. The feature provides an Azure Container Instances deployment with an
automatically managed identity in Azure Active Directory.
In this article, you learn more about managed identities in Azure Container Instances
and:
Adapt the examples to enable and use identities in Azure Container Instances to access
other Azure services. These examples are interactive. However, in practice your container
images would run code to access Azure services.
Azure Container Instances supports both types of managed Azure identities: user-
assigned and system-assigned. On a container group, you can enable a system-assigned
identity, one or more user-assigned identities, or both types of identities. If you're
unfamiliar with managed identities for Azure resources, see the overview.
Prerequisites
Use the Bash environment in Azure Cloud Shell. For more information, see
Quickstart for Bash in Azure Cloud Shell.
If you prefer to run CLI reference commands locally, install the Azure CLI. If you're
running on Windows or macOS, consider running Azure CLI in a Docker container.
For more information, see How to run the Azure CLI in a Docker container.
If you're using a local installation, sign in to the Azure CLI by using the az login
command. To finish the authentication process, follow the steps displayed in
your terminal. For other sign-in options, see Sign in with the Azure CLI.
When you're prompted, install the Azure CLI extension on first use. For more
information about extensions, see Use extensions with the Azure CLI.
Run az version to find the version and dependent libraries that are installed. To
upgrade to the latest version, run az upgrade.
This article requires version 2.0.49 or later of the Azure CLI. If using Azure Cloud
Shell, the latest version is already installed.
First, create a resource group named myResourceGroup in the eastus location with the
following az group create command:
Azure CLI
Use the az keyvault create command to create a key vault. Be sure to specify a unique
key vault name.
Azure CLI
az keyvault create \
--name mykeyvault \
--resource-group myResourceGroup \
--location eastus
Store a sample secret in the key vault using the az keyvault secret set command:
Azure CLI
Continue with the following examples to access the key vault using either a user-
assigned or system-assigned managed identity in Azure Container Instances.
Create an identity
First create an identity in your subscription using the az identity create command. You
can use the same resource group used to create the key vault, or use a different one.
Azure CLI
az identity create \
--resource-group myResourceGroup \
--name myACIId
To use the identity in the following steps, use the az identity show command to store
the identity's service principal ID and resource ID in variables.
Azure CLI
Azure CLI
az keyvault set-policy \
--name mykeyvault \
--resource-group myResourceGroup \
--object-id $SP_ID \
--secret-permissions get
Azure CLI
az container create \
--resource-group myResourceGroup \
--name mycontainer \
--image mcr.microsoft.com/azure-cli \
--assign-identity $RESOURCE_ID \
--command-line "tail -f /dev/null"
Within a few seconds, you should get a response from the Azure CLI indicating that the
deployment has completed. Check its status with the az container show command.
Azure CLI
az container show \
--resource-group myResourceGroup \
--name mycontainer
The identity section in the output looks similar to the following, showing the identity is
set in the container group. The principalID under userAssignedIdentities is the service
principal of the identity you created in Azure Active Directory:
Output
[...]
"identity": {
"principalId": "null",
"tenantId": "xxxxxxxx-f292-4e60-9122-xxxxxxxxxxxx",
"type": "UserAssigned",
"userAssignedIdentities": {
"/subscriptions/xxxxxxxx-0903-4b79-a55a-
xxxxxxxxxxxx/resourcegroups/danlep1018/providers/Microsoft.ManagedIdentity/u
serAssignedIdentities/myACIId": {
"clientId": "xxxxxxxx-5523-45fc-9f49-xxxxxxxxxxxx",
"principalId": "xxxxxxxx-f25b-4895-b828-xxxxxxxxxxxx"
}
}
},
[...]
Azure CLI
az container exec \
--resource-group myResourceGroup \
--name mycontainer \
--exec-command "/bin/bash"
Run the following commands in the bash shell in the container. To get an access token
to use Azure Active Directory to authenticate to key vault, run the following command:
Bash
client_id="xxxxxxxx-5523-45fc-9f49-xxxxxxxxxxxx"
curl "http://169.254.169.254/metadata/identity/oauth2/token?api-
version=2018-02-
01&resource=https%3A%2F%2Ffanyv88.com%3A443%2Fhttps%2Fvault.azure.net&client_id=$client_id" -H
Metadata:true -s
Output:
Bash
{"access_token":"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx1QiLCJhbGciOiJSUzI1NiIsIng1
dCI6Imk2bEdrM0ZaenhSY1ViMkMzbkVRN3N5SEpsWSIsImtpZCI6Imk2bEdrM0ZaenhSY1ViMkMz
bkVRN3N5SEpsWSJ9......xxxxxxxxxxxxxxxxx","refresh_token":"","expires_in":"28
799","expires_on":"1539927532","not_before":"1539898432","resource":"https:/
/vault.azure.net/","token_type":"Bearer"}
Bash
TOKEN=$(curl 'http://169.254.169.254/metadata/identity/oauth2/token?api-
version=2018-02-01&resource=https%3A%2F%2Ffanyv88.com%3A443%2Fhttps%2Fvault.azure.net' -H Metadata:true
| jq -r '.access_token')
Now use the access token to authenticate to key vault and read a secret. Be sure to
substitute the name of your key vault in the URL (https://mykeyvault.vault.azure.net/...):
Bash
curl https://mykeyvault.vault.azure.net/secrets/SampleSecret/?api-
version=7.4 -H "Authorization: Bearer $TOKEN"
The response looks similar to the following, showing the secret. In your code, you would
parse this output to obtain the secret. Then, use the secret in a subsequent operation to
access another Azure resource.
Bash
{"value":"Hello Container
Instances","contentType":"ACIsecret","id":"https://mykeyvault.vault.azure.ne
t/secrets/SampleSecret/xxxxxxxxxxxxxxxxxxxx","attributes":
{"enabled":true,"created":1539965967,"updated":1539965967,"recoveryLevel":"P
urgeable"},"tags":{"file-encoding":"utf-8"}}
Azure CLI
Within a few seconds, you should get a response from the Azure CLI indicating that the
deployment has completed. Check its status with the az container show command.
Azure CLI
az container show \
--resource-group myResourceGroup \
--name mycontainer
The identity section in the output looks similar to the following, showing that a
system-assigned identity is created in Azure Active Directory:
Output
[...]
"identity": {
"principalId": "xxxxxxxx-528d-7083-b74c-xxxxxxxxxxxx",
"tenantId": "xxxxxxxx-f292-4e60-9122-xxxxxxxxxxxx",
"type": "SystemAssigned",
"userAssignedIdentities": null
},
[...]
Set a variable to the value of principalId (the service principal ID) of the identity, to use
in later steps.
Azure CLI
Azure CLI
az keyvault set-policy \
--name mykeyvault \
--resource-group myResourceGroup \
--object-id $SP_ID \
--secret-permissions get
Azure CLI
az container exec \
--resource-group myResourceGroup \
--name mycontainer \
--exec-command "/bin/bash"
Run the following commands in the bash shell in the container. First log in to the Azure
CLI using the managed identity:
Azure CLI
az login --identity
From the running container, retrieve the secret from the key vault:
Azure CLI
Output
User-assigned identity
A user-assigned identity is a resource ID of the form:
"/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/provider
s/Microsoft.ManagedIdentity/userAssignedIdentities/{identityName}"
JSON
"identity": {
"type": "UserAssigned",
"userAssignedIdentities": {
"myResourceID1": {
}
}
}
System-assigned identity
JSON
"identity": {
"type": "SystemAssigned"
}
JSON
"identity": {
"type": "System Assigned, UserAssigned",
"userAssignedIdentities": {
"myResourceID1": {
}
}
}
...
'/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/provider
s/Microsoft.ManagedIdentity/userAssignedIdentities/{identityName}'
YAML
identity:
type: UserAssigned
userAssignedIdentities:
{'myResourceID1':{}}
System-assigned identity
YAML
identity:
type: SystemAssigned
yml
identity:
type: SystemAssigned, UserAssigned
userAssignedIdentities:
{'myResourceID1':{}}
Next steps
In this article, you learned about managed identities in Azure Container Instances and
how to:
See an Azure Go SDK example of using a managed identity to access a key vault
from Azure Container Instances.
Azure Service Bus JMS 2.0 developer
guide
Article • 05/03/2023
This guide contains detailed information to help you succeed in communicating with
Azure Service Bus using the Java Message Service (JMS) 2.0 API.
As a Java developer, if you're new to Azure Service Bus, please consider reading the
below articles.
7 Note
Azure Service Bus Premium tier supports JMS 1.1 and JMS 2.0.
Azure Service Bus - Standard tier supports limited JMS 1.1 functionality. For more
details, please refer to this documentation.
7 Note
The below guide has been adapted from the Oracle Java EE 6 Tutorial for Java
Message Service (JMS)
Connection factory
The connection factory object is used by the client to connect with the JMS provider.
The connection factory encapsulates a set of connection configuration parameters that
are defined by the administrator.
) Important
Java applications leveraging JMS 2.0 API can connect to Azure Service Bus using the
connection string, or using a TokenCredential for leveraging Azure Active Directory
(AAD) backed authentication. When using AAD backed authentication, ensure to
assign roles and permissions to the identity as needed.
Create a system assigned managed identity on Azure, and use this identity to create
a TokenCredential .
Java
The Connection factory can then be instantiated with the below parameters.
The factory can be created as shown here. The token credential and host are
required parameters, but the other properties are optional.
Java
JMS destination
A destination is the object a client uses to specify the target of the messages it produces
and the source of the messages it consumes.
Destinations map to entities in Azure Service Bus - queues (in point to point scenarios)
and topics (in pub-sub scenarios).
Connections
A connection encapsulates a virtual connection with a JMS provider. With Azure Service
Bus, this represents a stateful connection between the application and Azure Service Bus
over AMQP.
Java
Sessions
A session is a single-threaded context for producing and consuming messages. It can be
utilized to create messages, message producers and consumers, but it also provides a
transactional context to allow grouping of sends and receives into an atomic unit of
work.
Java
7 Note
JMS API doesn't support receiving messages from service bus queues or topics with
messaging sessions enabled.
Session modes
JMSContext
7 Note
JMSContext combines the functionality provided by the connection and session object.
It can be created from the connection factory object.
Java
JMSContext modes
Just like the Session object, the JMSContext can be created with the same acknowledge
modes as mentioned in Session modes.
Java
JMSContext context =
connectionFactory.createContext(JMSContext.AUTO_ACKNOWLEDGE);
When the mode isn't specified, the JMSContext.AUTO_ACKNOWLEDGE is picked by
default.
Java
Java
context.createProducer().send(destination, message);
Java
The message consumer provides a synchronous way to receive messages from the
destination through the receive() method.
Java
Message m = consumer.receive();
Message m = consumer.receive(0);
When a non-zero positive argument is provided, the consumer blocks until that timer
expires.
Java
Java
Consumers on queues are simply client side objects that live in the context of the
Session (and Connection) between the client application and Azure Service Bus.
A client side object that lives in the context of the Session(or JMSContext), and,
A subscription that is an entity on Azure Service Bus.
The subscriptions are documented here and can be one of the below -
Java
7 Note
This is because the topic itself doesn't store the messages. As soon as the message
is sent to the topic, it is forwarded to the appropriate subscriptions.
Output
Summary
This developer guide showcased how Java client applications using Java Message
Service (JMS) can connect with Azure Service Bus.
Next steps
For more information on Azure Service Bus and details about Java Message Service
(JMS) entities, check out the links below -
This article shows you how to use passwordless connections to Azure databases in
Spring Boot applications deployed to Azure Spring Apps.
In this tutorial, you complete the following tasks using the Azure portal or the Azure CLI.
Both methods are explained in the following procedures.
7 Note
Prerequisites
An Azure subscription. If you don't already have one, create a free account
before you begin.
Azure CLI 2.45.0 or higher required.
The Azure Spring Apps extension. You can install the extension by using the
command: az extension add --name spring .
Java Development Kit (JDK), version 8, 11, or 17.
A Git client.
cURL or a similar HTTP utility to test functionality.
MySQL command line client if you choose to run Azure Database for MySQL. You
can connect to your server with Azure Cloud Shell using a popular client tool, the
mysql.exe command-line tool. Alternatively, you can use the mysql command
line in your local environment.
ODBC Driver 18 for SQL Server if you choose to run Azure SQL Database.
Bash
export AZ_RESOURCE_GROUP=passwordless-tutorial-rg
export AZ_DATABASE_SERVER_NAME=<YOUR_DATABASE_SERVER_NAME>
export AZ_DATABASE_NAME=demodb
export AZ_LOCATION=<YOUR_AZURE_REGION>
export AZ_SPRING_APPS_SERVICE_NAME=<YOUR_AZURE_SPRING_APPS_SERVICE_NAME>
export AZ_SPRING_APPS_APP_NAME=hellospring
export AZ_DB_ADMIN_USERNAME=<YOUR_DB_ADMIN_USERNAME>
export AZ_DB_ADMIN_PASSWORD=<YOUR_DB_ADMIN_PASSWORD>
export AZ_USER_IDENTITY_NAME=<YOUR_USER_ASSIGNED_MANAGEMED_IDENTITY_NAME>
Replace the placeholders with the following values, which are used throughout this
article:
1. Update Azure CLI with the Azure Spring Apps extension by using the following
command:
Azure CLI
Azure CLI
az login
az account list --output table
az account set --subscription <name-or-ID-of-subscription>
3. Use the following commands to create a resource group to contain your Azure
Spring Apps service and an instance of the Azure Spring Apps service:
Azure CLI
az group create \
--name $AZ_RESOURCE_GROUP \
--location $AZ_LOCATION
az spring create \
--resource-group $AZ_RESOURCE_GROUP \
--name $AZ_SPRING_APPS_SERVICE_NAME
Azure CLI
2. The SQL server is empty, so create a new database by using the following
command:
Azure CLI
az sql db create \
--resource-group $AZ_RESOURCE_GROUP \
--server $AZ_DATABASE_SERVER_NAME \
--name $AZ_DATABASE_NAME
Azure CLI
Azure CLI
7 Note
Please make sure Azure CLI use the 64-bit Python, 32-bit Python has
compatibility issue with the command's dependency pyodbc . The Python
information of Azure CLI can be got with command az --version . If it shows
[MSC v.1929 32 bit (Intel)] , then it means it use 32-bit Python. The solution
Azure CLI
az spring connection create sql \
--resource-group $AZ_RESOURCE_GROUP \
--service $AZ_SPRING_APPS_SERVICE_NAME \
--app $AZ_SPRING_APPS_APP_NAME \
--target-resource-group $AZ_RESOURCE_GROUP \
--server $AZ_DATABASE_SERVER_NAME \
--database $AZ_DATABASE_NAME \
--system-identity
This Service Connector command does the following tasks in the background:
7 Note
If you see the error message The subscription is not registered to use
Microsoft.ServiceLinker , run the command az provider register --
Bash
git clone https://github.com/Azure-Samples/quickstart-spring-data-
jdbc-sql-server passwordless-sample
XML
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.5.4</version>
</dependency>
There's currently no Spring Cloud Azure starter for Azure SQL Database, but
the azure-identity dependency is required.
Bash
logging.level.org.springframework.jdbc.core=DEBUG
spring.sql.init.mode=always
EOF
Bash
cd passwordless-sample
./mvnw clean package -DskipTests
6. Query the app status after deployment by using the following command:
Azure CLI
Bash
JSON
Bash
curl https://${AZ_SPRING_APPS_SERVICE_NAME}-
hellospring.azuremicroservices.io
This command returns the list of "todo" items, including the item you've created, as
shown in the following example:
JSON
Clean up resources
To clean up all resources used during this tutorial, delete the resource group by using
the following command:
Azure CLI
az group delete \
--name $AZ_RESOURCE_GROUP \
--yes
Next steps
Spring Cloud Azure documentation
Use Spring Data JDBC with Azure
Database for MySQL
Article • 02/28/2023
This tutorial demonstrates how to store data in Azure Database for MySQL database
using Spring Data JDBC .
In this tutorial, we include two authentication methods: Azure Active Directory (Azure
AD) authentication and MySQL authentication. The Passwordless tab shows the Azure
AD authentication and the Password tab shows the MySQL authentication.
MySQL authentication uses accounts stored in MySQL. If you choose to use passwords
as credentials for the accounts, these credentials will be stored in the user table.
Because these passwords are stored in MySQL, you need to manage the rotation of the
passwords by yourself.
Prerequisites
An Azure subscription - create one for free .
Apache Maven .
Azure CLI.
If you don't have a Spring Boot application, create a Maven project with the Spring
Initializr . Be sure to select Maven Project and, under Dependencies, add the
Spring Web, Spring Data JDBC, and MySQL Driver dependencies, and then select
Java version 8 or higher.
If you don't have one, create an Azure Database for MySQL Flexible Server instance
named mysqlflexibletest . For instructions, see Quickstart: Use the Azure portal to
create an Azure Database for MySQL Flexible Server. Then, create a database
named demo . For instructions, see Create and manage databases for Azure
Database for MySQL Flexible Server.
To be able to use your database, open the server's firewall to allow the local IP address
to access the database server. For more information, see Manage firewall rules for Azure
Database for MySQL - Flexible Server using the Azure portal.
If you're connecting to your MySQL server from Windows Subsystem for Linux (WSL) on
a Windows computer, you need to add the WSL host IP address to your firewall.
Passwordless (Recommended)
You can use the following method to create a non-admin user that uses a
passwordless connection.
Azure CLI
az extension add --name serviceconnector-passwordless --
upgrade
Azure CLI
When the command completes, take note of the username in the console
output.
To install the Spring Cloud Azure Starter JDBC MySQL module, add the following
dependencies to your pom.xml file:
XML
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-dependencies</artifactId>
<version>4.8.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
7 Note
XML
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-starter-jdbc-mysql</artifactId>
</dependency>
7 Note
Passwordless (Recommended)
properties
logging.level.org.springframework.jdbc.core=DEBUG
spring.datasource.url=jdbc:mysql://mysqlflexibletest.mysql.database
.azure.com:3306/demo?serverTimezone=UTC
spring.datasource.username=<your_mysql_ad_non_admin_username>
spring.datasource.azure.passwordless-enabled=true
spring.sql.init.mode=always
2 Warning
SQL
3. Create a new Todo Java class. This class is a domain model mapped onto the todo
table that will be created automatically by Spring Boot. The following code ignores
the getters and setters methods.
Java
import org.springframework.data.annotation.Id;
public Todo() {
}
@Id
private Long id;
Java
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.context.event.ApplicationReadyEvent;
import org.springframework.context.ApplicationListener;
import org.springframework.context.annotation.Bean;
import org.springframework.data.repository.CrudRepository;
import java.util.stream.Stream;
@SpringBootApplication
public class DemoApplication {
@Bean
ApplicationListener<ApplicationReadyEvent>
basicsApplicationListener(TodoRepository repository) {
return event->repository
.saveAll(Stream.of("A", "B", "C").map(name->new
Todo("configuration", "congratulations, you have set up correctly!",
true)).toList())
.forEach(System.out::println);
}
Tip
5. Start the application. The application stores data into the database. You'll see logs
similar to the following example:
shell
Next steps
Azure for Spring developers Spring Cloud Azure MySQL Samples
Use Spring Data JDBC with Azure
Database for PostgreSQL
Article • 02/28/2023
This tutorial demonstrates how to store data in an Azure Database for PostgreSQL
database using Spring Data JDBC .
In this tutorial, we include two authentication methods: Azure Active Directory (Azure
AD) authentication and PostgreSQL authentication. The Passwordless tab shows the
Azure AD authentication and the Password tab shows the PostgreSQL authentication.
Prerequisites
An Azure subscription - create one for free .
Apache Maven .
Azure CLI.
If you don't have a Spring Boot application, create a Maven project with the Spring
Initializr . Be sure to select Maven Project and, under Dependencies, add the
Spring Web, Spring Data JDBC, and PostgreSQL Driver dependencies, and then
select Java version 8 or higher.
If you don't have one, create an Azure Database for PostgreSQL Flexible Server
instance named postgresqlflexibletest and a database named demo . For
instructions, see Quickstart: Create an Azure Database for PostgreSQL - Flexible
Server in the Azure portal.
To be able to use your database, open the server's firewall to allow the local IP address
to access the database server. For more information, see Firewall rules in Azure
Database for PostgreSQL - Flexible Server.
If you're connecting to your PostgreSQL server from Windows Subsystem for Linux
(WSL) on a Windows computer, you need to add the WSL host ID to your firewall.
Passwordless (Recommended)
You can use the following method to create a non-admin user that uses a
passwordless connection.
Azure CLI
az extension add --name serviceconnector-passwordless --
upgrade
Azure CLI
When the command completes, take note of the username in the console
output.
To install the Spring Cloud Azure Starter JDBC PostgreSQL module, add the following
dependencies to your pom.xml file:
XML
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-dependencies</artifactId>
<version>4.8.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
7 Note
XML
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-starter-jdbc-postgresql</artifactId>
</dependency>
7 Note
Passwordless (Recommended)
properties
logging.level.org.springframework.jdbc.core=DEBUG
spring.datasource.url=jdbc:postgresql://postgresqlflexibletest.post
gres.database.azure.com:5432/demo?sslmode=require
spring.datasource.username=<your_postgresql_ad_non_admin_username>
spring.datasource.azure.passwordless-enabled=true
spring.sql.init.mode=always
2 Warning
SQL
3. Create a new Todo Java class. This class is a domain model mapped onto the todo
table that will be created automatically by Spring Boot. The following code ignores
the getters and setters methods.
Java
import org.springframework.data.annotation.Id;
public Todo() {
}
@Id
private Long id;
}
4. Edit the startup class file to show the following content.
Java
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.context.event.ApplicationReadyEvent;
import org.springframework.context.ApplicationListener;
import org.springframework.context.annotation.Bean;
import org.springframework.data.repository.CrudRepository;
import java.util.stream.Stream;
@SpringBootApplication
public class DemoApplication {
@Bean
ApplicationListener<ApplicationReadyEvent>
basicsApplicationListener(TodoRepository repository) {
return event->repository
.saveAll(Stream.of("A", "B", "C").map(name->new
Todo("configuration", "congratulations, you have set up correctly!",
true)).toList())
.forEach(System.out::println);
}
Tip
5. Start the application. The application stores data into the database. You'll see logs
similar to the following example:
shell
Next steps
Azure for Spring developers Spring Cloud Azure PostgreSQL Samples
Configure passwordless database
connections for Java apps on Oracle
WebLogic Servers
Article • 05/30/2023
This article shows you how to configure passwordless database connections for Java
apps on Oracle WebLogic Server offers with the Azure portal.
The offers support passwordless connections for PostgreSQL, MySQL and Azure SQL
databases.
Prerequisites
If you don't have an Azure subscription, create a free account before you begin.
Use Azure Cloud Shell using the Bash environment; make sure the Azure CLI
version is 2.43.0 or higher.
If you prefer, install the Azure CLI 2.43.0 or higher to run Azure CLI commands.
If you're using a local install, sign in with Azure CLI by using the az login
command. To finish the authentication process, follow the steps displayed in
your terminal. See Sign in with Azure CLI for other sign-in options.
When you're prompted, install Azure CLI extensions on first use. For more
information about extensions, see Use extensions with Azure CLI.
Run az version to find the version and dependent libraries that are installed. To
upgrade to the latest version, run az upgrade.
Ensure the Azure identity you use to sign in and complete this article has either the
Owner role in the current subscription or the Contributor and User Access
Administrator roles in the current subscription. For an overview of Azure roles, see
What is Azure role-based access control (Azure RBAC)? For details on the specific
roles required by Oracle WebLogic marketplace offer, see Azure built-in roles.
Azure CLI
RESOURCE_GROUP_NAME="abc1228rg"
az group create \
--name ${RESOURCE_GROUP_NAME} \
--location eastus
Create a flexible server with the az mysql flexible-server create command. This
example creates a flexible server named mysql20221201 with admin user azureuser
and admin password Secret123456 . Replace the password with yours. For more
information, see Create an Azure Database for MySQL Flexible Server using Azure
CLI.
Azure CLI
MYSQL_NAME="mysql20221201"
MYSQL_ADMIN_USER="azureuser"
MYSQL_ADMIN_PASSWORD="Secret123456"
Azure CLI
DATABASE_NAME="contoso"
When the command completes, you should see output similar to the following
example:
Output
For information on how MySQL Flexible Server interacts with managed identities,
see Use Azure Active Directory for authentication with MySQL.
The following example configures the current Azure CLI user as an Azure AD
administrator account. To enable Azure authentication, it's necessary to assign an
identity to MySQL Flexible Server.
First, create a managed identity with az identity create and assign the identity to
MySQL server with az mysql flexible-server identity assign.
Azure CLI
MYSQL_UMI_NAME="id-mysql-aad-20221205"
# create a User Assigned Managed Identity for MySQL to be used for AAD
authentication
az identity create \
--resource-group $RESOURCE_GROUP_NAME \
--name $MYSQL_UMI_NAME
Then, set the current Azure CLI user as the Azure AD administrator account with az
mysql flexible-server ad-admin create.
Azure CLI
Azure CLI
az identity create \
--resource-group ${RESOURCE_GROUP_NAME} \
--name myManagedIdentity
To configure the identity in the following steps, use the az identity show command to
store the identity's client ID in a shell variable.
Azure CLI
Now, connect as the Azure AD administrator user to your MySQL database, and
create a MySQL user for your managed identity.
First, you're required to create a firewall rule to access the MySQL server from your
CLI client. Run the following commands to get your current IP address.
Bash
MY_IP=$(curl http://whatismyip.akamai.com)
If you're working on Windows Subsystem for Linux (WSL) with VPN enabled, the
following command may return an incorrect IPv4 address. One way to get your IPv4
address is by visiting whatismyipaddress.com . In any case, set the environment
variable MY_IP as the IPv4 address from which you want to connect to the database.
Azure CLI
Then, prepare an .sql file to create a database user for the managed identity. The
following example adds a user with login name identity-contoso and grants the
user privileges to access database contoso .
Bash
IDENTITY_LOGIN_NAME="identity-contoso"
Execute the .sql file with the command az mysql flexible-server execute. You can get
your access token with the command az account get-access-token.
Azure CLI
Output
If the .sql file executes successfully, you find output that is similar to the following
example:
Output
The managed identity myManagedIdentity now has access to the database when
authenticating with the username identity-contoso .
If you no longer want to access the server from this IP address, you can remove the
firewall rule by using the following command.
Azure CLI
Finally, use the following command to get the connection string that you use in the
next section.
Azure CLI
CONNECTION_STRING="jdbc:mysql://${MYSQL_NAME}.mysql.database.azure.com:3
306/${DATABASE_NAME}?useSSL=true"
echo ${CONNECTION_STRING}
Fill in the required information in the Basics pane and other panes if you want to enable
the features. When you reach the Database pane, fill in the passwordless configuration
as shown in the following steps.
The Connection settings section should look like the following screenshot, which
uses Oracle WebLogic Server Cluster on VMs as an example.
You've now finished configuring the passwordless connection. You can continue to fill in
the following panes or select Review + create, then Create to deploy the offer.
Continuing to take Oracle WebLogic Server Cluster on VMs as an example, after the
deployment completes, follow these steps in the Azure portal to find the Admin console
URL.
1. Sign in to the WebLogic Administration Console with the username and password
you provided on the Basics pane.
3. Select the Monitoring tab, where the state of the data source is Running, as shown
in the following screenshot.
4. Select the Testing tab, then select the radio button next to the desired server.
5. Select Test Data Source. You should see a message indicating a successful test, as
shown in the following screenshot.
Clean up resources
If you don't need these resources, you can delete them by doing the following
commands:
Azure CLI
Next steps
Learn more about running WLS on AKS or virtual machines by following these links:
WLS on AKS
This article shows you how to use passwordless connections to Azure databases in
Spring Boot applications deployed to Azure Spring Apps.
In this tutorial, you complete the following tasks using the Azure portal or the Azure CLI.
Both methods are explained in the following procedures.
7 Note
Prerequisites
An Azure subscription. If you don't already have one, create a free account
before you begin.
Azure CLI 2.45.0 or higher required.
The Azure Spring Apps extension. You can install the extension by using the
command: az extension add --name spring .
Java Development Kit (JDK), version 8, 11, or 17.
A Git client.
cURL or a similar HTTP utility to test functionality.
MySQL command line client if you choose to run Azure Database for MySQL. You
can connect to your server with Azure Cloud Shell using a popular client tool, the
mysql.exe command-line tool. Alternatively, you can use the mysql command
line in your local environment.
ODBC Driver 18 for SQL Server if you choose to run Azure SQL Database.
Bash
export AZ_RESOURCE_GROUP=passwordless-tutorial-rg
export AZ_DATABASE_SERVER_NAME=<YOUR_DATABASE_SERVER_NAME>
export AZ_DATABASE_NAME=demodb
export AZ_LOCATION=<YOUR_AZURE_REGION>
export AZ_SPRING_APPS_SERVICE_NAME=<YOUR_AZURE_SPRING_APPS_SERVICE_NAME>
export AZ_SPRING_APPS_APP_NAME=hellospring
export AZ_DB_ADMIN_USERNAME=<YOUR_DB_ADMIN_USERNAME>
export AZ_DB_ADMIN_PASSWORD=<YOUR_DB_ADMIN_PASSWORD>
export AZ_USER_IDENTITY_NAME=<YOUR_USER_ASSIGNED_MANAGEMED_IDENTITY_NAME>
Replace the placeholders with the following values, which are used throughout this
article:
1. Update Azure CLI with the Azure Spring Apps extension by using the following
command:
Azure CLI
Azure CLI
az login
az account list --output table
az account set --subscription <name-or-ID-of-subscription>
3. Use the following commands to create a resource group to contain your Azure
Spring Apps service and an instance of the Azure Spring Apps service:
Azure CLI
az group create \
--name $AZ_RESOURCE_GROUP \
--location $AZ_LOCATION
az spring create \
--resource-group $AZ_RESOURCE_GROUP \
--name $AZ_SPRING_APPS_SERVICE_NAME
Azure CLI
2. The SQL server is empty, so create a new database by using the following
command:
Azure CLI
az sql db create \
--resource-group $AZ_RESOURCE_GROUP \
--server $AZ_DATABASE_SERVER_NAME \
--name $AZ_DATABASE_NAME
Azure CLI
Azure CLI
7 Note
Please make sure Azure CLI use the 64-bit Python, 32-bit Python has
compatibility issue with the command's dependency pyodbc . The Python
information of Azure CLI can be got with command az --version . If it shows
[MSC v.1929 32 bit (Intel)] , then it means it use 32-bit Python. The solution
Azure CLI
az spring connection create sql \
--resource-group $AZ_RESOURCE_GROUP \
--service $AZ_SPRING_APPS_SERVICE_NAME \
--app $AZ_SPRING_APPS_APP_NAME \
--target-resource-group $AZ_RESOURCE_GROUP \
--server $AZ_DATABASE_SERVER_NAME \
--database $AZ_DATABASE_NAME \
--system-identity
This Service Connector command does the following tasks in the background:
7 Note
If you see the error message The subscription is not registered to use
Microsoft.ServiceLinker , run the command az provider register --
Bash
git clone https://github.com/Azure-Samples/quickstart-spring-data-
jdbc-sql-server passwordless-sample
XML
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.5.4</version>
</dependency>
There's currently no Spring Cloud Azure starter for Azure SQL Database, but
the azure-identity dependency is required.
Bash
logging.level.org.springframework.jdbc.core=DEBUG
spring.sql.init.mode=always
EOF
Bash
cd passwordless-sample
./mvnw clean package -DskipTests
6. Query the app status after deployment by using the following command:
Azure CLI
Bash
JSON
Bash
curl https://${AZ_SPRING_APPS_SERVICE_NAME}-
hellospring.azuremicroservices.io
This command returns the list of "todo" items, including the item you've created, as
shown in the following example:
JSON
Clean up resources
To clean up all resources used during this tutorial, delete the resource group by using
the following command:
Azure CLI
az group delete \
--name $AZ_RESOURCE_GROUP \
--yes
Next steps
Spring Cloud Azure documentation
Use Spring Data JPA with Azure
Database for MySQL
Article • 02/28/2023
This tutorial demonstrates how to store data in Azure Database for MySQL database
using Spring Data JPA .
The Java Persistence API (JPA) is the standard Java API for object-relational mapping.
In this tutorial, we include two authentication methods: Azure Active Directory (Azure
AD) authentication and MySQL authentication. The Passwordless tab shows the Azure
AD authentication and the Password tab shows the MySQL authentication.
MySQL authentication uses accounts stored in MySQL. If you choose to use passwords
as credentials for the accounts, these credentials will be stored in the user table.
Because these passwords are stored in MySQL, you need to manage the rotation of the
passwords by yourself.
Prerequisites
An Azure subscription - create one for free .
Apache Maven .
Azure CLI.
If you don't have a Spring Boot application, create a Maven project with the Spring
Initializr . Be sure to select Maven Project and, under Dependencies, add the
Spring Web, Spring Data JPA, and MySQL Driver dependencies, and then select
Java version 8 or higher.
If you don't have one, create an Azure Database for MySQL Flexible Server instance
named mysqlflexibletest . For instructions, see Quickstart: Use the Azure portal to
create an Azure Database for MySQL Flexible Server. Then, create a database
named demo . For instructions, see Create and manage databases for Azure
Database for MySQL Flexible Server.
) Important
To use passwordless connections, create an Azure AD admin user for your Azure
Database for MySQL instance. For instructions, see the Configure the Azure AD
Admin section of Set up Azure Active Directory authentication for Azure
Database for MySQL - Flexible Server.
To be able to use your database, open the server's firewall to allow the local IP address
to access the database server. For more information, see Manage firewall rules for Azure
Database for MySQL - Flexible Server using the Azure portal.
If you're connecting to your MySQL server from Windows Subsystem for Linux (WSL) on
a Windows computer, you need to add the WSL host IP address to your firewall.
Passwordless (Recommended)
You can use the following method to create a non-admin user that uses a
passwordless connection.
Service Connector (Recommended)
Azure CLI
Azure CLI
When the command completes, take note of the username in the console
output.
To install the Spring Cloud Azure Starter JDBC MySQL module, add the following
dependencies to your pom.xml file:
XML
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-dependencies</artifactId>
<version>4.8.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
7 Note
XML
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-starter-jdbc-mysql</artifactId>
</dependency>
7 Note
Passwordless (Recommended)
properties
logging.level.org.hibernate.SQL=DEBUG
spring.datasource.azure.passwordless-enabled=true
spring.datasource.url=jdbc:mysql://mysqlflexibletest.mysql.database
.azure.com:3306/demo?serverTimezone=UTC
spring.datasource.username=<your_mysql_ad_non_admin_username>
spring.jpa.hibernate.ddl-auto=create-drop
spring.jpa.properties.hibernate.dialect
=org.hibernate.dialect.MySQL8Dialect
2 Warning
2. Create a new Todo Java class. This class is a domain model mapped onto the todo
table that will be created automatically by JPA. The following code ignores the
getters and setters methods.
Java
package com.example.demo;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.Id;
@Entity
public class Todo {
public Todo() {
}
@Id
@GeneratedValue
private Long id;
Java
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.context.event.ApplicationReadyEvent;
import org.springframework.context.ApplicationListener;
import org.springframework.context.annotation.Bean;
import org.springframework.data.jpa.repository.JpaRepository;
import java.util.stream.Collectors;
import java.util.stream.Stream;
@SpringBootApplication
public class DemoApplication {
@Bean
ApplicationListener<ApplicationReadyEvent>
basicsApplicationListener(TodoRepository repository) {
return event->repository
.saveAll(Stream.of("A", "B", "C").map(name->new
Todo("configuration", "congratulations, you have set up correctly!",
true)).collect(Collectors.toList()))
.forEach(System.out::println);
}
Tip
4. Start the application. You'll see logs similar to the following example:
shell
Next steps
Azure for Spring developers Spring Cloud Azure MySQL Samples
Use Spring Data JPA with Azure
Database for PostgreSQL
Article • 02/28/2023
This tutorial demonstrates how to store data in Azure Database for PostgreSQL using
Spring Data JPA .
The Java Persistence API (JPA) is the standard Java API for object-relational mapping.
In this tutorial, we include two authentication methods: Azure Active Directory (Azure
AD) authentication and PostgreSQL authentication. The Passwordless tab shows the
Azure AD authentication and the Password tab shows the PostgreSQL authentication.
Prerequisites
An Azure subscription - create one for free .
Apache Maven .
Azure CLI.
If you don't have a Spring Boot application, create a Maven project with the Spring
Initializr . Be sure to select Maven Project and, under Dependencies, add the
Spring Web, Spring Data JDBC, and PostgreSQL Driver dependencies, and then
select Java version 8 or higher.
If you don't have one, create an Azure Database for PostgreSQL Flexible Server
instance named postgresqlflexibletest and a database named demo . For
instructions, see Quickstart: Create an Azure Database for PostgreSQL - Flexible
Server in the Azure portal.
) Important
To use passwordless connections, configure the Azure AD admin user for your
Azure Database for PostgreSQL Flexible Server instance. For more information, see
Manage Azure Active Directory roles in Azure Database for PostgreSQL - Flexible
Server.
To be able to use your database, open the server's firewall to allow the local IP address
to access the database server. For more information, see Firewall rules in Azure
Database for PostgreSQL - Flexible Server.
If you're connecting to your PostgreSQL server from Windows Subsystem for Linux
(WSL) on a Windows computer, you need to add the WSL host ID to your firewall.
Passwordless (Recommended)
You can use the following method to create a non-admin user that uses a
passwordless connection.
Service Connector (Recommended)
Azure CLI
Azure CLI
When the command completes, take note of the username in the console
output.
To install the Spring Cloud Azure Starter JDBC PostgreSQL module, add the following
dependencies to your pom.xml file:
XML
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-dependencies</artifactId>
<version>4.8.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
7 Note
XML
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-starter-jdbc-postgresql</artifactId>
</dependency>
7 Note
Passwordless (Recommended)
properties
logging.level.org.hibernate.SQL=DEBUG
spring.datasource.url=jdbc:postgresql://postgresqlflexibletest.post
gres.database.azure.com:5432/demo?sslmode=require
spring.datasource.username=<your_postgresql_ad_non_admin_username>
spring.datasource.azure.passwordless-enabled=true
spring.jpa.hibernate.ddl-auto=create-drop
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.Postg
reSQLDialect
2. Create a new Todo Java class. This class is a domain model mapped onto the todo
table that will be created automatically by JPA. The following code ignores the
getters and setters methods.
Java
package com.example.demo;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.Id;
@Entity
public class Todo {
public Todo() {
}
@Id
@GeneratedValue
private Long id;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.context.event.ApplicationReadyEvent;
import org.springframework.context.ApplicationListener;
import org.springframework.context.annotation.Bean;
import org.springframework.data.jpa.repository.JpaRepository;
import java.util.stream.Collectors;
import java.util.stream.Stream;
@SpringBootApplication
public class DemoApplication {
@Bean
ApplicationListener<ApplicationReadyEvent>
basicsApplicationListener(TodoRepository repository) {
return event->repository
.saveAll(Stream.of("A", "B", "C").map(name->new
Todo("configuration", "congratulations, you have set up correctly!",
true)).collect(Collectors.toList()))
.forEach(System.out::println);
}
Tip
4. Start the application. You'll see logs similar to the following example:
shell
Next steps
Azure for Spring developers Spring Cloud Azure PostgreSQL Samples
Use Spring Kafka with Azure Event Hubs
for Kafka API
Article • 07/24/2023
This tutorial shows you how to configure a Java-based Spring Cloud Stream Binder to
use Azure Event Hubs for Kafka for sending and receiving messages with Azure Event
Hubs. For more information, see Use Azure Event Hubs from Apache Kafka applications
In this tutorial, we'll include two authentication methods: Azure Active Directory (Azure
AD) authentication and Shared Access Signatures (SAS) authentication. The
Passwordless tab shows the Azure AD authentication and the Connection string tab
shows the SAS authentication.
Azure AD authentication is a mechanism for connecting to Azure Event Hubs for Kafka
using identities defined in Azure AD. With Azure AD authentication, you can manage
database user identities and other Microsoft services in a central location, which
simplifies permission management.
SAS authentication uses the connection string of your Azure Event Hubs namespace for
the delegated access to Event Hubs for Kafka. If you choose to use Shared Access
Signatures as credentials, you need to manage the connection string by yourself.
Prerequisites
An Azure subscription - create one for free .
An Azure Event hub. If you don't have one, create an event hub using Azure portal.
A Spring Boot application. If you don't have one, create a Maven project with the
Spring Initializr . Be sure to select Maven Project and, under Dependencies, add
the Spring Web, Spring for Apache Kafka, and Cloud Stream dependencies, then
select Java version 8 or higher.
) Important
Spring Boot version 2.5 or higher is required to complete the steps in this tutorial.
Prepare credentials
Passwordless (Recommended)
Azure Event Hubs supports using Azure Active Directory (Azure AD) to authorize
requests to Event Hubs resources. With Azure AD, you can use Azure role-based
access control (Azure RBAC) to grant permissions to a security principal, which may
be a user or an application service principal.
If you want to run this sample locally with Azure AD authentication, be sure your
user account has authenticated via Azure Toolkit for IntelliJ, Visual Studio Code
Azure Account plugin, or Azure CLI. Also, be sure the account has been granted
sufficient permissions.
7 Note
When using passwordless connections, you need to grant your account access
to resources. In Azure Event Hubs, assign the Azure Event Hubs Data Receiver
and Azure Event Hubs Data Sender role to the Azure AD account you're
currently using. For more information about granting access roles, see Assign
Azure roles using the Azure portal and Authorize access to Event Hubs
resources using Azure Active Directory.
To install the Spring Cloud Azure Starter module, add the following dependencies to
your pom.xml file:
XML
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-dependencies</artifactId>
<version>4.9.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
7 Note
I Use .
XML
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-starter</artifactId>
</dependency>
1. Configure the Event hub credentials by adding the following properties to your
application.properties file.
Passwordless (Recommended)
properties
spring.cloud.stream.kafka.binder.brokers=${AZ_EVENTHUBS_NAMESPACE_N
AME}.servicebus.windows.net:9093
spring.cloud.function.definition=consume;supply
spring.cloud.stream.bindings.consume-in-
0.destination=${AZ_EVENTHUB_NAME}
spring.cloud.stream.bindings.consume-in-0.group=$Default
spring.cloud.stream.bindings.supply-out-
0.destination=${AZ_EVENTHUB_NAME}
Tip
com.azure.spring.cloud.autoconfigure.kafka.AzureKafkaSpringCloudStrea
mConfiguration .
Field Description
7 Note
If you enable automatic topic creation, be sure to add the configuration item
spring.cloud.stream.kafka.binder.replicationFactor , with the value set to at
least 1. For more information, see Spring Cloud Stream Kafka Binder
Reference Guide .
2. Edit the startup class file to show the following content.
Java
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
import org.springframework.messaging.Message;
import org.springframework.messaging.support.GenericMessage;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Sinks;
import java.util.function.Consumer;
import java.util.function.Supplier;
@SpringBootApplication
public class EventHubKafkaBinderApplication implements
CommandLineRunner {
@Bean
public Supplier<Flux<Message<String>>> supply() {
return ()->many.asFlux()
.doOnNext(m->LOGGER.info("Manually sending
message {}", m))
.doOnError(t->LOGGER.error("Error encountered",
t));
}
@Bean
public Consumer<Message<String>> consume() {
return message->LOGGER.info("New message received: '{}'",
message.getPayload());
}
@Override
public void run(String... args) {
many.emitNext(new GenericMessage<>("Hello World"),
Sinks.EmitFailureHandler.FAIL_FAST);
}
}
Tip
determines which method to use at runtime. This approach enables your app
to use different authentication methods in different environments (such as
local and production environments) without implementing environment-
specific code. For more information, see DefaultAzureCredential.
3. Start the application. Messages like the following example will be posted in your
application log:
Output
Next steps
Azure for Spring developers
This quickstart guide explains how to build a Java app to manage an Azure Cosmos DB
for NoSQL account. You create the Java app using the SQL Java SDK, and add resources
to your Azure Cosmos DB account by using the Java application.
First, create an Azure Cosmos DB for NoSQL account using the Azure portal. Azure
Cosmos DB is a multi-model database service that lets you quickly create and query
document, table, key-value, and graph databases with global distribution and horizontal
scale capabilities. You can try Azure Cosmos DB account for free without a credit card
or an Azure subscription.
) Important
This quickstart is for Azure Cosmos DB Java SDK v4 only. For more information, see
the release notes, Maven repository , performance tips, and troubleshooting
guide. If you currently use an older version than v4, see the Migrate to Azure
Cosmos DB Java SDK v4 guide for help upgrading to v4.
Tip
If you work with Azure Cosmos DB resources in a Spring application, consider using
Spring Cloud Azure as an alternative. Spring Cloud Azure is an open-source project
that provides seamless Spring integration with Azure services. To learn more about
Spring Cloud Azure, and to see an example using Cosmos DB, see Access data with
Azure Cosmos DB NoSQL API.
Prerequisites
An Azure account with an active subscription. If you don't have an Azure
subscription, you can try Azure Cosmos DB free with no credit card required.
Java Development Kit (JDK) 8 . Point your JAVA_HOME environment variable to the
folder where the JDK is installed.
A Maven binary archive . On Ubuntu, run apt-get install maven to install Maven.
Git . On Ubuntu, run sudo apt-get install git to install Git.
Introductory notes
The structure of an Azure Cosmos DB account: For any API or programming language,
an Azure Cosmos DB account contains zero or more databases, a database (DB) contains
zero or more containers, and a container contains zero or more items, as shown in the
following diagram:
For more information, see Databases, containers, and items in Azure Cosmos DB.
A few important properties are defined at the level of the container, including
provisioned throughput and partition key. The provisioned throughput is measured in
request units (RUs), which have a monetary price and are a substantial determining
factor in the operating cost of the account. Provisioned throughput can be selected at
per-container granularity or per-database granularity, however container-level
throughput specification is typically preferred. To learn more about throughput
provisioning, see Introduction to provisioned throughput in Azure Cosmos DB.
As items are inserted into an Azure Cosmos DB container, the database grows
horizontally by adding more storage and compute to handle requests. Storage and
compute capacity are added in discrete units known as partitions, and you must choose
one field in your documents to be the partition key that maps each document to a
partition. Partitions are managed such that each partition is assigned a roughly equal
slice out of the range of partition key values. Therefore, you're advised to choose a
partition key that's relatively random or evenly distributed. Otherwise, some partitions
see substantially more requests (hot partition) while other partitions see substantially
fewer requests (cold partition). To learn more, see Partitioning and horizontal scaling in
Azure Cosmos DB.
Create a database account
Before you can create a document database, you need to create an API for NoSQL
account with Azure Cosmos DB.
1. From the Azure portal menu or the Home page, select Create a resource.
2. Search for Azure Cosmos DB. Select Create > Azure Cosmos DB.
3. On the Create an Azure Cosmos DB account page, select the Create option within
the Azure Cosmos DB for NoSQL section.
To learn more about the API for NoSQL, see Welcome to Azure Cosmos DB.
4. In the Create Azure Cosmos DB Account page, enter the basic settings for the new
Azure Cosmos DB account.
Subscription Subscription Select the Azure subscription that you want to use for this
name Azure Cosmos DB account.
Resource Resource Select a resource group, or select Create new, then enter a
Group group name unique name for the new resource group.
Location The region Select a geographic location to host your Azure Cosmos DB
closest to account. Use the location that is closest to your users to give
your users them the fastest access to the data.
Setting Value Description
Apply Azure Apply or Do With Azure Cosmos DB free tier, you get the first 1000 RU/s
Cosmos DB not apply and 25 GB of storage for free in an account. Learn more
free tier about free tier .
discount
Limit total Selected or Limit the total amount of throughput that can be provisioned
account not on this account. This limit prevents unexpected charges
throughput related to provisioned throughput. You can update or remove
this limit after your account is created.
You can have up to one free tier Azure Cosmos DB account per Azure subscription
and must opt in when creating the account. If you don't see the option to apply
the free tier discount, another account in the subscription has already been
enabled with free tier.
7 Note
The following options are not available if you select Serverless as the Capacity
mode:
5. In the Global Distribution tab, configure the following details. You can leave the
default values for this quickstart:
Multi- Disable Multi-region writes capability allows you to take advantage of the
region provisioned throughput for your databases and containers across
Writes the globe.
Availability Disable Availability Zones help you further improve availability and
Zones resiliency of your application.
7 Note
The following options are not available if you select Serverless as the Capacity
mode in the previous Basics page:
Geo-redundancy
Multi-region Writes
Add a container
You can now use the Data Explorer tool in the Azure portal to create a database and
container.
2. In the Add container page, enter the settings for the new container.
Database ToDoList Enter Tasks as the name for the new database.
ID Database names must contain from 1 through
255 characters, and they cannot contain /, \\,
#, ? , or a trailing space. Check the Share
throughput across containers option, it allows
you to share the throughput provisioned on the
database across all the containers within the
database. This option also helps with cost
savings.
Setting Suggested value Description
Container Items Enter Items as the name for your new container.
ID Container IDs have the same character
requirements as database names.
Don't add Unique keys or turn on Analytical store for this example. Unique keys
let you add a layer of data integrity to the database by ensuring the uniqueness of
one or more values per partition key. For more information, see Unique keys in
Azure Cosmos DB. Analytical store is used to enable large-scale analytics against
operational data without any impact to your transactional workloads.
Select OK. The Data Explorer displays the new database and container.
JSON
{
"id": "1",
"category": "personal",
"name": "groceries",
"description": "Pick up apples and strawberries.",
"isComplete": false
}
3. Once you've added the json to the Documents tab, select Save.
4. Create and save one more document where you insert a unique value for the id
property, and change the other properties as you see fit. Your new documents can
have any structure you want as Azure Cosmos DB doesn't impose any schema on
your data.
1. At the top of the Items tab in Data Explorer, review the default query SELECT *
FROM c . This query retrieves and displays all documents from the container ordered
by ID.
2. To change the query, select Edit Filter, replace the default query with ORDER BY
c._ts DESC , and then select Apply Filter.
The modified query displays the documents in descending order based on their
description, so now your second document is listed first.
If you're familiar with SQL syntax, you can enter any supported SQL queries in the query
predicate box. You can also use Data Explorer to create stored procedures, user defined
functions, and triggers for server-side business logic.
Data Explorer provides easy access in the Azure portal to all of the built-in
programmatic data access features available in the APIs. You can also use the Azure
portal to scale throughput, get keys and connection strings, and review metrics and
SLAs for your Azure Cosmos DB account.
Run the following command to clone the sample repository. This command creates a
copy of the sample app on your computer.
Bash
learn more about DefaultAzureCredential , see the Azure authentication with Java
and Azure Identity. DefaultAzureCredential supports multiple authentication
methods and determines which method should be used at runtime. This approach
enables your app to use different authentication methods in different environments
(local vs. production) without implementing environment-specific code.
For example, your app can authenticate using your Visual Studio sign-in credentials
when developing locally, and then use a managed identity once it has been
deployed to Azure. No code changes are required for this transition.
When developing locally with passwordless authentication, make sure the user
account that connects to Cosmos DB is assigned a role with the correct permissions
to perform data operations. Currently, Azure Cosmos DB for NoSQL doesn't include
built-in roles for data operations, but you can create your own using the Azure CLI
or PowerShell.
Azure CLI
az cosmosdb sql role definition create \
--account-name <cosmosdb-account-name> \
--resource-group <resource-group-name> \
--body '{
"RoleName": "PasswordlessReadWrite",
"Type": "CustomRole",
"AssignableScopes": ["/"],
"Permissions": [{
"DataActions": [
"Microsoft.DocumentDB/databaseAccounts/readMetadata",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/item
s/*",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/*"
]
}]
}'
2. When the command completes, copy the ID value from the name field and
paste it somewhere for later use.
3. Assign the role you created to the user account or service principal that will
connect to Cosmos DB. During local development, this will generally be your
own account that's logged into a development tool like Visual Studio or the
Azure CLI. Retrieve the details of your account using the az ad user
command.
Azure CLI
4. Copy the value of the id property out of the results and paste it somewhere
for later use.
5. Assign the custom role you created to your user account using the az
cosmosdb sql role assignment create command and the IDs you copied
previously.
Azure CLI
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
Java
Azure CLI
Azure CLI
Java
Java
try {
CosmosItemResponse<Family> item =
container.readItem(family.getId(), new
PartitionKey(family.getLastName()), Family.class);
double requestCharge = item.getRequestCharge();
Duration requestLatency = item.getDuration();
logger.info("Item successfully read with id {} with a charge of
{} and within duration {}",
item.getItem().getId(), requestCharge, requestLatency);
} catch (CosmosException e) {
logger.error("Read Item failed with", e);
}
SQL queries over JSON are performed using the queryItems method.
Java
CosmosPagedIterable<Family> familiesPagedIterable =
container.queryItems(
"SELECT * FROM Family WHERE Family.lastName IN ('Andersen',
'Wakefield', 'Johnson')", queryOptions, Family.class);
familiesPagedIterable.iterableByPage(10).forEach(cosmosItemProperti
esFeedResponse -> {
logger.info("Got a page of query result with {} items(s) and
request charge of {}",
cosmosItemPropertiesFeedResponse.getResults().size(),
cosmosItemPropertiesFeedResponse.getRequestCharge());
Bash
cd azure-cosmos-java-getting-started
2. In the git terminal window, use the following command to install the required
Java packages.
Bash
mvn package
3. In the git terminal window, use the following command to start the Java
application. Replace SYNCASYNCMODE with sync-passwordless or async-
passwordless , depending on which sample code you'd like to run. Replace
Bash
The terminal window displays a notification that the FamilyDB database was
created.
4. The app references the database and container you created via Azure CLI
earlier.
5. The app performs point reads using object IDs and partition key value (which
is lastName in our sample).
6. The app queries items to retrieve all families with last name (Andersen,
Wakefield, Johnson).
7. The app doesn't delete the created resources. Switch back to the portal to
clean up the resources from your account so that you don't incur charges.
There are several advanced scenarios that benefit from client-side throughput control:
Different operations and tasks have different priorities - there can be a need to
prevent normal transactions from being throttled due to data ingestion or copy
activities. Some operations and/or tasks aren't sensitive to latency, and are more
tolerant to being throttled than others.
2 Warning
Please note that throughput control is not yet supported for gateway mode.
Currently, for serverless Azure Cosmos DB accounts, attempting to use
targetThroughputThreshold to define a percentage will result in failure. You can
only provide an absolute value for target throughput/RU using targetThroughput .
container as defined in the examples above, you can create this container as below.
Here we name the container ThroughputControl :
Sync API
Java
7 Note
The throughput control container must be created with a partition key /groupId
and must have ttl value set, or throughput control will not function correctly.
Then, to enable the container object used by the current client to use a shared global
control group, we need to create two sets of config. The first is to define the control
groupName , and the targetThroughputThreshold or targetThroughput for that group. If
the group does not already exist, an entry for it will be created in the throughput control
container:
Java
ThroughputControlGroupConfig groupConfig =
new ThroughputControlGroupConfigBuilder()
.groupName("globalControlGroup")
.targetThroughputThreshold(0.25)
.targetThroughput(100)
.build();
7 Note
) Important
The second config you need to create will reference the throughput container you
created earlier, and define some behaviours for it using two parameters:
Java
GlobalThroughputControlConfig globalControlConfig =
this.client.createGlobalThroughputControlConfigBuilder("ThroughputControlDat
abase", "ThroughputControl")
.setControlItemRenewInterval(Duration.ofSeconds(5))
.setControlItemExpireInterval(Duration.ofSeconds(11))
.build();
Now we're ready to enable global throughput control for this container object. Other
Cosmos clients running in other JVMs can share the same throughput control group,
and as long as they are referencing the same throughput control metadata container,
and reference the same throughput control group name.
Java
container.enableGlobalThroughputControlGroup(groupConfig,
globalControlConfig);
7 Note
Throughput control does not do RU pre-calculation of each operation. Instead, it
tracks the RU usages after the operation based on the response header. As such,
throughput control is based on an approximation - and does not guarantee that
amount of throughput will be available for the group at any given time. This means
that if the configured RU is so low that a single operation can use it all, then
throughput control cannot avoid the RU exceeding the configured limit. Therefore,
throughput control works best when the configured limit is higher than any single
operation that can be executed by a client in the given control group. With that in
mind, when reading via query or change feed, you should configure the page
size to be a modest amount, so that client throughput control can be re-
calculated with higher frequency, and therefore reflected more accurately at any
given time. However, when using throughput control for a write-job using bulk, the
number of documents executed in a single request will automatically be tuned
based on the throttling rate to allow the throughput control to kick-in as early as
possible.
Java
ThroughputControlGroupConfig groupConfig =
new ThroughputControlGroupConfigBuilder()
.groupName("localControlGroup")
.targetThroughputThreshold(0.1)
.build();
container.enableLocalThroughputControlGroup(groupConfig);
2. Select a tab such as Latency, and select a timeframe on the right. Compare the
Actual and SLA lines on the charts.
Clean up resources
When you're done with your app and Azure Cosmos DB account, you can delete the
Azure resources you created so you don't incur more charges. To delete the resources:
1. In the Azure portal Search bar, search for and select Resource groups.
2. From the list, select the resource group you created for this quickstart.
Next steps
In this quickstart, you learned how to create an Azure Cosmos DB for NoSQL account,
create a document database and container using Data Explorer, and run a Java app to
do the same thing programmatically. You can now import additional data into your
Azure Cosmos DB account.
Are you capacity planning for a migration to Azure Cosmos DB? You can use information
about your existing database cluster for capacity planning.
If all you know is the number of vcores and servers in your existing database
cluster, read about estimating RUs using vCores or vCPUs.
If you know typical request rates for your current database workload, learn how to
estimate RUs using Azure Cosmos DB capacity planner.
Use Java to send events to or receive
events from Azure Event Hubs
Article • 06/16/2023
This quickstart shows how to send events to and receive events from an event hub using
the azure-messaging-eventhubs Java package.
Tip
Prerequisites
If you're new to Azure Event Hubs, see Event Hubs overview before you do this
quickstart.
Microsoft Azure subscription. To use Azure services, including Azure Event Hubs,
you need a subscription. If you don't have an existing Azure account, you can sign
up for a free trial or use your MSDN subscriber benefits when you create an
account .
A Java development environment. This quickstart uses Eclipse . Java Development
Kit (JDK) with version 8 or above is required.
Create an Event Hubs namespace and an event hub. The first step is to use the
Azure portal to create a namespace of type Event Hubs, and obtain the
management credentials your application needs to communicate with the event
hub. To create a namespace and an event hub, follow the procedure in this article.
Then, get the connection string for the Event Hubs namespace by following
instructions from the article: Get connection string. You use the connection string
later in this quickstart.
Send events
This section shows you how to create a Java application to send events an event hub.
Passwordless (Recommended)
XML
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-messaging-eventhubs</artifactId>
<version>5.15.0</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.8.0</version>
<scope>compile</scope>
</dependency>
7 Note
Update the version to the latest version published to the Maven repository.
The following example assigns the Azure Event Hubs Data Owner role to your user
account, which provides full access to Azure Event Hubs resources. In a real
scenario, follow the Principle of Least Privilege to give users only the minimum
permissions needed for a more secure production environment.
Azure Event Hubs Data Owner: Enables data access to Event Hubs namespace
and its entities (queues, topics, subscriptions, and filters)
Azure Event Hubs Data Sender: Use this role to give the sender access to
Event Hubs namespace and its entities.
Azure Event Hubs Data Receiver: Use this role to give the receiver access to
Event Hubs namespace and its entities.
If you want to create a custom role, see Rights required for Event Hubs operations.
) Important
In most cases, it will take a minute or two for the role assignment to propagate
in Azure. In rare cases, it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your Event Hubs namespace using the main
search bar or left navigation.
2. On the overview page, select Access control (IAM) from the left-hand
menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Azure Event Hubs Data Owner and select the matching
result. Then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Write code to send messages to the event hub
Passwordless (Recommended)
Add a class named Sender , and add the following code to the class:
) Important
Update <NAMESPACE NAME> with the name of your Event Hubs namespace.
Update <EVENT HUB NAME> with the name of your event hub.
Java
package ehubquickstart;
import com.azure.messaging.eventhubs.*;
import java.util.Arrays;
import java.util.List;
import com.azure.identity.*;
// Replace <EVENT HUB NAME> with the name of your event hub.
// Example: private static final String eventHubName = "ordersehub";
private static final String eventHubName = "<EVENT HUB NAME>";
// create a batch
EventDataBatch eventDataBatch = producer.createBatch();
Build the program, and ensure that there are no errors. You'll run this program after you
run the receiver program.
Receive events
The code in this tutorial is based on the EventProcessorClient sample on GitHub ,
which you can examine to see the full working application.
Follow these recommendations when using Azure Blob Storage as a checkpoint store:
Use a separate container for each processor group. You can use the same storage
account, but use one container per each group.
Don't use the container for anything else, and don't use the storage account for
anything else.
Storage account should be in the same region as the deployed application is
located in. If the application is on-premises, try to choose the closest region
possible.
On the Storage account page in the Azure portal, in the Blob service section, ensure
that the following settings are disabled.
Hierarchical namespace
Blob soft delete
Versioning
Passwordless (Recommended)
When developing locally, make sure that the user account that is accessing blob
data has the correct permissions. You'll need Storage Blob Data Contributor to
read and write blob data. To assign yourself this role, you'll need to be assigned the
User Access Administrator role, or another role that includes the
Microsoft.Authorization/roleAssignments/write action. You can assign Azure RBAC
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. You can learn
more about the available scopes for role assignments on the scope overview page.
In this scenario, you'll assign permissions to your user account, scoped to the
storage account, to follow the Principle of Least Privilege. This practice gives users
only the minimum permissions needed and creates more secure production
environments.
The following example will assign the Storage Blob Data Contributor role to your
user account, which provides both read and write access to blob data in your
storage account.
) Important
In most cases it will take a minute or two for the role assignment to propagate
in Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your storage account using the main search
bar or left navigation.
2. On the storage account overview page, select Access control (IAM) from
the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Storage Blob Data Contributor and select the
matching result and then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Passwordless (Recommended)
azure-messaging-eventhubs
azure-messaging-eventhubs-checkpointstore-blob
azure-identity
XML
<dependencies>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-messaging-eventhubs</artifactId>
<version>5.15.0</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-messaging-eventhubs-checkpointstore-
blob</artifactId>
<version>1.16.1</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.8.0</version>
<scope>compile</scope>
</dependency>
</dependencies>
Passwordless (Recommended)
1. Add the following import statements at the top of the Java file.
Java
import com.azure.messaging.eventhubs.*;
import
com.azure.messaging.eventhubs.checkpointstore.blob.BlobCheckpointSt
ore;
import com.azure.messaging.eventhubs.models.*;
import com.azure.storage.blob.*;
import java.util.function.Consumer;
import com.azure.identity.*;
2. Create a class named Receiver , and add the following string variables to the
class. Replace the placeholders with the correct values.
) Important
<EVENT HUB NAME> with the name of your event hub in the
namespace.
Java
) Important
account.
<CONTAINER NAME> with the name of the blob container in the
storage account
Java
.consumerGroup(EventHubClientBuilder.DEFAULT_CONSUMER_GROUP_NAME)
.processEvent(PARTITION_PROCESSOR)
.processError(ERROR_HANDLER)
.checkpointStore(new
BlobCheckpointStore(blobContainerAsyncClient))
.credential(credential)
.buildEventProcessorClient();
System.out.println("Exiting process");
Java
public static final Consumer<EventContext> PARTITION_PROCESSOR =
eventContext -> {
PartitionContext partitionContext =
eventContext.getPartitionContext();
EventData eventData = eventContext.getEventData();
3. In the Receiver application window, confirm that you see the events that were
published by the Sender application.
Next steps
See the following samples on GitHub:
azure-messaging-eventhubs samples
azure-messaging-eventhubs-checkpointstore-blob samples .
Quickstart: Stream data with Azure
Event Hubs and Apache Kafka
Article • 08/08/2023
This quickstart shows you how to stream data into and from Azure Event Hubs using the
Apache Kafka protocol. You'll not change any code in the sample Kafka producer or
consumer apps. You just update the configurations that the clients use to point to an
Event Hubs namespace, which exposes a Kafka endpoint. You also don't build and use a
Kafka cluster on your own. Instead, you use the Event Hubs namespace with the Kafka
endpoint.
7 Note
Prerequisites
To complete this quickstart, make sure you have the following prerequisites:
7 Note
Event Hubs for Kafka isn't supported in the basic tier.
1. Enable a system-assigned managed identity for the virtual machine. For more
information about configuring managed identity on a VM, see Configure
managed identities for Azure resources on a VM using the Azure portal.
Managed identities for Azure resources provide Azure services with an
automatically managed identity in Azure Active Directory. You can use this
identity to authenticate to any service that supports Azure AD authentication,
without having credentials in your code.
2. Using the Access control page of the Event Hubs namespace you created,
assign Azure Event Hubs Data Owner role to the VM's managed identity.
Azure Event Hubs supports using Azure Active Directory (Azure AD) to
authorize requests to Event Hubs resources. With Azure AD, you can use Azure
role-based access control (Azure RBAC) to grant permissions to a security
principal, which may be a user, or an application service principal.
a. In the Azure portal, navigate to your Event Hubs namespace. Go to "Access
Control (IAM)" in the left navigation.
c. In the Role tab, select Azure Event Hubs Data Owner, and select the Next
button.
d. In the Members tab, select the Managed Identity in the Assign access to
section.
e. Select the +Select members link.
5. Navigate to azure-event-hubs-for-
kafka/tutorials/oauth/java/managedidentity/consumer .
XML
bootstrap.servers=NAMESPACENAME.servicebus.windows.net:9093
security.protocol=SASL_SSL
sasl.mechanism=OAUTHBEARER
sasl.jaas.config=org.apache.kafka.common.security.oauthbearer.OAuth
BearerLoginModule required;
sasl.login.callback.handler.class=CustomAuthenticateCallbackHandler
;
7 Note
You can find all the OAuth samples for Event Hubs for Kafka here .
7. Switch back to the Consumer folder where the pom.xml file is and, and run
the consumer code and process events from event hub using your Kafka
clients:
Java
10. Switch back to the Producer folder where the pom.xml file is and, run the
producer code and stream events into Event Hubs:
shell
You should see messages about events sent in the producer window. Now,
check the consumer app window to see the messages that it receives from the
event hub.
Schema validation for Kafka with Schema
Registry
You can use Azure Schema Registry to perform schema validation when you stream data
with your Kafka applications using Event Hubs. Azure Schema Registry of Event Hubs
provides a centralized repository for managing schemas and you can seamlessly
connect your new or existing Kafka applications with Schema Registry.
To learn more, see Validate schemas for Apache Kafka applications using Avro.
Next steps
In this article, you learned how to stream into Event Hubs without changing your
protocol clients or running your own clusters. To learn more, see Apache Kafka
developer guide for Azure Event Hubs.
Quickstart: Azure Key Vault Certificate
client library for Java (Certificates)
Article • 02/15/2023
Get started with the Azure Key Vault Certificate client library for Java. Follow the steps
below to install the package and try out example code for basic tasks.
Tip
If you're working with Azure Key Vault Certificates resources in a Spring application,
we recommend that you consider Spring Cloud Azure as an alternative. Spring
Cloud Azure is an open-source project that provides seamless Spring integration
with Azure services. To learn more about Spring Cloud Azure, and to see an
example using Key Vault Certificates, see Enable HTTPS in Spring Boot with Azure
Key Vault certificates.
Additional resources:
Source code
API reference documentation
Product documentation
Samples
Prerequisites
An Azure subscription - create one for free .
Java Development Kit (JDK) version 8 or above
Apache Maven
Azure CLI
This quickstart assumes you are running Azure CLI and Apache Maven in a Linux
terminal window.
Setting up
This quickstart is using the Azure Identity library with Azure CLI to authenticate user to
Azure Services. Developers can also use Visual Studio or Visual Studio Code to
authenticate their calls, for more information, see Authenticate the client with Azure
Identity client library.
Sign in to Azure
1. Run the login command.
Azure CLI
az login
If the CLI can open your default browser, it will do so and load an Azure sign-in
page.
Console
The output from generating the project will look something like this:
Console
[INFO] ---------------------------------------------------------------------
-------
[INFO] Using following parameters for creating project from Archetype:
maven-archetype-quickstart:1.4
[INFO] ---------------------------------------------------------------------
-------
[INFO] Parameter: groupId, Value: com.keyvault.certificates.quickstart
[INFO] Parameter: artifactId, Value: akv-certificates-java
[INFO] Parameter: version, Value: 1.0-SNAPSHOT
[INFO] Parameter: package, Value: com.keyvault.certificates.quickstart
[INFO] Parameter: packageInPathFormat, Value: com/keyvault/quickstart
[INFO] Parameter: package, Value: com.keyvault.certificates.quickstart
[INFO] Parameter: groupId, Value: com.keyvault.certificates.quickstart
[INFO] Parameter: artifactId, Value: akv-certificates-java
[INFO] Parameter: version, Value: 1.0-SNAPSHOT
[INFO] Project created from Archetype in dir: /home/user/quickstarts/akv-
certificates-java
[INFO] ---------------------------------------------------------------------
---
[INFO] BUILD SUCCESS
[INFO] ---------------------------------------------------------------------
---
[INFO] Total time: 38.124 s
[INFO] Finished at: 2019-11-15T13:19:06-08:00
[INFO] ---------------------------------------------------------------------
---
Console
cd akv-certificates-java
XML
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-security-keyvault-certificates</artifactId>
<version>4.1.3</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.2.0</version>
</dependency>
Alternatively, you can simply run the Azure CLI or Azure PowerShell commands below.
) Important
Azure CLI
Azure CLI
Create an access policy for your key vault that grants certificate permissions to your user
account.
Azure CLI
Windows
set KEY_VAULT_NAME=<your-key-vault-name>
Windows PowerShell
PowerShell
$Env:KEY_VAULT_NAME="<your-key-vault-name>"
macOS or Linux
export KEY_VAULT_NAME=<your-key-vault-name>
Object model
The Azure Key Vault Certificate client library for Java allows you to manage certificates.
The Code examples section shows how to create a client, create a certificate, retrieve a
certificate, and delete a certificate.
Code examples
Add directives
Add the following directives to the top of your code:
Java
import com.azure.core.util.polling.SyncPoller;
import com.azure.identity.DefaultAzureCredentialBuilder;
import com.azure.security.keyvault.certificates.CertificateClient;
import com.azure.security.keyvault.certificates.CertificateClientBuilder;
import com.azure.security.keyvault.certificates.models.CertificateOperation;
import com.azure.security.keyvault.certificates.models.CertificatePolicy;
import com.azure.security.keyvault.certificates.models.DeletedCertificate;
import com.azure.security.keyvault.certificates.models.KeyVaultCertificate;
import
com.azure.security.keyvault.certificates.models.KeyVaultCertificateWithPolic
y;
In this example, the name of your key vault is expanded to the key vault URI, in the
format https://<your-key-vault-name>.vault.azure.net . For more information about
authenticating to key vault, see Developer's Guide.
Java
Save a secret
Now that your application is authenticated, you can create a certificate in your key vault
using the certificateClient.beginCreateCertificate method. This requires a name for
the certificate and a certificate policy -- we've assigned the value "myCertificate" to the
certificateName variable in this sample and use a default policy.
Certificate creation is a long running operation, for which you can poll its progress or
wait for it to complete.
Java
SyncPoller<CertificateOperation, KeyVaultCertificateWithPolicy>
certificatePoller =
certificateClient.beginCreateCertificate(certificateName,
CertificatePolicy.getDefault());
certificatePoller.waitForCompletion();
You can obtain the certificate once creation has completed with via the following call:
Java
Retrieve a certificate
You can now retrieve the previously created certificate with the
certificateClient.getCertificate method.
Java
KeyVaultCertificate retrievedCertificate =
certificateClient.getCertificate(certificateName);
You can now access the details of the retrieved certificate with operations like
retrievedCertificate.getName , retrievedCertificate.getProperties , etc. As well as its
contents retrievedCertificate.getCer .
Delete a certificate
Finally, let's delete the certificate from your key vault with the
certificateClient.beginDeleteCertificate method, which is also a long running
operation.
Java
Clean up resources
When no longer needed, you can use the Azure CLI or Azure PowerShell to remove your
key vault and the corresponding resource group.
Azure CLI
Azure PowerShell
Remove-AzResourceGroup -Name "myResourceGroup"
Sample code
Java
package com.keyvault.certificates.quickstart;
import com.azure.core.util.polling.SyncPoller;
import com.azure.identity.DefaultAzureCredentialBuilder;
import com.azure.security.keyvault.certificates.CertificateClient;
import com.azure.security.keyvault.certificates.CertificateClientBuilder;
import com.azure.security.keyvault.certificates.models.CertificateOperation;
import com.azure.security.keyvault.certificates.models.CertificatePolicy;
import com.azure.security.keyvault.certificates.models.DeletedCertificate;
import com.azure.security.keyvault.certificates.models.KeyVaultCertificate;
import
com.azure.security.keyvault.certificates.models.KeyVaultCertificateWithPolic
y;
SyncPoller<CertificateOperation, KeyVaultCertificateWithPolicy>
certificatePoller =
certificateClient.beginCreateCertificate(certificateName,
CertificatePolicy.getDefault());
certificatePoller.waitForCompletion();
System.out.print("done.");
System.out.println("Retrieving certificate from " + keyVaultName +
".");
KeyVaultCertificate retrievedCertificate =
certificateClient.getCertificate(certificateName);
System.out.print("done.");
}
}
Next steps
In this quickstart you created a key vault, created a certificate, retrieved it, and then
deleted it. To learn more about Key Vault and how to integrate it with your applications,
continue on to the articles below.
Get started with the Azure Key Vault Key client library for Java. Follow these steps to
install the package and try out example code for basic tasks.
Additional resources:
Source code
API reference documentation
Product documentation
Samples
Prerequisites
An Azure subscription - create one for free .
Java Development Kit (JDK) version 8 or above
Apache Maven
Azure CLI
This quickstart assumes you're running Azure CLI and Apache Maven in a Linux
terminal window.
Setting up
This quickstart is using the Azure Identity library with Azure CLI to authenticate user to
Azure Services. Developers can also use Visual Studio or Visual Studio Code to
authenticate their calls, for more information, see Authenticate the client with Azure
Identity client library.
Sign in to Azure
1. Run the login command.
Azure CLI
az login
If the CLI can open your default browser, it will do so and load an Azure sign-in
page.
Console
The output from generating the project will look something like this:
Console
[INFO] ---------------------------------------------------------------------
-------
[INFO] Using following parameters for creating project from Archetype:
maven-archetype-quickstart:1.4
[INFO] ---------------------------------------------------------------------
-------
[INFO] Parameter: groupId, Value: com.keyvault.keys.quickstart
[INFO] Parameter: artifactId, Value: akv-keys-java
[INFO] Parameter: version, Value: 1.0-SNAPSHOT
[INFO] Parameter: package, Value: com.keyvault.keys.quickstart
[INFO] Parameter: packageInPathFormat, Value: com/keyvault/quickstart
[INFO] Parameter: package, Value: com.keyvault.keys.quickstart
[INFO] Parameter: groupId, Value: com.keyvault.keys.quickstart
[INFO] Parameter: artifactId, Value: akv-keys-java
[INFO] Parameter: version, Value: 1.0-SNAPSHOT
[INFO] Project created from Archetype in dir: /home/user/quickstarts/akv-
keys-java
[INFO] ---------------------------------------------------------------------
---
[INFO] BUILD SUCCESS
[INFO] ---------------------------------------------------------------------
---
[INFO] Total time: 38.124 s
[INFO] Finished at: 2019-11-15T13:19:06-08:00
[INFO] ---------------------------------------------------------------------
---
Console
cd akv-keys-java
XML
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-security-keyvault-keys</artifactId>
<version>4.2.3</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.2.0</version>
</dependency>
Alternatively, you can simply run the Azure CLI or Azure PowerShell commands below.
) Important
Azure CLI
Azure CLI
Azure CLI
This application is using your key vault name as an environment variable called
KEY_VAULT_NAME .
Windows
set KEY_VAULT_NAME=<your-key-vault-name>
Windows PowerShell
PowerShell
$Env:KEY_VAULT_NAME="<your-key-vault-name>"
macOS or Linux
export KEY_VAULT_NAME=<your-key-vault-name>
Object model
The Azure Key Vault Key client library for Java allows you to manage keys. The Code
examples section shows how to create a client, create a key, retrieve a key, and delete a
key.
Code examples
Add directives
Add the following directives to the top of your code:
Java
import com.azure.core.util.polling.SyncPoller;
import com.azure.identity.DefaultAzureCredentialBuilder;
import com.azure.security.keyvault.keys.KeyClient;
import com.azure.security.keyvault.keys.KeyClientBuilder;
import com.azure.security.keyvault.keys.models.DeletedKey;
import com.azure.security.keyvault.keys.models.KeyType;
import com.azure.security.keyvault.keys.models.KeyVaultKey;
In this example, the name of your key vault is expanded to the key vault URI, in the
format https://<your-key-vault-name>.vault.azure.net . For more information about
authenticating to key vault, see Developer's Guide.
Java
Create a key
Now that your application is authenticated, you can create a key in your key vault using
the keyClient.createKey method. This requires a name for the key and a key type.
We've assigned the value "myKey" to the keyName variable and use a an RSA KeyType in
this sample.
Java
keyClient.createKey(keyName, KeyType.RSA);
You can verify that the key has been set with the az keyvault key show command:
Azure CLI
Retrieve a key
You can now retrieve the previously created key with the keyClient.getKey method.
Java
You can now access the details of the retrieved key with operations like
retrievedKey.getProperties , retrievedKey.getKeyOperations , etc.
Delete a key
Finally, let's delete the key from your key vault with the keyClient.beginDeleteKey
method.
Key deletion is a long running operation, for which you can poll its progress or wait for it
to complete.
Java
You can verify that the key has been deleted with the az keyvault key show command:
Azure CLI
Clean up resources
When no longer needed, you can use the Azure CLI or Azure PowerShell to remove your
key vault and the corresponding resource group.
Azure CLI
Azure PowerShell
Sample code
Java
package com.keyvault.keys.quickstart;
import com.azure.core.util.polling.SyncPoller;
import com.azure.identity.DefaultAzureCredentialBuilder;
import com.azure.security.keyvault.keys.KeyClient;
import com.azure.security.keyvault.keys.KeyClientBuilder;
import com.azure.security.keyvault.keys.models.DeletedKey;
import com.azure.security.keyvault.keys.models.KeyType;
import com.azure.security.keyvault.keys.models.KeyVaultKey;
keyClient.createKey(keyName, KeyType.RSA);
System.out.print("done.");
System.out.println("Retrieving key from " + keyVaultName + ".");
System.out.print("done.");
}
}
Next steps
In this quickstart, you created a key vault, created a key, retrieved it, and then deleted it.
To learn more about Key Vault and how to integrate it with your applications, continue
on to these articles.
Get started with the Azure Key Vault Secret client library for Java. Follow these steps to
install the package and try out example code for basic tasks.
Tip
If you're working with Azure Key Vault Secrets resources in a Spring application, we
recommend that you consider Spring Cloud Azure as an alternative. Spring Cloud
Azure is an open-source project that provides seamless Spring integration with
Azure services. To learn more about Spring Cloud Azure, and to see an example
using Key Vault Secrets, see Load a secret from Azure Key Vault in a Spring Boot
application.
Additional resources:
Source code
API reference documentation
Product documentation
Samples
Prerequisites
An Azure subscription - create one for free .
Java Development Kit (JDK) version 8 or above
Apache Maven
Azure CLI
This quickstart assumes you're running Azure CLI and Apache Maven in a Linux
terminal window.
Setting up
This quickstart is using the Azure Identity library with Azure CLI to authenticate user to
Azure Services. Developers can also use Visual Studio or Visual Studio Code to
authenticate their calls, for more information, see Authenticate the client with Azure
Identity client library.
Sign in to Azure
1. Run the login command.
Azure CLI
az login
If the CLI can open your default browser, it will do so and load an Azure sign-in
page.
Console
The output from generating the project will look something like this:
Console
[INFO] ---------------------------------------------------------------------
-------
[INFO] Using following parameters for creating project from Archetype:
maven-archetype-quickstart:1.4
[INFO] ---------------------------------------------------------------------
-------
[INFO] Parameter: groupId, Value: com.keyvault.secrets.quickstart
[INFO] Parameter: artifactId, Value: akv-secrets-java
[INFO] Parameter: version, Value: 1.0-SNAPSHOT
[INFO] Parameter: package, Value: com.keyvault.secrets.quickstart
[INFO] Parameter: packageInPathFormat, Value: com/keyvault/quickstart
[INFO] Parameter: package, Value: com.keyvault.secrets.quickstart
[INFO] Parameter: groupId, Value: com.keyvault.secrets.quickstart
[INFO] Parameter: artifactId, Value: akv-secrets-java
[INFO] Parameter: version, Value: 1.0-SNAPSHOT
[INFO] Project created from Archetype in dir: /home/user/quickstarts/akv-
secrets-java
[INFO] ---------------------------------------------------------------------
---
[INFO] BUILD SUCCESS
[INFO] ---------------------------------------------------------------------
---
[INFO] Total time: 38.124 s
[INFO] Finished at: 2019-11-15T13:19:06-08:00
[INFO] ---------------------------------------------------------------------
---
Azure CLI
cd akv-secrets-java
XML
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-security-keyvault-secrets</artifactId>
<version>4.2.3</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.2.0</version>
</dependency>
Alternatively, you can simply run the Azure CLI or Azure PowerShell commands below.
) Important
Azure CLI
Azure CLI
Create an access policy for your key vault that grants secret permissions to your user
account.
Azure CLI
Windows
set KEY_VAULT_NAME=<your-key-vault-name>
Windows PowerShell
PowerShell
$Env:KEY_VAULT_NAME="<your-key-vault-name>"
macOS or Linux
export KEY_VAULT_NAME=<your-key-vault-name>
Object model
The Azure Key Vault Secret client library for Java allows you to manage secrets. The
Code examples section shows how to create a client, set a secret, retrieve a secret, and
delete a secret.
Code examples
Add directives
Add the following directives to the top of your code:
Java
import com.azure.core.util.polling.SyncPoller;
import com.azure.identity.DefaultAzureCredentialBuilder;
import com.azure.security.keyvault.secrets.SecretClient;
import com.azure.security.keyvault.secrets.SecretClientBuilder;
import com.azure.security.keyvault.secrets.models.DeletedSecret;
import com.azure.security.keyvault.secrets.models.KeyVaultSecret;
In this example, the name of your key vault is expanded to the key vault URI, in the
format https://<your-key-vault-name>.vault.azure.net . For more information about
authenticating to key vault, see Developer's Guide.
Java
Save a secret
Now that your application is authenticated, you can put a secret into your key vault
using the secretClient.setSecret method. This requires a name for the secret—we've
assigned the value "mySecret" to the secretName variable in this sample.
Java
You can verify that the secret has been set with the az keyvault secret show command:
Azure CLI
Retrieve a secret
You can now retrieve the previously set secret with the secretClient.getSecret method.
Java
Delete a secret
Finally, let's delete the secret from your key vault with the
secretClient.beginDeleteSecret method.
Secret deletion is a long running operation, for which you can poll its progress or wait
for it to complete.
Java
You can verify that the secret has been deleted with the az keyvault secret show
command:
Azure CLI
Clean up resources
When no longer needed, you can use the Azure CLI or Azure PowerShell to remove your
key vault and the corresponding resource group.
Azure CLI
Azure PowerShell
Sample code
Java
package com.keyvault.secrets.quickstart;
import java.io.Console;
import com.azure.core.util.polling.SyncPoller;
import com.azure.identity.DefaultAzureCredentialBuilder;
import com.azure.security.keyvault.secrets.SecretClient;
import com.azure.security.keyvault.secrets.SecretClientBuilder;
import com.azure.security.keyvault.secrets.models.DeletedSecret;
import com.azure.security.keyvault.secrets.models.KeyVaultSecret;
System.out.println("done.");
System.out.println("Forgetting your secret.");
secretValue = "";
System.out.println("Your secret's value is '" + secretValue + "'.");
System.out.println("done.");
}
}
Next steps
In this quickstart, you created a key vault, stored a secret, retrieved it, and then deleted
it. To learn more about Key Vault and how to integrate it with your applications,
continue on to these articles.
) Important
Azure Database for MySQL - Single Server is on the retirement path. We strongly
recommend for you to upgrade to Azure Database for MySQL - Flexible Server. For
more information about migrating to Azure Database for MySQL - Flexible Server,
see What's happening to Azure Database for MySQL Single Server?
This article demonstrates creating a sample application that uses Java and JDBC to
store and retrieve information in Azure Database for MySQL.
In this article, we'll include two authentication methods: Azure Active Directory (Azure
AD) authentication and MySQL authentication. The Passwordless tab shows the Azure
AD authentication and the Password tab shows the MySQL authentication.
MySQL authentication uses accounts stored in MySQL. If you choose to use passwords
as credentials for the accounts, these credentials will be stored in the user table.
Because these passwords are stored in MySQL, you'll need to manage the rotation of
the passwords by yourself.
Prerequisites
An Azure account. If you don't have one, get a free trial .
Azure Cloud Shell or Azure CLI. We recommend Azure Cloud Shell so you'll be
logged in automatically and have access to all the tools you'll need.
A supported Java Development Kit, version 8 (included in Azure Cloud Shell).
The Apache Maven build tool.
MySQL command line client. You can connect to your server using the mysql.exe
command-line tool with Azure Cloud Shell. Alternatively, you can use the mysql
command line in your local environment.
Passwordless (Recommended)
Bash
export AZ_RESOURCE_GROUP=database-workshop
export AZ_DATABASE_SERVER_NAME=<YOUR_DATABASE_SERVER_NAME>
export AZ_DATABASE_NAME=demo
export AZ_LOCATION=<YOUR_AZURE_REGION>
export AZ_MYSQL_AD_NON_ADMIN_USERNAME=demo-non-admin
export AZ_LOCAL_IP_ADDRESS=<YOUR_LOCAL_IP_ADDRESS>
export CURRENT_USERNAME=$(az ad signed-in-user show --query
userPrincipalName -o tsv)
export CURRENT_USER_OBJECTID=$(az ad signed-in-user show --query id -o
tsv)
Replace the placeholders with the following values, which are used throughout this
article:
default, but we recommend that you configure a region closer to where you
live. You can see the full list of available regions by entering az account list-
locations .
Azure CLI
az group create \
--name $AZ_RESOURCE_GROUP \
--location $AZ_LOCATION \
--output tsv
7 Note
You can read more detailed information about creating MySQL servers in
Quickstart: Create an Azure Database for MySQL server by using the Azure
portal.
Passwordless (Recommended)
If you're using Azure CLI, run the following command to make sure it has sufficient
permission:
Azure CLI
Azure CLI
Next, run the following command to set the Azure AD admin user:
Azure CLI
) Important
When setting the administrator, a new user is added to the Azure Database for
MySQL server with full administrator permissions. You can only create one
Azure AD admin per MySQL server. Selection of another user will overwrite the
existing Azure AD admin configured for the server.
This command creates a small MySQL server and sets the Active Directory admin to
the signed-in user.
Because you configured your local IP address at the beginning of this article, you can
open the server's firewall by running the following command:
Azure CLI
If you're connecting to your MySQL server from Windows Subsystem for Linux (WSL) on
a Windows computer, you'll need to add the WSL host ID to your firewall.
Obtain the IP address of your host machine by running the following command in WSL:
Bash
cat /etc/resolv.conf
Copy the IP address following the term nameserver , then use the following command to
set an environment variable for the WSL IP Address:
Bash
AZ_WSL_IP_ADDRESS=<the-copied-IP-address>
Then, use the following command to open the server's firewall to your WSL-based app:
Azure CLI
Azure CLI
az mysql db create \
--resource-group $AZ_RESOURCE_GROUP \
--name $AZ_DATABASE_NAME \
--server-name $AZ_DATABASE_SERVER_NAME \
--output tsv
7 Note
You can read more detailed information about creating MySQL users in Create
users in Azure Database for MySQL.
Passwordless (Recommended)
Create a SQL script called create_ad_user.sql for creating a non-admin user. Add the
following contents and save it locally:
Bash
export AZ_MYSQL_AD_NON_ADMIN_USERID=$CURRENT_USER_OBJECTID
FLUSH privileges;
EOF
Then, use the following command to run the SQL script to create the Azure AD
non-admin user:
Bash
Now use the following command to remove the temporary SQL script file:
Bash
rm create_ad_user.sql
Passwordless (Recommended)
XML
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.example</groupId>
<artifactId>demo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>demo</name>
<properties>
<java.version>1.8</java.version>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>8.0.30</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity-extensions</artifactId>
<version>1.0.0</version>
</dependency>
</dependencies>
</project>
This file is an Apache Maven file that configures your project to use Java 8 and a
recent MySQL driver for Java.
Passwordless (Recommended)
Bash
7 Note
Bash
7 Note
Bash
touch src/main/resources/schema.sql
Java
package com.example.demo;
import com.mysql.cj.jdbc.AbandonedConnectionCleanupThread;
import java.sql.*;
import java.util.*;
import java.util.logging.Logger;
static {
System.setProperty("java.util.logging.SimpleFormatter.format", "
[%4$-7s] %5$s %n");
log =Logger.getLogger(DemoApplication.class.getName());
}
properties.load(DemoApplication.class.getClassLoader().getResourceAsStream("
database.properties"));
This Java code will use the database.properties and the schema.sql files that you created
earlier. After connecting to the MySQL server, you can create a schema to store your
data.
In this file, you can see that we commented methods to insert, read, update and delete
data. You'll implement those methods in the rest of this article, and you'll be able to
uncomment them one after each other.
7 Note
The database credentials are stored in the user and password properties of the
database.properties file. Those credentials are used when executing
DriverManager.getConnection(properties.getProperty("url"), properties); , as the
7 Note
Using your IDE, you should be able to right-click on the DemoApplication class and
execute it.
Using Maven, you can run the application with the following command: mvn
exec:java -Dexec.mainClass="com.example.demo.DemoApplication" .
The application should connect to the Azure Database for MySQL, create a database
schema, and then close the connection. You should see output similar to the following
example in the console logs:
Output
Java
package com.example.demo;
public Todo() {
}
@Override
public String toString() {
return "Todo{" +
"id=" + id +
", description='" + description + '\'' +
", details='" + details + '\'' +
", done=" + done +
'}';
}
}
This class is a domain model mapped on the todo table that you created when
executing the schema.sql script.
Java
private static void insertData(Todo todo, Connection connection) throws
SQLException {
log.info("Insert data");
PreparedStatement insertStatement = connection
.prepareStatement("INSERT INTO todo (id, description, details,
done) VALUES (?, ?, ?, ?);");
insertStatement.setLong(1, todo.getId());
insertStatement.setString(2, todo.getDescription());
insertStatement.setString(3, todo.getDetails());
insertStatement.setBoolean(4, todo.isDone());
insertStatement.executeUpdate();
}
You can now uncomment the two following lines in the main method:
Java
Executing the main class should now produce the following output:
Output
Java
You can now uncomment the following line in the main method:
Java
todo = readData(connection);
Executing the main class should now produce the following output:
Output
Still in the src/main/java/DemoApplication.java file, after the readData method, add the
following method to update data inside the database:
Java
updateStatement.setString(1, todo.getDescription());
updateStatement.setString(2, todo.getDetails());
updateStatement.setBoolean(3, todo.isDone());
updateStatement.setLong(4, todo.getId());
updateStatement.executeUpdate();
readData(connection);
}
You can now uncomment the two following lines in the main method:
Java
Executing the main class should now produce the following output:
Output
Java
You can now uncomment the following line in the main method:
Java
deleteData(todo, connection);
Executing the main class should now produce the following output:
Output
Clean up resources
Congratulations! You've created a Java application that uses JDBC to store and retrieve
data from Azure Database for MySQL.
To clean up all resources used during this quickstart, delete the resource group using
the following command:
Azure CLI
az group delete \
--name $AZ_RESOURCE_GROUP \
--yes
Next steps
Migrate your MySQL database to Azure Database for MySQL using dump and
restore
Quickstart: Use Java and JDBC with
Azure Database for PostgreSQL
Article • 03/29/2023
) Important
This article demonstrates how to create a sample application that uses Java and JDBC
to store and retrieve information in Azure Database for PostgreSQL.
In this article, we'll include two authentication methods: Azure Active Directory (Azure
AD) authentication and PostgreSQL authentication. The Passwordless tab shows the
Azure AD authentication and the Password tab shows the PostgreSQL authentication.
Prerequisites
An Azure account. If you don't have one, get a free trial .
Azure Cloud Shell or Azure CLI 2.37.0 or above required. We recommend Azure
Cloud Shell so you'll be logged in automatically and have access to all the tools
you'll need.
A supported Java Development Kit, version 8 (included in Azure Cloud Shell).
The Apache Maven build tool.
Passwordless (Recommended)
Bash
export AZ_RESOURCE_GROUP=database-workshop
export AZ_DATABASE_SERVER_NAME=<YOUR_DATABASE_SERVER_NAME>
export AZ_DATABASE_NAME=demo
export AZ_LOCATION=<YOUR_AZURE_REGION>
export AZ_POSTGRESQL_AD_NON_ADMIN_USERNAME=
<YOUR_POSTGRESQL_AD_NON_ADMIN_USERNAME>
export AZ_LOCAL_IP_ADDRESS=<YOUR_LOCAL_IP_ADDRESS>
export CURRENT_USERNAME=$(az ad signed-in-user show --query
userPrincipalName -o tsv)
export CURRENT_USER_OBJECTID=$(az ad signed-in-user show --query id -o
tsv)
Replace the placeholders with the following values, which are used throughout this
article:
default, but we recommend that you configure a region closer to where you
live. You can see the full list of available regions by entering az account list-
locations .
<YOUR_POSTGRESQL_AD_NON_ADMIN_USERNAME> : The username of your PostgreSQL
database server. Make ensure the username is a valid user in your Azure AD
tenant.
<YOUR_LOCAL_IP_ADDRESS> : The IP address of your local computer, from which
you'll run your Spring Boot application. One convenient way to find it is to
open whatismyip.akamai.com .
) Important
Azure CLI
az group create \
--name $AZ_RESOURCE_GROUP \
--location $AZ_LOCATION \
--output tsv
7 Note
You can read more detailed information about creating PostgreSQL servers in
Create an Azure Database for PostgreSQL server by using the Azure portal.
Passwordless (Recommended)
If you're using Azure CLI, run the following command to make sure it has sufficient
permission:
Azure CLI
Azure CLI
Now run the following command to set the Azure AD admin user:
Azure CLI
) Important
When setting the administrator, a new user is added to the Azure Database for
PostgreSQL server with full administrator permissions. Only one Azure AD
admin can be created per PostgreSQL server and selection of another one will
overwrite the existing Azure AD admin configured for the server.
This command creates a small PostgreSQL server and sets the Active Directory
admin to the signed-in user.
Because you configured your local IP address at the beginning of this article, you can
open the server's firewall by running the following command:
Azure CLI
Obtain the IP address of your host machine by running the following command in WSL:
Bash
cat /etc/resolv.conf
Copy the IP address following the term nameserver , then use the following command to
set an environment variable for the WSL IP Address:
Bash
AZ_WSL_IP_ADDRESS=<the-copied-IP-address>
Then, use the following command to open the server's firewall to your WSL-based app:
Azure CLI
Azure CLI
az postgres db create \
--resource-group $AZ_RESOURCE_GROUP \
--name $AZ_DATABASE_NAME \
--server-name $AZ_DATABASE_SERVER_NAME \
--output tsv
7 Note
You can read more detailed information about creating PostgreSQL users in Create
users in Azure Database for PostgreSQL.
Passwordless (Recommended)
Create a SQL script called create_ad_user.sql for creating a non-admin user. Add the
following contents and save it locally:
Bash
Then, use the following command to run the SQL script to create the Azure AD
non-admin user:
Bash
psql "host=$AZ_DATABASE_SERVER_NAME.postgres.database.azure.com
user=$CURRENT_USERNAME@$AZ_DATABASE_SERVER_NAME dbname=$AZ_DATABASE_NAME
port=5432 password=$(az account get-access-token --resource-type oss-
rdbms --output tsv --query accessToken) sslmode=require" <
create_ad_user.sql
Now use the following command to remove the temporary SQL script file:
Bash
rm create_ad_user.sql
XML
<properties>
<java.version>1.8</java.version>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.3.6</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity-extensions</artifactId>
<version>1.0.0</version>
</dependency>
</dependencies>
</project>
This file is an Apache Maven file that configures your project to use Java 8 and a
recent PostgreSQL driver for Java.
Passwordless (Recommended)
Bash
cat << EOF > src/main/resources/application.properties
url=jdbc:postgresql://${AZ_DATABASE_SERVER_NAME}.postgres.database.azure
.com:5432/${AZ_DATABASE_NAME}?
sslmode=require&authenticationPluginClassName=com.azure.identity.extensi
ons.jdbc.postgresql.AzurePostgresqlAuthenticationPlugin
user=${AZ_POSTGRESQL_AD_NON_ADMIN_USERNAME}@${AZ_DATABASE_SERVER_NAME}
EOF
7 Note
The configuration property url has ?sslmode=require appended to tell the JDBC
driver to use TLS (Transport Layer Security ) when connecting to the database.
Using TLS is mandatory with Azure Database for PostgreSQL, and it's a good
security practice.
Bash
Java
package com.example.demo;
import java.sql.*;
import java.util.*;
import java.util.logging.Logger;
static {
System.setProperty("java.util.logging.SimpleFormatter.format", "
[%4$-7s] %5$s %n");
log =Logger.getLogger(DemoApplication.class.getName());
}
properties.load(DemoApplication.class.getClassLoader().getResourceAsStream("
application.properties"));
This Java code will use the application.properties and the schema.sql files that you
created earlier in order to connect to the PostgreSQL server and create a schema that
will store your data.
In this file, you can see that we commented methods to insert, read, update and delete
data. You'll code those methods in the rest of this article, and you'll be able to
uncomment them one after another.
7 Note
The database credentials are stored in the user and password properties of the
application.properties file. Those credentials are used when executing
DriverManager.getConnection(properties.getProperty("url"), properties); , as the
properties file is passed as an argument.
You can now execute this main class with your favorite tool:
Using your IDE, you should be able to right-click on the DemoApplication class and
execute it.
Using Maven, you can run the application by using the following command: mvn
exec:java -Dexec.mainClass="com.example.demo.DemoApplication" .
The application should connect to the Azure Database for PostgreSQL, create a database
schema, and then close the connection, as you should see in the console logs:
Output
Java
package com.example.demo;
@Override
public String toString() {
return "Todo{" +
"id=" + id +
", description='" + description + '\'' +
", details='" + details + '\'' +
", done=" + done +
'}';
}
}
This class is a domain model mapped on the todo table that you created when
executing the schema.sql script.
Java
insertStatement.setLong(1, todo.getId());
insertStatement.setString(2, todo.getDescription());
insertStatement.setString(3, todo.getDetails());
insertStatement.setBoolean(4, todo.isDone());
insertStatement.executeUpdate();
}
You can now uncomment the two following lines in the main method:
Java
Executing the main class should now produce the following output:
Output
Java
You can now uncomment the following line in the main method:
Java
todo = readData(connection);
Executing the main class should now produce the following output:
Output
Java
updateStatement.setString(1, todo.getDescription());
updateStatement.setString(2, todo.getDetails());
updateStatement.setBoolean(3, todo.isDone());
updateStatement.setLong(4, todo.getId());
updateStatement.executeUpdate();
readData(connection);
}
You can now uncomment the two following lines in the main method:
Java
Executing the main class should now produce the following output:
Output
Java
You can now uncomment the following line in the main method:
Java
deleteData(todo, connection);
Executing the main class should now produce the following output:
Output
Clean up resources
Congratulations! You've created a Java application that uses JDBC to store and retrieve
data from Azure Database for PostgreSQL.
To clean up all resources used during this quickstart, delete the resource group using
the following command:
Azure CLI
az group delete \
--name $AZ_RESOURCE_GROUP \
--yes
Next steps
Migrate your database using Export and Import
Send messages to and receive messages
from Azure Service Bus queues (Java)
Article • 04/13/2023
In this quickstart, you create a Java app to send messages to and receive messages from
an Azure Service Bus queue.
7 Note
This quick start provides step-by-step instructions for a simple scenario of sending
messages to a Service Bus queue and receiving them. You can find pre-built Java
samples for Azure Service Bus in the Azure SDK for Java repository on GitHub .
Tip
Prerequisites
An Azure subscription. To complete this tutorial, you need an Azure account. You
can activate your MSDN subscriber benefits or sign up for a free account .
Install Azure SDK for Java. If you're using Eclipse, you can install the Azure Toolkit
for Eclipse that includes the Azure SDK for Java. You can then add the Microsoft
Azure Libraries for Java to your project. If you're using IntelliJ, see Install the Azure
Toolkit for IntelliJ.
To create a namespace:
1. Sign in to the Azure portal
2. In the left navigation pane of the portal, select All services, select Integration from
the list of categories, hover the mouse over Service Bus, and then select Create on
the Service Bus tile.
3. In the Basics tag of the Create namespace page, follow these steps:
b. For Resource group, choose an existing resource group in which the namespace
will live, or create a new one.
c. Enter a name for the namespace. The namespace name should adhere to the
following naming conventions:
The name must be unique across Azure. The system immediately checks to
see if the name is available.
The name length is at least 6 and at most 50 characters.
The name can contain only letters, numbers, hyphens “-“.
The name must start with a letter and end with a letter or number.
The name doesn't end with “-sb“ or “-mgmt“.
d. For Location, choose the region in which your namespace should be hosted.
e. For Pricing tier, select the pricing tier (Basic, Standard, or Premium) for the
namespace. For this quickstart, select Standard.
) Important
If you selected the Premium pricing tier, specify the number of messaging
units. The premium tier provides resource isolation at the CPU and memory
level so that each workload runs in isolation. This resource container is called a
messaging unit. A premium namespace has at least one messaging unit. You
can select 1, 2, 4, 8 or 16 messaging units for each Service Bus Premium
namespace. For more information, see Service Bus Premium Messaging.
5. You see the home page for your service bus namespace.
3. Enter a name for the queue, and leave the other values with their defaults.
The first option shows you how to use your security principal in Azure Active Directory
and role-based access control (RBAC) to connect to a Service Bus namespace. You don't
need to worry about having hard-coded connection string in your code or in a
configuration file or in a secure storage like Azure Key Vault.
The second option shows you how to use a connection string to connect to a Service
Bus namespace. If you are new to Azure, you may find the connection string option
easier to follow. We recommend using the passwordless option in real-world
applications and production environments. For more information, see Authentication
and authorization. You can also read more about passwordless authentication on the
overview page.
Passwordless (Recommended)
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. Learn more
about the available scopes for role assignments on the scope overview page.
The following example assigns the Azure Service Bus Data Owner role to your user
account, which provides full access to Azure Service Bus resources. In a real
scenario, follow the Principle of Least Privilege to give users only the minimum
permissions needed for a more secure production environment.
Azure Service Bus Data Owner: Enables data access to Service Bus namespace
and its entities (queues, topics, subscriptions, and filters). A member of this
role can send and receive messages from queues or topics/subscriptions.
Azure Service Bus Data Sender: Use this role to give the send access to Service
Bus namespace and its entities.
Azure Service Bus Data Receiver: Use this role to give the receive access to
Service Bus namespace and its entities.
If you want to create a custom role, see Rights required for Service Bus operations.
) Important
In most cases, it will take a minute or two for the role assignment to propagate
in Azure. In rare cases, it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
1. If you don't have the Service Bus Namespace page open in the Azure portal,
locate your Service Bus namespace using the main search bar or left
navigation.
2. On the overview page, select Access control (IAM) from the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this example,
search for Azure Service Bus Data Owner and select the matching result. Then
choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
7. In the dialog, search for your Azure AD username (usually your user@domain
email address) and then choose Select at the bottom of the dialog.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Send messages to a queue
In this section, you create a Java console project, and add code to send messages to the
queue that you created earlier.
If you're using Eclipse and created a Java console application, convert your Java project
to a Maven: right-click the project in the Package Explorer window, select Configure ->
Convert to Maven project. Then, add dependencies to these two libraries as shown in
the following example.
Passwordless (Recommended)
Update the pom.xml file to add dependencies to Azure Service Bus and Azure
Identity packages.
XML
<dependencies>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-messaging-servicebus</artifactId>
<version>7.13.3</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.8.0</version>
<scope>compile</scope>
</dependency>
</dependencies>
Java
import com.azure.messaging.servicebus.*;
import com.azure.identity.*;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import java.util.Arrays;
import java.util.List;
2. In the class, define variables to hold connection string and queue name.
Passwordless (Recommended)
Java
) Important
3. Add a method named sendMessage in the class to send one message to the queue.
Passwordless (Recommended)
) Important
Java
.fullyQualifiedNamespace("NAMESPACENAME.servicebus.windows.net")
.credential(credential)
.sender()
.queueName(queueName)
.buildClient();
Java
Passwordless (Recommended)
) Important
Java
static void sendMessageBatch()
{
// create a token using the default Azure credential
DefaultAzureCredential credential = new
DefaultAzureCredentialBuilder()
.build();
.fullyQualifiedNamespace("NAMESPACENAME.servicebus.windows.net")
.credential(credential)
.sender()
.queueName(queueName)
.buildClient();
if (messageBatch.getCount() > 0) {
senderClient.sendMessages(messageBatch);
System.out.println("Sent a batch of messages to the queue:
" + queueName);
}
//close the client
senderClient.close();
}
1. Add a method named receiveMessages to receive messages from the queue. This
method creates a ServiceBusProcessorClient for the queue by specifying a
handler for processing messages and another one for handling errors. Then, it
starts the processor, waits for few seconds, prints the messages that are received,
and then stops and closes the processor.
Passwordless (Recommended)
) Important
Java
.fullyQualifiedNamespace("NAMESPACENAME.servicebus.windows.net")
.credential(credential)
.processor()
.queueName(queueName)
.processMessage(QueueTest::processMessage)
.processError(context -> processError(context,
countdownLatch))
.buildProcessorClient();
TimeUnit.SECONDS.sleep(10);
System.out.println("Stopping and closing the processor");
processorClient.close();
}
2. Add the processMessage method to process a message received from the Service
Bus subscription.
Java
Java
if (reason == ServiceBusFailureReason.MESSAGING_ENTITY_DISABLED
|| reason == ServiceBusFailureReason.MESSAGING_ENTITY_NOT_FOUND
|| reason == ServiceBusFailureReason.UNAUTHORIZED) {
System.out.printf("An unrecoverable error occurred. Stopping
processing with reason %s: %s%n",
reason, exception.getMessage());
countdownLatch.countDown();
} else if (reason == ServiceBusFailureReason.MESSAGE_LOCK_LOST) {
System.out.printf("Message lock lost for message: %s%n",
context.getException());
} else if (reason == ServiceBusFailureReason.SERVICE_BUSY) {
try {
// Choosing an arbitrary amount of time to wait until
trying again.
TimeUnit.SECONDS.sleep(1);
} catch (InterruptedException e) {
System.err.println("Unable to sleep for period of time");
}
} else {
System.out.printf("Error source %s, reason %s, message: %s%n",
context.getErrorSource(),
reason, context.getException());
}
}
Java
1. If you're using Eclipse, right-click the project, select Export, expand Java,
select Runnable JAR file, and follow the steps to create a runnable JAR file.
2. If you are signed into the machine using a user account that's different from
the user account added to the Azure Service Bus Data Owner role, follow
these steps. Otherwise, skip this step and move on to run the Jar file in the
next step.
b. Run the following CLI command to sign in to Azure. Use the same user
account that you added to the Azure Service Bus Data Owner role.
Azure CLI
az login
Java
Console
On the Overview page for the Service Bus namespace in the Azure portal, you can see
incoming and outgoing message count. You may need to wait for a minute or so and
then refresh the page to see the latest values.
Select the queue on this Overview page to navigate to the Service Bus Queue page.
You see the incoming and outgoing message count on this page too. You also see other
information such as the current size of the queue, maximum size, active message
count, and so on.
Next Steps
See the following documentation and samples:
In this quickstart, you write Java code using the azure-messaging-servicebus package to
send messages to an Azure Service Bus topic and then receive messages from
subscriptions to that topic.
7 Note
This quick start provides step-by-step instructions for a simple scenario of sending
a batch of messages to a Service Bus topic and receiving those messages from a
subscription of the topic. You can find pre-built Java samples for Azure Service Bus
in the Azure SDK for Java repository on GitHub .
Tip
Prerequisites
An Azure subscription. To complete this tutorial, you need an Azure account. You
can activate your Visual Studio or MSDN subscriber benefits or sign-up for a free
account .
Install Azure SDK for Java. If you're using Eclipse, you can install the Azure Toolkit
for Eclipse that includes the Azure SDK for Java. You can then add the Microsoft
Azure Libraries for Java to your project. If you're using IntelliJ, see Install the Azure
Toolkit for IntelliJ.
To create a namespace:
2. In the left navigation pane of the portal, select All services, select Integration from
the list of categories, hover the mouse over Service Bus, and then select Create on
the Service Bus tile.
3. In the Basics tag of the Create namespace page, follow these steps:
b. For Resource group, choose an existing resource group in which the namespace
will live, or create a new one.
c. Enter a name for the namespace. The namespace name should adhere to the
following naming conventions:
The name must be unique across Azure. The system immediately checks to
see if the name is available.
The name length is at least 6 and at most 50 characters.
The name can contain only letters, numbers, hyphens “-“.
The name must start with a letter and end with a letter or number.
The name doesn't end with “-sb“ or “-mgmt“.
d. For Location, choose the region in which your namespace should be hosted.
e. For Pricing tier, select the pricing tier (Basic, Standard, or Premium) for the
namespace. For this quickstart, select Standard.
) Important
If you selected the Premium pricing tier, specify the number of messaging
units. The premium tier provides resource isolation at the CPU and memory
level so that each workload runs in isolation. This resource container is called a
messaging unit. A premium namespace has at least one messaging unit. You
can select 1, 2, 4, 8 or 16 messaging units for each Service Bus Premium
namespace. For more information, see Service Bus Premium Messaging.
5. You see the home page for your service bus namespace.
3. Enter a name for the topic. Leave the other options with their default values.
4. Select Create.
Create a subscription to the topic
1. Select the topic that you created in the previous section.
The first option shows you how to use your security principal in Azure Active Directory
and role-based access control (RBAC) to connect to a Service Bus namespace. You don't
need to worry about having hard-coded connection string in your code or in a
configuration file or in a secure storage like Azure Key Vault.
The second option shows you how to use a connection string to connect to a Service
Bus namespace. If you are new to Azure, you may find the connection string option
easier to follow. We recommend using the passwordless option in real-world
applications and production environments. For more information, see Authentication
and authorization. You can also read more about passwordless authentication on the
overview page.
Passwordless (Recommended)
The following example assigns the Azure Service Bus Data Owner role to your user
account, which provides full access to Azure Service Bus resources. In a real
scenario, follow the Principle of Least Privilege to give users only the minimum
permissions needed for a more secure production environment.
Azure Service Bus Data Owner: Enables data access to Service Bus namespace
and its entities (queues, topics, subscriptions, and filters). A member of this
role can send and receive messages from queues or topics/subscriptions.
Azure Service Bus Data Sender: Use this role to give the send access to Service
Bus namespace and its entities.
Azure Service Bus Data Receiver: Use this role to give the receive access to
Service Bus namespace and its entities.
If you want to create a custom role, see Rights required for Service Bus operations.
) Important
In most cases, it will take a minute or two for the role assignment to propagate
in Azure. In rare cases, it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
1. If you don't have the Service Bus Namespace page open in the Azure portal,
locate your Service Bus namespace using the main search bar or left
navigation.
2. On the overview page, select Access control (IAM) from the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this example,
search for Azure Service Bus Data Owner and select the matching result. Then
choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
7. In the dialog, search for your Azure AD username (usually your user@domain
email address) and then choose Select at the bottom of the dialog.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Passwordless (Recommended)
Update the pom.xml file to add dependencies to Azure Service Bus and Azure
Identity packages.
XML
<dependencies>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-messaging-servicebus</artifactId>
<version>7.13.3</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.8.0</version>
<scope>compile</scope>
</dependency>
</dependencies>
Passwordless (Recommended)
Java
import com.azure.messaging.servicebus.*;
import com.azure.identity.*;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import java.util.Arrays;
import java.util.List;
2. In the class, define variables to hold connection string (not needed for
passwordless scenario), topic name, and subscription name.
Passwordless (Recommended)
Java
) Important
Replace <TOPIC NAME> with the name of the topic, and <SUBSCRIPTION
NAME> with the name of the topic's subscription.
3. Add a method named sendMessage in the class to send one message to the topic.
Passwordless (Recommended)
) Important
Java
.fullyQualifiedNamespace("NAMESPACENAME.servicebus.windows.net")
.credential(credential)
.sender()
.topicName(topicName)
.buildClient();
Java
Passwordless (Recommended)
) Important
Java
.fullyQualifiedNamespace("NAMESPACENAME.servicebus.windows.net")
.credential(credential)
.sender()
.topicName(topicName)
.buildClient();
if (messageBatch.getCount() > 0) {
senderClient.sendMessages(messageBatch);
System.out.println("Sent a batch of messages to the topic:
" + topicName);
}
Passwordless (Recommended)
) Important
your class.
Java
.fullyQualifiedNamespace("NAMESPACENAME.servicebus.windows.net")
.credential(credential)
.processor()
.topicName(topicName)
.subscriptionName(subName)
.processMessage(ServiceBusTopicTest::processMessage)
.processError(context -> processError(context,
countdownLatch))
.buildProcessorClient();
TimeUnit.SECONDS.sleep(10);
System.out.println("Stopping and closing the processor");
processorClient.close();
}
2. Add the processMessage method to process a message received from the Service
Bus subscription.
Java
Java
if (reason == ServiceBusFailureReason.MESSAGING_ENTITY_DISABLED
|| reason == ServiceBusFailureReason.MESSAGING_ENTITY_NOT_FOUND
|| reason == ServiceBusFailureReason.UNAUTHORIZED) {
System.out.printf("An unrecoverable error occurred. Stopping
processing with reason %s: %s%n",
reason, exception.getMessage());
countdownLatch.countDown();
} else if (reason == ServiceBusFailureReason.MESSAGE_LOCK_LOST) {
System.out.printf("Message lock lost for message: %s%n",
context.getException());
} else if (reason == ServiceBusFailureReason.SERVICE_BUSY) {
try {
// Choosing an arbitrary amount of time to wait until
trying again.
TimeUnit.SECONDS.sleep(1);
} catch (InterruptedException e) {
System.err.println("Unable to sleep for period of time");
}
} else {
System.out.printf("Error source %s, reason %s, message: %s%n",
context.getErrorSource(),
reason, context.getException());
}
}
Java
Passwordless (Recommended)
1. If you're using Eclipse, right-click the project, select Export, expand Java,
select Runnable JAR file, and follow the steps to create a runnable JAR file.
2. If you are signed into the machine using a user account that's different from
the user account added to the Azure Service Bus Data Owner role, follow
these steps. Otherwise, skip this step and move on to run the Jar file in the
next step.
b. Run the following CLI command to sign in to Azure. Use the same user
account that you added to the Azure Service Bus Data Owner role.
Azure CLI
az login
3. Run the Jar file using the following command.
Java
Console
On the Overview page for the Service Bus namespace in the Azure portal, you can see
incoming and outgoing message count. You may need to wait for a minute or so and
then refresh the page to see the latest values.
Switch to the Topics tab in the middle-bottom pane, and select the topic to see the
Service Bus Topic page for your topic. On this page, you should see four incoming and
four outgoing messages in the Messages chart.
If you comment out the receiveMessages call in the main method and run the app
again, on the Service Bus Topic page, you see 8 incoming messages (4 new) but four
outgoing messages.
On this page, if you select a subscription, you get to the Service Bus Subscription page.
You can see the active message count, dead-letter message count, and more on this
page. In this example, there are four active messages that the receiver hasn't received
yet.
Next steps
See the following documentation and samples:
Get started with the Azure Blob Storage client library for Java to manage blobs and
containers. Follow these steps to install the package and try out example code for basic
tasks.
Tip
Prerequisites
Azure account with an active subscription - create an account for free .
Azure Storage account - create a storage account.
Java Development Kit (JDK) version 8 or above.
Apache Maven .
Setting up
This section walks you through preparing a project to work with the Azure Blob Storage
client library for Java.
PowerShell
mvn archetype:generate `
--define interactiveMode=n `
--define groupId=com.blobs.quickstart `
--define artifactId=blob-quickstart `
--define archetypeArtifactId=maven-archetype-quickstart `
--define archetypeVersion=1.4
2. The output from generating the project should look something like this:
Console
Console
cd blob-quickstart
4. In side the blob-quickstart directory, create another directory called data. This
folder is where the blob data files will be created and stored.
Console
mkdir data
Add azure-sdk-bom to take a dependency on the latest version of the library. In the
following snippet, replace the {bom_version_to_target} placeholder with the version
number. Using azure-sdk-bom keeps you from having to specify the version of each
individual dependency. To learn more about the BOM, see the Azure SDK BOM
README .
XML
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-sdk-bom</artifactId>
<version>{bom_version_to_target}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
Then add the following dependency elements to the group of dependencies. The azure-
identity dependency is needed for passwordless connections to Azure services.
XML
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-storage-blob</artifactId>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
</dependency>
Java
package com.blobs.quickstart;
/**
* Azure Blob Storage quickstart
*/
import com.azure.identity.*;
import com.azure.storage.blob.*;
import com.azure.storage.blob.models.*;
import java.io.*;
Code examples
These example code snippets show you how to perform the following actions with the
Azure Blob Storage client library for Java:
) Important
Make sure you have the correct dependencies in pom.xml and the necessary
directives for the code samples to work, as described in the setting up section.
You can also authorize requests to Azure Blob Storage by using the account access key.
However, this approach should be used with caution. Developers must be diligent to
never expose the access key in an unsecure location. Anyone who has the access key is
able to authorize requests against the storage account, and effectively has access to all
the data. DefaultAzureCredential offers improved management and security benefits
over the account key to allow passwordless authentication. Both options are
demonstrated in the following example.
Passwordless (Recommended)
The order and locations in which DefaultAzureCredential looks for credentials can
be found in the Azure Identity library overview.
For example, your app can authenticate using your Visual Studio Code sign-in
credentials with when developing locally. Your app can then use a managed identity
once it has been deployed to Azure. No code changes are required for this
transition.
In this scenario, you'll assign permissions to your user account, scoped to the
storage account, to follow the Principle of Least Privilege. This practice gives users
only the minimum permissions needed and creates more secure production
environments.
The following example will assign the Storage Blob Data Contributor role to your
user account, which provides both read and write access to blob data in your
storage account.
) Important
In most cases it will take a minute or two for the role assignment to propagate
in Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your storage account using the main search
bar or left navigation.
2. On the storage account overview page, select Access control (IAM) from
the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Storage Blob Data Contributor and select the
matching result and then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
You can authorize access to data in your storage account using the following steps:
1. Make sure you're authenticated with the same Azure AD account you assigned
the role to on your storage account. You can authenticate via the Azure CLI,
Visual Studio Code, or Azure PowerShell.
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
2. To use DefaultAzureCredential , make sure that the azure-identity
dependency is added in pom.xml :
XML
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
</dependency>
3. Add this code to the Main method. When the code runs on your local
workstation, it will use the developer credentials of the prioritized tool you're
logged into to authenticate to Azure, such as the Azure CLI or Visual Studio
Code.
Java
/*
* The default credential first checks environment variables for
configuration
* If environment configuration is incomplete, it will try managed
identity
*/
DefaultAzureCredential defaultCredential = new
DefaultAzureCredentialBuilder().build();
4. Make sure to update the storage account name in the URI of your
BlobServiceClient . The storage account name can be found on the overview
Create a container
Decide on a name for the new container. The code below appends a UUID value to the
container name to ensure that it's unique.
) Important
Next, create an instance of the BlobContainerClient class, then call the create method to
actually create the container in your storage account.
Java
// Create a unique name for the container
String containerName = "quickstartblobs" + java.util.UUID.randomUUID();
To learn more about creating a container, and to explore more code samples, see Create
a blob container with Java.
Java
To learn more about uploading blobs, and to explore more code samples, see Upload a
blob with Java.
Java
System.out.println("\nListing blobs...");
To learn more about listing blobs, and to explore more code samples, see List blobs with
Java.
Download blobs
Download the previously created blob by calling the downloadToFile method. The
example code adds a suffix of "DOWNLOAD" to the file name so that you can see both
files in local file system.
Java
// Append the string "DOWNLOAD" before the .txt extension for comparison
purposes
String downloadFileName = fileName.replace(".txt", "DOWNLOAD.txt");
blobClient.downloadToFile(localPath + downloadFileName);
To learn more about downloading blobs, and to explore more code samples, see
Download a blob with Java.
Delete a container
The following code cleans up the resources the app created by removing the entire
container using the delete method. It also deletes the local files created by the app.
The app pauses for user input by calling System.console().readLine() before it deletes
the blob, container, and local files. This is a good chance to verify that the resources
were created correctly, before they're deleted.
Java
// Clean up resources
System.out.println("\nPress the Enter key to begin clean up");
System.console().readLine();
System.out.println("Done");
To learn more about deleting a container, and to explore more code samples, see Delete
and restore a blob container with Java.
1. Navigate to the directory containing the pom.xml file and compile the project by
using the following mvn command:
Console
mvn compile
Console
mvn package
Console
To simplify the run step, you can add exec-maven-plugin to pom.xml and configure
as shown below:
XML
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.4.0</version>
<configuration>
<mainClass>com.blobs.quickstart.App</mainClass>
<cleanupDaemonThreads>false</cleanupDaemonThreads>
</configuration>
</plugin>
With this configuration, you can execute the app with the following command:
Console
mvn exec:java
The output of the app is similar to the following example (UUID values omitted for
readability):
Output
https://mystorageacct.blob.core.windows.net/quickstartblobsUUID/quickstartUU
ID.txt
Listing blobs...
quickstartUUID.txt
Downloading blob to
./data/quickstartUUIDDOWNLOAD.txt
Before you begin the cleanup process, check your data folder for the two files. You can
compare them and observe that they're identical.
Clean up resources
After you've verified the files and finished testing, press the Enter key to delete the test
files along with the container you created in the storage account. You can also use Azure
CLI to delete resources.
Next steps
In this quickstart, you learned how to upload, download, and list blobs using Java.
To learn more, see the Azure Blob Storage client libraries for Java.
For tutorials, samples, quickstarts, and other documentation, visit Azure for Java
developers.
Quickstart: Azure Queue Storage client
library for Java
Article • 06/29/2023
Get started with the Azure Queue Storage client library for Java. Azure Queue Storage is
a service for storing large numbers of messages for later retrieval and processing. Follow
these steps to install the package and try out example code for basic tasks.
Use the Azure Queue Storage client library for Java to:
Create a queue
Add messages to a queue
Peek at messages in a queue
Update a message in a queue
Get the queue length
Receive messages from a queue
Delete messages from a queue
Delete a queue
Prerequisites
Java Development Kit (JDK) version 8 or above
Apache Maven
Azure subscription - create one for free
Azure Storage account - create a storage account
Setting up
This section walks you through preparing a project to work with the Azure Queue
Storage client library for Java.
PowerShell
PowerShell
mvn archetype:generate `
--define interactiveMode=n `
--define groupId=com.queues.quickstart `
--define artifactId=queues-quickstart `
--define archetypeArtifactId=maven-archetype-quickstart `
--define archetypeVersion=1.4
2. The output from generating the project should look something like this:
Console
Console
cd queues-quickstart
Add azure-sdk-bom to take a dependency on the latest version of the library. In the
following snippet, replace the {bom_version_to_target} placeholder with the version
number. Using azure-sdk-bom keeps you from having to specify the version of each
individual dependency. To learn more about the BOM, see the Azure SDK BOM
README .
XML
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-sdk-bom</artifactId>
<version>{bom_version_to_target}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
Then add the following dependency elements to the group of dependencies. The azure-
identity dependency is needed for passwordless connections to Azure services.
XML
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-storage-queue</artifactId>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
</dependency>
Java
package com.queues.quickstart;
/**
* Azure Queue Storage client library quickstart
*/
import com.azure.identity.*;
import com.azure.storage.queue.*;
import com.azure.storage.queue.models.*;
import java.io.*;
Authenticate to Azure
Application requests to most Azure services must be authorized. Using the
DefaultAzureCredential class provided by the Azure Identity client library is the
recommended approach for implementing passwordless connections to Azure services
in your code.
You can also authorize requests to Azure services using passwords, connection strings,
or other credentials directly. However, this approach should be used with caution.
Developers must be diligent to never expose these secrets in an unsecure location.
Anyone who gains access to the password or secret key is able to authenticate.
DefaultAzureCredential offers improved management and security benefits over the
account key to allow passwordless authentication. Both options are demonstrated in the
following example.
Passwordless (Recommended)
For example, your app can authenticate using your Azure CLI sign-in credentials
when developing locally, and then use a managed identity once it has been
deployed to Azure. No code changes are required for this transition.
When developing locally, make sure that the user account that is accessing the
queue data has the correct permissions. You'll need Storage Queue Data
Contributor to read and write queue data. To assign yourself this role, you'll need
to be assigned the User Access Administrator role, or another role that includes the
Microsoft.Authorization/roleAssignments/write action. You can assign Azure RBAC
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. You can learn
more about the available scopes for role assignments on the scope overview page.
In this scenario, you'll assign permissions to your user account, scoped to the
storage account, to follow the Principle of Least Privilege. This practice gives users
only the minimum permissions needed and creates more secure production
environments.
The following example will assign the Storage Queue Data Contributor role to your
user account, which provides both read and write access to queue data in your
storage account.
) Important
In most cases it will take a minute or two for the role assignment to propagate
in Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your storage account using the main search
bar or left navigation.
2. On the storage account overview page, select Access control (IAM) from
the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Storage Queue Data Contributor and select the
matching result and then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
Object model
Azure Queue Storage is a service for storing large numbers of messages. A queue
message can be up to 64 KB in size. A queue may contain millions of messages, up to
the total capacity limit of a storage account. Queues are commonly used to create a
backlog of work to process asynchronously. Queue Storage offers three types of
resources:
Storage account: All access to Azure Storage is done through a storage account.
For more information about storage accounts, see Storage account overview
Queue: A queue contains a set of messages. All messages must be in a queue.
Note that the queue name must be all lowercase. For information on naming
queues, see Naming Queues and Metadata.
Message: A message, in any format, of up to 64 KB. A message can remain in the
queue for a maximum of 7 days. For version 2017-07-29 or later, the maximum
time-to-live can be any positive number, or -1 indicating that the message doesn't
expire. If this parameter is omitted, the default time-to-live is seven days.
Passwordless (Recommended)
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
Once authenticated, you can create and authorize a QueueClient object using
DefaultAzureCredential to access queue data in the storage account.
DefaultAzureCredential automatically discovers and uses the account you signed in
Java
import com.azure.identity.*;
Decide on a name for the queue and create an instance of the QueueClient class,
using DefaultAzureCredential for authorization. We use this client object to create
and interact with the queue resource in the storage account.
) Important
Queue names may only contain lowercase letters, numbers, and hyphens, and
must begin with a letter or a number. Each hyphen must be preceded and
followed by a non-hyphen character. The name must also be between 3 and 63
characters long. For more information about naming queues, see Naming
queues and metadata.
Add this code inside the main method, and make sure to replace the <storage-
account-name> placeholder value:
Java
// Instantiate a QueueClient
// We'll use this client object to create and interact with the queue
// TODO: replace <storage-account-name> with the actual name
QueueClient queueClient = new QueueClientBuilder()
.endpoint("https://<storage-account-
name>.queue.core.windows.net/")
.queueName(queueName)
.credential(new DefaultAzureCredentialBuilder().build())
.buildClient();
7 Note
Messages sent using the QueueClient class must be in a format that can be
included in an XML request with UTF-8 encoding. You can optionally set the
QueueMessageEncoding option to BASE64 to handle non-compliant messages.
Create a queue
Using the QueueClient object, call the create method to create the queue in your
storage account.
Java
Java
Java
Java
The getProperties method returns several values including the number of messages
currently in a queue. The count is only approximate because messages can be added or
removed after your request. The getApproximateMessageCount method returns the last
value retrieved by the call to getProperties , without calling Queue Storage.
Java
Java
When calling the receiveMessages method, you can optionally specify a value for
maxMessages , which is the number of messages to retrieve from the queue. The default is
1 message and the maximum is 32 messages. You can also specify a value for
visibilityTimeout , which hides the messages from other operations for the timeout
Delete a queue
The following code cleans up the resources the app created by deleting the queue using
the Delete method.
Java
System.out.println("\nPress Enter key to delete the queue...");
System.console().readLine();
// Clean up
System.out.println("Deleting queue: " + queueClient.getQueueName());
queueClient.delete();
System.out.println("Done");
In your console window, navigate to your application directory, then build and run the
application.
Console
mvn compile
Console
mvn package
Console
Output
Press Enter key to receive messages and delete them from the queue...
When the app pauses before receiving messages, check your storage account in the
Azure portal . Verify the messages are in the queue.
Press the Enter key to receive and delete the messages. When prompted, press the
Enter key again to delete the queue and finish the demo.
Next steps
In this quickstart, you learned how to create a queue and add messages to it using Java
code. Then you learned to peek, retrieve, and delete messages. Finally, you learned how
to delete a message queue.
For related code samples using deprecated Java version 8 SDKs, see Code samples
using Java version 8.
For more Azure Queue Storage sample apps, see Azure Queue Storage client
library for Java - samples .
Tutorial: Deploy a Spring application to
Azure Spring Apps with a passwordless
connection to an Azure database
Article • 04/24/2023
This article shows you how to use passwordless connections to Azure databases in
Spring Boot applications deployed to Azure Spring Apps.
In this tutorial, you complete the following tasks using the Azure portal or the Azure CLI.
Both methods are explained in the following procedures.
7 Note
Prerequisites
An Azure subscription. If you don't already have one, create a free account
before you begin.
Azure CLI 2.45.0 or higher required.
The Azure Spring Apps extension. You can install the extension by using the
command: az extension add --name spring .
Java Development Kit (JDK), version 8, 11, or 17.
A Git client.
cURL or a similar HTTP utility to test functionality.
MySQL command line client if you choose to run Azure Database for MySQL. You
can connect to your server with Azure Cloud Shell using a popular client tool, the
mysql.exe command-line tool. Alternatively, you can use the mysql command
line in your local environment.
ODBC Driver 18 for SQL Server if you choose to run Azure SQL Database.
Bash
export AZ_RESOURCE_GROUP=passwordless-tutorial-rg
export AZ_DATABASE_SERVER_NAME=<YOUR_DATABASE_SERVER_NAME>
export AZ_DATABASE_NAME=demodb
export AZ_LOCATION=<YOUR_AZURE_REGION>
export AZ_SPRING_APPS_SERVICE_NAME=<YOUR_AZURE_SPRING_APPS_SERVICE_NAME>
export AZ_SPRING_APPS_APP_NAME=hellospring
export AZ_DB_ADMIN_USERNAME=<YOUR_DB_ADMIN_USERNAME>
export AZ_DB_ADMIN_PASSWORD=<YOUR_DB_ADMIN_PASSWORD>
export AZ_USER_IDENTITY_NAME=<YOUR_USER_ASSIGNED_MANAGEMED_IDENTITY_NAME>
Replace the placeholders with the following values, which are used throughout this
article:
1. Update Azure CLI with the Azure Spring Apps extension by using the following
command:
Azure CLI
Azure CLI
az login
az account list --output table
az account set --subscription <name-or-ID-of-subscription>
3. Use the following commands to create a resource group to contain your Azure
Spring Apps service and an instance of the Azure Spring Apps service:
Azure CLI
az group create \
--name $AZ_RESOURCE_GROUP \
--location $AZ_LOCATION
az spring create \
--resource-group $AZ_RESOURCE_GROUP \
--name $AZ_SPRING_APPS_SERVICE_NAME
1. Create an Azure Database for MySQL server by using the following command:
Azure CLI
7 Note
Azure CLI
Azure CLI
Azure CLI
Then, use the following command to create a user-assigned managed identity for
Azure Active Directory authentication. For more information, see Set up Azure
Active Directory authentication for Azure Database for MySQL - Flexible Server.
Azure CLI
) Important
Azure CLI
This Service Connector command does the following tasks in the background:
Set the Azure Active Directory admin to the current signed-in user.
7 Note
If you see the error message The subscription is not registered to use
Microsoft.ServiceLinker , run the command az provider register --
namespace Microsoft.ServiceLinker to register the Service Connector
Bash
XML
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-starter-jdbc-mysql</artifactId>
</dependency>
This dependency adds support for the Spring Cloud Azure starter.
7 Note
For more information about how to manage Spring Cloud Azure library
versions by using a bill of materials (BOM), see the Getting started
section of the Spring Cloud Azure developer guide.
3. Use the following command to update the application.properties file:
Bash
logging.level.org.springframework.jdbc.core=DEBUG
spring.datasource.azure.passwordless-enabled=true
spring.sql.init.mode=always
EOF
Bash
cd passwordless-sample
./mvnw clean package -DskipTests
Azure CLI
6. Query the app status after deployment by using the following command:
Azure CLI
Bash
This command returns the created item, as shown in the following example:
JSON
Bash
curl https://${AZ_SPRING_APPS_SERVICE_NAME}-
hellospring.azuremicroservices.io
This command returns the list of "todo" items, including the item you've created, as
shown in the following example:
JSON
[{"id":1,"description":"configuration","details":"congratulations, you have
set up JDBC correctly!","done":true}]
Clean up resources
To clean up all resources used during this tutorial, delete the resource group by using
the following command:
Azure CLI
az group delete \
--name $AZ_RESOURCE_GROUP \
--yes
Next steps
Spring Cloud Azure documentation
Use a managed identity to connect
Azure SQL Database to an app deployed
to Azure Spring Apps
Article • 04/19/2023
7 Note
Azure Spring Apps is the new name for the Azure Spring Cloud service. Although
the service has a new name, you'll see the old name in some places for a while as
we work to update assets such as screenshots, videos, and diagrams.
This article shows you how to create a managed identity for an app deployed to Azure
Spring Apps and use it to access Azure SQL Database.
Azure SQL Database is the intelligent, scalable, relational database service built for the
cloud. It’s always up to date, with AI-powered and automated features that optimize
performance and durability. Serverless compute and Hyperscale storage options
automatically scale resources on demand, so you can focus on building new applications
without worrying about storage size or resource management.
Prerequisites
An Azure account with an active subscription. Create an account for free .
Azure CLI version 2.45.0 or higher.
Follow the Spring Data JPA tutorial to provision an Azure SQL Database and get it
work with a Java app locally.
Follow the Azure Spring Apps system-assigned managed identity tutorial to
provision an app in Azure Spring Apps with managed identity enabled.
Manual configuration
SQL
Azure CLI
as shown in the following example. Be sure to use the correct value for the
$AZ_DATABASE_NAME variable.
properties
spring.datasource.url=jdbc:sqlserver://$AZ_DATABASE_NAME.database.window
s.net:1433;database=demo;encrypt=true;trustServerCertificate=false;hostN
ameInCertificate=*.database.windows.net;loginTimeout=30;Authentication=A
ctiveDirectoryMSI;
Build and deploy the app to Azure Spring Apps
Rebuild the app and deploy it to the Azure Spring Apps provisioned in the second bullet
point under Prerequisites. You now have a Spring Boot application authenticated by a
managed identity that uses JPA to store and retrieve data from an Azure SQL Database
in Azure Spring Apps.
Next steps
How to access Storage blob with managed identity in Azure Spring Apps
How to enable system-assigned managed identity for applications in Azure Spring
Apps
Learn more about managed identities for Azure resources
Authenticate Azure Spring Apps with Key Vault in GitHub Actions
Connect an Azure Database for MySQL
instance to your application in Azure
Spring Apps
Article • 05/10/2023
7 Note
Azure Spring Apps is the new name for the Azure Spring Cloud service. Although
the service has a new name, you'll see the old name in some places for a while as
we work to update assets such as screenshots, videos, and diagrams.
With Azure Spring Apps, you can connect selected Azure services to your applications
automatically, instead of having to configure your Spring Boot application manually.
This article shows you how to connect your application to your Azure Database for
MySQL instance.
Prerequisites
An application deployed to Azure Spring Apps. For more information, see
Quickstart: Deploy your first application to Azure Spring Apps.
An Azure Database for MySQL Flexible Server instance.
Azure CLI version 2.45.0 or higher.
XML
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-starter-jdbc-mysql</artifactId>
</dependency>
3. Update the current app by running az spring app deploy , or create a new
deployment for this change by running az spring app deployment create .
Follow these steps to configure your Spring app to connect to an Azure Database
for MySQL Flexible Server with a system-assigned managed identity.
Azure CLI
Azure CLI
Azure CLI
Next steps
In this article, you learned how to connect an application in Azure Spring Apps to an
Azure Database for MySQL instance. To learn more about connecting services to an
application, see Connect an Azure Cosmos DB database to an application in Azure
Spring Apps.
Bind an Azure Database for PostgreSQL
to your application in Azure Spring
Apps
Article • 06/01/2023
7 Note
Azure Spring Apps is the new name for the Azure Spring Cloud service. Although
the service has a new name, you'll see the old name in some places for a while as
we work to update assets such as screenshots, videos, and diagrams.
With Azure Spring Apps, you can bind select Azure services to your applications
automatically, instead of having to configure your Spring Boot application manually.
This article shows you how to bind your application to your Azure Database for
PostgreSQL instance.
In this article, we include two authentication methods: Azure Active Directory (Azure AD)
authentication and PostgreSQL authentication. The Passwordless tab shows the Azure
AD authentication and the Password tab shows the PostgreSQL authentication.
Prerequisites
An application deployed to Azure Spring Apps. For more information, see
Quickstart: Deploy your first application to Azure Spring Apps.
An Azure Database for PostgreSQL Single Server instance.
Azure CLI version 2.45.0 or higher.
XML
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>spring-cloud-azure-starter-jdbc-postgresql</artifactId>
</dependency>
3. Update the current app by running az spring app deploy , or create a new
deployment for this change by running az spring app deployment create .
1. Install the Service Connector passwordless extension for the Azure CLI:
Azure CLI
Azure CLI
az spring connection create postgres \
--resource-group $AZ_SPRING_APPS_RESOURCE_GROUP \
--service $AZ_SPRING_APPS_SERVICE_INSTANCE_NAME \
--app $APP_NAME \
--deployment $DEPLOYMENT_NAME \
--target-resource-group $POSTGRES_RESOURCE_GROUP \
--server $POSTGRES_SERVER_NAME \
--database $DATABASE_NAME \
--system-identity
Next steps
In this article, you learned how to bind an application in Azure Spring Apps to an Azure
Database for PostgreSQL instance. To learn more about binding services to an
application, see Bind an Azure Cosmos DB database to an application in Azure Spring
Apps.
Tutorial: Connect to a PostgreSQL
Database from Java Tomcat App Service
without secrets using a managed
identity
Article • 03/09/2023
Azure App Service provides a highly scalable, self-patching web hosting service in Azure.
It also provides a managed identity for your app, which is a turn-key solution for
securing access to Azure Database for PostgreSQL and other Azure services. Managed
identities in App Service make your app more secure by eliminating secrets from your
app, such as credentials in the environment variables. In this tutorial, you will learn how
to:
If you don't have an Azure subscription, create an Azure free account before you
begin.
Prerequisites
Git
Java JDK
Maven
Azure CLI version 2.45.0 or higher.
Bash
1. Sign into the Azure CLI, and optionally set your subscription if you have more than
one connected to your login credentials.
Azure CLI
az login
az account set --subscription <subscription-ID>
Azure CLI
RESOURCE_GROUP=<resource-group-name>
LOCATION=eastus
3. Create an Azure Database for PostgreSQL server. The server is created with an
administrator account, but it won't be used because we'll use the Azure Active
Directory (Azure AD) admin account to perform administrative tasks.
Flexible Server
Azure CLI
POSTGRESQL_ADMIN_USER=azureuser
# PostgreSQL admin access rights won't be used because Azure AD
authentication is leveraged to administer the database.
POSTGRESQL_ADMIN_PASSWORD=<admin-password>
POSTGRESQL_HOST=<postgresql-host-name>
Flexible Server
Azure CLI
DATABASE_NAME=checklist
1. The sample app contains a pom.xml file that can generate the WAR file. Run the
following command to build the app.
Bash
Azure CLI
APPSERVICE_PLAN=<app-service-plan>
APPSERVICE_NAME=<app-service-name>
# Create an App Service plan
az appservice plan create \
--resource-group $RESOURCE_GROUP \
--name $APPSERVICE_PLAN \
--location $LOCATION \
--sku B1 \
--is-linux
Azure CLI
az webapp deploy \
--resource-group $RESOURCE_GROUP \
--name $APPSERVICE_NAME \
--src-path target/app.war \
--type war
Install the Service Connector passwordless extension for the Azure CLI:
Azure CLI
Flexible Server
Azure CLI
Azure CLI
az webapp browse \
--resource-group $RESOURCE_GROUP \
--name MyWebapp \
--name $APPSERVICE_NAME
Clean up resources
In the preceding steps, you created Azure resources in a resource group. If you don't
expect to need these resources in the future, delete the resource group by running the
following command in the Cloud Shell:
Azure CLI
Next steps
Learn more about running Java apps on App Service on Linux in the developer guide.
Learn how to secure your app with a custom domain and certificate.
Azure Container Apps provides a managed identity for your app, which is a turn-key
solution for securing access to Azure Database for PostgreSQL and other Azure services.
Managed identities in Container Apps make your app more secure by eliminating
secrets from your app, such as credentials in the environment variables.
This tutorial walks you through the process of building, configuring, deploying, and
scaling Java container apps on Azure. At the end of this tutorial, you'll have a Quarkus
application storing data in a PostgreSQL database with a managed identity running on
Container Apps.
" Configure a Quarkus app to authenticate using Azure Active Directory (Azure AD)
with a PostgreSQL Database.
" Create an Azure container registry and push a Java app image to it.
" Create a Container App in Azure.
" Create a PostgreSQL database in Azure.
" Connect to a PostgreSQL Database with managed identity using Service Connector.
If you don't have an Azure subscription, create an Azure free account before you
begin.
1. Prerequisites
Azure CLI version 2.45.0 or higher.
Git
Java JDK
Maven
Docker
GraalVM
The following example creates a resource group named myResourceGroup in the East US
Azure region.
Azure CLI
Create an Azure container registry instance using the az acr create command. The
registry name must be unique within Azure, contain 5-50 alphanumeric characters. All
letters must be specified in lower case. In the following example,
mycontainerregistry007 is used. Update this to a unique value.
Azure CLI
az acr create \
--resource-group myResourceGroup \
--name mycontainerregistry007 \
--sku Basic
Run the following commands in your terminal to clone the sample repo and set up the
sample app environment.
git
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity-providers-jdbc-postgresql</artifactId>
<version>1.0.0-beta.1</version>
</dependency>
Delete the existing content in application.properties and replace with the following
to configure the database for dev, test, and production modes:
Flexible Server
properties
quarkus.package.type=uber-jar
quarkus.hibernate-orm.database.generation=drop-and-create
quarkus.datasource.db-kind=postgresql
quarkus.datasource.jdbc.max-size=8
quarkus.datasource.jdbc.min-size=2
quarkus.hibernate-orm.log.sql=true
quarkus.hibernate-orm.sql-load-script=import.sql
quarkus.datasource.jdbc.acquisition-timeout = 10
%dev.quarkus.datasource.username=${AZURE_CLIENT_NAME}
%dev.quarkus.datasource.jdbc.url=jdbc:postgresql://${DBHOST}.postgr
es.database.azure.com:5432/${DBNAME}?\
authenticationPluginClassName=com.azure.identity.providers.postgres
ql.AzureIdentityPostgresqlAuthenticationPlugin\
&sslmode=require\
&azure.clientId=${AZURE_CLIENT_ID}\
&azure.clientSecret=${AZURE_CLIENT_SECRET}\
&azure.tenantId=${AZURE_TENANT_ID}
%prod.quarkus.datasource.username=${AZURE_MI_NAME}
%prod.quarkus.datasource.jdbc.url=jdbc:postgresql://${DBHOST}.postg
res.database.azure.com:5432/${DBNAME}?\
authenticationPluginClassName=com.azure.identity.providers.postgres
ql.AzureIdentityPostgresqlAuthenticationPlugin\
&sslmode=require
%dev.quarkus.class-loading.parent-first-artifacts=com.azure:azure-
core::jar,\
com.azure:azure-core-http-netty::jar,\
io.projectreactor.netty:reactor-netty-core::jar,\
io.projectreactor.netty:reactor-netty-http::jar,\
io.netty:netty-resolver-dns::jar,\
io.netty:netty-codec::jar,\
io.netty:netty-codec-http::jar,\
io.netty:netty-codec-http2::jar,\
io.netty:netty-handler::jar,\
io.netty:netty-resolver::jar,\
io.netty:netty-common::jar,\
io.netty:netty-transport::jar,\
io.netty:netty-buffer::jar,\
com.azure:azure-identity::jar,\
com.azure:azure-identity-providers-core::jar,\
com.azure:azure-identity-providers-jdbc-postgresql::jar,\
com.fasterxml.jackson.core:jackson-core::jar,\
com.fasterxml.jackson.core:jackson-annotations::jar,\
com.fasterxml.jackson.core:jackson-databind::jar,\
com.fasterxml.jackson.dataformat:jackson-dataformat-xml::jar,\
com.fasterxml.jackson.datatype:jackson-datatype-jsr310::jar,\
org.reactivestreams:reactive-streams::jar,\
io.projectreactor:reactor-core::jar,\
com.microsoft.azure:msal4j::jar,\
com.microsoft.azure:msal4j-persistence-extension::jar,\
org.codehaus.woodstox:stax2-api::jar,\
com.fasterxml.woodstox:woodstox-core::jar,\
com.nimbusds:oauth2-oidc-sdk::jar,\
com.nimbusds:content-type::jar,\
com.nimbusds:nimbus-jose-jwt::jar,\
net.minidev:json-smart::jar,\
net.minidev:accessors-smart::jar,\
io.netty:netty-transport-native-unix-common::jar
Run the following command to build the Quarkus app image. You must tag it with
the fully qualified name of your registry login server. The login server name is in
the format <registry-name>.azurecr.io (must be all lowercase), for example,
mycontainerregistry007.azurecr.io. Replace the name with your own registry name.
Bash
mvnw quarkus:add-extension -Dextensions="container-image-jib"
mvnw clean package -Pnative -Dquarkus.native.container-build=true -
Dquarkus.container-image.build=true -Dquarkus.container-
image.registry=mycontainerregistry007 -Dquarkus.container-
image.name=quarkus-postgres-passwordless-app -Dquarkus.container-
image.tag=v1
Before pushing container images, you must log in to the registry. To do so, use the
[az acr login][az-acr-login] command. Specify only the registry resource name
when signing in with the Azure CLI. Don't use the fully qualified login server name.
Azure CLI
Bash
Azure CLI
RESOURCE_GROUP="myResourceGroup"
LOCATION="eastus"
CONTAINERAPPS_ENVIRONMENT="my-environment"
2. Create a container app with your app image by running the following command.
Replace the placeholders with your values. To find the container registry admin
account details, see Authenticate with an Azure container registry
Azure CLI
CONTAINER_IMAGE_NAME=quarkus-postgres-passwordless-app:v1
REGISTRY_SERVER=mycontainerregistry007
REGISTRY_USERNAME=<REGISTRY_USERNAME>
REGISTRY_PASSWORD=<REGISTRY_PASSWORD>
az containerapp create \
--resource-group $RESOURCE_GROUP \
--name my-container-app \
--image $CONTAINER_IMAGE_NAME \
--environment $CONTAINERAPPS_ENVIRONMENT \
--registry-server $REGISTRY_SERVER \
--registry-username $REGISTRY_USERNAME \
--registry-password $REGISTRY_PASSWORD
Flexible Server
Azure CLI
DB_SERVER_NAME='msdocs-quarkus-postgres-webapp-db'
ADMIN_USERNAME='demoadmin'
ADMIN_PASSWORD='<admin-password>'
The following parameters are used in the above Azure CLI command:
resource-group → Use the same resource group name in which you created the
web app, for example msdocs-quarkus-postgres-webapp-rg .
name → The PostgreSQL database server name. This name must be unique across
all Azure (the server endpoint becomes
https://<name>.postgres.database.azure.com ). Allowed characters are A - Z , 0 - 9 ,
and - . A good pattern is to use a combination of your company name and server
identifier. ( msdocs-quarkus-postgres-webapp-db )
location → Use the same location used for the web app.
demoadmin is okay.
) Important
public-access → None which sets the server in public access mode with no firewall
rules. Rules will be created in a later step.
sku-name → The name of the pricing tier and compute configuration, for example
GP_Gen5_2 . For more information, see Azure Database for PostgreSQL pricing .
1. Create a database named fruits within the PostgreSQL service with this
command:
Flexible Server
Azure CLI
2. Install the Service Connector passwordless extension for the Azure CLI:
Azure CLI
Flexible Server
Azure CLI
Azure CLI
When the new webpage shows your list of fruits, your app is connecting to the database
using the managed identity. You should now be able to edit fruit list as before.
Clean up resources
In the preceding steps, you created Azure resources in a resource group. If you don't
expect to need these resources in the future, delete the resource group by running the
following command in the Cloud Shell:
Azure CLI
Next steps
Learn more about running Java apps on Azure in the developer guide.
Prerequisites
An Azure subscription
A database in Azure SQL Database configured with Azure Active Directory (Azure
AD) authentication. You can create one using the Create database quickstart.
Bash-enabled shell
Node.js LTS
Visual Studio Code
Visual Studio Code App Service extension
The latest version of the Azure CLI
1. For local development connections, make sure your logical server is configured to
allow your local machine IP address and other Azure services to connect:
Select Add your client IPv4 address(xx.xx.xx.xx) to add a firewall rule that
will enable connections from your local machine IPv4 address. Alternatively,
you can also select + Add a firewall rule to enter a specific IP address of your
choice.
Make sure the Allow Azure services and resources to access this server
checkbox is selected.
2 Warning
Enabling the Allow Azure services and resources to access this server
setting is not a recommended security practice for production scenarios.
Real applications should implement more secure approaches, such as
stronger firewall restrictions or virtual network configurations.
2. The server must also have Azure AD authentication enabled with an Azure Active
Directory admin account assigned. For local development connections, the Azure
Active Directory admin account should be an account you can also log into Visual
Studio or the Azure CLI with locally. You can verify whether your server has Azure
AD authentication enabled on the Azure Active Directory page.
3. If you're using a personal Azure account, make sure you have Azure Active
Directory setup and configured for Azure SQL Database in order to assign your
account as a server admin. If you're using a corporate account, Azure Active
Directory will most likely already be configured for you.
1. Create a new directory for the project and navigate into it.
Bash
npm init -y
3. Install the required packages used in the sample code in this article:
Bash
4. Install the development package used in the sample code in this article:
Bash
Bash
code .
6. Open the package.json file and add the following property and value after the
name property to configure the project for ESM modules.
JSON
"type": "module",
Create Express.js application code
To create the Express.js OpenAPI application, you'll create several files:
File Description
index.js Main application file, which starts the Express.js app on port 3000.
openapi.js Express.js /api-docs route for OpenAPI explorer UI. Root redirects to this
route.
database.js Database class to handle Azure SQL CRUD operations using the mssql
npm package.
JavaScript
JavaScript
if (!personId) {
res.status(404);
} else {
const rowsAffected = await database.delete(personId);
res.status(204).json({ rowsAffected });
}
} catch (err) {
res.status(500).json({ error: err?.message });
}
});
3. Create an opanapi.js route file and add the following code for the OpenAPI UI
explorer:
JavaScript
yml
openapi: 3.0.0
info:
version: 1.0.0
title: Persons API
paths:
/persons:
get:
summary: Get all persons
responses:
'200':
description: OK
content:
application/json:
schema:
type: array
items:
$ref: '#/components/schemas/Person'
post:
summary: Create a new person
requestBody:
required: true
content:
application/json:
schema:
$ref: '#/components/schemas/Person'
responses:
'201':
description: Created
content:
application/json:
schema:
$ref: '#/components/schemas/Person'
/persons/{id}:
parameters:
- name: id
in: path
required: true
schema:
type: integer
get:
summary: Get a person by ID
responses:
'200':
description: OK
content:
application/json:
schema:
$ref: '#/components/schemas/Person'
'404':
description: Person not found
put:
summary: Update a person by ID
requestBody:
required: true
content:
application/json:
schema:
$ref: '#/components/schemas/Person'
responses:
'200':
description: OK
content:
application/json:
schema:
$ref: '#/components/schemas/Person'
'404':
description: Person not found
delete:
summary: Delete a person by ID
responses:
'204':
description: No Content
'404':
description: Person not found
components:
schemas:
Person:
type: object
properties:
id:
type: integer
readOnly: true
firstName:
type: string
lastName:
type: string
Passwordless (recommended)
1. In Visual Studio Code, create a config.js file and add the following mssql
configuration code to authenticate to Azure SQL Database.
JavaScript
text
AZURE_SQL_SERVER=<YOURSERVERNAME>.database.windows.net
AZURE_SQL_DATABASE=<YOURDATABASENAME>
AZURE_SQL_PORT=1433
AZURE_SQL_AUTHENTICATIONTYPE=azure-active-directory-default
7 Note
4. Add the following to ignore environment variables and dependencies during the
zip deployment.
JSON
{
"appService.zipIgnorePattern": ["./.env*","node_modules{,/**}"]
}
JavaScript
constructor(config) {
this.config = config;
console.log(`Database: config: ${JSON.stringify(config)}`);
}
async connect() {
try {
console.log(`Database connecting...${this.connected}`);
if (this.connected === false) {
this.poolconnection = await sql.connect(this.config);
this.connected = true;
console.log('Database connection successful');
} else {
console.log('Database already connected');
}
} catch (error) {
console.error(`Error connecting to database:
${JSON.stringify(error)}`);
}
}
async disconnect() {
try {
this.poolconnection.close();
console.log('Database connection closed');
} catch (error) {
console.error(`Error closing database connection: ${error}`);
}
}
async executeQuery(query) {
await this.connect();
const request = this.poolconnection.request();
const result = await request.query(query);
return result.rowsAffected[0];
}
async create(data) {
await this.connect();
const request = this.poolconnection.request();
return result.rowsAffected[0];
}
async readAll() {
await this.connect();
const request = this.poolconnection.request();
const result = await request.query(`SELECT * FROM Person`);
return result.recordsets[0];
}
async read(id) {
await this.connect();
return result.recordset[0];
}
return result.rowsAffected[0];
}
async delete(id) {
await this.connect();
return result.rowsAffected[0];
}
}
Bash
The Person table is created in the database when you run this application.
3. On the Swagger UI page, expand the POST method and select Try it.
4. Modify the sample JSON to include values for the properties. The ID property is
ignored.
5. Select Execute to add a new record to the database. The API returns a successful
response.
6. Expand the GET method on the Swagger UI page and select Try it. Select Execute,
and the person you just created is returned.
2. Sign in to Azure, if you haven't already, by selecting the Azure: Sign In to Azure
Cloud command in the Command Palette ( Ctrl + Shift + P )
3. In Visual Studio Code's Azure Explorer window, right-click on the App Services
node and select Create New Web App (Advanced).
Prompt Value
Enter a globally unique name for Enter a prompt such as azure-sql-passwordless . Post-
the new web app. pend a unique string such as 123 .
Select a resource group for new Select +Create a new resource group then select the
resources. default name.
Select a Linux App Service plan. Select Create new App Service plan. then select the
default name.
5. Wait until the notification that your app was created before continuing.
6. In the Azure Explorer, expand the App Services node and right-click your new
app.
When the deployment finishes, the app doesn't work correctly on Azure. You still need
to configure the secure connection between the App Service and the SQL database to
retrieve your data.
Connect the App Service to Azure SQL
Database
Passwordless (recommended)
The following steps are required to connect the App Service instance to Azure SQL
Database:
Azure CLI
1. In Visual Studio Code, in the Azure explorer, right-click your App Service
and select Open in portal.
2. Navigate to the Identity page for your App Service. Under the System
assigned tab, the Status should be set to On. This value means that a
system-assigned managed identity was enabled for your app.
3. Navigate to the Configuration page for your App Service. Under the
Application Settings tab, you should see several environment variables,
which were already in the mssql configuration object.
AZURE_SQL_SERVER
AZURE_SQL_DATABASE
AZURE_SQL_PORT
AZURE_SQL_AUTHENTICATIONTYPE
The person you created locally should display in the browser. Congratulations! Your
application is now connected to Azure SQL Database in both local and hosted
environments.
Tip
If you receive a 500 Internal Server error while testing, it may be due to your
database networking configurations. Verify that your logical server is configured
with the settings outlined in the Configure the database section.
Azure portal
1. In the Azure portal search bar, search for Azure SQL and select the matching
result.
4. On the Azure you sure you want to delete... page that opens, type the name
of your database to confirm, and then select Delete.
Sample code
The sample code for this application is available on GitHub .
Next steps
Tutorial: Secure a database in Azure SQL Database
Authorize database access to SQL Database
An overview of Azure SQL Database security capabilities
Azure SQL Database security best practices
Quickstart - Azure Cosmos DB for
NoSQL client library for Node.js
Article • 05/08/2023
Get started with the Azure Cosmos DB client library for JavaScript to create databases,
containers, and items within your account. Follow these steps to install the package and
try out example code for basic tasks.
7 Note
Prerequisites
An Azure account with an active subscription.
No Azure subscription? You can try Azure Cosmos DB free with no credit card
required.
Node.js LTS
Azure Command-Line Interface (CLI) or Azure PowerShell
Prerequisite check
In a terminal or command window, run node --version to check that the Node.js
version is one of the current long term support (LTS) versions.
Run az --version (Azure CLI) or Get-Module -ListAvailable AzureRM (Azure
PowerShell) to check that you have the appropriate Azure command-line tools
installed.
Setting up
This section walks you through creating an Azure Cosmos account and setting up a
project that uses Azure Cosmos DB SQL API client library for JavaScript to manage
resources.
No Azure subscription? You can try Azure Cosmos DB free with no credit card
required. If you create an account using the free trial, you can safely skip ahead to
the Create a new JavaScript project section.
This quickstart will create a single Azure Cosmos DB account using the API for NoSQL.
Portal
Tip
For this quickstart, we recommend using the resource group name msdocs-
cosmos-quickstart-rg .
2. From the Azure portal menu or the Home page, select Create a resource.
3. On the New page, search for and select Azure Cosmos DB.
4. On the Select API option page, select the Create option within the NoSQL
section. Azure Cosmos DB has six APIs: NoSQL, MongoDB, PostgreSQL,
Apache Cassandra, Apache Gremlin, and Table. Learn more about the API for
NoSQL.
5. On the Create Azure Cosmos DB Account page, enter the following
information:
Subscription Subscription Select the Azure subscription that you wish to use for
name this Azure Cosmos account.
Location The region Select a geographic location to host your Azure Cosmos
closest to DB account. Use the location that is closest to your users
your users to give them the fastest access to the data.
Apply Azure Apply or Do Enable Azure Cosmos DB free tier. With Azure Cosmos
Cosmos DB not apply DB free tier, you'll get the first 1000 RU/s and 25 GB of
free tier storage for free in an account. Learn more about free
discount tier .
7 Note
You can have up to one free tier Azure Cosmos DB account per Azure
subscription and must opt-in when creating the account. If you do not
see the option to apply the free tier discount, this means another account
in the subscription has already been enabled with free tier.
6. Select Review + create.
7. Review the settings you provide, and then select Create. It takes a few minutes
to create the account. Wait for the portal page to display Your deployment is
complete before moving on.
9. From the API for NoSQL account page, select the Keys navigation menu
option.
10. Record the values from the URI and PRIMARY KEY fields. You'll use these
values in a later step.
Bash
npm init -y
2. Edit the package.json file to use ES6 modules by adding the "type": "module",
entry. This setting allows your code to use modern async/await syntax.
JavaScript
{
"name": "azure-cosmos-db-sql-api-quickstart",
"version": "1.0.0",
"description": "Azure Cosmos DB for Core (SQL) quickstart with
JavaScript",
"main": "index",
"type": "module",
"scripts": {
"start": "node index.js",
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [
"cosmos",
"quickstart"
],
"author": "diberry",
"license": "MIT",
"dependencies": {
"@azure/cosmos": "^3.17.0",
"dotenv": "^16.0.2"
}
}
Install packages
Passwordless (Recommended)
Bash
2. Add the dotenv npm package to read environment variables from a .env
file.
Bash
Windows
PowerShell
$env:COSMOS_ENDPOINT = "<cosmos-account-URI>"
$env:COSMOS_KEY = "<cosmos-account-PRIMARY-KEY>"
Object model
Before you start building the application, let's look into the hierarchy of resources in
Azure Cosmos DB. Azure Cosmos DB has a specific object model used to create and
access resources. The Azure Cosmos DB creates resources in a hierarchy that consists of
accounts, databases, containers, and items.
Account
Database Database
item
item
item
For more information about the hierarchy of different resources, see working with
databases, containers, and items in Azure Cosmos DB.
You'll use the following JavaScript classes to interact with these resources:
CosmosClient - This class provides a client-side logical representation for the Azure
Cosmos DB service. The client object is used to configure and execute requests
against the service.
Database - This class is a reference to a database that may, or may not, exist in the
service yet. The database is validated server-side when you attempt to access it or
perform an operation against it.
Container - This class is a reference to a container that also may not exist in the
service yet. The container is validated server-side when you attempt to work with
it.
SqlQuerySpec - This interface represents a SQL query and any query parameters.
QueryIterator<> - This class represents an iterator that can track the current page
of results and get a new page of results.
FeedResponse<> - This class represents a single page of responses from the
iterator.
Code examples
Authenticate the client
Create a database
Create a container
Create an item
Get an item
Query items
The sample code described in this article creates a database named cosmicworks with a
container named products . The products table is designed to contain product details
such as name, category, quantity, and a sale indicator. Each product also contains a
unique identifier.
For this sample code, the container will use the category as a logical partition key.
You can also authorize requests to Azure services using passwords, connection strings,
or other credentials directly. However, this approach should be used with caution.
Developers must be diligent to never expose these secrets in an unsecure location.
Anyone who gains access to the password or secret key is able to authenticate.
DefaultAzureCredential offers improved management and security benefits over the
account key to allow passwordless authentication. Both options are demonstrated in the
following example.
Passwordless (Recommended)
For example, your app can authenticate using your Visual Studio sign-in credentials
when developing locally, and then use a managed identity once it has been
deployed to Azure. No code changes are required for this transition.
When developing locally with passwordless authentication, make sure the user
account that connects to Cosmos DB is assigned a role with the correct permissions
to perform data operations. Currently, Azure Cosmos DB for NoSQL doesn't include
built-in roles for data operations, but you can create your own using the Azure CLI
or PowerShell.
Azure CLI
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/item
s/*",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/*"
]
}]
}'
2. When the command completes, copy the ID value from the name field and
paste it somewhere for later use.
3. Assign the role you created to the user account or service principal that will
connect to Cosmos DB. During local development, this will generally be your
own account that's logged into a development tool like Visual Studio or the
Azure CLI. Retrieve the details of your account using the az ad user
command.
Azure CLI
4. Copy the value of the id property out of the results and paste it somewhere
for later use.
5. Assign the custom role you created to your user account using the az
cosmosdb sql role assignment create command and the IDs you copied
previously.
Azure CLI
az cosmosdb sql role assignment create \
--account-name <cosmosdb-account-name> \
--resource-group <resource-group-name> \
--scope "/" \
--principal-id <your-user-id> \
--role-definition-id <your-custom-role-id>
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
From the project directory, open the index.js file. In your editor, add npm packages
to work with Cosmos DB and authenticate to Azure. You'll authenticate to Cosmos
DB for NoSQL using DefaultAzureCredential from the @azure/identity package.
DefaultAzureCredential will automatically discover and use the account you
signed-in with previously.
JavaScript
JavaScript
JavaScript
Create a new client instance of the CosmosClient class constructor with the
DefaultAzureCredential object and the endpoint.
JavaScript
Create a database
Passwordless (Recommended)
The @azure/cosmos client library enables you to perform data operations using
Azure RBAC. However, to authenticate management operations, such as creating
and deleting databases, you must use RBAC through one of the following options:
The Azure CLI approach is used in for this quickstart and passwordless access. Use
the az cosmosdb sql database create command to create a Cosmos DB for NoSQL
database.
Azure CLI
The command line to create a database is for PowerShell, shown on multiple lines
for clarity. For other shell types, change the line continuation characters as
appropriate. For example, for Bash, use backslash ("\"). Or, remove the continuation
characters and enter the command on one line.
Create a container
Passwordless (Recommended)
The Azure CLI approach is used in this example. Use the az cosmosdb sql container
create command to create a Cosmos DB container.
Azure CLI
The command line to create a container is for PowerShell, on multiple lines for
clarity. For other shell types, change the line continuation characters as appropriate.
For example, for Bash, use backslash ("\"). Or, remove the continuation characters
and enter the command on one line. For Bash, you'll also need to add
MSYS_NO_PATHCONV=1 before the command so that Bash deals with the partition key
parameter correctly.
After the resources have been created, use classes from the
Microsoft.Azure.Cosmos client libraries to connect to and query the database.
Create an item
Add the following code to provide your data set. Each product has a unique ID, name,
category id (used as partition key) and other fields.
JavaScript
// Data items
const items = [
{
"id": "08225A9E-F2B3-4FA3-AB08-8C70ADD6C3C2",
"categoryId": "75BF1ACB-168D-469C-9AA3-1FD26BB4EA4C",
"categoryName": "Bikes, Touring Bikes",
"sku": "BK-T79U-50",
"name": "Touring-1000 Blue, 50",
"description": "The product called \"Touring-1000 Blue, 50\"",
"price": 2384.0700000000002,
"tags": [
{
"_id": "27B7F8D5-1009-45B8-88F5-41008A0F0393",
"name": "Tag-61"
}
]
},
{
"id": "2C981511-AC73-4A65-9DA3-A0577E386394",
"categoryId": "75BF1ACB-168D-469C-9AA3-1FD26BB4EA4C",
"categoryName": "Bikes, Touring Bikes",
"sku": "BK-T79U-46",
"name": "Touring-1000 Blue, 46",
"description": "The product called \"Touring-1000 Blue, 46\"",
"price": 2384.0700000000002,
"tags": [
{
"_id": "4E102F3F-7D57-4CD7-88F4-AC5076A42C59",
"name": "Tag-91"
}
]
},
{
"id": "0F124781-C991-48A9-ACF2-249771D44029",
"categoryId": "56400CF3-446D-4C3F-B9B2-68286DA3BB99",
"categoryName": "Bikes, Mountain Bikes",
"sku": "BK-M68B-42",
"name": "Mountain-200 Black, 42",
"description": "The product called \"Mountain-200 Black, 42\"",
"price": 2294.9899999999998,
"tags": [
{
"_id": "4F67013C-3B5E-4A3D-B4B0-8C597A491EB6",
"name": "Tag-82"
}
]
}
];
JavaScript
Get an item
In Azure Cosmos DB, you can perform a point read operation by using both the unique
identifier ( id ) and partition key fields. In the SDK, call Container.item().read passing in
both values to return an item.
The partition key is specific to a container. In this Contoso Products container, the
category id, categoryId , is used as the partition key.
JavaScript
Query items
Add the following code to query for all items that match a specific filter. Create a
parameterized query expression then call the Container.Items.query method. This
method returns a QueryIterator that manages the pages of results. Then, use a
combination of while and for loops to fetchNext page of results as a FeedResponse
and then iterate over the individual data objects.
The query is programmatically composed to SELECT * FROM todo t WHERE
t.partitionKey = 'Bikes, Touring Bikes' .
JavaScript
// Get items
const { resources } = await container.items.query(querySpec).fetchAll();
If you want to use this data returned from the FeedResponse as an item, you need to
create an Item, using the Container.Items.read method.
Delete an item
Add the following code to delete an item you need to use the ID and partition key to
get the item, then delete it. This example uses the Container.Item.delete method to
delete the item.
JavaScript
// Delete item
const { statusCode } = await container.item(items[2].id,
items[2].categoryId).delete();
console.log(`${items[2].id} ${statusCode==204 ? `Item deleted` : `Item not
deleted`}`);
To run the app, use a terminal to navigate to the application directory and run the
application.
Bash
node index.js
Output
Clean up resources
When you no longer need the API for NoSQL account, you can delete the corresponding
resource group.
Portal
1. Navigate to the resource group you previously created in the Azure portal.
Tip
3. On the Are you sure you want to delete dialog, enter the name of the
resource group, and then select Delete.
Next steps
In this quickstart, you learned how to create an Azure Cosmos DB SQL API account,
create a database, and create a container using the JavaScript SDK. You can now dive
deeper into the SDK to import more data, perform complex queries, and manage your
Azure Cosmos DB SQL API resources.
This quickstart shows how to send events to and receive events from an event hub using
the @azure/event-hubs npm package.
Prerequisites
If you are new to Azure Event Hubs, see Event Hubs overview before you do this
quickstart.
Microsoft Azure subscription. To use Azure services, including Azure Event Hubs,
you need a subscription. If you don't have an existing Azure account, you can sign
up for a free trial or use your MSDN subscriber benefits when you create an
account .
Node.js LTS. Download the latest long-term support (LTS) version .
Visual Studio Code (recommended) or any other integrated development
environment (IDE).
Create an Event Hubs namespace and an event hub. The first step is to use the
Azure portal to create a namespace of type Event Hubs, and obtain the
management credentials your application needs to communicate with the event
hub. To create a namespace and an event hub, follow the procedure in this article.
Passwordless (Recommended)
shell
Passwordless (Recommended)
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. Learn more
about the available scopes for role assignments on the scope overview page.
The following example assigns the Azure Event Hubs Data Owner role to your user
account, which provides full access to Azure Event Hubs resources. In a real
scenario, follow the Principle of Least Privilege to give users only the minimum
permissions needed for a more secure production environment.
If you want to create a custom role, see Rights required for Event Hubs operations.
) Important
In most cases, it will take a minute or two for the role assignment to propagate
in Azure. In rare cases, it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your Event Hubs namespace using the main
search bar or left navigation.
2. On the overview page, select Access control (IAM) from the left-hand
menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Azure Event Hubs Data Owner and select the matching
result. Then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Send events
In this section, you create a JavaScript application that sends events to an event hub.
2. Create a file called send.js, and paste the following code into it:
Passwordless (Recommended)
JavaScript
// Event hubs
const eventHubsResourceName = "EVENT HUBS RESOURCE NAME";
const fullyQualifiedNamespace =
`${eventHubsResourceName}.servicebus.windows.net`;
const eventHubName = "EVENT HUB NAME";
main().catch((err) => {
console.log("Error occurred: ", err);
});
3. Run node send.js to execute this file. This command sends a batch of three events
to your event hub.
4. In the Azure portal, verify that the event hub has received the messages. Refresh
the page to update the chart. It might take a few seconds for it to show that the
messages have been received.
7 Note
Receive events
In this section, you receive events from an event hub by using an Azure Blob storage
checkpoint store in a JavaScript application. It performs metadata checkpoints on
received messages at regular intervals in an Azure Storage blob. This approach makes it
easy to continue receiving messages later from where you left off.
Follow these recommendations when using Azure Blob Storage as a checkpoint store:
Use a separate container for each processor group. You can use the same storage
account, but use one container per each group.
Don't use the container for anything else, and don't use the storage account for
anything else.
Storage account should be in the same region as the deployed application is
located in. If the application is on-premises, try to choose the closest region
possible.
On the Storage account page in the Azure portal, in the Blob service section, ensure
that the following settings are disabled.
Hierarchical namespace
Blob soft delete
Versioning
Passwordless (Recommended)
When developing locally, make sure that the user account that is accessing blob
data has the correct permissions. You'll need Storage Blob Data Contributor to
read and write blob data. To assign yourself this role, you'll need to be assigned the
User Access Administrator role, or another role that includes the
Microsoft.Authorization/roleAssignments/write action. You can assign Azure RBAC
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. You can learn
more about the available scopes for role assignments on the scope overview page.
In this scenario, you'll assign permissions to your user account, scoped to the
storage account, to follow the Principle of Least Privilege. This practice gives users
only the minimum permissions needed and creates more secure production
environments.
The following example will assign the Storage Blob Data Contributor role to your
user account, which provides both read and write access to blob data in your
storage account.
) Important
In most cases it will take a minute or two for the role assignment to propagate
in Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your storage account using the main search
bar or left navigation.
2. On the storage account overview page, select Access control (IAM) from
the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Storage Blob Data Contributor and select the
matching result and then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
shell
2. Create a file called receive.js, and paste the following code into it:
Passwordless (Recommended)
JavaScript
// Event hubs
const eventHubsResourceName = "EVENT HUBS RESOURCE NAME";
const fullyQualifiedNamespace =
`${eventHubsResourceName}.servicebus.windows.net`;
const eventHubName = "EVENT HUB NAME";
const consumerGroup = "$Default"; // name of the default consumer
group
// Azure Storage
const storageAccountName = "STORAGE ACCOUNT NAME";
const storageContainerName = "STORAGE CONTAINER NAME";
const baseUrl =
`https://${storageAccountName}.blob.core.windows.net`;
main().catch((err) => {
console.log("Error occurred: ", err);
});
3. Run node receive.js in a command prompt to execute this file. The window
should display messages about received events.
7 Note
Congratulations! You have now received events from your event hub. The receiver
program will receive events from all the partitions of the default consumer group in the
event hub.
Next steps
Check out these samples on GitHub:
JavaScript samples
TypeScript samples
Quickstart: Azure Key Vault certificate
client library for JavaScript
Article • 02/26/2023
Get started with the Azure Key Vault certificate client library for JavaScript. Azure Key
Vault is a cloud service that provides a secure store for certificates. You can securely
store keys, passwords, certificates, and other secrets. Azure key vaults may be created
and managed through the Azure portal. In this quickstart, you learn how to create,
retrieve, and delete certificates from an Azure key vault using the JavaScript client library
Prerequisites
An Azure subscription - create one for free .
Current Node.js LTS .
Azure CLI
An existing Key Vault - you can create one using:
Azure CLI
Azure portal
Azure PowerShell
Sign in to Azure
1. Run the login command.
Azure CLI
az login
If the CLI can open your default browser, it will do so and load an Azure sign-in
page.
terminal
terminal
npm init -y
terminal
terminal
Azure CLI
Windows
set KEY_VAULT_NAME=<your-key-vault-name>
Code example
This code uses the following Key Vault Certificate classes and methods:
DefaultAzureCredential class
CertificateClient class
beginCreateCertificate
getCertificate
getCertificateVersion
updateCertificateProperties
updateCertificatePolicy
beginDeleteCertificate
PollerLike interface
getResult
JavaScript
// delete certificate
const deletePoller = await
client.beginDeleteCertificate(certificateName);
const deletedCertificate = await deletePoller.pollUntilDone();
console.log("Recovery Id: ", deletedCertificate.recoveryId);
console.log("Deleted Date: ", deletedCertificate.deletedOn);
console.log("Scheduled Purge Date: ",
deletedCertificate.scheduledPurgeDate);
}
main().catch((error) => {
console.error("An error occurred:", error);
process.exit(1);
});
terminal
node index.js
2. The create and get methods return a full JSON object for the certificate:
JSON
{
"keyId": undefined,
"secretId": undefined,
"name": "YOUR-CERTIFICATE-NAME",
"reuseKey": false,
"keyCurveName": undefined,
"exportable": true,
"issuerName": 'Self',
"certificateType": undefined,
"certificateTransparency": undefined
},
"properties": {
"createdOn": 2021-11-29T20:17:45.000Z,
"updatedOn": 2021-11-29T20:17:45.000Z,
"expiresOn": 2022-11-29T20:17:45.000Z,
"id": "https://YOUR-KEY-VAULT-
NAME.vault.azure.net/certificates/YOUR-CERTIFICATE-NAME/YOUR-
CERTIFICATE-VERSION",
"enabled": false,
"notBefore": 2021-11-29T20:07:45.000Z,
"recoveryLevel": "Recoverable+Purgeable",
"name": "YOUR-CERTIFICATE-NAME",
"vaultUrl": "https://YOUR-KEY-VAULT-NAME.vault.azure.net",
"version": "YOUR-CERTIFICATE-VERSION",
"tags": undefined,
"x509Thumbprint": undefined,
"recoverableDays": 90
}
}
Next steps
In this quickstart, you created a key vault, stored a certificate, and retrieved that
certificate. To learn more about Key Vault and how to integrate it with your applications,
continue on to these articles.
Get started with the Azure Key Vault key client library for JavaScript. Azure Key Vault is a
cloud service that provides a secure store for cryptographic keys. You can securely store
keys, passwords, certificates, and other secrets. Azure key vaults may be created and
managed through the Azure portal. In this quickstart, you learn how to create, retrieve,
and delete keys from an Azure key vault using the JavaScript key client library
Prerequisites
An Azure subscription - create one for free .
Current Node.js LTS .
Azure CLI
An existing Key Vault - you can create one using:
Azure CLI
Azure portal
Azure PowerShell
Sign in to Azure
1. Run the login command.
Azure CLI
az login
If the CLI can open your default browser, it will do so and load an Azure sign-in
page.
terminal
terminal
npm init -y
terminal
terminal
Azure CLI
Windows
set KEY_VAULT_NAME=<your-key-vault-name>
Code example
The code samples below will show you how to create a client, set a secret, retrieve a
secret, and delete a secret.
This code uses the following Key Vault Secret classes and methods:
DefaultAzureCredential class
KeyClient class
createKey
createEcKey
createRsaKey
getKey
listPropertiesOfKeys
updateKeyProperties
beginDeleteKey
getDeletedKey
purgeDeletedKey
JavaScript
// Delete the key - the key is soft-deleted but not yet purged
const deletePoller = await client.beginDeleteKey(keyName);
await deletePoller.pollUntilDone();
main().catch((error) => {
console.error("An error occurred:", error);
process.exit(1);
});
Run the sample application
1. Run the app:
terminal
node index.js
2. The create and get methods return a full JSON object for the key:
JSON
"key": {
"key": {
"kid": "https://YOUR-KEY-VAULT-NAME.vault.azure.net/keys/YOUR-KEY-
NAME/YOUR-KEY-VERSION",
"kty": "YOUR-KEY-TYPE",
"keyOps": [ ARRAY-OF-VALID-OPERATIONS ],
... other properties based on key type
},
"id": "https://YOUR-KEY-VAULT-NAME.vault.azure.net/keys/YOUR-KEY-
NAME/YOUR-KEY-VERSION",
"name": "YOUR-KEY-NAME",
"keyOperations": [ ARRAY-OF-VALID-OPERATIONS ],
"keyType": "YOUR-KEY-TYPE",
"properties": {
"tags": undefined,
"enabled": true,
"notBefore": undefined,
"expiresOn": undefined,
"createdOn": 2021-11-29T18:29:11.000Z,
"updatedOn": 2021-11-29T18:29:11.000Z,
"recoverableDays": 90,
"recoveryLevel": "Recoverable+Purgeable",
"exportable": undefined,
"releasePolicy": undefined,
"vaultUrl": "https://YOUR-KEY-VAULT-NAME.vault.azure.net",
"version": "YOUR-KEY-VERSION",
"name": "YOUR-KEY-VAULT-NAME",
"managed": undefined,
"id": "https://YOUR-KEY-VAULT-NAME.vault.azure.net/keys/YOUR-KEY-
NAME/YOUR-KEY-VERSION"
}
}
Next steps
In this quickstart, you created a key vault, stored a key, and retrieved that key. To learn
more about Key Vault and how to integrate it with your applications, continue on to
these articles.
Get started with the Azure Key Vault secret client library for JavaScript. Azure Key Vault is
a cloud service that provides a secure store for secrets. You can securely store keys,
passwords, certificates, and other secrets. Azure key vaults may be created and
managed through the Azure portal. In this quickstart, you learn how to create, retrieve,
and delete secrets from an Azure key vault using the JavaScript client library
Prerequisites
An Azure subscription - create one for free .
Current Node.js LTS .
Azure CLI
An existing Key Vault - you can create one using:
Azure CLI
Azure portal
Azure PowerShell
Sign in to Azure
1. Run the login command.
Azure CLI
az login
If the CLI can open your default browser, it will do so and load an Azure sign-in
page.
terminal
terminal
npm init -y
terminal
terminal
Azure CLI
Windows
set KEY_VAULT_NAME=<your-key-vault-name>
Code example
The code samples below will show you how to create a client, set a secret, retrieve a
secret, and delete a secret.
This code uses the following Key Vault Secret classes and methods:
DefaultAzureCredential
SecretClient class
setSecret
getSecret
updateSecretProperties
beginDeleteSecret
JavaScript
// Create a secret
// The secret can be a string of any kind. For example,
// a multiline text block such as an RSA private key with newline
characters,
// or a stringified JSON object, like `JSON.stringify({ mySecret:
'MySecretValue'})`.
const uniqueString = new Date().getTime();
const secretName = `secret${uniqueString}`;
const result = await client.setSecret(secretName, "MySecretValue");
console.log("result: ", result);
main().catch((error) => {
console.error("An error occurred:", error);
process.exit(1);
});
terminal
node index.js
2. The create and get methods return a full JSON object for the secret:
JSON
{
"value": "MySecretValue",
"name": "secret1637692472606",
"properties": {
"createdOn": "2021-11-23T18:34:33.000Z",
"updatedOn": "2021-11-23T18:34:33.000Z",
"enabled": true,
"recoverableDays": 90,
"recoveryLevel": "Recoverable+Purgeable",
"id": "https: //YOUR-KEYVAULT-
NAME.vault.azure.net/secrets/secret1637692472606/YOUR-VERSION",
"vaultUrl": "https: //YOUR-KEYVAULT-NAME.vault.azure.net",
"version": "YOUR-VERSION",
"name": "secret1637692472606"
}
}
JSON
"createdOn": "2021-11-23T18:34:33.000Z",
"updatedOn": "2021-11-23T18:34:33.000Z",
"enabled": true,
"recoverableDays": 90,
"recoveryLevel": "Recoverable+Purgeable",
"id": "https: //YOUR-KEYVAULT-
NAME.vault.azure.net/secrets/secret1637692472606/YOUR-VERSION",
"vaultUrl": "https: //YOUR-KEYVAULT-NAME.vault.azure.net",
"version": "YOUR-VERSION",
"name": "secret1637692472606"
Next steps
In this quickstart, you created a key vault, stored a secret, and retrieved that secret. To
learn more about Key Vault and how to integrate it with your applications, continue on
to the articles below.
7 Note
This quick start provides step-by-step instructions for a simple scenario of sending
messages to a Service Bus queue and receiving them. You can find pre-built
JavaScript and TypeScript samples for Azure Service Bus in the Azure SDK for
JavaScript repository on GitHub .
Prerequisites
If you're new to the service, see Service Bus overview before you do this quickstart.
An Azure subscription. To complete this tutorial, you need an Azure account. You
can activate your MSDN subscriber benefits or sign-up for a free account .
Node.js LTS
Passwordless
To use this quickstart with your own Azure account, you need:
Use the same account when you add the appropriate data role to your
resource.
Run the code in the same terminal or command prompt.
Note down your queue name for your Service Bus namespace. You'll need that
in the code.
7 Note
This tutorial works with samples that you can copy and run using Nodejs . For
instructions on how to create a Node.js application, see Create and deploy a
Node.js application to an Azure Website, or Node.js cloud service using Windows
PowerShell.
To create a namespace:
2. In the left navigation pane of the portal, select All services, select Integration from
the list of categories, hover the mouse over Service Bus, and then select Create on
the Service Bus tile.
3. In the Basics tag of the Create namespace page, follow these steps:
b. For Resource group, choose an existing resource group in which the namespace
will live, or create a new one.
c. Enter a name for the namespace. The namespace name should adhere to the
following naming conventions:
The name must be unique across Azure. The system immediately checks to
see if the name is available.
The name length is at least 6 and at most 50 characters.
The name can contain only letters, numbers, hyphens “-“.
The name must start with a letter and end with a letter or number.
The name doesn't end with “-sb“ or “-mgmt“.
d. For Location, choose the region in which your namespace should be hosted.
e. For Pricing tier, select the pricing tier (Basic, Standard, or Premium) for the
namespace. For this quickstart, select Standard.
) Important
If you want to use topics and subscriptions, choose either Standard or
Premium. Topics/subscriptions aren't supported in the Basic pricing tier.
If you selected the Premium pricing tier, specify the number of messaging
units. The premium tier provides resource isolation at the CPU and memory
level so that each workload runs in isolation. This resource container is called a
messaging unit. A premium namespace has at least one messaging unit. You
can select 1, 2, 4, 8 or 16 messaging units for each Service Bus Premium
namespace. For more information, see Service Bus Premium Messaging.
3. Enter a name for the queue, and leave the other values with their defaults.
The first option shows you how to use your security principal in Azure Active Directory
and role-based access control (RBAC) to connect to a Service Bus namespace. You don't
need to worry about having hard-coded connection string in your code or in a
configuration file or in a secure storage like Azure Key Vault.
The second option shows you how to use a connection string to connect to a Service
Bus namespace. If you are new to Azure, you may find the connection string option
easier to follow. We recommend using the passwordless option in real-world
applications and production environments. For more information, see Authentication
and authorization. You can also read more about passwordless authentication on the
overview page.
Passwordless (Recommended)
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. Learn more
about the available scopes for role assignments on the scope overview page.
The following example assigns the Azure Service Bus Data Owner role to your user
account, which provides full access to Azure Service Bus resources. In a real
scenario, follow the Principle of Least Privilege to give users only the minimum
permissions needed for a more secure production environment.
Azure Service Bus Data Owner: Enables data access to Service Bus namespace
and its entities (queues, topics, subscriptions, and filters). A member of this
role can send and receive messages from queues or topics/subscriptions.
Azure Service Bus Data Sender: Use this role to give the send access to Service
Bus namespace and its entities.
Azure Service Bus Data Receiver: Use this role to give the receive access to
Service Bus namespace and its entities.
If you want to create a custom role, see Rights required for Service Bus operations.
) Important
In most cases, it will take a minute or two for the role assignment to propagate
in Azure. In rare cases, it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
1. If you don't have the Service Bus Namespace page open in the Azure portal,
locate your Service Bus namespace using the main search bar or left
navigation.
2. On the overview page, select Access control (IAM) from the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this example,
search for Azure Service Bus Data Owner and select the matching result. Then
choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
7. In the dialog, search for your Azure AD username (usually your user@domain
email address) and then choose Select at the bottom of the dialog.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Use Node Package Manager (NPM) to install
the package
Passwordless
1. To install the required npm package(s) for Service Bus, open a command
prompt that has npm in its path, change the directory to the folder where you
want to have your samples and then run this command.
Bash
Passwordless
You must have signed in with the Azure CLI's az login in order for your local
machine to provide the passwordless authentication required in this code.
2. Create a file called send.js and paste the below code into it. This code sends
the names of scientists as messages to your queue.
JavaScript
// Passwordless credential
const credential = new DefaultAzureCredential();
// name of the queue
const queueName = "<QUEUE NAME>"
const messages = [
{ body: "Albert Einstein" },
{ body: "Werner Heisenberg" },
{ body: "Marie Curie" },
{ body: "Steven Hawking" },
{ body: "Isaac Newton" },
{ body: "Niels Bohr" },
{ body: "Michael Faraday" },
{ body: "Galileo Galilei" },
{ body: "Johannes Kepler" },
{ body: "Nikolaus Kopernikus" }
];
try {
// Tries to send all messages in a single batch.
// Will fail if the messages cannot fit in a batch.
// await sender.sendMessages(messages);
Console
node send.js
Console
You must have signed in with the Azure CLI's az login in order for your local
machine to provide the passwordless authentication required in this code.
1. Open your favorite editor, such as Visual Studio Code
2. Create a file called receive.js and paste the following code into it.
JavaScript
// Passwordless credential
const credential = new DefaultAzureCredential();
await receiver.close();
await sbClient.close();
}
// call the main function
main().catch((err) => {
console.log("Error occurred: ", err);
process.exit(1);
});
Console
node receive.js
Console
On the Overview page for the Service Bus namespace in the Azure portal, you can see
incoming and outgoing message count. You may need to wait for a minute or so and
then refresh the page to see the latest values.
Select the queue on this Overview page to navigate to the Service Bus Queue page.
You see the incoming and outgoing message count on this page too. You also see other
information such as the current size of the queue, maximum size, active message
count, and so on.
Troubleshooting
If you receive one of the following errors when running the passwordless version of the
JavaScript code, make sure you are signed in via the Azure CLI command, az login and
the appropriate role is applied to your Azure user account:
Clean up resources
Navigate to your Service Bus namespace in the Azure portal, and select Delete on the
Azure portal to delete the namespace and the queue in it.
Next steps
See the following documentation and samples:
7 Note
This quick start provides step-by-step instructions for a simple scenario of sending
a batch of messages to a Service Bus topic and receiving those messages from a
subscription of the topic. You can find pre-built JavaScript and TypeScript samples
for Azure Service Bus in the Azure SDK for JavaScript repository on GitHub .
Prerequisites
An Azure subscription. To complete this tutorial, you need an Azure account. You
can activate your MSDN subscriber benefits or sign up for a free account .
Node.js LTS
Follow steps in the Quickstart: Use the Azure portal to create a Service Bus topic
and subscriptions to the topic. You will use only one subscription for this
quickstart.
Passwordless
To use this quickstart with your own Azure account, you need:
7 Note
This tutorial works with samples that you can copy and run using Nodejs .
For instructions on how to create a Node.js application, see Create and
deploy a Node.js application to an Azure Website, or Node.js Cloud Service
using Windows PowerShell.
To create a namespace:
2. In the left navigation pane of the portal, select All services, select Integration from
the list of categories, hover the mouse over Service Bus, and then select Create on
the Service Bus tile.
3. In the Basics tag of the Create namespace page, follow these steps:
b. For Resource group, choose an existing resource group in which the namespace
will live, or create a new one.
c. Enter a name for the namespace. The namespace name should adhere to the
following naming conventions:
The name must be unique across Azure. The system immediately checks to
see if the name is available.
The name length is at least 6 and at most 50 characters.
The name can contain only letters, numbers, hyphens “-“.
The name must start with a letter and end with a letter or number.
The name doesn't end with “-sb“ or “-mgmt“.
d. For Location, choose the region in which your namespace should be hosted.
e. For Pricing tier, select the pricing tier (Basic, Standard, or Premium) for the
namespace. For this quickstart, select Standard.
) Important
If you want to use topics and subscriptions, choose either Standard or
Premium. Topics/subscriptions aren't supported in the Basic pricing tier.
If you selected the Premium pricing tier, specify the number of messaging
units. The premium tier provides resource isolation at the CPU and memory
level so that each workload runs in isolation. This resource container is called a
messaging unit. A premium namespace has at least one messaging unit. You
can select 1, 2, 4, 8 or 16 messaging units for each Service Bus Premium
namespace. For more information, see Service Bus Premium Messaging.
3. Enter a name for the topic. Leave the other options with their default values.
4. Select Create.
Create a subscription to the topic
1. Select the topic that you created in the previous section.
The first option shows you how to use your security principal in Azure Active Directory
and role-based access control (RBAC) to connect to a Service Bus namespace. You don't
need to worry about having hard-coded connection string in your code or in a
configuration file or in a secure storage like Azure Key Vault.
The second option shows you how to use a connection string to connect to a Service
Bus namespace. If you are new to Azure, you may find the connection string option
easier to follow. We recommend using the passwordless option in real-world
applications and production environments. For more information, see Authentication
and authorization. You can also read more about passwordless authentication on the
overview page.
Passwordless (Recommended)
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. Learn more
about the available scopes for role assignments on the scope overview page.
The following example assigns the Azure Service Bus Data Owner role to your user
account, which provides full access to Azure Service Bus resources. In a real
scenario, follow the Principle of Least Privilege to give users only the minimum
permissions needed for a more secure production environment.
Azure Service Bus Data Owner: Enables data access to Service Bus namespace
and its entities (queues, topics, subscriptions, and filters). A member of this
role can send and receive messages from queues or topics/subscriptions.
Azure Service Bus Data Sender: Use this role to give the send access to Service
Bus namespace and its entities.
Azure Service Bus Data Receiver: Use this role to give the receive access to
Service Bus namespace and its entities.
If you want to create a custom role, see Rights required for Service Bus operations.
Add Azure AD user to Azure Service Bus Owner role
Add your Azure AD user name to the Azure Service Bus Data Owner role at the
Service Bus namespace level. It will allow an app running in the context of your user
account to send messages to a queue or a topic, and receive messages from a
queue or a topic's subscription.
) Important
In most cases, it will take a minute or two for the role assignment to propagate
in Azure. In rare cases, it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
1. If you don't have the Service Bus Namespace page open in the Azure portal,
locate your Service Bus namespace using the main search bar or left
navigation.
2. On the overview page, select Access control (IAM) from the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this example,
search for Azure Service Bus Data Owner and select the matching result. Then
choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
7. In the dialog, search for your Azure AD username (usually your user@domain
email address) and then choose Select at the bottom of the dialog.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
1. To install the required npm package(s) for Service Bus, open a command
prompt that has npm in its path, change the directory to the folder where you
want to have your samples and then run this command.
Bash
Passwordless
You must have signed in with the Azure CLI's az login in order for your local
machine to provide the passwordless authentication required in this code.
JavaScript
// Passwordless credential
const credential = new DefaultAzureCredential();
const messages = [
{ body: "Albert Einstein" },
{ body: "Werner Heisenberg" },
{ body: "Marie Curie" },
{ body: "Steven Hawking" },
{ body: "Isaac Newton" },
{ body: "Niels Bohr" },
{ body: "Michael Faraday" },
{ body: "Galileo Galilei" },
{ body: "Johannes Kepler" },
{ body: "Nikolaus Kopernikus" }
];
try {
// Tries to send all messages in a single batch.
// Will fail if the messages cannot fit in a batch.
// await sender.sendMessages(messages);
Console
node sendtotopic.js
Console
Sent a batch of messages to the topic: mytopic
You must have signed in with the Azure CLI's az login in order for your local
machine to provide the passwordless authentication required in this code.
JavaScript
// Passwordless credential
const credential = new DefaultAzureCredential();
await receiver.close();
await sbClient.close();
}
5. Replace <SUBSCRIPTION NAME> with the name of the subscription to the topic.
Console
node receivefromsubscription.js
Console
In the Azure portal, navigate to your Service Bus namespace, switch to Topics in the
bottom pane, and select your topic to see the Service Bus Topic page for your topic. On
this page, you should see 10 incoming and 10 outgoing messages in the Messages
chart.
If you run only the send app next time, on the Service Bus Topic page, you see 20
incoming messages (10 new) but 10 outgoing messages.
On this page, if you select a subscription in the bottom pane, you get to the Service Bus
Subscription page. You can see the active message count, dead-letter message count,
and more on this page. In this example, there are 10 active messages that haven't been
received by a receiver yet.
Troubleshooting
If you receive an error when running the passwordless version of the JavaScript code
about required claims, make sure you are signed in via the Azure CLI command, az
login and the appropriate role is applied to your Azure user account.
Clean up resources
Navigate to your Service Bus namespace in the Azure portal, and select Delete on the
Azure portal to delete the namespace and the queue in it.
Next steps
See the following documentation and samples:
Get started with the Azure Blob Storage client library for Node.js to manage blobs and
containers. Follow these steps to install the package and try out example code for basic
tasks.
Prerequisites
An Azure account with an active subscription. Create an account for free .
An Azure Storage account. Create a storage account.
Node.js LTS .
1. In a console window (such as cmd, PowerShell, or Bash), create a new directory for
the project.
Console
mkdir blob-quickstart
Console
cd blob-quickstart
3. Create a package.json.
Console
npm init -y
code .
Console
Console
Console
2. Copy the following code into the file. More code will be added as you go through
this quickstart.
JavaScript
} catch (err) {
console.err(`Error: ${err.message}`);
}
}
main()
.then(() => console.log("Done"))
.catch((ex) => console.log(ex.message));
Object model
Azure Blob storage is optimized for storing massive amounts of unstructured data.
Unstructured data is data that doesn't adhere to a particular data model or definition,
such as text or binary data. Blob storage offers three types of resources:
Code examples
These example code snippets show you how to do the following tasks with the Azure
Blob Storage client library for JavaScript:
You can also authorize requests to Azure Blob Storage by using the account access key.
However, this approach should be used with caution. Developers must be diligent to
never expose the access key in an unsecure location. Anyone who has the access key is
able to authorize requests against the storage account, and effectively has access to all
the data. DefaultAzureCredential offers improved management and security benefits
over the account key to allow passwordless authentication. Both options are
demonstrated in the following example.
Passwordless (Recommended)
which method should be used at runtime. This approach enables your app to use
different authentication methods in different environments (local vs. production)
without implementing environment-specific code.
The order and locations in which DefaultAzureCredential looks for credentials can
be found in the Azure Identity library overview.
For example, your app can authenticate using your Azure CLI sign-in credentials
with when developing locally. Your app can then use a managed identity once it has
been deployed to Azure. No code changes are required for this transition.
In this scenario, you'll assign permissions to your user account, scoped to the
storage account, to follow the Principle of Least Privilege. This practice gives users
only the minimum permissions needed and creates more secure production
environments.
The following example will assign the Storage Blob Data Contributor role to your
user account, which provides both read and write access to blob data in your
storage account.
) Important
In most cases it will take a minute or two for the role assignment to propagate
in Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your storage account using the main search
bar or left navigation.
2. On the storage account overview page, select Access control (IAM) from
the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Storage Blob Data Contributor and select the
matching result and then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
You can authorize access to data in your storage account using the following steps:
1. Make sure you're authenticated with the same Azure AD account you assigned
the role to on your storage account. You can authenticate via the Azure CLI,
Visual Studio Code, or Azure PowerShell.
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
2. To use DefaultAzureCredential , make sure that the @azure\identity package
is installed, and the class is imported:
JavaScript
3. Add this code inside the try block. When the code runs on your local
workstation, DefaultAzureCredential uses the developer credentials of the
prioritized tool you're logged into to authenticate to Azure. Examples of these
tools include Azure CLI or Visual Studio Code.
JavaScript
7 Note
When deployed to Azure, this same code can be used to authorize
requests to Azure Storage from an application running in Azure. However,
you'll need to enable managed identity on your app in Azure. Then
configure your storage account to allow that managed identity to
connect. For detailed instructions on configuring this connection between
Azure services, see the Auth from Azure-hosted apps tutorial.
Create a container
1. Decide on a name for the new container. Container names must be lowercase.
For more information about naming containers and blobs, see Naming and
Referencing Containers, Blobs, and Metadata.
JavaScript
console.log('\nCreating container...');
console.log('\t', containerName);
To learn more about creating a container, and to explore more code samples, see Create
a blob container with JavaScript.
JavaScript
To learn more about uploading blobs, and to explore more code samples, see Upload a
blob with JavaScript.
JavaScript
console.log('\nListing blobs...');
The preceding code calls the listBlobsFlat method. In this case, only one blob has been
added to the container, so the listing operation returns just that one blob.
To learn more about listing blobs, and to explore more code samples, see List blobs with
JavaScript.
Download blobs
1. Add the following code to the end of the main function to download the
previously created blob into the app runtime.
JavaScript
2. Copy the following code after the main function to convert a stream back into a
string.
JavaScript
Delete a container
Add this code to the end of the main function to delete the container and all its blobs:
JavaScript
// Delete container
console.log('\nDeleting container...');
The preceding code cleans up the resources the app created by removing the entire
container using the delete method. You can also delete the local files, if you like.
To learn more about deleting a container, and to explore more code samples, see Delete
and restore a blob container with JavaScript.
Console
node index.js
Output
Creating container...
quickstart4a0780c0-fb72-11e9-b7b9-b387d3c488da
Listing blobs...
quickstart4a3128d0-fb72-11e9-b7b9-b387d3c488da.txt
Downloaded blob content...
Hello, World!
Deleting container...
Done
Step through the code in your debugger and check your Azure portal throughout the
process. Check to see that the container is being created. You can open the blob inside
the container and view the contents.
Clean up
1. When you're done with this quickstart, delete the blob-quickstart directory.
2. If you're done using your Azure Storage resource, use the Azure CLI to remove the
Storage resource.
Next steps
In this quickstart, you learned how to upload, download, and list blobs using JavaScript.
To learn more, see the Azure Blob Storage client libraries for JavaScript.
For tutorials, samples, quickstarts, and other documentation, visit Azure for
JavaScript and Node.js developers.
Quickstart: Azure Queue Storage client
library for JavaScript
Article • 06/29/2023
Get started with the Azure Queue Storage client library for JavaScript. Azure Queue
Storage is a service for storing large numbers of messages for later retrieval and
processing. Follow these steps to install the package and try out example code for basic
tasks.
Use the Azure Queue Storage client library for JavaScript to:
Create a queue
Add messages to a queue
Peek at messages in a queue
Update a message in a queue
Get the queue length
Receive messages from a queue
Delete messages from a queue
Delete a queue
Prerequisites
Azure subscription - create one for free
Azure Storage account - create a storage account
Current Node.js for your operating system.
Setting up
This section walks you through preparing a project to work with the Azure Queue
Storage client library for JavaScript.
1. In a console window (such as cmd, PowerShell, or Bash), create a new directory for
the project:
Console
mkdir queues-quickstart
Console
cd queues-quickstart
Console
npm init -y
Console
code .
Console
Console
Console
npm install uuid dotenv
3. Create the structure for the program, including basic exception handling
JavaScript
Authenticate to Azure
Application requests to most Azure services must be authorized. Using the
DefaultAzureCredential class provided by the Azure Identity client library is the
You can also authorize requests to Azure services using passwords, connection strings,
or other credentials directly. However, this approach should be used with caution.
Developers must be diligent to never expose these secrets in an unsecure location.
Anyone who gains access to the password or secret key is able to authenticate.
DefaultAzureCredential offers improved management and security benefits over the
account key to allow passwordless authentication. Both options are demonstrated in the
following example.
Passwordless (Recommended)
For example, your app can authenticate using your Azure CLI sign-in credentials
when developing locally, and then use a managed identity once it has been
deployed to Azure. No code changes are required for this transition.
When developing locally, make sure that the user account that is accessing the
queue data has the correct permissions. You'll need Storage Queue Data
Contributor to read and write queue data. To assign yourself this role, you'll need
to be assigned the User Access Administrator role, or another role that includes the
Microsoft.Authorization/roleAssignments/write action. You can assign Azure RBAC
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. You can learn
more about the available scopes for role assignments on the scope overview page.
In this scenario, you'll assign permissions to your user account, scoped to the
storage account, to follow the Principle of Least Privilege. This practice gives users
only the minimum permissions needed and creates more secure production
environments.
The following example will assign the Storage Queue Data Contributor role to your
user account, which provides both read and write access to queue data in your
storage account.
) Important
In most cases it will take a minute or two for the role assignment to propagate
in Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your storage account using the main search
bar or left navigation.
2. On the storage account overview page, select Access control (IAM) from
the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Storage Queue Data Contributor and select the
matching result and then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Object model
Azure Queue Storage is a service for storing large numbers of messages. A queue
message can be up to 64 KB in size. A queue may contain millions of messages, up to
the total capacity limit of a storage account. Queues are commonly used to create a
backlog of work to process asynchronously. Queue Storage offers three types of
resources:
Storage account: All access to Azure Storage is done through a storage account.
For more information about storage accounts, see Storage account overview
Queue: A queue contains a set of messages. All messages must be in a queue.
Note that the queue name must be all lowercase. For information on naming
queues, see Naming Queues and Metadata.
Message: A message, in any format, of up to 64 KB. A message can remain in the
queue for a maximum of 7 days. For version 2017-07-29 or later, the maximum
time-to-live can be any positive number, or -1 indicating that the message doesn't
expire. If this parameter is omitted, the default time-to-live is seven days.
Code examples
These example code snippets show you how to do the following actions with the Azure
Queue Storage client library for JavaScript:
Passwordless (Recommended)
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
Once authenticated, you can create and authorize a QueueClient object using
DefaultAzureCredential to access queue data in the storage account.
JavaScript
Decide on a name for the queue and create an instance of the QueueClient class,
using DefaultAzureCredential for authorization. We use this client object to create
and interact with the queue resource in the storage account.
) Important
Queue names may only contain lowercase letters, numbers, and hyphens, and
must begin with a letter or a number. Each hyphen must be preceded and
followed by a non-hyphen character. The name must also be between 3 and 63
characters long. For more information about naming queues, see Naming
queues and metadata.
Add the following code inside the main method, and make sure to replace the
<storage-account-name> placeholder value:
JavaScript
7 Note
Messages sent using the QueueClient class must be in a format that can be
included in an XML request with UTF-8 encoding. To include markup in the
message, the contents of the message must either be XML-escaped or Base64-
encoded.
Queues messages are stored as strings. If you need to send a different data type, you
must serialize that data type into a string when sending the message and deserialize the
string format when reading the message.
To convert JSON to a string format and back again in Node.js, use the following helper
functions:
JavaScript
function jsonToBase64(jsonObj) {
const jsonString = JSON.stringify(jsonObj)
return Buffer.from(jsonString).toString('base64')
}
function encodeBase64ToJson(base64String) {
const jsonString = Buffer.from(base64String,'base64').toString()
return JSON.parse(jsonString)
}
Create a queue
Using the QueueClient object, call the create method to create the queue in your
storage account.
JavaScript
console.log("\nCreating queue...");
console.log("\t", queueName);
JavaScript
JavaScript
JavaScript
JavaScript
call.
JavaScript
When calling the receiveMessages method, you can optionally specify values in
QueueReceiveMessageOptions to customize message retrieval. You can specify a value
for numberOfMessages , which is the number of messages to retrieve from the queue. The
default is 1 message and the maximum is 32 messages. You can also specify a value for
visibilityTimeout , which hides the messages from other operations for the timeout
Delete messages by calling the deleteMessage method. Any messages not explicitly
deleted eventually become visible in the queue again for another chance to process
them.
JavaScript
Delete a queue
The following code cleans up the resources the app created by deleting the queue using
the delete method.
Add this code to the end of the main function and save the file:
JavaScript
In your console window, navigate to the directory containing the index.js file, then use
the following node command to run the app.
Console
node index.js
Output
Azure Queue Storage client library - JavaScript quickstart sample
Creating queue...
quickstart<UUID>
Queue created, requestId: 5c0bc94c-6003-011b-7c11-b13d06000000
Deleting queue...
Queue deleted, requestId: 5c0bca05-6003-011b-1e11-b13d06000000
Done
Step through the code in your debugger and check your Azure portal throughout the
process. Check your storage account to verify messages in the queue are created and
deleted.
Next steps
In this quickstart, you learned how to create a queue and add messages to it using
JavaScript code. Then you learned to peek, retrieve, and delete messages. Finally, you
learned how to delete a message queue.
To learn more, see the Azure Queue Storage client library for JavaScript .
For more Azure Queue Storage sample apps, see Azure Queue Storage client
library for JavaScript - samples .
Connect to and query Azure SQL
Database using Python and the pyodbc
driver
Article • 05/31/2023
This article was partially created with the help of AI. An author reviewed and revised
the content as needed. Read more.
Prerequisites
An Azure subscription .
An Azure SQL database configured with Azure Active Directory (Azure AD)
authentication. You can create one using the Create database quickstart.
The latest version of the Azure CLI.
Visual Studio Code with the Python extension .
Python 3.7 or later.
1. For local development connections, make sure your logical server is configured to
allow your local machine IP address and other Azure services to connect:
Select Add your client IPv4 address(xx.xx.xx.xx) to add a firewall rule that
will enable connections from your local machine IPv4 address. Alternatively,
you can also select + Add a firewall rule to enter a specific IP address of your
choice.
Make sure the Allow Azure services and resources to access this server
checkbox is selected.
2 Warning
Enabling the Allow Azure services and resources to access this server
setting is not a recommended security practice for production scenarios.
Real applications should implement more secure approaches, such as
stronger firewall restrictions or virtual network configurations.
2. The server must also have Azure AD authentication enabled with an Azure Active
Directory admin account assigned. For local development connections, the Azure
Active Directory admin account should be an account you can also log into Visual
Studio or the Azure CLI with locally. You can verify whether your server has Azure
AD authentication enabled on the Azure Active Directory page.
3. If you're using a personal Azure account, make sure you have Azure Active
Directory setup and configured for Azure SQL Database in order to assign your
account as a server admin. If you're using a corporate account, Azure Active
Directory will most likely already be configured for you.
1. Open Visual Studio Code and create a new folder for your project and change
directory into it.
Cmd
mkdir python-sql-azure
cd python-sql-azure
Windows
Cmd
py -m venv .venv
.venv\scripts\activate
For details and specific instructions for installing the pyodbc driver on all operating
systems, see Configure development environment for pyodbc Python development.
pyodbc
fastapi
uvicorn[standard]
install pydantic
azure-identity
Console
Interactive Authentication
Bash
You can get the details to create your connection string from the Azure portal:
1. Go to the Azure SQL Server, select the SQL databases page to find your database
name, and select the database.
7 Note
If you've installed Azure Arc and associated it with your Azure subscription, you can
also use the managed identity approach shown for the app deployed to App
Service.
Python
import os
import pyodbc, struct
from azure import identity
class Person(BaseModel):
first_name: str
last_name: Union[str, None] = None
connection_string = os.environ["AZURE_SQL_CONNECTIONSTRING"]
app = FastAPI()
@app.get("/")
def root():
print("Root of Person API")
try:
conn = get_conn()
cursor = conn.cursor()
conn.commit()
except Exception as e:
# Table may already exist
print(e)
return "Person API"
@app.get("/all")
def get_persons():
rows = []
with get_conn() as conn:
cursor = conn.cursor()
cursor.execute("SELECT * FROM Persons")
@app.get("/person/{person_id}")
def get_person(person_id: int):
with get_conn() as conn:
cursor = conn.cursor()
cursor.execute("SELECT * FROM Persons WHERE ID = ?", person_id)
row = cursor.fetchone()
return f"{row.ID}, {row.FirstName}, {row.LastName}"
@app.post("/person")
def create_person(item: Person):
with get_conn() as conn:
cursor = conn.cursor()
cursor.execute(f"INSERT INTO Persons (FirstName, LastName) VALUES
(?, ?)", item.first_name, item.last_name)
conn.commit()
return item
def get_conn():
credential =
identity.DefaultAzureCredential(exclude_interactive_browser_credential=False
)
token_bytes =
credential.get_token("https://database.windows.net/.default").token.encode("
UTF-16-LE")
token_struct = struct.pack(f'<I{len(token_bytes)}s', len(token_bytes),
token_bytes)
SQL_COPT_SS_ACCESS_TOKEN = 1256 # This connection option is defined by
microsoft in msodbcsql.h
conn = pyodbc.connect(connection_string, attrs_before=
{SQL_COPT_SS_ACCESS_TOKEN: token_struct})
return conn
2 Warning
The sample code shows raw SQL statements, which shouldn't be used in
production code. Instead, use an Object Relational Mapper (ORM) package like
SqlAlchemy that generates a more secure object layer to access your database.
Console
You can also use try /redoc to see another form of generated documentation for
the API.
3. Modify the sample JSON to include values for the first and last name. Select
Execute to add a new record to the database. The API returns a successful
response.
4. Expand the GET method on the Swagger UI page and select Try it. Choose
Execute, and the person you just created is returned.
Deploy to Azure App Service
The app is ready to be deployed to Azure.
1. Create a start.sh file so that gunicorn in Azure App Service can run uvicorn. The
start.sh has one line:
Console
2. Use the az webapp up to deploy the code to App Service. (You can use the option
-dryrun to see what the command does without creating the resource.)
Azure CLI
az webapp up \
--resource-group <resource-group-name> \
--name <web-app-name>
3. Use the az webapp config set command to configure App Service to use the
start.sh file.
Azure CLI
Azure CLI
To run these commands you can use any tool or IDE that can connect to Azure SQL
Database, including SQL Server Management Studio (SSMS), Azure Data Studio, and
Visual Studio Code with the SQL server mssql extension. As well, you can use the
Azure portal as described in Quickstart: Use the Azure portal query editor to query
Azure SQL Database.
1. Add a user to the Azure SQL Database with SQL commands to create a user and
role for passwordless access.
SQL
For more information, see Contained Database Users - Making Your Database
Portable. For an example that shows the same principle but applied to Azure VM,
see Tutorial: Use a Windows VM system-assigned managed identity to access
Azure SQL. For more information about the roles assigned, see Fixed-database
Roles.
If you disable and then enable the App Service system-assigned managed identity,
then drop the user and recreate it. Run DROP USER [<web-app-name>] and rerun the
CREATE and ALTER commands. To see users, use SELECT * FROM
sys.database_principals .
2. Use the az webapp config appsettings set command to add an app setting for the
connection string.
Azure CLI
HTTP
https://<web-app-name>.azurewebsites.net
Append /docs to the URL to see the Swagger UI and test the API methods.
Congratulations! Your application is now connected to Azure SQL Database in both local
and hosted environments.
Next steps
Migrate a Python application to use passwordless connections with Azure SQL
Database - Shows user-assigned managed identity.
Passwordless connections for Azure services
Managed identity best practice recommendations
Quickstart: Azure Cosmos DB for NoSQL
client library for Python
Article • 03/08/2023
Get started with the Azure Cosmos DB client library for Python to create databases,
containers, and items within your account. Follow these steps to install the package and
try out example code for basic tasks.
7 Note
Prerequisites
An Azure account with an active subscription.
No Azure subscription? You can try Azure Cosmos DB free with no credit card
required.
Python 3.7 or later
Ensure the python executable is in your PATH .
Azure Command-Line Interface (CLI) or Azure PowerShell
Prerequisite check
In a command shell, run python --version to check that the version is 3.7 or later.
Run az --version (Azure CLI) or Get-Module -ListAvailable AzureRM (Azure
PowerShell) to check that you have the appropriate Azure command-line tools
installed.
Setting up
This section walks you through creating an Azure Cosmos DB account and setting up a
project that uses the Azure Cosmos DB for NoSQL client library for Python to manage
resources.
Create an Azure Cosmos DB account
Tip
No Azure subscription? You can try Azure Cosmos DB free with no credit card
required. If you create an account using the free trial, you can safely skip ahead to
the Create a new Python app section.
This quickstart will create a single Azure Cosmos DB account using the API for NoSQL.
Portal
Tip
For this quickstart, we recommend using the resource group name msdocs-
cosmos-quickstart-rg .
2. From the Azure portal menu or the Home page, select Create a resource.
3. On the New page, search for and select Azure Cosmos DB.
4. On the Select API option page, select the Create option within the NoSQL
section. Azure Cosmos DB has six APIs: NoSQL, MongoDB, PostgreSQL,
Apache Cassandra, Apache Gremlin, and Table. Learn more about the API for
NoSQL.
5. On the Create Azure Cosmos DB Account page, enter the following
information:
Subscription Subscription Select the Azure subscription that you wish to use for
name this Azure Cosmos account.
Location The region Select a geographic location to host your Azure Cosmos
closest to DB account. Use the location that is closest to your users
your users to give them the fastest access to the data.
Apply Azure Apply or Do Enable Azure Cosmos DB free tier. With Azure Cosmos
Cosmos DB not apply DB free tier, you'll get the first 1000 RU/s and 25 GB of
free tier storage for free in an account. Learn more about free
discount tier .
7 Note
You can have up to one free tier Azure Cosmos DB account per Azure
subscription and must opt-in when creating the account. If you do not
see the option to apply the free tier discount, this means another account
in the subscription has already been enabled with free tier.
6. Select Review + create.
7. Review the settings you provide, and then select Create. It takes a few minutes
to create the account. Wait for the portal page to display Your deployment is
complete before moving on.
9. From the API for NoSQL account page, select the Keys navigation menu
option.
10. Record the values from the URI and PRIMARY KEY fields. You'll use these
values in a later step.
Install packages
Use the pip install command to install packages you'll need in the quickstart.
Passwordless (Recommended)
Add the azure-cosmos and azure-identity PyPI packages to the Python app.
Bash
Windows
PowerShell
$env:COSMOS_ENDPOINT = "<cosmos-account-URI>"
$env:COSMOS_KEY = "<cosmos-account-PRIMARY-KEY>"
Object model
Before you start building the application, let's look into the hierarchy of resources in
Azure Cosmos DB. Azure Cosmos DB has a specific object model used to create and
access resources. The Azure Cosmos DB creates resources in a hierarchy that consists of
accounts, databases, containers, and items.
Account
Database Database
item
item
item
For more information about the hierarchy of different resources, see working with
databases, containers, and items in Azure Cosmos DB.
You'll use the following Python classes to interact with these resources:
CosmosClient - This class provides a client-side logical representation for the Azure
Cosmos DB service. The client object is used to configure and execute requests
against the service.
DatabaseProxy - This class is a reference to a database that may, or may not, exist
in the service yet. The database is validated server-side when you attempt to
access it or perform an operation against it.
ContainerProxy - This class is a reference to a container that also may not exist in
the service yet. The container is validated server-side when you attempt to work
with it.
Code examples
Authenticate the client
Create a database
Create a container
Create an item
Get an item
Query items
The sample code described in this article creates a database named cosmicworks with a
container named products . The products table is designed to contain product details
such as name, category, quantity, and a sale indicator. Each product also contains a
unique identifier.
For this sample code, the container will use the category as a logical partition key.
You can also authorize requests to Azure services using passwords, connection strings,
or other credentials directly. However, this approach should be used with caution.
Developers must be diligent to never expose these secrets in an unsecure location.
Anyone who gains access to the password or secret key is able to authenticate.
DefaultAzureCredential offers improved management and security benefits over the
account key to allow passwordless authentication. Both options are demonstrated in the
following example.
Passwordless (Recommended)
For example, your app can authenticate using your Azure CLI sign-in credentials
when developing locally, and then use a managed identity once it has been
deployed to Azure. No code changes are required for this transition.
When developing locally with passwordless authentication, make sure the user
account that connects to Cosmos DB is assigned a role with the correct permissions
to perform data operations. Currently, Azure Cosmos DB for NoSQL doesn't include
built-in roles for data operations, but you can create your own using the Azure CLI
or PowerShell.
Azure CLI
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/item
s/*",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/*"
]
}]
}'
2. When the command completes, copy the ID value from the name field and
paste it somewhere for later use.
3. Assign the role you created to the user account or service principal that will
connect to Cosmos DB. During local development, this will generally be your
own account that's logged into a development tool like Visual Studio or the
Azure CLI. Retrieve the details of your account using the az ad user
command.
Azure CLI
4. Copy the value of the id property out of the results and paste it somewhere
for later use.
5. Assign the custom role you created to your user account using the az
cosmosdb sql role assignment create command and the IDs you copied
previously.
Azure CLI
For local development, make sure you're authenticated with the same Azure AD
account you assigned the role to. You can authenticate via popular development
tools, such as the Azure CLI or Azure PowerShell. The development tools with which
you can authenticate vary across languages.
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
From the project directory, open the app.py file. In your editor, add modules to
work with Cosmos DB and authenticate to Azure. You'll authenticate to Cosmos DB
for NoSQL using DefaultAzureCredential from the azure-identity package.
DefaultAzureCredential will automatically discover and use the account you
Sync
Python
import os
import json
from azure.cosmos import CosmosClient
from azure.identity import DefaultAzureCredential
Sync / Async
Python
endpoint = os.environ["COSMOS_ENDPOINT"]
Sync / Async
Python
DATABASE_NAME = "cosmicworks"
CONTAINER_NAME = "products"
Create a new client instance using the CosmosClient class constructor and the
DefaultAzureCredential object.
Sync
Python
credential = DefaultAzureCredential()
client = CosmosClient(url=endpoint, credential=credential)
Create a database
Passwordless (Recommended)
The Microsoft.Azure.Cosmos client library enables you to perform data operations
using Azure RBAC. However, to authenticate management operations, such as
creating and deleting databases, you must use RBAC through one of the following
options:
The Azure CLI approach is used in for this quickstart and passwordless access. Use
the az cosmosdb sql database create command to create a Cosmos DB for NoSQL
database.
Azure CLI
The command line to create a database is for PowerShell, shown on multiple lines
for clarity. For other shell types, change the line continuation characters as
appropriate. For example, for Bash, use backslash ("\"). Or, remove the continuation
characters and enter the command on one line.
Create a container
Passwordless (Recommended)
Azure CLI
The command line to create a container is for PowerShell, on multiple lines for
clarity. For other shell types, change the line continuation characters as appropriate.
For example, for Bash, use backslash ("\"). Or, remove the continuation characters
and enter the command on one line. For Bash, you'll also need to add
MSYS_NO_PATHCONV=1 before the command so that Bash deals with the partition key
parameter correctly.
After the resources have been created, use classes from the
Microsoft.Azure.Cosmos client libraries to connect to and query the database.
Create an item
Create a new item in the container by first creating a new variable ( new_item ) with a
sample item defined. In this example, the unique identifier of this item is 70b63682-b93a-
4c77-aad2-65501347265f . The partition key value is derived from the /categoryId path,
so it would be 61dba35b-4f02-45c5-b648-c6badc0cbd79 .
Sync / Async
Python
new_item = {
"id": "70b63682-b93a-4c77-aad2-65501347265f",
"categoryId": "61dba35b-4f02-45c5-b648-c6badc0cbd79",
"categoryName": "gear-surf-surfboards",
"name": "Yamba Surfboard",
"quantity": 12,
"sale": False,
}
Tip
The remaining fields are flexible and you can define as many or as few as you want.
You can even combine different item schemas in the same container.
Sync
Python
container.create_item(new_item)
Get an item
In Azure Cosmos DB, you can perform a point read operation by using both the unique
identifier ( id ) and partition key fields. In the SDK, call ContainerProxy.read_item passing
in both values to return an item as a dictionary of strings and values ( dict[str, Any] ).
Sync
Python
existing_item = container.read_item(
item="70b63682-b93a-4c77-aad2-65501347265f",
partition_key="61dba35b-4f02-45c5-b648-c6badc0cbd79",
)
print("Point read\t", existing_item["name"])
Query items
After you insert an item, you can run a query to get all items that match a specific filter.
This example runs the SQL query: SELECT * FROM products p WHERE p.categoryId =
"61dba35b-4f02-45c5-b648-c6badc0cbd79" . This example uses query parameterization to
construct the query. The query uses a string of the SQL query, and a dictionary of query
parameters.
Sync / Async
Python
This example dictionary included the @categoryId query parameter and the
corresponding value 61dba35b-4f02-45c5-b648-c6badc0cbd79 .
Once the query is defined, call ContainerProxy.query_items to run the query and return
the results as a paged set of items ( ItemPage[Dict[str, Any]] ).
Sync / Async
Python
results = container.query_items(
query=QUERY, parameters=params, enable_cross_partition_query=False
)
Finally, use a for loop to iterate over the results in each page and perform various
actions.
Sync
Python
In this example, json.dumps is used to print the item to the console in a human-readable
way.
Run the code
This app creates an API for NoSQL database and container. The example then creates an
item and then reads the exact same item back. Finally, the example issues a query that
should only return that single item. At the final step, the example outputs the final item
to the console.
Use a terminal to navigate to the application directory and run the application.
Bash
python app.py
Output
Database cosmicworks
Container products
Point read Yamba Surfboard
Result list [
{
"id": "70b63682-b93a-4c77-aad2-65501347265f",
"categoryId": "61dba35b-4f02-45c5-b648-c6badc0cbd79",
"categoryName": "gear-surf-surfboards",
"name": "Yamba Surfboard",
"quantity": 12,
"sale": false,
"_rid": "KSsMAPI2fH0BAAAAAAAAAA==",
"_self": "dbs/KSsMAA==/colls/KSsMAPI2fH0=/docs/KSsMAPI2fH0BAAAAAAAAAA==/",
"_etag": "\"48002b76-0000-0200-0000-63c85f9d0000\"",
"_attachments": "attachments/",
"_ts": 1674076061
}
]
7 Note
The fields assigned by Azure Cosmos DB will vary from this sample output.
Clean up resources
When you no longer need the API for NoSQL account, you can delete the corresponding
resource group.
Portal
1. Navigate to the resource group you previously created in the Azure portal.
Tip
3. On the Are you sure you want to delete dialog, enter the name of the
resource group, and then select Delete.
Next steps
In this quickstart, you learned how to create an Azure Cosmos DB for NoSQL account,
create a database, and create a container using the Python SDK. You can now dive
deeper into guidance on how to import your data into the API for NoSQL.
Send events to or receive events from
event hubs by using Python
Article • 06/16/2023
This quickstart shows how to send events to and receive events from an event hub using
the azure-eventhub Python package.
Prerequisites
If you're new to Azure Event Hubs, see Event Hubs overview before you do this
quickstart.
Microsoft Azure subscription. To use Azure services, including Azure Event Hubs,
you need a subscription. If you don't have an existing Azure account, sign up for a
free trial .
Python 3.7 or later, with pip installed and updated.
Visual Studio Code (recommended) or any other integrated development
environment (IDE).
Create an Event Hubs namespace and an event hub. The first step is to use the
Azure portal to create an Event Hubs namespace, and obtain the management
credentials that your application needs to communicate with the event hub. To
create a namespace and an event hub, follow the procedure in this article.
Passwordless (Recommended)
shell
Passwordless (Recommended)
The following example assigns the Azure Event Hubs Data Owner role to your user
account, which provides full access to Azure Event Hubs resources. In a real
scenario, follow the Principle of Least Privilege to give users only the minimum
permissions needed for a more secure production environment.
Azure Event Hubs Data Owner: Enables data access to Event Hubs namespace
and its entities (queues, topics, subscriptions, and filters)
Azure Event Hubs Data Sender: Use this role to give the sender access to
Event Hubs namespace and its entities.
Azure Event Hubs Data Receiver: Use this role to give the receiver access to
Event Hubs namespace and its entities.
If you want to create a custom role, see Rights required for Event Hubs operations.
) Important
In most cases, it will take a minute or two for the role assignment to propagate
in Azure. In rare cases, it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your Event Hubs namespace using the main
search bar or left navigation.
2. On the overview page, select Access control (IAM) from the left-hand
menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Azure Event Hubs Data Owner and select the matching
result. Then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Send events
In this section, create a Python script to send events to the event hub that you created
earlier.
2. Create a script called send.py. This script sends a batch of events to the event hub
that you created earlier.
Passwordless (Recommended)
EVENT_HUB_FULLY_QUALIFIED_NAMESPACE
EVENT_HUB_NAME
Python
import asyncio
EVENT_HUB_FULLY_QUALIFIED_NAMESPACE =
"EVENT_HUB_FULLY_QUALIFIED_NAMESPACE"
EVENT_HUB_NAME = "EVENT_HUB_NAME"
credential = DefaultAzureCredential()
fully_qualified_namespace=EVENT_HUB_FULLY_QUALIFIED_NAMESPACE,
eventhub_name=EVENT_HUB_NAME,
credential=credential,
)
async with producer:
# Create a batch.
event_data_batch = await producer.create_batch()
asyncio.run(run())
7 Note
Receive events
This quickstart uses Azure Blob storage as a checkpoint store. The checkpoint store is
used to persist checkpoints (that is, the last read positions).
Follow these recommendations when using Azure Blob Storage as a checkpoint store:
Use a separate container for each processor group. You can use the same storage
account, but use one container per each group.
Don't use the container for anything else, and don't use the storage account for
anything else.
Storage account should be in the same region as the deployed application is
located in. If the application is on-premises, try to choose the closest region
possible.
On the Storage account page in the Azure portal, in the Blob service section, ensure
that the following settings are disabled.
Hierarchical namespace
Blob soft delete
Versioning
Be sure to record the connection string and container name for later use in the receive
code.
Passwordless (Recommended)
When developing locally, make sure that the user account that is accessing blob
data has the correct permissions. You'll need Storage Blob Data Contributor to
read and write blob data. To assign yourself this role, you'll need to be assigned the
User Access Administrator role, or another role that includes the
Microsoft.Authorization/roleAssignments/write action. You can assign Azure RBAC
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. You can learn
more about the available scopes for role assignments on the scope overview page.
In this scenario, you'll assign permissions to your user account, scoped to the
storage account, to follow the Principle of Least Privilege. This practice gives users
only the minimum permissions needed and creates more secure production
environments.
The following example will assign the Storage Blob Data Contributor role to your
user account, which provides both read and write access to blob data in your
storage account.
) Important
In most cases it will take a minute or two for the role assignment to propagate
in Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your storage account using the main search
bar or left navigation.
2. On the storage account overview page, select Access control (IAM) from
the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Storage Blob Data Contributor and select the
matching result and then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
Passwordless (Recommended)
shell
Passwordless (Recommended)
BLOB_STORAGE_ACCOUNT_URL
BLOB_CONTAINER_NAME
EVENT_HUB_FULLY_QUALIFIED_NAMESPACE
EVENT_HUB_NAME
Python
import asyncio
BLOB_STORAGE_ACCOUNT_URL = "BLOB_STORAGE_ACCOUNT_URL"
BLOB_CONTAINER_NAME = "BLOB_CONTAINER_NAME"
EVENT_HUB_FULLY_QUALIFIED_NAMESPACE =
"EVENT_HUB_FULLY_QUALIFIED_NAMESPACE"
EVENT_HUB_NAME = "EVENT_HUB_NAME"
credential = DefaultAzureCredential()
fully_qualified_namespace=EVENT_HUB_FULLY_QUALIFIED_NAMESPACE,
eventhub_name=EVENT_HUB_NAME,
consumer_group="$Default",
checkpoint_store=checkpoint_store,
credential=credential,
)
async with client:
# Call the receive method. Read from the beginning of the
partition
# (starting_position: "-1")
await client.receive(on_event=on_event,
starting_position="-1")
if __name__ == "__main__":
# Run the main method.
asyncio.run(main())
7 Note
For examples of other options for receiving events from Event Hub
asynchronously using a connection string, see the GitHub
recv_with_checkpoint_store_async.py page . The patterns shown there are
also applicable to receiving events passwordless.
Bash
python recv.py
Bash
python send.py
The receiver window should display the messages that were sent to the event hub.
Troubleshooting
If you don't see events in the receiver window or the code reports an error, try the
following troubleshooting tips:
If you don't see results from recy.py, run send.py several times.
If you see errors about "coroutine" when using the passwordless code (with
credentials), make sure you're using importing from azure.identity.aio .
If you see "Unclosed client session" with passwordless code (with credentials),
make sure you close the credential when finished. For more information, see Async
credentials.
If you see authorization errors with recv.py when accessing storage, make sure you
followed the steps in Create an Azure storage account and a blob container and
assigned the Storage Blob Data Contributor role to the service principal.
If you receive events with different partition IDs, this result is expected. Partitions
are a data organization mechanism that relates to the downstream parallelism
required in consuming applications. The number of partitions in an event hub
directly relates to the number of concurrent readers you expect to have. For more
information, see Learn more about partitions.
Next steps
In this quickstart, you've sent and received events asynchronously. To learn how to send
and receive events synchronously, go to the GitHub sync_samples page .
For all the samples (both synchronous and asynchronous) on GitHub, go to Azure Event
Hubs client library for Python samples .
Quickstart: Azure Key Vault certificate
client library for Python
Article • 03/15/2023
Get started with the Azure Key Vault certificate client library for Python. Follow these
steps to install the package and try out example code for basic tasks. By using Key Vault
to store certificates, you avoid storing certificates in your code, which increases the
security of your app.
Prerequisites
An Azure subscription - create one for free .
Python 3.7+
Azure CLI
This quickstart assumes you're running Azure CLI or Azure PowerShell in a Linux terminal
window.
Sign in to Azure
Azure CLI
Azure CLI
az login
If the CLI can open your default browser, it will do so and load an Azure sign-
in page.
terminal
terminal
Azure CLI
Azure CLI
Azure CLI
az keyvault create --name <your-unique-keyvault-name> --resource-
group myResourceGroup
Console
export KEY_VAULT_NAME=<your-unique-keyvault-name>
Azure CLI
Azure CLI
Python
import os
from azure.keyvault.certificates import CertificateClient, CertificatePolicy
from azure.identity import DefaultAzureCredential
keyVaultName = os.environ["KEY_VAULT_NAME"]
KVUri = "https://" + keyVaultName + ".vault.azure.net"
credential = DefaultAzureCredential()
client = CertificateClient(vault_url=KVUri, credential=credential)
policy = CertificatePolicy.get_default()
poller = client.begin_create_certificate(certificate_name=certificateName,
policy=policy)
certificate = poller.result()
print(" done.")
retrieved_certificate = client.get_certificate(certificateName)
poller = client.begin_delete_certificate(certificateName)
deleted_certificate = poller.result()
print(" done.")
terminal
python kv_certificates.py
If you encounter permissions errors, make sure you ran the az keyvault set-policy
or Set-AzKeyVaultAccessPolicy command.
Rerunning the code with the same key name may produce the error, "(Conflict)
Certificate <name> is currently in a deleted but recoverable state." Use a different
key name.
Code details
In the example code, the name of your key vault is expanded to the key vault URI, in the
format https://\<your-key-vault-name>.vault.azure.net .
Python
credential = DefaultAzureCredential()
client = CertificateClient(vault_url=KVUri, credential=credential)
Save a certificate
Once you've obtained the client object for the key vault, you can create a certificate
using the begin_create_certificate method:
Python
policy = CertificatePolicy.get_default()
poller = client.begin_create_certificate(certificate_name=certificateName,
policy=policy)
certificate = poller.result()
When Azure handles the request, it authenticates the caller's identity (the service
principal) using the credential object you provided to the client.
Retrieve a certificate
To read a certificate from Key Vault, use the get_certificate method:
Python
retrieved_certificate = client.get_certificate(certificateName)
You can also verify that the certificate has been set with the Azure CLI command az
keyvault certificate show or the Azure PowerShell cmdlet Get-AzKeyVaultCertificate
Delete a certificate
To delete a certificate, use the begin_delete_certificate method:
Python
poller = client.begin_delete_certificate(certificateName)
deleted_certificate = poller.result()
You can verify that the certificate is deleted with the Azure CLI command az keyvault
certificate show or the Azure PowerShell cmdlet Get-AzKeyVaultCertificate.
Once deleted, a certificate remains in a deleted but recoverable state for a time. If you
run the code again, use a different certificate name.
Clean up resources
If you want to also experiment with secrets and keys, you can reuse the Key Vault
created in this article.
Otherwise, when you're finished with the resources created in this article, use the
following command to delete the resource group and all its contained resources:
Azure CLI
Azure CLI
Next steps
Overview of Azure Key Vault
Secure access to a key vault
Azure Key Vault developer's guide
Key Vault security overview
Authenticate with Key Vault
Quickstart: Azure Key Vault keys client
library for Python
Article • 03/15/2023
Get started with the Azure Key Vault client library for Python. Follow these steps to
install the package and try out example code for basic tasks. By using Key Vault to store
cryptographic keys, you avoid storing such keys in your code, which increases the
security of your app.
Prerequisites
An Azure subscription - create one for free .
Python 3.7+
Azure CLI
This quickstart assumes you're running Azure CLI or Azure PowerShell in a Linux terminal
window.
Sign in to Azure
Azure CLI
Azure CLI
az login
If the CLI can open your default browser, it will do so and load an Azure sign-
in page.
terminal
terminal
Azure CLI
Azure CLI
Azure CLI
az keyvault create --name <your-unique-keyvault-name> --resource-
group myResourceGroup
Console
export KEY_VAULT_NAME=<your-unique-keyvault-name>
Azure CLI
Azure CLI
Python
import os
from azure.keyvault.keys import KeyClient
from azure.identity import DefaultAzureCredential
keyVaultName = os.environ["KEY_VAULT_NAME"]
KVUri = "https://" + keyVaultName + ".vault.azure.net"
credential = DefaultAzureCredential()
client = KeyClient(vault_url=KVUri, credential=credential)
print(" done.")
retrieved_key = client.get_key(keyName)
poller = client.begin_delete_key(keyName)
deleted_key = poller.result()
print(" done.")
terminal
python kv_keys.py
If you encounter permissions errors, make sure you ran the az keyvault set-policy
or Set-AzKeyVaultAccessPolicy command.
Rerunning the code with the same key name may produce the error, "(Conflict) Key
<name> is currently in a deleted but recoverable state." Use a different key name.
Code details
Authenticate and create a client
Application requests to most Azure services must be authorized. Using the
DefaultAzureCredential class provided by the Azure Identity client library is the
recommended approach for implementing passwordless connections to Azure services
in your code. DefaultAzureCredential supports multiple authentication methods and
determines which method should be used at runtime. This approach enables your app
to use different authentication methods in different environments (local vs. production)
without implementing environment-specific code.
In the example code, the name of your key vault is expanded using the value of the
KVUri variable, in the format: "https://<your-key-vault-name>.vault.azure.net".
Python
credential = DefaultAzureCredential()
client = KeyClient(vault_url=KVUri, credential=credential)
Save a key
Once you've obtained the client object for the key vault, you can store a key using the
create_rsa_key method:
Python
Calling a create method generates a call to the Azure REST API for the key vault.
When Azure handles the request, it authenticates the caller's identity (the service
principal) using the credential object you provided to the client.
Retrieve a key
To read a key from Key Vault, use the get_key method:
Python
retrieved_key = client.get_key(keyName)
You can also verify that the key has been set with the Azure CLI command az keyvault
key show or the Azure PowerShell cmdlet Get-AzKeyVaultKey.
Delete a key
To delete a key, use the begin_delete_key method:
Python
poller = client.begin_delete_key(keyName)
deleted_key = poller.result()
The begin_delete_key method is asynchronous and returns a poller object. Calling the
poller's result method waits for its completion.
You can verify that the key is deleted with the Azure CLI command az keyvault key show
or the Azure PowerShell cmdlet Get-AzKeyVaultKey.
Once deleted, a key remains in a deleted but recoverable state for a time. If you run the
code again, use a different key name.
Clean up resources
If you want to also experiment with certificates and secrets, you can reuse the Key Vault
created in this article.
Otherwise, when you're finished with the resources created in this article, use the
following command to delete the resource group and all its contained resources:
Azure CLI
Azure CLI
Get started with the Azure Key Vault secret client library for Python. Follow these steps
to install the package and try out example code for basic tasks. By using Key Vault to
store secrets, you avoid storing secrets in your code, which increases the security of your
app.
Prerequisites
An Azure subscription - create one for free .
Python 3.7+.
Azure CLI or Azure PowerShell.
This quickstart assumes you're running Azure CLI or Azure PowerShell in a Linux terminal
window.
Sign in to Azure
Azure CLI
Azure CLI
az login
If the CLI can open your default browser, it will do so and load an Azure sign-
in page.
terminal
terminal
Azure CLI
Azure CLI
Azure CLI
az keyvault create --name <your-unique-keyvault-name> --resource-
group myResourceGroup
Console
export KEY_VAULT_NAME=<your-unique-keyvault-name>
Azure CLI
Azure CLI
Python
import os
from azure.keyvault.secrets import SecretClient
from azure.identity import DefaultAzureCredential
keyVaultName = os.environ["KEY_VAULT_NAME"]
KVUri = f"https://{keyVaultName}.vault.azure.net"
credential = DefaultAzureCredential()
client = SecretClient(vault_url=KVUri, credential=credential)
client.set_secret(secretName, secretValue)
print(" done.")
retrieved_secret = client.get_secret(secretName)
poller = client.begin_delete_secret(secretName)
deleted_secret = poller.result()
print(" done.")
terminal
python kv_secrets.py
If you encounter permissions errors, make sure you ran the az keyvault set-policy
or Set-AzKeyVaultAccessPolicy command.
Rerunning the code with the same secret name may produce the error, "(Conflict)
Secret <name> is currently in a deleted but recoverable state." Use a different
secret name.
Code details
In the example code, the name of your key vault is expanded using the value of the
KVUri variable, in the format: "https://<your-key-vault-name>.vault.azure.net".
Python
credential = DefaultAzureCredential()
client = SecretClient(vault_url=KVUri, credential=credential)
Save a secret
Once you've obtained the client object for the key vault, you can store a secret using the
set_secret method:
Python
client.set_secret(secretName, secretValue)
Calling set_secret generates a call to the Azure REST API for the key vault.
When Azure handles the request, it authenticates the caller's identity (the service
principal) using the credential object you provided to the client.
Retrieve a secret
To read a secret from Key Vault, use the get_secret method:
Python
retrieved_secret = client.get_secret(secretName)
You can also retrieve a secret with the Azure CLI command az keyvault secret show or
the Azure PowerShell cmdlet Get-AzKeyVaultSecret.
Delete a secret
To delete a secret, use the begin_delete_secret method:
Python
poller = client.begin_delete_secret(secretName)
deleted_secret = poller.result()
You can verify that the secret had been removed with the Azure CLI command az
keyvault secret show or the Azure PowerShell cmdlet Get-AzKeyVaultSecret.
Once deleted, a secret remains in a deleted but recoverable state for a time. If you run
the code again, use a different secret name.
Clean up resources
If you want to also experiment with certificates and keys, you can reuse the Key Vault
created in this article.
Otherwise, when you're finished with the resources created in this article, use the
following command to delete the resource group and all its contained resources:
Azure CLI
Azure CLI
az group delete --resource-group myResourceGroup
Next steps
Overview of Azure Key Vault
Azure Key Vault developer's guide
Key Vault security overview
Authenticate with Key Vault
Send messages to and receive messages
from Azure Service Bus queues (Python)
Article • 01/20/2023
7 Note
This quick start provides step-by-step instructions for a simple scenario of sending
messages to a Service Bus queue and receiving them. You can find pre-built
JavaScript and TypeScript samples for Azure Service Bus in the Azure SDK for
Python repository on GitHub .
Prerequisites
If you're new to the service, see Service Bus overview before you do this quickstart.
An Azure subscription. To complete this tutorial, you need an Azure account. You
can activate your MSDN subscriber benefits or sign-up for a free account .
Passwordless (Recommended)
Use the same account when you add the appropriate data role to your
resource.
Run the code in the same terminal or command prompt.
Note the queue name for your Service Bus namespace. You'll need that in the
code.
7 Note
This tutorial works with samples that you can copy and run using Python. For
instructions on how to create a Python application, see Create and deploy a
Python application to an Azure Website. For more information about installing
packages used in this tutorial, see the Python Installation Guide.
To create a namespace:
2. In the left navigation pane of the portal, select All services, select Integration from
the list of categories, hover the mouse over Service Bus, and then select Create on
the Service Bus tile.
3. In the Basics tag of the Create namespace page, follow these steps:
b. For Resource group, choose an existing resource group in which the namespace
will live, or create a new one.
c. Enter a name for the namespace. The namespace name should adhere to the
following naming conventions:
The name must be unique across Azure. The system immediately checks to
see if the name is available.
The name length is at least 6 and at most 50 characters.
The name can contain only letters, numbers, hyphens “-“.
The name must start with a letter and end with a letter or number.
The name doesn't end with “-sb“ or “-mgmt“.
d. For Location, choose the region in which your namespace should be hosted.
e. For Pricing tier, select the pricing tier (Basic, Standard, or Premium) for the
namespace. For this quickstart, select Standard.
) Important
If you want to use topics and subscriptions, choose either Standard or
Premium. Topics/subscriptions aren't supported in the Basic pricing tier.
If you selected the Premium pricing tier, specify the number of messaging
units. The premium tier provides resource isolation at the CPU and memory
level so that each workload runs in isolation. This resource container is called a
messaging unit. A premium namespace has at least one messaging unit. You
can select 1, 2, 4, 8 or 16 messaging units for each Service Bus Premium
namespace. For more information, see Service Bus Premium Messaging.
3. Enter a name for the queue, and leave the other values with their defaults.
The first option shows you how to use your security principal in Azure Active Directory
and role-based access control (RBAC) to connect to a Service Bus namespace. You don't
need to worry about having hard-coded connection string in your code or in a
configuration file or in a secure storage like Azure Key Vault.
The second option shows you how to use a connection string to connect to a Service
Bus namespace. If you are new to Azure, you may find the connection string option
easier to follow. We recommend using the passwordless option in real-world
applications and production environments. For more information, see Authentication
and authorization. You can also read more about passwordless authentication on the
overview page.
Passwordless (Recommended)
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. Learn more
about the available scopes for role assignments on the scope overview page.
The following example assigns the Azure Service Bus Data Owner role to your user
account, which provides full access to Azure Service Bus resources. In a real
scenario, follow the Principle of Least Privilege to give users only the minimum
permissions needed for a more secure production environment.
Azure Service Bus Data Owner: Enables data access to Service Bus namespace
and its entities (queues, topics, subscriptions, and filters). A member of this
role can send and receive messages from queues or topics/subscriptions.
Azure Service Bus Data Sender: Use this role to give the send access to Service
Bus namespace and its entities.
Azure Service Bus Data Receiver: Use this role to give the receive access to
Service Bus namespace and its entities.
If you want to create a custom role, see Rights required for Service Bus operations.
) Important
In most cases, it will take a minute or two for the role assignment to propagate
in Azure. In rare cases, it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
1. If you don't have the Service Bus Namespace page open in the Azure portal,
locate your Service Bus namespace using the main search bar or left
navigation.
2. On the overview page, select Access control (IAM) from the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this example,
search for Azure Service Bus Data Owner and select the matching result. Then
choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
7. In the dialog, search for your Azure AD username (usually your user@domain
email address) and then choose Select at the bottom of the dialog.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Use pip to install packages
Passwordless (Recommended)
1. To install the required Python packages for this Service Bus tutorial, open a
command prompt that has Python in its path, change the directory to the
folder where you want to have your samples.
shell
Passwordless (Recommended)
Python
import asyncio
from azure.servicebus.aio import ServiceBusClient
from azure.servicebus import ServiceBusMessage
from azure.identity.aio import DefaultAzureCredential
Python
FULLY_QUALIFIED_NAMESPACE = "FULLY_QUALIFIED_NAMESPACE"
QUEUE_NAME = "QUEUE_NAME"
credential = DefaultAzureCredential()
) Important
Python
The sender is an object that acts as a client for the queue you created. You'll
create it later and send as an argument to this function.
Python
Python
batch_message.add_message(ServiceBusMessage("Message inside a
ServiceBusMessageBatch"))
except ValueError:
# ServiceBusMessageBatch object reaches max_size.
# New ServiceBusMessageBatch object can be created
here to send more data.
break
# Send the batch of messages to the queue
await sender.send_messages(batch_message)
print("Sent a batch of 10 messages")
6. Create a Service Bus client and then a queue sender object to send messages.
Python
Python
asyncio.run(run())
print("Done sending messages")
print("-----------------------")
Passwordless (Recommended)
1. Similar to the send sample, add import statements, define constants that you
should replace with your own values, and define a credential.
Python
import asyncio
FULLY_QUALIFIED_NAMESPACE = "FULLY_QUALIFIED_NAMESPACE"
QUEUE_NAME = "QUEUE_NAME"
credential = DefaultAzureCredential()
2. Create a Service Bus client and then a queue receiver object to receive
messages.
Python
Python
asyncio.run(run())
shell
Console
In the Azure portal, navigate to your Service Bus namespace. On the Overview page,
verify that the incoming and outgoing message counts are 16. If you don't see the
counts, refresh the page after waiting for a few minutes.
Select the queue on this Overview page to navigate to the Service Bus Queue page.
You can also see the incoming and outgoing message count on this page. You also see
other information such as the current size of the queue and active message count.
Next steps
See the following documentation and samples:
7 Note
Prerequisites
An Azure subscription .
Python 3.7 or higher, with the Azure Python SDK package installed.
7 Note
This tutorial works with samples that you can copy and run using Python. For
instructions on how to create a Python application, see Create and deploy a
Python application to an Azure Website. For more information about installing
packages used in this tutorial, see the Python Installation Guide.
To create a namespace:
2. In the left navigation pane of the portal, select All services, select Integration from
the list of categories, hover the mouse over Service Bus, and then select Create on
the Service Bus tile.
3. In the Basics tag of the Create namespace page, follow these steps:
b. For Resource group, choose an existing resource group in which the namespace
will live, or create a new one.
c. Enter a name for the namespace. The namespace name should adhere to the
following naming conventions:
The name must be unique across Azure. The system immediately checks to
see if the name is available.
The name length is at least 6 and at most 50 characters.
The name can contain only letters, numbers, hyphens “-“.
The name must start with a letter and end with a letter or number.
The name doesn't end with “-sb“ or “-mgmt“.
d. For Location, choose the region in which your namespace should be hosted.
e. For Pricing tier, select the pricing tier (Basic, Standard, or Premium) for the
namespace. For this quickstart, select Standard.
) Important
If you selected the Premium pricing tier, specify the number of messaging
units. The premium tier provides resource isolation at the CPU and memory
level so that each workload runs in isolation. This resource container is called a
messaging unit. A premium namespace has at least one messaging unit. You
can select 1, 2, 4, 8 or 16 messaging units for each Service Bus Premium
namespace. For more information, see Service Bus Premium Messaging.
5. You see the home page for your service bus namespace.
3. Enter a name for the topic. Leave the other options with their default values.
4. Select Create.
Create a subscription to the topic
1. Select the topic that you created in the previous section.
The first option shows you how to use your security principal in Azure Active Directory
and role-based access control (RBAC) to connect to a Service Bus namespace. You don't
need to worry about having hard-coded connection string in your code or in a
configuration file or in a secure storage like Azure Key Vault.
The second option shows you how to use a connection string to connect to a Service
Bus namespace. If you are new to Azure, you may find the connection string option
easier to follow. We recommend using the passwordless option in real-world
applications and production environments. For more information, see Authentication
and authorization. You can also read more about passwordless authentication on the
overview page.
Passwordless (Recommended)
The following example assigns the Azure Service Bus Data Owner role to your user
account, which provides full access to Azure Service Bus resources. In a real
scenario, follow the Principle of Least Privilege to give users only the minimum
permissions needed for a more secure production environment.
Azure Service Bus Data Owner: Enables data access to Service Bus namespace
and its entities (queues, topics, subscriptions, and filters). A member of this
role can send and receive messages from queues or topics/subscriptions.
Azure Service Bus Data Sender: Use this role to give the send access to Service
Bus namespace and its entities.
Azure Service Bus Data Receiver: Use this role to give the receive access to
Service Bus namespace and its entities.
If you want to create a custom role, see Rights required for Service Bus operations.
) Important
In most cases, it will take a minute or two for the role assignment to propagate
in Azure. In rare cases, it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
1. If you don't have the Service Bus Namespace page open in the Azure portal,
locate your Service Bus namespace using the main search bar or left
navigation.
2. On the overview page, select Access control (IAM) from the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this example,
search for Azure Service Bus Data Owner and select the matching result. Then
choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
7. In the dialog, search for your Azure AD username (usually your user@domain
email address) and then choose Select at the bottom of the dialog.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Code setup
Passwordless (Recommended)
To follow this quickstart using passwordless authentication and your own Azure
account:
Use the same account when you add the appropriate role to your resource
later in the tutorial.
Run the tutorial code in the same terminal or command prompt.
) Important
Make sure you sign in with az login . The DefaultAzureCredential class in the
passwordless code uses the Azure CLI credentials to authenticate with Azure
Active Directory (Azure AD).
Passwordless (Recommended)
1. To install the required Python packages for this Service Bus tutorial, open a
command prompt that has Python in its path. Change the directory to the
folder where you want to have your samples.
2. Install packages:
shell
Open your favorite editor, such as Visual Studio Code , create a file send.py, and add
the following code into it.
Passwordless (Recommended)
Python
import asyncio
from azure.servicebus.aio import ServiceBusClient
from azure.servicebus import ServiceBusMessage
from azure.identity.aio import DefaultAzureCredential
Python
FULLY_QUALIFIED_NAMESPACE = "FULLY_QUALIFIED_NAMESPACE"
TOPIC_NAME = "TOPIC_NAME"
credential = DefaultAzureCredential()
) Important
In the preceding code, you used the Azure Identity client library's
DefaultAzureCredential class. When the app runs locally during development,
Python
Python
Python
batch_message.add_message(ServiceBusMessage("Message inside a
ServiceBusMessageBatch"))
except ValueError:
# ServiceBusMessageBatch object reaches max_size.
# New ServiceBusMessageBatch object can be created
here to send more data.
break
# Send the batch of messages to the topic
await sender.send_messages(batch_message)
print("Sent a batch of 10 messages")
6. Create a Service Bus client and then a topic sender object to send messages.
Python
asyncio.run(run())
print("Done sending messages")
print("-----------------------")
Open your favorite editor, such as Visual Studio Code , create a file recv.py, and add
the following code into it.
Passwordless (Recommended)
1. Similar to the send sample, add import statements, define constants that you
should replace with your own values, and define a credential.
Python
import asyncio
from azure.servicebus.aio import ServiceBusClient
from azure.identity.aio import DefaultAzureCredential
FULLY_QUALIFIED_NAMESPACE = "FULLY_QUALIFIED_NAMESPACE"
SUBSCRIPTION_NAME = "SUBSCRIPTION_NAME"
TOPIC_NAME = "TOPIC_NAME"
credential = DefaultAzureCredential()
2. Create a Service Bus client and then a subscription receiver object to receive
messages.
Python
Python
asyncio.run(run())
shell
Console
Sent a single message
Sent a list of 5 messages
Sent a batch of 10 messages
Done sending messages
-----------------------
Received: Single Message
Received: Message in list
Received: Message in list
Received: Message in list
Received: Message in list
Received: Message in list
Received: Message inside a ServiceBusMessageBatch
Received: Message inside a ServiceBusMessageBatch
Received: Message inside a ServiceBusMessageBatch
Received: Message inside a ServiceBusMessageBatch
Received: Message inside a ServiceBusMessageBatch
Received: Message inside a ServiceBusMessageBatch
Received: Message inside a ServiceBusMessageBatch
Received: Message inside a ServiceBusMessageBatch
Received: Message inside a ServiceBusMessageBatch
Received: Message inside a ServiceBusMessageBatch
In the Azure portal, navigate to your Service Bus namespace. On the Overview page,
verify that the incoming and outgoing message counts are 16. If you don't see the
counts, refresh the page after waiting for a few minutes.
Select the topic in the bottom pane to see the Service Bus Topic page for your topic. On
this page, you should see three incoming and three outgoing messages in the Messages
chart.
On this page, if you select a subscription, you get to the Service Bus Subscription page.
You can see the active message count, dead-letter message count, and more on this
page. In this example, all the messages have been received, so the active message count
is zero.
If you comment out the receive code, you'll see the active message count as 16.
Next steps
See the following documentation and samples:
Get started with the Azure Blob Storage client library for Python to manage blobs and
containers. Follow these steps to install the package and try out example code for basic
tasks in an interactive console app.
Prerequisites
Azure account with an active subscription - create an account for free
Azure Storage account - create a storage account
Python 3.6+
Setting up
This section walks you through preparing a project to work with the Azure Blob Storage
client library for Python.
1. In a console window (such as PowerShell or Bash), create a new directory for the
project:
Console
mkdir blob-quickstart
Console
cd blob-quickstart
Console
Python
try:
print("Azure Blob Storage Python quickstart sample")
Object model
Azure Blob Storage is optimized for storing massive amounts of unstructured data.
Unstructured data is data that doesn't adhere to a particular data model or definition,
such as text or binary data. Blob storage offers three types of resources:
Code examples
These example code snippets show you how to do the following tasks with the Azure
Blob Storage client library for Python:
You can also authorize requests to Azure Blob Storage by using the account access key.
However, this approach should be used with caution. Developers must be diligent to
never expose the access key in an unsecure location. Anyone who has the access key is
able to authorize requests against the storage account, and effectively has access to all
the data. DefaultAzureCredential offers improved management and security benefits
over the account key to allow passwordless authentication. Both options are
demonstrated in the following example.
Passwordless (Recommended)
The order and locations in which DefaultAzureCredential looks for credentials can
be found in the Azure Identity library overview.
For example, your app can authenticate using your Azure CLI sign-in credentials
with when developing locally. Your app can then use a managed identity once it has
been deployed to Azure. No code changes are required for this transition.
In this scenario, you'll assign permissions to your user account, scoped to the
storage account, to follow the Principle of Least Privilege. This practice gives users
only the minimum permissions needed and creates more secure production
environments.
The following example will assign the Storage Blob Data Contributor role to your
user account, which provides both read and write access to blob data in your
storage account.
) Important
In most cases it will take a minute or two for the role assignment to propagate
in Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your storage account using the main search
bar or left navigation.
2. On the storage account overview page, select Access control (IAM) from
the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Storage Blob Data Contributor and select the
matching result and then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
You can authorize access to data in your storage account using the following steps:
1. Make sure you're authenticated with the same Azure AD account you assigned
the role to on your storage account. You can authenticate via the Azure CLI,
Visual Studio Code, or Azure PowerShell.
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
Python
3. Add this code inside the try block. When the code runs on your local
workstation, DefaultAzureCredential uses the developer credentials of the
prioritized tool you're logged into to authenticate to Azure. Examples of these
tools include Azure CLI or Visual Studio Code.
Python
account_url = "https://<storageaccountname>.blob.core.windows.net"
default_credential = DefaultAzureCredential()
4. Make sure to update the storage account name in the URI of your
BlobServiceClient object. The storage account name can be found on the
Create a container
Decide on a name for the new container. The code below appends a UUID value to the
container name to ensure that it's unique.
) Important
Call the create_container method to actually create the container in your storage
account.
Python
# Create a unique name for the container
container_name = str(uuid.uuid4())
To learn more about creating a container, and to explore more code samples, see Create
a blob container with Python.
Python
# Create a blob client using the local file name as the name for the blob
blob_client = blob_service_client.get_blob_client(container=container_name,
blob=local_file_name)
Python
print("\nListing blobs...")
To learn more about listing blobs, and to explore more code samples, see List blobs with
Python.
Download blobs
Download the previously created blob by calling the download_blob method. The
example code adds a suffix of "DOWNLOAD" to the file name so that you can see both
files in local file system.
Python
Delete a container
The following code cleans up the resources the app created by removing the entire
container using the delete_container method. You can also delete the local files, if you
like.
The app pauses for user input by calling input() before it deletes the blob, container,
and local files. Verify that the resources were created correctly before they're deleted.
Python
# Clean up
print("\nPress the Enter key to begin clean up")
input()
print("Done")
To learn more about deleting a container, and to explore more code samples, see Delete
and restore a blob container with Python.
Navigate to the directory containing the blob-quickstart.py file, then execute the
following python command to run the app:
Console
python blob-quickstart.py
The output of the app is similar to the following example (UUID values omitted for
readability):
Output
Listing blobs...
quickstartUUID.txt
Downloading blob to
./data/quickstartUUIDDOWNLOAD.txt
Before you begin the cleanup process, check your data folder for the two files. You can
compare them and observe that they're identical.
Clean up resources
After you've verified the files and finished testing, press the Enter key to delete the test
files along with the container you created in the storage account. You can also use Azure
CLI to delete resources.
Next steps
In this quickstart, you learned how to upload, download, and list blobs using Python.
To learn more, see the Azure Blob Storage client libraries for Python.
For tutorials, samples, quickstarts, and other documentation, visit Azure for Python
Developers.
Quickstart: Azure Queue Storage client
library for Python
Article • 06/29/2023
Get started with the Azure Queue Storage client library for Python. Azure Queue Storage
is a service for storing large numbers of messages for later retrieval and processing.
Follow these steps to install the package and try out example code for basic tasks.
Use the Azure Queue Storage client library for Python to:
Create a queue
Add messages to a queue
Peek at messages in a queue
Update a message in a queue
Get the queue length
Receive messages from a queue
Delete messages from a queue
Delete a queue
Prerequisites
Azure subscription - create one for free
Azure Storage account - create a storage account
Python 3.6+
Setting up
This section walks you through preparing a project to work with the Azure Queue
Storage client library for Python.
1. In a console window (such as cmd, PowerShell, or Bash), create a new directory for
the project.
Console
mkdir queues-quickstart
Console
cd queues-quickstart
Console
3. Create the structure for the program, including basic exception handling
Python
try:
print("Azure Queue storage - Python quickstart sample")
# Quickstart code goes here
except Exception as ex:
print('Exception:')
print(ex)
4. Save the new file as queues-quickstart.py in the queues-quickstart directory.
Authenticate to Azure
Application requests to most Azure services must be authorized. Using the
DefaultAzureCredential class provided by the Azure Identity client library is the
recommended approach for implementing passwordless connections to Azure services
in your code.
You can also authorize requests to Azure services using passwords, connection strings,
or other credentials directly. However, this approach should be used with caution.
Developers must be diligent to never expose these secrets in an unsecure location.
Anyone who gains access to the password or secret key is able to authenticate.
DefaultAzureCredential offers improved management and security benefits over the
account key to allow passwordless authentication. Both options are demonstrated in the
following example.
Passwordless (Recommended)
For example, your app can authenticate using your Visual Studio Code sign-in
credentials when developing locally, and then use a managed identity once it has
been deployed to Azure. No code changes are required for this transition.
When developing locally, make sure that the user account that is accessing the
queue data has the correct permissions. You'll need Storage Queue Data
Contributor to read and write queue data. To assign yourself this role, you'll need
to be assigned the User Access Administrator role, or another role that includes the
Microsoft.Authorization/roleAssignments/write action. You can assign Azure RBAC
roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. You can learn
more about the available scopes for role assignments on the scope overview page.
In this scenario, you'll assign permissions to your user account, scoped to the
storage account, to follow the Principle of Least Privilege. This practice gives users
only the minimum permissions needed and creates more secure production
environments.
The following example will assign the Storage Queue Data Contributor role to your
user account, which provides both read and write access to queue data in your
storage account.
) Important
In most cases it will take a minute or two for the role assignment to propagate
in Azure, but in rare cases it may take up to eight minutes. If you receive
authentication errors when you first run your code, wait a few moments and
try again.
Azure portal
1. In the Azure portal, locate your storage account using the main search
bar or left navigation.
2. On the storage account overview page, select Access control (IAM) from
the left-hand menu.
3. On the Access control (IAM) page, select the Role assignments tab.
4. Select + Add from the top menu and then Add role assignment from the
resulting drop-down menu.
5. Use the search box to filter the results to the desired role. For this
example, search for Storage Queue Data Contributor and select the
matching result and then choose Next.
6. Under Assign access to, select User, group, or service principal, and then
choose + Select members.
8. Select Review + assign to go to the final page, and then Review + assign
again to complete the process.
Object model
Azure Queue Storage is a service for storing large numbers of messages. A queue
message can be up to 64 KB in size. A queue may contain millions of messages, up to
the total capacity limit of a storage account. Queues are commonly used to create a
backlog of work to process asynchronously. Queue Storage offers three types of
resources:
Storage account: All access to Azure Storage is done through a storage account.
For more information about storage accounts, see Storage account overview
Queue: A queue contains a set of messages. All messages must be in a queue.
Note that the queue name must be all lowercase. For information on naming
queues, see Naming Queues and Metadata.
Message: A message, in any format, of up to 64 KB. A message can remain in the
queue for a maximum of 7 days. For version 2017-07-29 or later, the maximum
time-to-live can be any positive number, or -1 indicating that the message doesn't
expire. If this parameter is omitted, the default time-to-live is seven days.
Code examples
These example code snippets show you how to do the following actions with the Azure
Queue Storage client library for Python:
Passwordless (Recommended)
Azure CLI
Sign-in to Azure through the Azure CLI using the following command:
Azure CLI
az login
Once authenticated, you can create and authorize a QueueClient object using
DefaultAzureCredential to access queue data in the storage account.
DefaultAzureCredential automatically discovers and uses the account you signed in
Python
Decide on a name for the queue and create an instance of the QueueClient class,
using DefaultAzureCredential for authorization. We use this client object to create
and interact with the queue resource in the storage account.
) Important
Queue names may only contain lowercase letters, numbers, and hyphens, and
must begin with a letter or a number. Each hyphen must be preceded and
followed by a non-hyphen character. The name must also be between 3 and 63
characters long. For more information about naming queues, see Naming
queues and metadata.
Add the following code inside the try block, and make sure to replace the
<storage-account-name> placeholder value:
Python
account_url = "https://<storageaccountname>.queue.core.windows.net"
default_credential = DefaultAzureCredential()
Queue messages are stored as text. If you want to store binary data, set up Base64
encoding and decoding functions before putting a message in the queue.
You can configure Base64 encoding and decoding functions when creating the client
object:
Python
Create a queue
Using the QueueClient object, call the create_queue method to create the queue in your
storage account.
Python
Python
Python
Python
Python
properties = queue_client.get_queue_properties()
count = properties.approximate_message_count
print("Message count: " + str(count))
The result is approximate since messages can be added or removed after the service
responds to your request.
Python
When calling the receive_messages method, you can optionally specify a value for
max_messages , which is the number of messages to retrieve from the queue. The default
is 1 message and the maximum is 32 messages. You can also specify a value for
visibility_timeout , which hides the messages from other operations for the timeout
The app pauses for user input by calling input before it processes and deletes the
messages. Verify in your Azure portal that the resources were created correctly, before
they're deleted. Any messages not explicitly deleted eventually become visible in the
queue again for another chance to process them.
Python
print("\nPress Enter key to 'process' messages and delete them from the
queue...")
input()
for msg_batch in messages.by_page():
for msg in msg_batch:
# "Process" the message
print(msg.content)
# Let the service know we're finished with
# the message and it can be safely deleted.
queue_client.delete_message(msg)
Delete a queue
The following code cleans up the resources the app created by deleting the queue using
the delete_queue method.
Add this code to the end of the try block and save the file:
Python
# Clean up
print("Deleting queue...")
queue_client.delete_queue()
print("Done")
Console
python queues-quickstart.py
Output
Azure Queue Storage client library - Python quickstart sample
Creating queue: quickstartqueues-<UUID>
Press Enter key to 'process' messages and delete them from the queue...
First message
Second message
Third message has been updated
Deleting queue...
Done
When the app pauses before receiving messages, check your storage account in the
Azure portal . Verify the messages are in the queue.
Press the Enter key to receive and delete the messages. When prompted, press the
Enter key again to delete the queue and finish the demo.
Next steps
In this quickstart, you learned how to create a queue and add messages to it using
Python code. Then you learned to peek, retrieve, and delete messages. Finally, you
learned how to delete a message queue.
For related code samples using deprecated Python version 2 SDKs, see Code
samples using Python version 2.
To learn more, see the Azure Storage libraries for Python .
For more Azure Queue Storage sample apps, see Azure Queue Storage client
library for Python - samples .
Azure developer documentation
Find the languages and tools you need to develop on Azure.
OVERVIEW OVERVIEW
Host applications on Azure Connect apps to Azure services
CONCEPT ARCHITECTURE
Create resources in Azure Key concepts for building Azure
apps
CONCEPT CONCEPT
Understand Azure billing Versioning policy for Azure
services, SDKs, and CLI tools
Python JavaScript
Deploy serverless Python apps to Azure Develop a static website
Functions
Deploy to Azure App Service
Deploy Python apps to Azure App Service
Deploy a serverless application
Manage storage blobs with the Azure SDK for
Deploy Docker containers
Python
Migrate a MongoDB app to Azure Cosmos DB
Use Python to query Azure SQL Database
See more in the JavaScript developer center
Create an Azure Data Factory using Python
See more in the Python developer center
Java .NET
Install the JDK for Azure and Azure Stack Introduction to Azure and .NET
Deploy an app to Azure Spring Apps by using Configure your .NET development environment
the Azure portal for Azure
Create a Java app in Azure App Service Deploy an ASP.NET web app
Use Spring Boot Starter for Azure Active Build a serverless function
Directory
Azure SDK for .NET
Migrate Java Applications to Azure
See more in the .NET developer center
See more in the Java developer center
Go Azure PowerShell
Install the Azure SDK for Go What is the new Az module?
Authenticate your app Migrate from AzureRM to Az
Use Blob storage Install
Azure SDK for Go code samples Sign in
See more in the Go developer center Persist credential contexts
See more in the Azure PowerShell developer
center
Developer Tools
Use your favorite development tools when working with Azure
Maven
Use Maven to automate the way you build and
manage Java projects. It works with other
languages too.
Ansible Chef
Use Ansible to automate cloud provisioning, Use Chef to transform your virtual machine
configuration management, and application infrastructure on Azure into code.
deployments.
Azure for .NET developers
Learn to use the Azure SDK for .NET. Browse API reference, sample code, tutorials, quickstarts,
conceptual articles and more. Know that .NET 💜 Azure.
OVERVIEW QUICKSTART
Introduction to Azure and .NET Create an ASP.NET Core web app
in Azure
QUICKSTART TUTORIAL
Build a serverless function ASP.NET Core and Docker
DEPLOY TUTORIAL
Deploy a .NET app with Azure Authentication end-to-end in
DevOps App Service
Featured content
Learn to develop .NET apps leveraging a variety of Azure services.
Are you interested in contributing to the .NET docs? For more information, see our contributor guide.
Azure for Java developer documentation
Get started developing apps for the cloud with these tutorials and tools for Java developers.
Tools
Maven Gradle
Azure CLI Jenkins on Azure
Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.
Azure for JavaScript & Node.js
developers
Explore the power of JavaScript on Azure through Quickstarts, How-To Guides, codes
samples and more.
New to Azure?
b GET STARTED
What is Azure?
Azure Fundamentals
Install Node.js
g TUTORIAL
Migrate to serverless
g TUTORIAL
Web + Data
f QUICKSTART
Storage on Azure
Databases on Azure
GraphQL on Azure
Serverless API + DB
AI/ML
f QUICKSTART
Language detection
Speech to text
Text to speech
Image analysis
Use Azure client libraries (SDK)
b GET STARTED
Samples browser
Developer Guides
b GET STARTED
Developer tools
b GET STARTED
Windows Terminal
Serverless functions
c HOW-TO GUIDE
Web apps
g TUTORIAL
GitHub Actions
CI/CD pipeline
Logs
Containers
g TUTORIAL
f QUICKSTART
SQL databases
Machine learning
c HOW-TO GUIDE
Create an ML experiment
Create ML pipelines
b GET STARTED
Get started
Developer tools
b GET STARTED
Get started
e OVERVIEW
a DOWNLOAD
Data
f QUICKSTART
Virtual Machines
f QUICKSTART
Serverless
f QUICKSTART
Create a Go serverless function in Azure
Containers
f QUICKSTART
Open source
i REFERENCE