100% found this document useful (1 vote)
287 views8 pages

Real Time Use Case Scenario in Azure Data Factory 1647784753

The document discusses migrating tables with different schemas from an on-premise Adventureworks2016 database to Azure SQL Cloud. It provides a 6-step process to configure the data factory to: 1) Use lookup and foreach activities to retrieve schema and table names dynamically from the source database. 2) Use copy activity in a foreach loop to migrate each table to the destination Azure SQL database. 3) Configure source and sink datasets dynamically using parameters for schema and table names to migrate all tables from different schemas to Azure SQL Cloud while retaining the schema names.

Uploaded by

Piyush Raj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
287 views8 pages

Real Time Use Case Scenario in Azure Data Factory 1647784753

The document discusses migrating tables with different schemas from an on-premise Adventureworks2016 database to Azure SQL Cloud. It provides a 6-step process to configure the data factory to: 1) Use lookup and foreach activities to retrieve schema and table names dynamically from the source database. 2) Use copy activity in a foreach loop to migrate each table to the destination Azure SQL database. 3) Configure source and sink datasets dynamically using parameters for schema and table names to migrate all tables from different schemas to Azure SQL Cloud while retaining the schema names.

Uploaded by

Piyush Raj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Migrate all the tables with different schema to Azure SQL

Cloud with schema name.


Below is the list of tables that needs to be migrated from on premise Adventureworks2016 to
Azure SQL Cloud. It has difference schemas like Human Resources, Person and Production like
that.

1. Go to the Azure Data Factory , and then go to Author tab which will allows you to develop
your data factory requirements.
2. Here We have to take Lookup activities from the General Section of the Activities and need
to configure the same just as below.
In the Dataset , we need to configure the source data set along with linked services which is self-
hosted and point to on premise database , which should displays the schema name and table name
from the entire database.
3 Then drag and drop the for each activity from the Iteration and controls and connect lookup
activity with for each activity to allow access to retrieve schema name and table name inside for
each activity.
4 Now take the copy data activity and configure the same inside for each activity to loop over all the
tables with difference schema inside Adventure works database.

And in the copy data activity configure the source and sink with dynamic datasets.

While configuring the source datasets dynamically we need to define two parameters just as below
to access all the tables with different schemas dynamically.

Which we can further use as table name which is dynamic just as below.
5. Now configure the Sink (Destination) and in that we have to configure linked server which should
be Autoresolve linked Services because we are move to Azure SQL Database.
6 Here also we need to take two different parameters for datasets one points to schema name and
the other one is pointing two table name just as below.
Which after that we can used it to configure datasets dynamically.

Sink final configuration looks like this.

By this process we can able to move all the mentioned tables are in Adventure Works 2016 Database
with different schemas to Azure SQL Cloud database just as below with schema name as well.

You might also like