Real Time Use Case Scenario in Azure Data Factory 1647784753
Real Time Use Case Scenario in Azure Data Factory 1647784753
1. Go to the Azure Data Factory , and then go to Author tab which will allows you to develop
your data factory requirements.
2. Here We have to take Lookup activities from the General Section of the Activities and need
to configure the same just as below.
In the Dataset , we need to configure the source data set along with linked services which is self-
hosted and point to on premise database , which should displays the schema name and table name
from the entire database.
3 Then drag and drop the for each activity from the Iteration and controls and connect lookup
activity with for each activity to allow access to retrieve schema name and table name inside for
each activity.
4 Now take the copy data activity and configure the same inside for each activity to loop over all the
tables with difference schema inside Adventure works database.
And in the copy data activity configure the source and sink with dynamic datasets.
While configuring the source datasets dynamically we need to define two parameters just as below
to access all the tables with different schemas dynamically.
Which we can further use as table name which is dynamic just as below.
5. Now configure the Sink (Destination) and in that we have to configure linked server which should
be Autoresolve linked Services because we are move to Azure SQL Database.
6 Here also we need to take two different parameters for datasets one points to schema name and
the other one is pointing two table name just as below.
Which after that we can used it to configure datasets dynamically.
By this process we can able to move all the mentioned tables are in Adventure Works 2016 Database
with different schemas to Azure SQL Cloud database just as below with schema name as well.