0% found this document useful (0 votes)
4 views28 pages

On-premises MySQL Database to Azure SQL Database

On-premises MySQL Database to Azure SQL Database

Uploaded by

praveensqldba21
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views28 pages

On-premises MySQL Database to Azure SQL Database

On-premises MySQL Database to Azure SQL Database

Uploaded by

praveensqldba21
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

PROJECT OBJECT:

To Migrate structured data from an on-premises MySQL database to Azure SQL Database using a
secure, dynamic, and reusable pipeline architecture built with Azure Data Factory (ADF).

PROJECT ARCHITECTURE:

RESOURCES REQIURED:

• MYSQL Work Bench

• Azure Data Factory (ADF) – Core orchestration tool.

• Self-hosted Integration Runtime (IR) – Secure connection to on-premise SQL Server.

• Azure SQL Database – Target cloud data store

CREATE A RESOURCE GROUP:


CREATE ADF:

CREATING THE IR:


INSTALL MICROSOFT INTEGRATION RUNTIME:
USECASE1

• To Copy 1 Table Or To Copy Tables One By One.

• For Copy 3 Tables We Need 6 Datasets.(ele_store,bike,sales)

Example:
DATASET CREATION:

SOURCE:

SINK:
USECASE:II

PROBLEM:

• With use case -1 is to migrate 50 tables , we need to have 100 datasets.

• Data Migration using Dataset Parameters.


WE NEED:

• 1 pipeline

• 2 datasets(but during the pipeline run time we need to give table name manually)
• IMPLEMENTATION STEPS:

STEP1: CREATE NEW PIPELINE: COPY ACTIVITY

o SOURCE:

▪ Dataset:

o Inputdata

o Linkedservice:

▪ Mysql2adf_ls

• TableName

o car_data
o SINK:

▪ Dataset:

o Outputdata

▪ Linkedservice:

• Adf2sql_ls

o TableName:

▪ Don’t give any table name now.

o Now gotoTable Option : select Auto create table

o Now open the outputdata --→Enable the Enter Manually-→Give the


table name as: Car_tbl.( but this is same as usecase1)

o So we need to implement the parameters in the source dataset and


sink dataset.
STEP2: CREATING PARAMETERS:

o Input Data:

o Parameters:

o Name: OnpremTableName
o Output Data:

o Parameters:

o Name: TargetTableName

STEP3: UPDATING THE PIPELINE

o Input Data:

o Remove the Manually given car_data.

o Add Dynamic content.


o Output Data:

o Remove the Manually given car_tbl.

o Add Dynamic content.

STEP 4: WHILE RUNNING THE PIPELINE

o 1. Move Car_Data:

o Now we need to move the table.


o Source:
Sink:

Output:

o 2. Move Electronic_Data:
Source:
Sink:

o 3. Move Sales_Data:
Source:

Sink:
USECASE3:

• We can create the parameter at the pipeline level.

• Along with the dataset parameters.

SOURCE:
SINK:

Run the pipeline:


USECASE-4

• If my table has 50 tables,1 copy activity should run 50 time because of 50 tables.

• We need:

o 1 copy activity.

Lets do it how:

• In mysql we have 22 tables.we need to copy all the tables in a single go.

• To see all the tables names in mysql:

Select tablename from information_schema.tables

Where table_schema=”amazon_db”

And table_type=”base Table”;

• To give table name dynamically instaed giving manually one by one.

• Follow the steps:


STEP1: ADD THE LOOKUP ACTIVITY

CREATE DATASET:
CREATE LS:

WRITE A QUERY IN THE LOOKUP ACTIVITY:


Output:

• We are getting the tablename of first table, but we need the names for all the tables.

• To do goto update pipelin-→disable the First row only checkbox:


RUN THE PIPELINE NOW:

Output: output of every activity will generate the JSON file as a output.

Table count : 22

Step2: we need to pass all the tables name one by one to copy activity, for that we need for
each activity.

If we are doing lookup to copy, only first table name will be passed. But we need all the
tables name.

Iteration-→for each
Double click the for each now add the copy activity

Now connect lookup and for each

Need to send output of lookup to for each:


Lookup-→setting:

You can select any from the above option.

Inside the foreach we have copy activity.


Copy activity:

Source:

We will get the tablename from lookup actitivyt.


So the query should be:

Target: sink:

In the ouput data:

Sink:
Will fail.so need to give as follow.

You might also like