0% found this document useful (0 votes)
300 views

Databricks Project

This document outlines a project flow to migrate banking application data from an on-premise file system to the cloud. The data will be stored in a Data Lake Gen2 using Azure Data Factory, then processed with data quality checks on Azure Databricks before loading the clean data into Azure Synapse Analytics for analysis. The required resources are an on-premise file system, Azure Data Factory, Data Lake Storage Gen2, Azure Synapse Analytics, and Azure Databricks.

Uploaded by

ambati
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
300 views

Databricks Project

This document outlines a project flow to migrate banking application data from an on-premise file system to the cloud. The data will be stored in a Data Lake Gen2 using Azure Data Factory, then processed with data quality checks on Azure Databricks before loading the clean data into Azure Synapse Analytics for analysis. The required resources are an on-premise file system, Azure Data Factory, Data Lake Storage Gen2, Azure Synapse Analytics, and Azure Databricks.

Uploaded by

ambati
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

Project Flow

 Let’s say we have some banking application and the data will be stored in On-premise and these
files want to move to cloud.
 Using ADF will store in Data Lake Gen2 and from there will use Data bricks for data quality
checks and all good data will take and load to Azure synapse analytics.

Required Resources

 On-premise File System


 Azure Data Factory
 Data Lake Storage Gen2
 Azure Synapse Analytics
 Azure Databricks

You might also like