0% found this document useful (0 votes)
26 views3 pages

Vineela

Uploaded by

chaitumtx
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views3 pages

Vineela

Uploaded by

chaitumtx
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Medisettyvineela1

Over 3 years of experience as an Azure Data Engineer in Stentor Technologies Limited from July 2018 to till date.
JOB EXPERIENCE SUMMARY

Having 3+ years of hands on experience in Azure Data Factory v2, Azure Data Lake, Azure Blob Storage, Azure
synapse analytics and SQL server.

Azure Data Factory:


Have vast knowledge on Data factory Core Concept's.
Extensively Worked on Copy Data activity.
Worked on Get Metadata, Look Up, Stored procedure, For Each, IF, Delete and Execute Pipeline activities.
Knowledge on processing stream data through Stream Analytics (IOT Hub, Event Hub and Blob Storage).
Good exposure to Azure SQL Database, Azure Synapse Analytics.
Involved in designing Logic Apps for the alerts.
Followed the Naming rules for all the Resources/Activities.
Pipeline execution methods (Debug vs Triggers).
Created the Auto Schedules for executing the pipelines using the Triggers.
Sound Knowledge in Mapping Data flow activities.
Experience in Development, Production support, Defect fixing, Unit testing.
Experience in working with Data warehouse units (DWU).

SKILL SET

Databases: zure SQL, Snowflake Database

ETL Tools: Azure Data Factory

Operating System: Windows

EXPERIENCE

Project 1:
Client: Stanley Black and Decker.
Tools: Azure Data factory, Microsoft SQL Server
Management Studio and Snowflake Database.
Duration: Jan 2021 to till date
Project Overview:
Stanley black and decker Associated with 120 plants (Sources) and to migrate data into Cloud platform(Blob Storage) in terms
of multiple file formats (CSV,PARQUET) with the help of Azure Data factory Pipelines. With the help of data bricks applying
spark transformations on these data and load into the Snowflake database.
Roles & Responsibilities:
Created the pipeline from file share to Azure Blob Storage.
Implemented the pipeline to move data from on-premises to Azure Data Lake.
Development of stored procedures in Database.

Project 2:

Client: Heathrow Airport Limited


Tools: Azure Data Lake, Azure Data Factory v2,
Microsoft SQL Server Management Studio
Duration: Sep 2018 to Nov 2019
Project Overview:
Heathrow uses Azure to pull operational data out of back-end business systems and push it to Power BI, which
shows employees (baggage handlers) at a glance and in real time to track the baggage. Heathrow uses services
such as Azure Data Factory, Azure Stream Analytics, and Azure SQL Database to extract, clean, and prepare real-
time data about flight movement & baggage status before sending it to Power BI.
Roles & Responsibilities:
Experience in performing ETL operations i.e., extracting the data from Azure Data Lake and integrating with
static reference data by making necessary transformations and loading data from Database (Azure SQL) to Data
warehouse (Azure Synapse Analytics) using ADF Pipelines.
Creating ADF pipelines and implemented logic using activities – Copy, Lookup, For Each, Get Metadata, Execute
Pipeline, Stored Procedure, If condition, Wait, Delete etc.
Worked in loading the data using Azure Data Factory (ADFv2) between Azure SQL and Az ure DW.
Experience in creating Pipelines, Datasets, Linked services etc., in ADFv2 for ETL process.
Performed ETL trigger, monitor and activity re-run as per the failures and expectations.
Involved in creating Database objects such as Stored Procedures & Views in SQL Server to implement the
Business logic & logging.
Created/Documented various Unit Test Cases/Evidence documents for Data movement between DB and DW.

Project 3:

 Client: GSK-GP-T360 R1 AZURE-O


 Tools: Azure Data Lake, Azure Data Factory v2, Microsoft SQL Server Management Studio
 Duration October 2019 to November 2020

 Project Overview:
 The Territory T360 project is an integrated BI system solution for Sales Fore Effectiveness, aligned to the Sales Business
Planning Framework. It will be used to support the Business Planning Definition and to monitor the right execution through a
specific framework of reporting.

 Roles & Responsibilities:


 Designing and implementing databases in SQL server
 Creating pipelines in Azure Data Factory to import data from NAS to SQL DW
 Creating Azure Data Bricks notebook to validate the files
 Designing the table configuration in the Azure Storage Explorer

Project 4:
 Client: Nolan Transportation Group
 Tools: Microsoft Azure Cloud
 Duration December 2020- Jan 2021
 Project Overview:
 This Application is to track the details of the Loads for each lane and its corresponding pickup city, destination city, Customer
Price, Negotiation Price, and Suitable Carrier Match etc.
 Roles & Responsibilities:
 Used ADF for Data movement and scheduling the pipelines.
 Defect Analysis and Fixing.
 Documentation-Design Document.
 Created ADF pipelines and SQL queries
s

 Previous Experience:
 Worked as a RELATIONSHIP EXECUTE in TECH-MAHINDRA from April 2016 to May 2018

 Education:
 Bachelor of Electronics and Communication Engineering [ECE] From JNTU Kakinada (2011-2015) with 68%.
 Board of intermediate from Narayana junior college (2011) with 70%.
 Higher secondary education from Nava Baharat public school (1999-2009) with 60%.

Declaration:
I hereby confirm that the details furnished above are true to the best of my knowledge.

Vineela M

You might also like