0% found this document useful (0 votes)
24 views4 pages

Mobile: +91 8121099515: Kalyan Yalla

The document provides a summary of a person's education and professional experience working as a data engineer. It includes details about their technical skills and experience working on multiple projects involving technologies like Azure Data Factory, Azure Databricks, SQL Server and Power BI.

Uploaded by

Ī King Kalyan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views4 pages

Mobile: +91 8121099515: Kalyan Yalla

The document provides a summary of a person's education and professional experience working as a data engineer. It includes details about their technical skills and experience working on multiple projects involving technologies like Azure Data Factory, Azure Databricks, SQL Server and Power BI.

Uploaded by

Ī King Kalyan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Kalyan Yalla

Mobile: +91 8121099515 :[email protected]


Education Summary

 Completed B. Tech from Sri Vasavi Engineering College in 2016

Professional Summary
Having 6+ years of experience in Data Warehousing & Microsoft Business Intelligence Tools like
Azure Data Factory, Azure Databricks, SQL Server, and Power BI.
Microsoft certified Azure Data Engineer Associate.
 Experienced in working with ADF build pipelines to connect with different sources and load data to cloud
environment.
 Having experience in connecting to different sources like Azure SQL database, data lake storage and blob storage
from ADF.
 Designed pipelines with various Control flow activities: Copy activity, execute Pipeline, Get Meta data, If Condition,
Lookup, Set Variable, Filter, For Each pipeline Activities for On-cloud ETL processing
 Designed external tables from parquet files via poly base mechanism.
 Created the Linked Services for various Source Systems and Target Destinations.
 Created the Datasets for various Source Systems and Target Destinations.
 Good knowledge of SQL and scripting experience in coding Stored Procedures, Functions, Triggers, View etc. using
MS SQL Server.
 Expertise in importing data from multiple data sources into Power BI.
 Experience in Data preparation and shaping data with Edit Query like add column, split column, pivot column,
un-pivot column, changing data type, and Merge queries and Append queries.
 Experience on creating DAX Expressions, hierarchies, Filters, Slicer, bookmarks, selection pane in Power BI Desktop.
 Expertise in Creating Data Visualizations using Matrix, Table, Area Charts, tree map, Line and Stacked column chart,
Donut chart, Waterfall chart, KPI’s, Bar, Pie.
 Worked on On-Premises Gateway to refresh the data sources and scheduling the data refresh as per business needs.
 Implemented Row-level security as part of security in Power BI Service.
 Having knowledge in Azure Data Bricks, Azure Synapse Analytics.

Employment Recital

 Working as Big Data Engineer in Tiger Analytics Pvt Ltd from March 2024 to till date .

 Worked as software Engineer at Wipro Pvt Ltd from Mar-2022 to march 2024.

 Worked as associate software Engineer at Tecsolvent Software Technologies Pvt Ltd from Dec-2017 to Mar-2022

Technical Skills

Operating Systems : Windows Family


Scripting Tools : MS SQL Server and Pyspark
ETL Tools : Azure Databricks, Azure Data Factory (ADF)
Reporting Tools : Power BI
Project Profile

Project: #1
Client : Costco (Jan 2018- Dec 2020)
Role : Data Engineer
Environment : SQL SERVER, Azure Data Factory and Azure Databricks

Project Description:
Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eight countries. We are the
recognized leader in our field, dedicated to quality in every area of our business and respected for our outstanding business
ethics. Despite our large size and explosive international expansion, we continue to provide a family atmosphere in which
employees thrive and succeed. We are proud to have been named by Washington CEO Magazine as one of the top three
companies to work for in the state of Washington.

Roles & Responsibilities:


 Primarily involved in Data Migration using SQL, SQL Azure, Azure Data Lake, and Azure Data Factory.
 Professional in creating a data warehouse, design-related extraction, loading data functions, testing designs, data
modelling, and ensure the smooth running of applications.
 Responsible for extracting the data from OLTP and OLAP using Azure Data factory and Databricks to Data Lake.
 Developed pipelines that can extract data from various sources and merge into single source datasets in Data Lake
using Databricks.
 Creating the linked service for source and target connectivity Based on the requirement.
 Once it’s created pipelines and datasets will be triggered based on LOAD (HISTORY/DELTA) operations.
 Created Mount point for Datalake and extracted Different formats of data like CSV and Parquet.
 Created data frames and transformed DF’s using Pyspark.
 Implemented SCD1 in delta lake to handle incremental load and write data back to the Azure SQL Server.
 Always actively participate in four ceremonies: Sprint planning meeting, Daily Scrum, Sprint review meeting, and
Sprint retrospective meeting.

Project: #2

Client Name : Acosta (Feb 2021- March 2022)


Role : Azure Data Engineer
Environment : ADF, MS SQL Server and Power BI

Project Description:
Acosta is a sales and marketing agency in Florida. Acosta’s channels are: club, convenience, drug, e-commerce,
electronics, foodservice, grocery, home improvement, mass merchandisers, military, natural/specialty
foods, telecom, and value.
Acosta has developed and maintains long-standing relationships with leading companies. In fact, Acosta works with
25 of the top 30 CPG companies. Several of the relationships originated over 50 years ago. Examples of the
Company’s longstanding relationships include the Clorox Company and Coca-Cola North America which have been
clients since the 1930s and 1950s, respectively.

Responsibilities:
 Extracted the data from different Source Systems, Transformed, Cleansed and Validated and load into the
Destination.
 Developed pipelines in Azure data factory to fetch data from different sources and loaded into Azure SQL Database
and ADLS Gen2
 Created entire Data flow from Blob to Data Lake storage.
 Involved in design and development of the Azure Data Factory Pipelines.
 Creation of Azure Data Factory Pipelines to move the data from Data Lake store to Azure SQL server.
 Created Jobs and scheduled Packages using Services, Datasets and Pipeline in ADF
 Developed Azure data factory Pipelines for moving data from staging to Datawarehouse using incremental data
load process
 Implemented activities Copy activity, Execute Pipeline, Get Meta data, If Condition, Lookup, Set Variable, Filter,
For Each pipeline Activities for On-cloud ETL processing.
 Expertise in importing data from multiple data sources into Power BI.
 Experience in transformations to edit the data and create multiple Slicers, Filters, Charts and Graphs on each view
on this Dashboard.
 Use calculated columns, new column fields to create visualizations with multiple measures and dimensions.
 Created multiple dashboards with Power bi Desktop for data visualization through various charts such as
bar charts, line charts, pie charts, packed bubbles and custom charts
 Experience in creating calculated measures and columns.
 Customer interaction to understand requirement and implementing various business changes in existing dashboards.

Project: #3
Client Name : Shell India (March 2022- March 2024)
Role : Azure Data Engineer
Environment : ADF, SQL and Power BI

Project Description:
Shell is one of India's most diversified international energy companies, with over 10,000 employees and a presence across
upstream, integrated gas, downstream, renewable energy, and deep capabilities in Research & Development, digitization,
and business operations. With over 350 retail stations across eight states, Shell India is expanding its fuel station network
and launched Shell Recharge, its EV charging service, in September 2022, which is rapidly growing in the EV infrastructure
space.

Responsibilities:

 Extracted Data from CSV, Excel and SQL server sources to Staging tables Dynamically using ADF pipelines.
 Implemented Control flow activities: Copy activity, Pipeline, Get Meta data, If Condition, Lookup, Set Variable,
Filter, For Each pipeline Activities for On-cloud ETL processing
 Created the Linked Services for various Source Systems and Target Destinations.
 Created the Datasets for various Source Systems and Target Destinations.
 Implemented Incremental load strategy for loading on daily basis.
 Parameterized the Datasets and Linked Services using the Parameters and Variables.
 Connect to different Data sources from Power BI.
 Developed Dashboard Reports using Power BI
 Created New Calculated Columns and Calculated Measures using DAX Expressions.
 Experience in creating calculated measures and columns with DAX in MS POWER BI DESKTOP.
 Experience in creating different visualizations like line, bar, histograms, scatter, water, Bullet, Heat maps, tree
maps etc.
 Experience in Configuring Gateways in the power BI services.

You might also like