0% found this document useful (0 votes)
15 views4 pages

Mandar Kulkarni

Mandar Narendra Kulkarni is a Cloud Engineer and Data Engineer with over 14 years of experience in Azure Cloud Computing and Data Warehouse applications, specializing in ETL pipelines and data integration. He holds multiple certifications, including Snowflake Advanced Architect and has worked on various projects involving Azure Data Factory, Snowflake, and real-time data streaming. His professional background includes roles at LTIMindtree, Accenture, and Cognizant, with expertise in managing large-scale data solutions across multiple industries.

Uploaded by

sanjeet
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views4 pages

Mandar Kulkarni

Mandar Narendra Kulkarni is a Cloud Engineer and Data Engineer with over 14 years of experience in Azure Cloud Computing and Data Warehouse applications, specializing in ETL pipelines and data integration. He holds multiple certifications, including Snowflake Advanced Architect and has worked on various projects involving Azure Data Factory, Snowflake, and real-time data streaming. His professional background includes roles at LTIMindtree, Accenture, and Cognizant, with expertise in managing large-scale data solutions across multiple industries.

Uploaded by

sanjeet
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Cloud Engineer/Data Engineer/ /Data Integration/

Mandar Narendra Kulkarni Mobile: (+91) 9096894547


Email id: [email protected]

Profile Summary:
 14.4 Years of professional IT experience in Azure Cloud Computing and Data Warehouse
applications Development using ETL pipelines.
 4+ year of experience on Snowflake cloud data warehouse on azure/AWS.
 Snowflake Advance Architect certified
 Hands of experience on Azure Data Factory (ADF) V2. Managing Database, Azure Data Platform
services (Azure Data Lake (ADLS),Azure Blob storage, Data Factory(ADF), Data Lake, Azure SQL
DW,Azure Logic Apps,),SQL Server.
 Created ETL pipeline using azure PaaS services – Azure data factory and Databricks.
 Experienced on real time data streaming using azure event hub and Stream analytics jobs.
 Migrated on premises databases to azure using azure data factory, Azure SQL DB, Data Lake, Blob
Storage
 Hands on experience on Azure Logic Apps.
 Implemented CI/CD using Azure DevOps for Azure cloud platform.
 Experience with dimensional modeling using star schema, snowflake.
 Implemented dashboard for azure monitoring using azure metrics and azure log analytics(Kusto
Query).
 Strong experience and understanding of architecting, designing and operationalization of
large-scale data and analytics solutions on Snowflake Cloud Database.
 Worked on Retail, Insurance, Banking and Marketing domain.

Technical Skill:
Cloud Azure Cloud Services – Azure Data factory, Data Lake, data lake Analytics Databricks, Azure SQL
DB, Azure AD, Event Hub, ADF, blob Storage, Azure Logic Apps, Azure DevOps, Stream analytics,
SQL Data warehouse (Azure DWH),
Snowflake Certification-- SnowPro Core Certified -2023
-- snowflake Advance Architect certified -2023
IICS: Informatica Cloud Lakehouse Data Management Foundation Certification
IICS: Informatica Data Governance & Data Privacy Foundation Certification
Database Accenture Data
Teradata, Sql Architect
Server, MySQL,Certification
Snowflake.2022
Languages SQL
Scripting Unix shell script, Unix Command
Systems UNIX, Windows – XP, Windows 7-10, Linux, dbt
Tools SQL Developer, Teradata SQL assistant, Putty, WinSCP, Cygwin, Control-M
Office Automation Tools MS Word, MS Excel, MS Access, Outlook, PowerPoint
ETL Tools Informatica Power Center 9x, 10x, Informatica cloud, IDQ
Ticket Tracking tool Jira, Service Now, Kanban Boards

Employment Details:
1) LTIMindtree - Principle Architect s05/2024 - Present
2) Accenture - Associate Data Engineering Manger 06/2021 - 05/2024
3) Tieto Software Pvt. Ltd, Pune, India. - Lead Software Engineer 02/2016 -06/2021
4) Cognizant Technology Solution, Pune, India. - Associate Software Engineer 07/2013 -02/2016
5) Capgemini, Pune, India. - Software Engineer 03/2010 -07/2013
Cloud Engineer/Data Engineer/ /Data Integration/

1) Projects Undertaken:
Project Title: Global Part Library 05/2024 – Present
Description:
Carrier wants to build the global part library from different PLM systems like windchill, team center etc. to show dashboard
for current status of engg parts.
Responsibilities:
• Implement ETL and data movement solutions using dell boomi and Snowflake architect.
• Created connection and extracted data from different Data store from azure and databases using copy activity,
control flow and loaded into azure DW using azure data factory.
• Extracted data from API and sharepoints sources.
• Ingesting, transforming and storing data from multiple source systems in to snowflake.
• Build a dashboard to view engg team part status
Technical Skill:
AWS, Dell Boomi, Snowflake, JIRA.

2) Projects Undertaken:
Project Title: Enterprise Data warehouse – Associate Data engineering Manager 06/2021 – 04/2024
Description:
EDW is designed to have consolidated data warehouse to store Canda based organizational, retail
Transactional data, monthly segmentation data for reporting purpose. Migration of on-prim Oracle DB to Snowflake
Responsibilities:
 Implement ETL and data movement solutions using Azure Data Factory (ADF) and Snowflake architect.
 Created connection and extracted data from different Data store from azure and databases using copy activity,
control flow and loaded into azure DW using azure data factory.
 Created azure service principle for authenticating Azure Data Lake and azure sql db.
 Extracted real time POS messages from store using Event hub and Stream analytic jobs.
 Ingesting, transforming and storing data from multiple source systems in to Data Lake and Azure SQL DB.
 Migrated On premise database on cloud using Azure data factory using link-shared service.
 Migrated application from Azure cloud to azure using Azure Data Factory, Azure SQL DB, Azure Data Lake.
 Created analytics jobs for analyzing data lake raw data using azure analytics service.
 Dynamically loaded multiple flat file into azure sql database using azure data factory.
 Migrate data from traditional database systems like Teradata, oracle to Azure databases.
 Migrated large-scale Teradata data warehouse on to snowflake cloud database using SSIS, Informatica, ADF.
 Code deployment using Azure DevOps and manage continuous integration and continuous delivery (CI/CD)
for apps and platform.
 Migrate data from traditional database systems to Azure databases.
 Worked On on-premises ETL tools – Informatica PC, IDQ.
Technical Skill:
Azure Data Factory, Azure Data Lake, Azure blob Storage, Azure DevOps, Informatica, Unix, SQL Server,
Snowflake, JIRA.

3) Projects Undertaken:
Project Title: Enterprise Data warehouse – Senior Software Engineer 02/2016 – 06/2021
Description:
EDW is designed to have consolidated data warehouse to store Kesko organizational, retail
Transactional data, monthly segmentation data for reporting purpose.
Responsibilities:
 Implement ETL and data movement solutions using Azure Data Factory (ADF).
 Created connection and extracted data from different Data store from azure and databases using copy activity,
control flow and loaded into azure DW using azure data factory.
 Created azure service principle for authenticating Azure Data Lake and azure sql db.
 Extracted real time POS messages from store using Event hub and Stream analytic jobs.
 Ingesting, transforming and storing data from multiple source systems in to Data Lake and Azure SQL DB.
 Migrated On premise database on cloud using Azure data factory using link-shared service.
 Migrated application from Azure cloud to azure using Azure Data Factory, Azure SQL DB, Azure Data Lake.
Cloud Engineer/Data Engineer/ /Data Integration/

 Created analytics jobs for analyzing data lake raw data using azure analytics service.
 Dynamically loaded multiple flat file into azure sql database using azure data factory.
 Migrate data from traditional database systems like Teradata, oracle to Azure databases.
 Migrated large-scale Teradata data warehouse on to snowflake cloud database using SSIS, Informatica, ADF.
 Code deployment using Azure DevOps and manage continuous integration and continuous delivery (CI/CD)
for apps and platform.
 Migrate data from traditional database systems to Azure databases.
 Worked On on-premises ETL tools – Informatica PC, IDQ.
Technical Skill:
Azure Data Factory, Azure Data Lake, Azure blob Storage, Azure Stream Analytics, Azure SQL DW, Azure
DevOps, Informatica, Unix, Teradata, SQL Server, Snowflake, Control-M, JIRA.

4) Projects Undertaken:
Project Title: BRASS 2015/10 – 2016/02
Description:
The objectives of this project, Reusable Authentication Shared Service (BRASS), are to deliver a reusable authentication
service. The ultimate goal is for the BRASS authentication process to be used across other new or existing processes to
provide a common authentication layer providing consistency of experience for customers and centralization of processes
for Barclays. The source system will be parsed using informatica and loaded into Hadoop using flume. There onwards
elastic search will be used for reporting purpose.
Technical Skill:
Informatica Power Center 9.5, Teradata, UNIX Shell Scripting

5) Projects Undertaken:
Project Title: Contact Tool 2013/08 – 2015/10
Description:
Collective contact history which enables users from across the bank to view the last or ongoing contact with the customer.
An updated 360 view of the customer products holdings with satisfaction & complaints data. New more granular levels MI
designed to support customer contact strategy. The provision of number of new contact events to be displayed. Utilise the
new campaigns management tool with increased capacity and flexibility to manage scheduled & adhoc change which
enables more relevant messages to be delivered.
Responsibilities:
 Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous
source Systems into target database.
 Created mappings using Designer and extracted data from various sources, transformed data according to the
requirement.
 Involved in extracting the data from the Flat Files and Relational databases into staging area.
 Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
 Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router
and Aggregator to create robust mappings in the Informatica Power Center Designer.
 Extracted data from web service using Informatica web service transformation- Personator - Melissa Data.
Technical Skill:
Informatica Power Center 9x, Oracle, UNIX, Control-M, SQL, PL/SQL, Unix Scripting, Informatica Amin.

6) Projects Undertaken:

Project Title: Customer Insight


2010/08 – 2013/07
Description:
Customer Insight is product developed by Capgemini for banking financial DWH.

Responsibilities:
 Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous
source Systems into target database.
 Created mappings using Designer and extracted data from various sources, transformed data according to the
requirement.

Cloud Engineer/Data Engineer/ /Data Integration/

 Involved in extracting the data from the Flat Files and Relational databases into staging area.
 Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
 Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router
and Aggregator to create robust mappings in the Informatica Power Center Designer.
 Extracted data from web service using Informatica web service transformation-.

Education Details:
Bachelors of Engineering (B.E) from Shivaji University, India –Passing Year-2009.

Personal Details: Nationality: Indian


Place of Birth: Maharashtra
Sex: Male
Language Known: English, Hindi, Marathi, German

You might also like