0% found this document useful (0 votes)
24 views2 pages

Vikaspatil92465 CV

Uploaded by

vikaspatil92465
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views2 pages

Vikaspatil92465 CV

Uploaded by

vikaspatil92465
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Vikas Patil

Data Engineer
[email protected] 8698346497 Pune, Maharashtra

Professional Experience
TCS 08/2022 – present
Data Engineer
Responsibilities:
1.Data Pipeline Development:
Design, develop, and maintain scalable data pipelines using Python and SQL.

2.Cloud Integration:
Implement and manage data storage and processing solutions on AWS.

Utilize AWS services such as S3, Redshift, Glue, and Lambda to build robust data

infrastructure.
3.Data Warehousing:
Develop and optimize data warehousing solutions using Snowflake.

Implement ETL processes to ensure data is clean, accurate, and accessible.


4.Containerization and Deployment:


Use Docker to containerize applications for consistent and reproducible

deployment.
5.Version Control and Collaboration:
Use Git for version control to manage codebase and collaborate with team

members.
6.Performance Tuning:
Optimize SQL queries and database performance to handle large-scale data

efficiently.
Monitor and troubleshoot data pipeline performance issues.

Cognizant 02/2022 – 07/2022


Junior Data Engineer Pune, India
Responsibilities:
1.Support Data Pipeline Development:
Assist in designing and maintaining data pipelines using Python and SQL under

the guidance of senior engineers.


2.Cloud Integration:
Support implementation and management of data storage and processing

solutions on AWS.
Utilize AWS services such as S3 and Lambda with supervision to build robust

data infrastructure.
3.Data Warehousing:
Assist in developing and optimizing data warehousing solutions using

Snowflake.
Participate in ETL process development to ensure data is clean and accurate.

6.Performance Tuning:
Optimize basic SQL queries and support performance tuning for handling large-

scale data efficiently.


Education
Master of Computer Application 02/2021 – 06/2022
North Maharashtra University Jalgaon Jalgaon, India

Bachelor of Computer Science 06/2017 – 11/2020


North Maharashtra University Jalgaon Jalgaon, India

Skills
Python AWS Snowflake SQL
Docker ETL Git Power BI

Projects
Nissan Motor (Japan Based) at TCS
Developed a data analytics platform to enhance vehicle performance analysis and derive customer insights

using Python, AWS, Snowflake, SQL, and Docker. Designed scalable data pipelines, implemented cloud-
based data storage and processing solutions, and developed a Snowflake-based data warehousing solution.
Utilized Docker for containerization and SQL for database management and optimization, leading to
improved data-driven decision-making for Nissan Motor.

Fiserv at Cognizant
Developed a data integration and analytics platform to enhance financial services insights using Python,

AWS, Snowflake, SQL, and Docker. Designed and implemented scalable data pipelines for processing
transactional data, leveraged AWS for cloud storage and computing, and utilized Snowflake for efficient data
warehousing. Employed Docker for consistent application deployment and SQL for optimizing database
queries, resulting in improved financial data analysis and reporting capabilities for Fiserv.

Certificates
Python SQL Snowflake Git

Objective
Seeking a challenging role where I can utilize my skills and experience to contribute to organizational success,
while continuously learning and growing in a dynamic environment.

You might also like