Resume 3
Resume 3
Resume 3
SNOWFLAKE DEVELOPER,
Mindtree,
Mobile: +91-xxxxxxxxx,
EXP: 6+ YEARS,
Email: [email protected].
PROFESSIONAL SUMMARY
Over all 6+ years of experience in IT industry, 1.7 years of experience with Snowflake, working as a
Data engineer and highly motivated individual with proven ability to learn fast and work well under pressure.
Proficient in Design and Development of process required to Extract, Transform and Load data
into the data warehouse using Snowflake Cloud Database, AWS S3
Expertise in building/migrating data warehouse on Snowflake cloud database.
Expertise in working with Snowflake Multi - Cluster Warehouses, Snowpipes,
Internal/External Stages, Store Procedures, Clone, Tasks and Streams.
Involved in Zero Copy cloning – Cloning databases for Dev and QA environments.
Storage Considerations for Staging and Permanent Databases / Tables.
Resource Monitors setup on warehouses
Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from
multiple source system which include loading data into snowflake tables.
Experience in building Snowpipe.
In-depth knowledge of Snowflake Database, Schema and Table structures.
Experience in various methodologies like Waterfall and Agile.
Extensive experience in extraction, transformation and loading of data directly from different
heterogeneous source systems like Flat Files, Oracle, complex files.
Quick learning, implementation and easy adoption to new technologies.
Good team player, very proactive and willing to learn new updates in technology.
PROFESSIONAL EXPERIENCE
Worked as Module Lead in Mindtre Ltd, Bangalore from Apr 2020 to Aug 2021.
Worked as Software Engineer in SLK Software Services India Pvt Ltd, Bangalore
from July’14 to Mar ‘2020.
EDUCATIONAL QUALIFICATION
B.TECH (INFORMATION TECHNOLOGY) from JNTU
TECHNICAL SKILLS
Project Experience
Organization: Mindtree
Project # 1
Project Profile:
Danske Bank is the largest bank in Denmark and a leading player in the Scandinavian financial markets.
Recently acquiring Sampo bank, Sampo Bank serves customers from 125 branches in Finland and 33 branches in
the Baltic States, Is the major achievement of Danske. The IT department of Danske has started with migration of
Sampo bank technology to Danske bank technology, as a part of project, I was involved in the process of migration
of Sampo bank data to Danske bank.
Contribution / Highlights:
Created internal and external stage and transformed data during load.
Designed Staging, Integration, Information and Archive areas on S3
Used Temporary and Transient tables on diff datasets
Shared sample data using grant access to customer for UAT.
Created S3 events for job triggering.
Zero Copy cloning – Cloning databases for Dev and QA environments
Time Travel, Data Retention Settings for Storage for Crucial Tables.
Data Sharing from Prod to Stage and Dev Environments.
Worked on Snowflake streams to process incremental records.
Designed Update and Insert strategy for merging changes into existing Dimensions and Fact tables
Unloaded the data from snowflake to send data to downstreams.
Project Experience
Project # 1
Client Description: Mercedes-Benz USA LLC (MBUSA) keeps a portion of the enterprise level operational
data from various applications in an enterprise data warehouse for reporting and analysis using COGNOS
software. Data is gathered from MBUSA Finance, Customer Call Center, Customer, Warranty, Vehicle,
Parts, Strategic Retail Development and Learning and Performance systems as well as 3rd party sources
such as POLK, Mercedes-Benz Financial System, Peers and vehicle registrations.
MBUSA is evaluating migration of enterprise wide ETL applications to meet the following objectives:
Reduce costs for development and support of COBOL based ETL applications.
Achieve the improvements provided by IBM Data Stage ETL package.
Utilize Change Data Capture and optimize the ETL processing in each subject area.
Responsibilities
Understanding the Business logic Code for relevant Subject Areas related to Benz project.
Effectively involved in review of various development activities / task with offshore on daily base.
Extracting the data from source like Flat files, Excel files and loaded the data into staging and Target
tables.
Executing the Unit Test Cases & SIT docs for analyzing the results.
Used most of the stages in data stage like Sequential file, dataset, File set, filter, Switch, Change
Capture, copy, Remove duplicates, Sort, Aggregates, Lookup, Join, funnel, transformer for application
related enhancements
Supporting UAT and PREPROD environment Issues
Created the reject handling in a generic way such as this script handles the rejects for all subject areas
and sends mail
Created a script for table count validations from source every day and this can be used as generic script
for all the applications by giving one file as input with the details.
Project # 2
Client Description:
Legrand, part of manufacturing domain and data related to sales order, purchase order and warehouse
management systems forms the legacy systems of the client.
Project Objective:
Middle ware to listen for the files from Active MQ server and perform validations technically and place the files in
FTP server for the Microsoft dynamics team and receive files from them and place the files in active MQ after
applying the business required transformations.
Processing xml files from active MQ to Ftp server after the required technical validations and transformations.
Creation of route jobs in mediation environment using ESB components that listens for the files in the active MQ
with the use of components.
Creation of route jobs in mediation environment using ESB components to listen for the files in the FTP server
with the use of FTP components.