Abdul SnowflakeDeveloper
Abdul SnowflakeDeveloper
Professional Summary:
Around 5 years of Professional Experience in working with SQL, Snowflake cloud data warehouse
with AWS/AZURE cloud
Deep understanding of Snowflake architecture, Caching and Virtual data warehouse scaling.
Expertise in bulk loading into Snowflake tables using COPY command.
Experience in Data Migration from SQL Server to Snowflake cloud data warehouse. Experience in
implementing Data Lake on AWS/AZURE and Data Warehouse on Snowflake Data Cloud.
Experienced in Creation of Snowflake objects like databases, schemas, tables, stages, sequences,
views, Procedures and FILE_FORMAT using SnowSQL
Expertise in working with Snowflake Multi-Cluster Warehouses, Snowpipes, Internal/External
Stages, Stored Procedures, Clone, Tasks and Streams.
Experience in bulk loading and unloading data into Snowflake tables.
Experienced in query performance tuning, Clone, Time travel.
Good exposure in Snowflake Cloud Architecture and SNOWPIPE for continuous data ingestion.
Experienced with snowflake Warehouses, databases, schemas, and table structures.
Good knowledge on snowflake advanced concepts like setting up resource monitors, virtual
warehouse sizing, RBAC controls.
Experienced in loading semi structured data.
Experienced in writing SQL queries to implement business rules and validation.
Involved in Zero Copy cloning – Cloning databases for Dev and QA environments.
Experienced with Snowflake cloud data warehouse and AWS S3 bucket for integrating data
from multiple source system which include loading data into snowflake tables.
Having good knowledge in building/migrating data warehouse on Snowflake cloud database.
Consistency in meeting deadlines while maintaining/exceeding quality expectations.
Flexibility to change/leverage with any new technology.
Hands-on experience on both development and support project environments.
Skills:
Programming Languages: SQL, Python
Database: Microsoft SQL Server, Snowflake, Oracle
AWS: AWS IAM, S3, Athena, RDS, EC2
Cloud Datawarehouse: Snowflake
ETL Tools: Matillion, AZURE DATA FACTORY, dbt
Scheduling Tools: Airflow, Control M
Work Experience:
Experience:
Worked as Senior Software Engineer at xxxxxxxxxxxxxxx, Bangalore from April 2019 to till date
EducationalQualification:
Master of Information Science from University of Southern Queensland, Sydney
Project Experience :
Project # 1
Project : Bank Data Mart
Client : NatWest Bank
Project Profile:
National Westminster Bank Plc, trading as NatWest, is a major retail and commercial bank in the United
Kingdom based in London, England. It was established in 1968 by the merger of National Provincial Bank
and Westminster Bank
Role and Responsibilities:
Responsible for end-to-end Data Migration from RDBMS to Snowflake Cloud Data Warehouse
Bulk loading from external stage (AWS S3) to internal stage (snowflake) using COPY command.
Creating views and materialized views in Snowflake cloud data warehouse to Business Analysis and
reporting.
Responsible for all activities related to the development, implementation and support of ETL
processes for large scale Snowflake cloud data warehouse.
Responsible for all activities related to the development, implementation, administration and
support of ETL processes for large scale data warehouses using Snowflake cloud Data warehouse.
Create task flows to automate the data loading/unloading to/from Snowflake data warehouse and
AWS S3
Created internal and external stage and transformed data during load.
Created File Formats, Functions, Views etc.
Cloned databases for Dev and QA environments using Zero Copy Cloning.
Worked on Snowflake streams to process incremental records and for CDC.
Project2: Impacts on EDW
Client: ALLY FINANCIAL, USA
Client Description: ALLY FINANCIAL is a United States financial holding company. It provides financing
and leasing capital to its middle market clients and their customers across more than 30 industries
Role: Snowflake Data Engineer
• Implemented snowflake data pipeline to configure auto-ingestion from AWS S3 to snowflake table.
• In EDW there are Staging layer, landing layer and down stream
• In EDW Extract, extract from upstream(i.e. File system–Oracle DB to Snowflake DB)
• Used COPY statements to ingest data from external stages to database tables.
• Then Transform the data, then clean the data by removing the Null and Duplicate Values, Data
Qualifier then Load into Staging layer.
• Worked on Snowflake's warehouses, roles, schema, databases, and Resource monitor.
• Continuous Data transfer from staging layer to ODS layer is also called Delta load.
• Created file formats, stages and pipes in Snowflake database.
• Used Complex SQL/PLSQL I am able write Stored Procedure for merge to capture
Insert ,Update ,Delete Operation on DB
• Data Stage uses SFTP file system for transferring data
• As a Software Engineer I Handle Data Cleaning and Data Manipulation by Writing Complex SQL
Queries.
• Worked on Zero Copy clone, Time travel and Fail Safe in Snowflake.
• Communicate with clients to Base Line the requirements and impact with other applications.
• ETL logic for data transformation for data fetched from Multiple Tables Using
Joins.
• Working closely with testing team in preparing test data for them.
• Developed comprehensive test plans, test cases, and test scripts based on detailed analysis of business
requirements, functional specifications, and user stories
• Implemented testing methodologies, including regression testing, user acceptance testing (UAT), and
performance testing, to validate the functionality, reliability, and performance of telecom systems.
Project # 3
Project : Western Union EDW
Client : Western Union.
Project Profile:
Western Union company is an American multinational financial services company head quartered in
Denver, Colorado. It is mainly focused on money transfer services. The main services are Wire transfer, Western
Union Mobile and Western Union Connect.
Role and Responsibilities:
Created Snowflake objects like databases, schemas, tables, stages, sequences, views,
FILE_FORMATS ,Procedures and using snowSQL and snowflake UI.
Loading data from files that have been staged in either external (Amazon S3) stage then Copy
Data into the Target Table and Query the DW Data. Worked on Amazon s3 components.
Experience in writing Complex Sub Queries, writing Complex JOINs, Views.
Cloning of database schemas, tables, procedures from QA to Prod environment.
Experience on Snowflake functions Time travel, Data Sharing, Data Cloning, Secure views and
clustering.
Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, snowflake stream to
implement SCD.
Develop SnowSQL scripts to load the data from flat files to Snowflaket ables.
Reduced Snowflake space used by adding transient tables where appropriate and ensuring
optimum clustering column definitions.
Ensuing zero defect and timely deployment of all the components.
Documenting all the deliverables for smooth handover to production team.
Project # 4
Client : Otsuka Pharmaceuticals
Project Profile:
Otsuka Pharmaceutical Co.,Ltd is a pharmaceutical company headquartered in Tokyo, Osaka and Naruto, Japan.
Their product specialties include neuroscience, digital innovation, nephrology, oncology, and medical devices.
Responsibilities: