0% found this document useful (0 votes)
30 views6 pages

Sandeep Updated Resume

Uploaded by

nitya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views6 pages

Sandeep Updated Resume

Uploaded by

nitya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

SANDEEP|ETL Informatica Developer

SUMMARY:
 Results-oriented IT professional with over 11 years of comprehensive experience in Data Warehousing
technology and all phases of the Software Development Life Cycle (SDLC). Proven expertise in designing,
developing, implementing, and testing Data Warehousing applications in Banking, Financial, and Insurance
domains.
 Successfully managed end-to-end testing processes for Data Warehousing applications in Banking and
HealthCare Insurance domains.
 Designed and implemented ETL components using Informatica PowerCenter, ensuring efficient data
movement from multiple sources to targets, Data Marts, and Data Warehouses.
 Expertise in performance tuning of Informatica mappings, identifying bottlenecks in source and target
systems.
 Implemented Slowly Changing Dimensions (SCD) Type 1 and Type 2 for initial and historical data loads
using Informatica.
 Conducted data analysis, ETL techniques, and MD5 logic for loading CDC, demonstrating a keen eye for data
quality.
 Proficient in SQL, with the ability to write and interpret complex SQL statements, and experience in SQL
optimization.
 Adept at working in Agile environments, contributing to the development of ETL Specification Documents
and maintaining project artifacts.
 Led multi-resource projects in an Onsite-Offshore model, serving as a mentor for junior team members.
 Collaborated effectively with onshore, offshore teams, and End-to-End teams to ensure project
commitments were met.
 Demonstrated strong problem-solving, time management, and communication skills, working both
independently and cooperatively in a team environment.

TECHNICAL SKILLS

ETL TECHNOLOGY Informatica PowerCenter 10.4/9.6.1/9.0.1, Informatica Intelligent Cloud


Services (IICS)
DATA WAREHOUSE Star Schema, Snowflake schema
DATA MODELLING OLTP/OLAP System Study, E-R modeling, Star Schema, Snowflake Schema

DATABASES Oracle 19c/12c/11g/10g, SQL Server 2014/2017/2012, MySQL


APPLICATIONS MS Office, Toad 9.2/8.6,
Others Quest, TOAD, SQL*PLUS, SQL*Loader, WinSCP, Putty, Rally,JIRA
CHANGE DATA Methodology using Informatica PowerCenter 10.x/9.x
CAPTURE (CDC)
TESTING Data Analysis, Data Validation, Data Verification, SQL optimization, and
performance tuning
DOCUMENTATION ETL Specification, Use Cases, source-to-target mapping, Impact Assessment
SCRIPTING Unix Shell Scripts
METHODOLOGIES Agile, Waterfall
PROFESSIONAL EXPERIENCE

Client: Anthem, Inc Oct 2022 to Current


Role: Informatica Developer
Anthem, Inc. operates as a health benefits company in the United States, providing health benefit plans to large
and small groups, individuals, Medicaid, and Medicare markets. The project involved the integration of relevant
data from various source systems, including QNXT, Leon, and EZCAP, utilizing ETL Informatica tools. The
implementation of Reports and Interactive dashboards facilitated the company in tracking payments for
enhanced reporting capabilities. This comprehensive Analytics solution contributed to a more robust system for
managing Medicaid and Medicare programs, aligning with the company's commitment to quality health benefit
plans and anti-fraud measures.

Responsibilities:

 Worked with business users, stakeholders, and SMEs to gather requirements and actively participated in the
complete Software Development Life Cycle (SDLC) of the project.
 Analyzed source data and gathered business requirements, creating Technical Design Documents from
Business Requirements Documents (BRD).
 Analyzed business and system requirements to identify system impacts.
 Prepared source-to-target mappings and conducted meetings with the business to understand
data/transformation logic.
 Created Detail Technical Design Documents outlining ETL technical specifications.
 Analyzed existing mapping logic for code reusability.
 Created Mapping Parameters, Session parameters, Mapping Variables, and Session Variables.
 Conducted extensive performance tuning to enhance session performance by identifying bottlenecks at
various points like targets, sources, mappings, sessions, or systems.
 Created unit test plans and conducted unit testing using different scenarios for each process.
 Involved in System test and regression test, and supported UAT for the client.
 Managed ETL and database code migrations across environments using deployment groups.
 Populated business rules using mappings into the target tables.
 Involved in end-to-end system testing, performance and regression testing, and data validations.
 Managed performance and tuning of SQL queries and addressed slow-running queries in production.
 Created batch scripts for automated database build deployment.
 Coordinated with various business users, stakeholders, and SMEs for functional expertise, design, business
test scenario review, UAT participation, and validation of data from multiple sources.
 Performed detailed data investigation and analysis of known data quality issues in related databases
through SQL.
 Actively involved in the analysis phase of business requirements and the design of Informatica mappings.
 Performed data validation, data auditing, and data cleansing activities to ensure high-quality deliveries.
 Used various transformations in Informatica, such as Source Qualifier, Expression, Look-up, Update Strategy,
Filter, Router, Joiner, etc., for developing mappings.
 Developed Informatica mappings for TYPE 1 and 2 Slowly Changing Dimensions.
 Created mappings using parameters and variables (session’s parameters and variables, mapping
parameters, and variables).
 Created sessions and workflows for Informatica mappings.
 Created mappings with connected look-ups and unconnected lookups.
 Created various Mapplets as part of mapping design.
 Created effective test cases and performed unit and integration testing to ensure the successful execution of
the data loading process.
 Documented mappings, transformations, and Informatica sessions.
 Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.

Environment: Informatica PowerCenter 10.4, SQL, Oracle 12c, TOAD, SQL Server 2017, Unix, Rally, JIRA,
Control M

Client: JP Morgan Chase Oct 2020 to Sep 2022


Role: Informatica Developer (REMOTE)
Description: The Bank of America Corporation is an American multinational investment bank and financial
services holding company. This project, named GCIB Data and Analytics, is developed to deal with ESG and
deals-related data to generate reports for clients.

Responsibilities:
 Contributed across all stages of the Software Development Life Cycle (SDLC), playing a pivotal role in
requirements gathering, design, development, testing, production, user training, and ongoing support for
the production environment.
 Actively engaged with business users, meticulously recording user requirements, and conducting
comprehensive Business Analysis to ensure alignment with project objectives.
 Illustrated the entire process flow, meticulously documenting data conversion, integration, and load
mechanisms, validating specifications for the project's success.
 Translated intricate high-level design specifications into straightforward ETL coding and mapping
standards, ensuring clarity and efficiency in the development process.
 Leveraged PowerCenter Designer tools to craft mappings for extracting and loading data from diverse
sources, including flat files and SQL server databases.
 Upheld warehouse metadata, adhering to naming standards and warehouse norms to lay a solid foundation
for future application development.
 Crafted detailed design and technical specifications, providing a blueprint for the successful execution of the
ETL process.
 Utilized Informatica as the primary ETL tool, developing source/target definitions, mappings, and sessions
to orchestrate the seamless extraction, transformation, and loading of data into staging tables from various
sources.
 Orchestrated the mapping and transformation of existing feeds into new data structures and standards
using Router, Lookups (Connected, Unconnected), Expression, Aggregator, Update Strategy, and stored
procedure transformations.
 Designed and implemented various complex mappings, specializing in Slowly Changing Dimension Type 1
and Type 2 transformations.
 Executed precise performance tuning at the mapping, session, source, and target levels, addressing criteria
and creating partitions to resolve performance issues.
 Engineered workflows encompassing command, email, session, decision, and a diverse range of tasks to
streamline data orchestration.
 Identified and rectified bugs in existing mappings by conducting comprehensive data flow analysis,
evaluating transformations, and implementing bug fixes.
 Conducted rigorous tuning of mappings based on criteria, creating partitions when necessary to optimize
performance.
 Implemented thorough data validation post-successful End-to-End tests, incorporating robust error-
handling mechanisms into ETL processes.
 Developed Parameter files to facilitate dynamic value passing to mappings as per project requirements.
 Scheduled batches and sessions within Informatica using the Informatica scheduler, complemented by
customized pre-written shell scripts for job scheduling.
 Proficiently composed Shell Scripts for job scheduling in Autosys, enhancing automation and efficiency.
 Demonstrated extensive expertise in designing and implementing continuous integration, continuous
delivery, and continuous deployment through Jenkins.
 Facilitated code migration using CI/CD methodologies, employing GIT and Bitbucket for streamlined version
control.

Environment: Informatica PowerCenter 10.5, Oracle 19c, Toad, Linux, Autosys, GIT, Bitbucket, Jira

Client: US Foods OCT 2017 to Sept 2020


Role: Informatica Developer (REMOTE)
Description: US Foods is one of America’s great food companies and a leading food service distributor,
partnering with approximately 300,000 restaurants and food service operators to help their businesses
succeed. They sell their product through an e-commerce channel; this project integrates multiple applications;
from the time the order is created to the time when the goods and services are delivered and billed. Informatica
is being used to integrate data into an e-commerce order data mart. The reporting tools are Cognos and Tableau.
The objective of these data marts is to enable improved decision-making, strategic plans, support, and solutions
that favorably impact costs, quality of care, outcomes, and customer satisfaction through an information-driven
environment that leverages its integrated data assets for competitive advantage.
Responsibilities:
 Orchestrated collaboration among diverse business users, stakeholders, and Subject Matter Experts (SMEs)
to gather functional expertise, conduct design reviews, scrutinize business test scenarios, participate in User
Acceptance Testing (UAT), and validate data from multiple sources.
 Proficiently navigated PowerCenter Designer tools, including Source Analyzer, Warehouse Designer,
Mapping Designer, Mapplet Designer, and Transformation Developer, to craft comprehensive data
integration solutions.
 Utilized the Debugger within Mapping Designer to meticulously test data flow between sources and targets,
troubleshooting any irregularities in mappings.
 Engineered intricate mappings using Mapping Designer, employing various transformations such as Source
Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Update Strategy, Joiner, Filter, and
Sorter transformations to load data from diverse sources.
 Applied SQL expertise to analyze source data, conduct data analysis, and validate data integrity.
 Scheduled Informatica Jobs seamlessly through the Control M scheduling tool, ensuring the timely execution
of critical data integration processes.
 Conducted thorough reviews of the existing system, providing valuable insights to establish a unified view of
the program.
 Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables, and Session
Parameters, ensuring dynamic and adaptable data processing.
 Crafted Shell scripts for the validation of source files, and monitoring system space, and utilized the cron
utility to effectively schedule these scripts.
 Engaged in the creation of Informatica mappings, Mapplets, worklets, and workflows, orchestrating the
seamless flow of data from various sources to the data warehouse.
 Played a pivotal role in facilitating load testing and benchmarking the developed product against set
performance standards, ensuring optimal performance.
 Tested databases using complex SQL scripts and effectively handled performance issues, demonstrating a
comprehensive understanding of database functionality.
 Contributed to Onsite and offshore coordination, ensuring the completeness of deliverables, and fostering
effective communication between different teams.

Environment: Informatica Power Center 8.6, Oracle 10g, SQL, SQL Server 20012, Cognos, JIRA

ADP India Pvt Ltd


Sept 2006 – Feb 2017
Role: Sr. Functional Consultant
Description: A global leader in HR technology, offering the latest AI and machine learning-enhanced payroll,
tax, HR, benefits, and much more. Conducting data analytics, creating KPIs, designing business solution work-
flows, and creating and executing associated test cases for various internal and external client-facing applica-
tions and solutions. The role may be involved in all aspects of the software development life cycle (SDLC) and
the goal was to work with clients, product managers, analysts, architects, and engineers to plan, design, develop,
test, and implement solutions consistent with the business objective.
Data Analytics:
 Design and develop analytical data models and KPIs to support static and ad-hoc reports, analytics and
research, dashboards, data mining, trending, creation of key performance indicators, predictive modeling,
etc.
 Deliver BI/Data content via portals, data visualization (Tableau/Power BI), etc.
 Design regression analytics and data trending charts using complex datasets.
 Research and determine links between disparate data sets.
 Develop complex queries against Mainframe, Oracle, DB2 etc.

Data Testing:
 Design and develop data extractions and transformations using ETL tools to complete data research and
testing.
 Conduct data testing using both manual and automated tests.
 Develop independent process and control test scripts and procedures.
 Document testing phases and defects and report bugs and errors to appropriate teams.
 Provide feedback and support to developers to solve findings (help troubleshoot issues).
 Collaborate with the business team to design, develop, test, and refine deliverables.

Problem Solving:
 Comfortable working with complex data scenarios with tech that has never been used before. Inevitable
challenges will arise, and we'll rely on you to look for a solution.
 Expert problem-solving skills required to interpret complex information and present the best way to move
forward.
 Technical and Business acumen to participate in the analysis, design, testing, and implementation of new
and existing business processes.
 Innovation with the ability to "think outside the box" is important.

Communication & Teamwork:


 Communication skills with the ability to clearly explain complex issues and drive consensus.
 Ability to work cooperatively as part of a team, as well as independently.

Environment: IICS, Migrations, SQL, AWS Aurora (Postgres), ETL Tool Informatica with pipelines, Work Effort
Estimations.

You might also like