Vindhya Pulicharla: Email: PH: 919-655-9520
Vindhya Pulicharla: Email: PH: 919-655-9520
Vindhya Pulicharla: Email: PH: 919-655-9520
7+ years of professional experience with comprehensive technical skill set and expertise indevelopment and
implementation of Data Warehouses&Data martDataStage 8.x/9.x/11.x&with Informatica Power
Center 9.X/8.X/ 7.x
Involved in all phases of Software Development Life Cycle (SDLC): Requirement gathering, analysis,
design, development, testing, production support and maintenance. Also worked on waterfall and
Agile/Scrum methodology project.
Good Knowledge on Logical data model, Physical data model,Data warehouse concepts,
Dimensional Modeling using Star schema and Snow Flake schema.
Developing Data Stage Sequences/Jobs, reusable Sequences/jobs and Activities for Daily process to
loading heterogeneous data into the data warehouse.
Extensively used stages like Sequential file, Datasets, DB Connectors (ODBC, Oracle, Netezzza),
Aggregator, Joiner, Transformer, Lookup, ChangeCapture, Remove Duplicates, Peak, Copy and
Pivot.
Executed sequences, both sequential and concurrent for efficient execution of jobs and used other
activities like Job Activity, Set Variables, Exec Command, Email Notification, Sequencer, Nested
Conditions, Start Loop and End Loop.
Experience in working with Informatica components like workflows, mappings, mapplets, sessions,
tasks, Debugger, partitioning, reusable components and also extensively worked withsession logs,
workflow logs for error handling and troubleshooting on mapping failures.
Good experience in developingInformatica Mappings, Mapplet, reusable Transformations, tasks,
sessions and Workflows for Daily process to loading heterogeneous data into the data warehouse.
Sources include delimited flat files, fixed width files, XML files, DB2 & ORACLE tables.
Extensively used Transformations like Router, Aggregator, Joiner, Expression, Lookup, Update
strategy, Union, Normalizer and Sequence generator.
Executed sessions, both sequential and concurrent for efficient execution of mappings and used other
tasks like event wait, event raise, email and command.
Good experience in working with various database systems such as DB2, Oracle and SQL Server
Databases.
Identified and streamlined the commonly used logic into mapplets and reusable sessions.
Proficient in writing the data clean up scripts using SQL queries and UNIX scripts.
Good experience in setting up new jobs and job dependencies using Control-M
Added appropriate dependencies at job level & batch process level for each job and Quantitative
resources for each database/resource pools at scheduling tool to avoid deadlocks, timeout and
connection issues.
Managed the authenticity of UNIX components by Checking-in & Checking-out code from
StarTeam/RTC, a code versioning tool.
Experience in working with Onsite-Offshore model.
Excellent communication and interpersonal skills, problem solving skills, analytical skills, flexible, self-
direct, ability to work with minimum supervision, and a team player
EDUCATION
MASTERS IN COMPUTER APPLICATIONS, SRI VENKATESWARA UNIVERSITY, TIRUPATI, AP,
INDIA - MAY 2006
SKILLS
ETL Tools – DataStage 8.1/9.x/11.x , Informatica Power center 7.X/8.x/9.x
Database Systems – IBM DB2, SQL Server 2005/2008/2014, Oracle, Postgres
Databases Tools – Squirrel, Oracle SQL developer, SQL Server Management Studio
BI Tools – Tableau 10x
Version Control – Star team, IBM RTC (Rational Team Concert)
Languages – UNIX Shell Scripting
Scheduling Tools – Control-M
Other Tools – Putty, WinSCP, Visio
TRAININGS
Data warehousing & Informatica Training – Mphasis, Chennai
DataStage (Self-paced learning)
Agile & Scrum (Self-Paced learning)
Data Visualization with Tableau – University of California -Davis & Coursera
Alteryx – (Alteryx Academy- In Progress)
EXPERIENCE
United Services Automobile Association, San Antonio, TxOct 2016 – Till Date
USAA, which is ranked #1 in Best places to work in 2010, 2011& 2012 by Computer World and is
among top 50 Fortune 500 companies, with total assets of $22 billion and employing 23,400 employees it is
one of the nation’s top Banking, Insurance, Investment & Advice organizations. As a lead I was responsible
for projects ranging in different domains such as Auto, P&C, Banking & Investments and Life insurance
2
campaign executions
Responsibilities:
Collaborates with project managers and prepare designs for project scope and perform risk analysis
on same
Collects, organizes and generates new documentation like high level technical documents, low level
functional documents and data flow diagrams
Designed and developed medium to complex DataStage jobs using stages such as the Oracle & DB2
connectors, Aggregator, Sequential File stage, Dataset stage, Transformer, Lookup, Filter, Remove
Duplicates, Change Capture, FTP stage & Sort stages.
Used DataStage sequences for Creating, Validating, Testing and running the DataStage jobs in
sequential/parallel process flow to process & load full and incremental data into target system.
Created numerous simple to complex queries involving self joins, correlated sub-queries
Designed and developed several SQL processes to extensively test the ETL process across the
environments
Works with DBAs and prepares improvement plans for improving extractions & load process.
Collaborating with cross functional teams and execute all project deployments.
Managed the authenticity of Data Stage/ UNIX components by Checking-in & Checking-out code from
and to RTC (IBM Rational Team Concert )
Created ETL overview & Code walk through documents for maintenance teams.
Documented all unit test cases for all tables across the hop’s and uploaded to project site for future
reference
Provides support during implementation and roll out activities
Added appropriate dependencies at job level & batch process level for each job and Quantitative
resources for each database/resource pools at scheduling tool to avoid deadlocks, timeout and
connection issues.
Tools & Technologies: DataStage 9.x/11.3, Unix, Oracle, Db2, Netezza, Control-m, SQL Developer, RTC
(Rational Tool Concert), HDFS, Hive
Responsibilities:
Prepared High-Level Design and Low-Level Design based on Functional and Business required
documents of the project.
Interact with the requirements team and architectural team to get a brief knowledge of business
logics.
Conducted Review sessions with SME’s and Business users for better understanding of the
requirements.
Extensively used ETL processes to load data from flat files into the target database by applying
business logic on transformation mapping for inserting and updating records when loaded.
Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner
Transformation.
Extensively used various transformations like Filter, Router, Sequence Generator, Lookups, Update
Strategy, Joiner, Source Qualifier, Expression, Sorter, and Aggregator.
Extensively used Mapping Variables, Mapping Parameters, and Parameter Files for the capturing
delta loads.
Worked with slowly changing dimension, Type1, Type2
Create and execute unit test cases
Keep track of the reported defects and support other teams to resolve defects
Tools & Technologies: Informatica Power Center 9.1, UNIX, DB2, TOAD, SQL*Loader
Responsibilities:
Developed Informatica mappings to load data into various dimensions and fact tables from various
source systems.
Created and managed Source to Target mapping documents for all Facts and Dimension tables
4
Designed and Developed mappings using which involve different sources that includes Flat Files and
Relational tables from heterogeneous databases like Oracle, SQL server and DB2.
Designed and developed medium to complex informatica mappings using transformations such as
the Source Qualifier, Aggregator, Expression, Lookup, Filter, Router, Rank, Sequence Generator,
Stored Procedure and Update.
Used Workflow Manager for Creating, Validating, Testing and running the sequential, parallel,
sessions that perform Full and Incremental Loads to target system.
Designed and developed medium to complex DataStage jobs using stages such as the Oracle & DB2
connectors, Aggregator, Sequential File stage, Dataset stage,Transformer, Lookup, Filter, Remove
Duplicates, Change Capture, FTP stage & Sort stages.
Used DataStage sequences for Creating, Validating, Testing and running the DataStage jobs in
sequential/parallel process flow to process & load full and incremental data into target system.
Extensively worked with Slowly Changing Dimensions (SCD) Type1 & Type2 for Data Loads.
Created numerous simple to complex queries involving self joins, correlated sub-queries
Identified and created various test scenarios for Unit testing the data loaded in target.
Tools & Technologies: Informatica 8.6, DataStage 8.0, Unix, Oracle, Db2, Control-m, SQL Developer,
StarTeam
Responsibilities:
Developed Informatica mappings to load data into various dimensions and fact tables from various
source systems.
Created and managed Source to Target mapping documents for all Facts and Dimension tables
Designed and Developed mappings using which involve different sources that includes Flat Files and
Relational tables from heterogeneous databases like Oracle, SQL server and DB2.
Designed and developed Informatica power center medium to complex mappings using
transformations such as the Source Qualifier, Aggregator, Expression, Lookup, Filter, Router, Rank,
5
Sequence Generator, Stored Procedure and Update.
Used Workflow Manager for Creating, Validating, Testing and running the sequential, parallel,
sessions that perform Full and Incremental Loads to target system.
Extensively worked with Slowly Changing Dimensions (SCD) Type1 & Type2 for Data Loads.
Created Pre/Post Session SQL commands in sessions and mappings on the target instance.
Created numerous simple to complex queries involving self joins, correlated sub-queries
Developed and tested all the backend programs, Informatica mappings, sessions and workflows
Tools & Technologies: Informatica 8.6, UNIX shell scripting, Oracle, SQL Developer, SQL Server 2008, TFS,
SQL Server Management studio
Mphasis
Client: General Motors, Chennai, India Jul 2006 - Mar 2009
General Motors – more commonly known as GM – is a multinational automotive company in the USA and
Canada. It produces cars and trucks and sells its services through many brands such as Chevrolet, GMC,
Cadillac, Opel, and Vauxhall amongst many more.
Responsibilities:
Assisted gathering business requirements and worked closely with various Application and Business
teams to develop Data Model, ETL procedures to design Data Warehouse.
Extensively used ETL Informatica tool to extract data stored in MS SQL 2003, csv files and Flat files
and finally loaded into a Data mart.
Used various active and passive transformations such as Aggregator, Expression, Sorter, Router,
Joiner, connected/unconnected Lookup, and Update Strategy transformations for data control,
cleansing, and data movement.
Designed and developed Mapplets for faster development, standardization and reusability purposes.
Implemented Slowly Changing Dimension Type 1 and Type 2 for inserting and updating Target tables
6
for maintaining the history.
Used Debugger to validate transformations by creating break points to analyze, and monitor Data
flow.
Tuned performance of Informatica Session by increasing block size, data cache size, sequence buffer
length and Target based commit interval, and mappings by dropping and recreation of indexes.
Involved in pre-and post-session migration planning for optimizing data load performance.
Performed Unit testing during the mapping phase to ensure proper and efficient implementation of
the transformations.
Worked along with the QA Team and provided warranty support
Tools & Technologies: Informatica Power Center 7.x, Oracle, SQL developer, PL/SQL, UNIX Shell Scripting,
SQL Server 2003