Vindhya Pulicharla: Email: PH: 919-655-9520

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 7

VINDHYA PULICHARLA

Email: [email protected] PH: 919-655-9520

7+ years of professional experience with comprehensive technical skill set and expertise indevelopment and
implementation of Data Warehouses&Data martDataStage 8.x/9.x/11.x&with Informatica Power
Center 9.X/8.X/ 7.x

 Involved in all phases of Software Development Life Cycle (SDLC): Requirement gathering, analysis,
design, development, testing, production support and maintenance. Also worked on waterfall and
Agile/Scrum methodology project.
 Good Knowledge on Logical data model, Physical data model,Data warehouse concepts,
Dimensional Modeling using Star schema and Snow Flake schema.
 Developing Data Stage Sequences/Jobs, reusable Sequences/jobs and Activities for Daily process to
loading heterogeneous data into the data warehouse.
 Extensively used stages like Sequential file, Datasets, DB Connectors (ODBC, Oracle, Netezzza),
Aggregator, Joiner, Transformer, Lookup, ChangeCapture, Remove Duplicates, Peak, Copy and
Pivot.
 Executed sequences, both sequential and concurrent for efficient execution of jobs and used other
activities like Job Activity, Set Variables, Exec Command, Email Notification, Sequencer, Nested
Conditions, Start Loop and End Loop.
 Experience in working with Informatica components like workflows, mappings, mapplets, sessions,
tasks, Debugger, partitioning, reusable components and also extensively worked withsession logs,
workflow logs for error handling and troubleshooting on mapping failures.
 Good experience in developingInformatica Mappings, Mapplet, reusable Transformations, tasks,
sessions and Workflows for Daily process to loading heterogeneous data into the data warehouse.
Sources include delimited flat files, fixed width files, XML files, DB2 & ORACLE tables.
 Extensively used Transformations like Router, Aggregator, Joiner, Expression, Lookup, Update
strategy, Union, Normalizer and Sequence generator.
 Executed sessions, both sequential and concurrent for efficient execution of mappings and used other
tasks like event wait, event raise, email and command.
 Good experience in working with various database systems such as DB2, Oracle and SQL Server
Databases.
 Identified and streamlined the commonly used logic into mapplets and reusable sessions.
 Proficient in writing the data clean up scripts using SQL queries and UNIX scripts.
 Good experience in setting up new jobs and job dependencies using Control-M
 Added appropriate dependencies at job level & batch process level for each job and Quantitative
resources for each database/resource pools at scheduling tool to avoid deadlocks, timeout and
connection issues.
 Managed the authenticity of UNIX components by Checking-in & Checking-out code from
StarTeam/RTC, a code versioning tool.
 Experience in working with Onsite-Offshore model.
 Excellent communication and interpersonal skills, problem solving skills, analytical skills, flexible, self-
direct, ability to work with minimum supervision, and a team player
EDUCATION
MASTERS IN COMPUTER APPLICATIONS, SRI VENKATESWARA UNIVERSITY, TIRUPATI, AP,
INDIA - MAY 2006

BACHELORS IN COMPUTER APPLICATIONS, SRI VENKATESWARA UNIVERSITY, TIRUPATI, AP, INDIA -


MAY 2001

SKILLS
ETL Tools – DataStage 8.1/9.x/11.x , Informatica Power center 7.X/8.x/9.x
Database Systems – IBM DB2, SQL Server 2005/2008/2014, Oracle, Postgres
Databases Tools – Squirrel, Oracle SQL developer, SQL Server Management Studio
BI Tools – Tableau 10x
Version Control – Star team, IBM RTC (Rational Team Concert)
Languages – UNIX Shell Scripting
Scheduling Tools – Control-M
Other Tools – Putty, WinSCP, Visio

TRAININGS
 Data warehousing & Informatica Training – Mphasis, Chennai
 DataStage (Self-paced learning)
 Agile & Scrum (Self-Paced learning)
 Data Visualization with Tableau – University of California -Davis & Coursera
 Alteryx – (Alteryx Academy- In Progress)

EXPERIENCE
United Services Automobile Association, San Antonio, TxOct 2016 – Till Date
USAA, which is ranked #1 in Best places to work in 2010, 2011& 2012 by Computer World and is
among top 50 Fortune 500 companies, with total assets of $22 billion and employing 23,400 employees it is
one of the nation’s top Banking, Insurance, Investment & Advice organizations. As a lead I was responsible
for projects ranging in different domains such as Auto, P&C, Banking & Investments and Life insurance

Role: DataStage Developer


Project: Social Media Analytics.
Operations Excellence (OE) team utilizes a 3rd party platformto monitor and maintain its social media
presence. 3rd Party tools do a good job of filtering all posts from the various social media sources (e.g.,
Twitter, Facebook, G+, Instagram, etc.) into a subset that is manageable for the Operations team, but this has
resulted in a social media process that exists in a silo. One of the chief goals of nearly all organizations today
is to enable data-driven decisions and actions. The goal is to enable users to push beyond canned reports
and limited spreadsheet views to take advantage of more advanced data visualization and analytics; then,
users can accelerate exploration and discovery of valuable insights and apply them for business advantage

Project:Anticipate, Detect & Respond (ADR) – Predictive Analytics.


Marketing the right offers to the right member at right time is the major task for the marketing business.
This project provides the predictive analytics capability for the marketing business to identify the eligible
members at right time for a specific offer. Separate models are designed for identifying the eligible members
for each offer. Models are designed and members are scored for each major offer through analytical
modeling tools and store the model scores data on the Hadoop servers. Model Score Data will be extracted
through Hive, processed and loads through ETL batch process into marketing & Sales environment for

2
campaign executions

Project: Enterprise Relation Sales Metrics


Project is aimed at deriving the Enterprise KPI Metrics for the Member Service Representatives based on
their ability to sell different Insurance, Banking and Finance Products. ETL work involved sourcing of data
from the Operational Database, cleansing the data, deriving different metrics and loading the data into the
mart tables. All the infrastructure code required for ETL was done in UNIX platform. Different categories of
reports are generated using SAP Business Objects for the end users.

Project: Marketing Channel Analytics


This objective of this project is to automate the Marketing campaign Channel Analysis, which reduces the
manual intervention and using of multiple data stores. Marketing Campaign Chanel analysis application
helps the business users to understand the differences between the Projections and the actual of the
campaigns. (Ex: Expected customers to be reach through email channel is 100,000. But the actual is 75,000.
Then Business will analysis the reasons behind the difference between projections and actual).

Responsibilities:
 Collaborates with project managers and prepare designs for project scope and perform risk analysis
on same
 Collects, organizes and generates new documentation like high level technical documents, low level
functional documents and data flow diagrams
 Designed and developed medium to complex DataStage jobs using stages such as the Oracle & DB2
connectors, Aggregator, Sequential File stage, Dataset stage, Transformer, Lookup, Filter, Remove
Duplicates, Change Capture, FTP stage & Sort stages.
 Used DataStage sequences for Creating, Validating, Testing and running the DataStage jobs in
sequential/parallel process flow to process & load full and incremental data into target system.
 Created numerous simple to complex queries involving self joins, correlated sub-queries
 Designed and developed several SQL processes to extensively test the ETL process across the
environments
 Works with DBAs and prepares improvement plans for improving extractions & load process.
 Collaborating with cross functional teams and execute all project deployments.
 Managed the authenticity of Data Stage/ UNIX components by Checking-in & Checking-out code from
and to RTC (IBM Rational Team Concert )
 Created ETL overview & Code walk through documents for maintenance teams.
 Documented all unit test cases for all tables across the hop’s and uploaded to project site for future
reference
 Provides support during implementation and roll out activities
 Added appropriate dependencies at job level & batch process level for each job and Quantitative
resources for each database/resource pools at scheduling tool to avoid deadlocks, timeout and
connection issues.
Tools & Technologies: DataStage 9.x/11.3, Unix, Oracle, Db2, Netezza, Control-m, SQL Developer, RTC
(Rational Tool Concert), HDFS, Hive

State Farm Insurance, Bloomington, IL Feb 2016 – Sep 2016


State Farm was founded in 1922 as a mutual automobile insurance company owned by its policyholders.
The firm specialized in auto insurance for farmers and later expanded services into other types of insurance,
such as homeowners and life insurance, and then to banking and financial services.

Role: Informatica Developer


Project: Marketing Data Mart Enhancements
This project is to enhance the customer interaction and experience by using the state of art campaign
3
execution process through IBM UNICA Campaign. Till the date of the project, business team is using SAS to
manage campaign execution process. SAS programs are complex and hard to maintain the Process, so
business chose IBM Unica as the marketing platform to ease the campaign execution process. Historical data
needs to be loaded into the Marketing data mart and process needs to build using Informatica tool to load
the incremental data.

Responsibilities:
 Prepared High-Level Design and Low-Level Design based on Functional and Business required
documents of the project.
 Interact with the requirements team and architectural team to get a brief knowledge of business
logics.
 Conducted Review sessions with SME’s and Business users for better understanding of the
requirements.
 Extensively used ETL processes to load data from flat files into the target database by applying
business logic on transformation mapping for inserting and updating records when loaded.
 Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner
Transformation.
 Extensively used various transformations like Filter, Router, Sequence Generator, Lookups, Update
Strategy, Joiner, Source Qualifier, Expression, Sorter, and Aggregator.
 Extensively used Mapping Variables, Mapping Parameters, and Parameter Files for the capturing
delta loads.
 Worked with slowly changing dimension, Type1, Type2
 Create and execute unit test cases
 Keep track of the reported defects and support other teams to resolve defects
Tools & Technologies: Informatica Power Center 9.1, UNIX, DB2, TOAD, SQL*Loader

United Services Automobile Association,San Antonio,TxSep 2010 – Oct 2011


USAA, which is ranked #1 in Best places to work in 2010& 2011 by Computer World and is among top 50
Fortune 500 companies, with total assets of $22 billion and employing 23,400 employees it is one of the
nation’s top Banking, Insurance, Investment & Advice organizations. As a lead I was responsible for projects
ranging in different domains such as Auto, P&C, Banking & Investments and Life insurance

Role: ETL Developer (Informatica/DataStage)


Project: EIA Retirement (Bank Pre-Approvals & Mortgage Feeds)
Informatica to Data stage Migration project. Enterprise Integration Area Application is getting retired as
part of modernization and to improve the Member experience. Enterprise Integration Area is the main data
source for the major applications like marketing, Bank pre-approvals and Mortgage and P&C member data.
Marketing applications moved off from EIA as part of modernization of marketing & sales process and
influenced other applications to move out of EIA to cut down the huge infrastructure costs. This phase of the
project is to redesign and to redevelop the Bank pre-approval Process & Mortgage process using data stage

Project: Reporting Sandbox


Reporting Sandbox is a banking solution to gather and to process the credit & debit information about the
customers on periodical bases and loads in DataMart for reporting and analytics. Based on customer
financial flexibility, solutions are tailored and marketed to customers. Sandbox env has live data for adhoc
reporting purpose & to identify the potential solutions and to estimate the ROI

Responsibilities:
 Developed Informatica mappings to load data into various dimensions and fact tables from various
source systems.
 Created and managed Source to Target mapping documents for all Facts and Dimension tables
4
 Designed and Developed mappings using which involve different sources that includes Flat Files and
Relational tables from heterogeneous databases like Oracle, SQL server and DB2.
 Designed and developed medium to complex informatica mappings using transformations such as
the Source Qualifier, Aggregator, Expression, Lookup, Filter, Router, Rank, Sequence Generator,
Stored Procedure and Update.
 Used Workflow Manager for Creating, Validating, Testing and running the sequential, parallel,
sessions that perform Full and Incremental Loads to target system.
 Designed and developed medium to complex DataStage jobs using stages such as the Oracle & DB2
connectors, Aggregator, Sequential File stage, Dataset stage,Transformer, Lookup, Filter, Remove
Duplicates, Change Capture, FTP stage & Sort stages.
 Used DataStage sequences for Creating, Validating, Testing and running the DataStage jobs in
sequential/parallel process flow to process & load full and incremental data into target system.
 Extensively worked with Slowly Changing Dimensions (SCD) Type1 & Type2 for Data Loads.
 Created numerous simple to complex queries involving self joins, correlated sub-queries
 Identified and created various test scenarios for Unit testing the data loaded in target.
Tools & Technologies: Informatica 8.6, DataStage 8.0, Unix, Oracle, Db2, Control-m, SQL Developer,
StarTeam

Frost Bank, San Antonio, TxAug 2009–Sep 2010


Frost Bank is a Texas-chartered bank founded in 1868 and based in San Antonio, with 139 branches
across the state. Frost is one of the largest Texas-based banks. Frost offers a full range of commercial and
consumer banking products, investment and brokerage services and insurance products to customers
throughout Texas.

Role: Informatica Developer


Project: Marketing Channel Analytics
Objective of this project is to automate the Marketing campaign Channel Analysis, which reduces the manual
intervention and using of multiple data stores. Marketing Campaign Chanel analysis application helps the
business users to understand the differences between the Projections and the actual of the campaigns. (Ex:
Expected customers to be reach through email channel is 100,000. But the actual is 75,000. Then Business
will analysis the reasons behind the difference between projections and actual).

Project: Customer Segmentation


The purpose of this initiative is to expand customer profiles with additional information gathered as part of
multiple interactive campaigns and surveys. This will enable business team to have a more in-depth
understanding of customer preferences, habits and demographic information. As a result, future campaigns
can be better focused in terms of which customers to target, when they should be targeted and the type of
message that would be most persuasive.

Project: Credit Risk Analytics


Objective of this program is to capture and provide most accurate risk information timely to various people
in Global Credit Risk Management (GCRM) group

Responsibilities:
 Developed Informatica mappings to load data into various dimensions and fact tables from various
source systems.
 Created and managed Source to Target mapping documents for all Facts and Dimension tables
 Designed and Developed mappings using which involve different sources that includes Flat Files and
Relational tables from heterogeneous databases like Oracle, SQL server and DB2.
 Designed and developed Informatica power center medium to complex mappings using
transformations such as the Source Qualifier, Aggregator, Expression, Lookup, Filter, Router, Rank,
5
Sequence Generator, Stored Procedure and Update.
 Used Workflow Manager for Creating, Validating, Testing and running the sequential, parallel,
sessions that perform Full and Incremental Loads to target system.
 Extensively worked with Slowly Changing Dimensions (SCD) Type1 & Type2 for Data Loads.
 Created Pre/Post Session SQL commands in sessions and mappings on the target instance.
 Created numerous simple to complex queries involving self joins, correlated sub-queries
 Developed and tested all the backend programs, Informatica mappings, sessions and workflows
Tools & Technologies: Informatica 8.6, UNIX shell scripting, Oracle, SQL Developer, SQL Server 2008, TFS,
SQL Server Management studio

Mphasis
Client: General Motors, Chennai, India Jul 2006 - Mar 2009
General Motors – more commonly known as GM – is a multinational automotive company in the USA and
Canada. It produces cars and trucks and sells its services through many brands such as Chevrolet, GMC,
Cadillac, Opel, and Vauxhall amongst many more.

Role: Informatica Developer


Project: Operational Risk Management (ORM)
ORM (Operational Risk Management) project is to build a data mart for operational reporting. The data in
the ORM Data mart will allow business teams to align its risk management systems with the One Risk
Strategic Plan objectives like maintain capital adequacy, maintain market confidence, delivery stable
earnings growth, stable and efficient access to funding and liquidity

Project: Financial Analytics for Fiscal Year IT Spends


This project is aimed to build Financial Analytical Solution for the company’s IT Expenditure. This will help
the Program Managers and the Planning Liaisons to analyze the IT Execution and Plan financials at different
levels in the organization and help them devise strategies for Optimizing Future Costs and Plans.

Project: Lead Time Accuracy (LTA)


Objective of the LTA is a web intelligence application, which allows business groups to leverage historical
lead time information in order to improve replenishment accuracy and responsiveness to changes that
occur within the supply chain. The data from dispatch of goods (International/Domestic Vendor) to
receiving the goods at stores are stored in 9 source systems (each segment data is stored in separate source
system). Data from these source systems has to be extracted, transformed according to the requirements
and loaded to the replenishment mart

Project: Test Data Management and Sanitization (TDMS)


Object of the project is to mask the client sensitive data in production environment and load into non-
production environment for various management purposes. The sensitive date includes information of pii
(personally identifiable information) like employee id; account number; SSN, passport number etc.

Responsibilities:
 Assisted gathering business requirements and worked closely with various Application and Business
teams to develop Data Model, ETL procedures to design Data Warehouse.
 Extensively used ETL Informatica tool to extract data stored in MS SQL 2003, csv files and Flat files
and finally loaded into a Data mart.
 Used various active and passive transformations such as Aggregator, Expression, Sorter, Router,
Joiner, connected/unconnected Lookup, and Update Strategy transformations for data control,
cleansing, and data movement.
 Designed and developed Mapplets for faster development, standardization and reusability purposes.
 Implemented Slowly Changing Dimension Type 1 and Type 2 for inserting and updating Target tables
6
for maintaining the history.
 Used Debugger to validate transformations by creating break points to analyze, and monitor Data
flow.
 Tuned performance of Informatica Session by increasing block size, data cache size, sequence buffer
length and Target based commit interval, and mappings by dropping and recreation of indexes.
 Involved in pre-and post-session migration planning for optimizing data load performance.
 Performed Unit testing during the mapping phase to ensure proper and efficient implementation of
the transformations.
 Worked along with the QA Team and provided warranty support

Tools & Technologies: Informatica Power Center 7.x, Oracle, SQL developer, PL/SQL, UNIX Shell Scripting,
SQL Server 2003

You might also like