0% found this document useful (0 votes)
12 views14 pages

Vinod K Datastage

Vinod Kumar Kairamkonda has over 15 years of experience in system analysis, design, and development of ETL processes using IBM Cloud Pak DataStage and related technologies. He has worked on various projects involving data migration, data warehousing, and integration across multiple platforms, including DB2, Oracle, and Azure SQL. His expertise includes data modeling, performance tuning, and developing user interfaces with modern frameworks.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views14 pages

Vinod K Datastage

Vinod Kumar Kairamkonda has over 15 years of experience in system analysis, design, and development of ETL processes using IBM Cloud Pak DataStage and related technologies. He has worked on various projects involving data migration, data warehousing, and integration across multiple platforms, including DB2, Oracle, and Azure SQL. His expertise includes data modeling, performance tuning, and developing user interfaces with modern frameworks.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 14

Vinod kumar Kairamkonda

309-363-0147
[email protected]

Professional Summary:
➢ 15+ years of experience in system analysis, design, development and implementation of
Relational Database and Data Warehousing systems using IBM Cloud Pak Datastage(CP4D),IBM
DataStage 11.7,11.5 ,11.3,9.1, 8.5, 8.1/7.x/6.x/5.x (Information Server, Web Sphere, Ascential
DataStage).
➢ Experienced in Migrating ETL Jobs from 11.5 Version to 11.7 Version Using CI/CD Process.
➢ Experienced in designing, developing, documenting, and testing of ETL jobs and mappings in
Server and Parallel jobs using DataStage to populate tables in Data Warehouse and Data marts.
➢ Extensive knowledge in Data Modeling using IBM Infosphere Data Architect.
➢ Proficient in developing strategies for Extraction, Transformation and Loading (ETL)
mechanisms.
➢ Expert in designing parallel jobs using various stages like Join, Merge, Lookup, remove duplicates,
Filter, xml stage , Hierarchail stage ,Dataset, Lookup file set, Complex flat file, Modify, Aggregator,
XML.
➢ Designed the web UI application layout and forms using HTML ,CSS and Java script.
➢ Hands on experience in creating indexed views, complex stored procedures, effective functions,
and appropriate triggers to assist efficient data manipulation and data consistency.
➢ Extensive experience in programming Stored Procedures, Triggers, Views, SQL queries
specializing in DB2 for LUW.
➢ Expert in designing Server jobs using various types of stages like Sequential file, ODBC, Hashed
file, Aggregator, Transformer, Sort, Link Partitioner and Link Collector.
➢ Excellent Analytical skills to understand the Business Process and Functionality. Developing
Functional Specifications for business process refinement and automation, Data Modeling, system
architecture, and conducting feasibility studies.
➢ Experienced with IBM InfoSphere Information Governance Catalog
➢ Experienced in integration of various data sources (DB2-UDB, SQL Server, Sybase, Oracle, Cloud
Snowflake Database ,Teradata, XML and MS-Access) into data staging area.
➢ In-depth Knowledge in Relational and Dimensional Data modeling for creating Logical and
Physical Design of Database and ER Diagrams using multiple data modeling tools like IDA 9.1,
ERWIN 7.1 and Visio
➢ Proficient in performance tuning for IDA implementing proper Model management, creating
multiple sub packages in each DB2 LUW logical model, minimizing dependencies, avoiding cross
referencing Data models.
➢ Integrated back-end data with the UI using AJAX, RESTful APIs, and data-binding
techniques.
➢ Experience in Configuring and Managing Linked Servers, data transfer between SQL Server and
other heterogeneous relational database DB2.
➢ Responsible for optimizing all indexes, SQL queries, stored procedures to improve the quality of
software.
➢ Expert in working with DataStage Manager, Designer, Administrator, and Director.
➢ Experienced with Microsoft DTS packages.
➢ Expertise in using UI frameworks like React, Angular, or Vue.js for building
interactive user interfaces.
➢ Experienced in handling facets in Health care systems.
➢ Involved in system transaction training on Claims, Utilization Management, Benefit Plans, Billing,
Commissions, Capitation, Customer Service, and Security.
➢ Experienced in performing problems in-depth and provide recommendations on how to gain
better performance in your specific environment.
➢ Experienced in Data Modeling as well as reverse engineering using tools Erwin, Oracle Designer
MS .
➢ Expert in unit testing, system integration testing, implementation and maintenance of databases
jobs.
➢ Expert in unit testing, system integration testing, implementation, maintenance and performance
tuning.
➢ Worked on different Scheduling tools like Autosys for automating jobs run.
➢ Proficiency in UI technologies such as HTML, CSS, JavaScript, React.js, Angular,
Vue.js, or other front-end frameworks.

Education Qualifications:
Master’s in engineering technology. (Connecticut, USA .2009)

Skillset:
IBM Cloud Pak Datastage(CP4D),IBM Information Infosphere 11.7,11.5,9.1 ,8.5(DataStage, Information
Analyzer, Business Glossary),IBM Information Infosphere 8.1, Ascential DataStage Enterprise Edition and
DataStage 7.5.2/7.0/6.0 (Administrator, Designer, Manager and Director), Informatica Power center
9.0,Infosphere Data Architect 9.1,Teradata SQL Assistant, Oracle Designer, Erwin, ER/Studio, TOAD, C, C+
+, Visual Basic, SQL, PL/SQL, Oracle 11i/10g/9i/8i/7.x, SQL Server 2000/2005/2008, MS Access, Cloud
Snowflake Database ,Teradata, DB2, Unix, Linux, IBM AIX 5.2/5.1,7.1,9.1, Sun Solaris V8.0, HP-UX V11.0,
Windows XP/NT/2000/98, DataStage Version Control.

Project Summary:

New York Department of Taxation and Finance: Albany (NY)


Apr/2024-Present
Sr. DataStage Developer:
The project is Modernizing the New York Tax filing systems to Migrate the Mainframe data to Xml and
Json files and claims Data into Relational Databases. The Project is to extract the data from different Tax
filing systems to process the Claim status for reporting Analysis into Db2 tables on Xml data, Jason data
using IBM Cloud Pak DataStage(CP4D) and Infosphere DataStage 11.5

Hardware/Software:

IBM Cloud Pak Datastage (CP4D) ,IBM Info sphere Information server 11.5 (Data stage, Quality Stage,
Information Analyzer, Thin Client, Business Glossary), AQT , XML ,Json data , DB2, Linux,WNSCP, Stone
Branch, Putty.

Responsibilities:

➢ Involving in gathering Business requirements through Business users and Architects


➢ Worked with developers for ETL Development to pass through Business rules.
➢ Modernizing the current ETL Infrastructure to XML and Json files using IBM Cloud Pak Datastage.
➢ Reviewed the Data models with Business users, ETL Team, DBA’s and testing team to provide
information about the data model and business requirements.
Developed and Maintained front end UI web applications using HTML 5 /Java script.
➢ Involved in Processing different tax file system Json files using Hierarchal stage on IBM Cloud Pak
DataStage to load into DB2 tables.
➢ Involved in Processing different tax files using xml stage on IBM Cloud Pak to create
Datasets for different FORM Ids to load into DB2 tables.
➢ Involved in Developing ETL jobs using IBM Data stage 11.5 with different stages Transformer,
Web services Transformer stage, joins, Lookup, Remove duplicates and aggregator stage.
➢ Involved in Developing Parallel jobs to extract Data from csv files, xml data to cloud
Data warehouse Database. Created reusable UI components to ensure consistency and
maintainability across the application.
➢ Responsible for daily verification that all scripts, downloads, and file copies were executed as
Planned, troubleshooting any steps that failed, and providing both immediate and long-term.
Problem resolution.
➢ Used Python script and Perl scripting to process Monthly files into Dataware housing tables.
➢ Used Shell and Perl scripting to process 300 tables on single execution for Loads status tables on
different servers.
➢ Involving in Daily Scrum calls for status updates and presenting Demos for end of Each Sprint.
➢ Involved in migrating ETL jobs from Development to Production Environment.
➢ Created Job Sequencer to execute a set of jobs.
➢ Created error and exception handling procedures to identify record, report and fix errors.
➢ Validated the current production data, the existing data and programming logics involved.
➢ Involved in unit testing, implementation, maintenance and performance tuning.
➢ Involved in debugging ETL jobs and identifying root cause to execute ETL Jobs.
➢ Migrating ETL jobs from development to UAT Environment.
➢ Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions
and Validations. Created dashboards and reports using UI development tools that integrate
DataStage outputs, facilitating real-time data analysis for stakeholders.

➢ Involved in Performance tuning of the jobs and debugging the jobs.


➢ Extensively tested the ETL jobs that were scheduled in Stone Branch tool for scheduling, timing
and
data integrity.

Portland General Electric: Fort Worth (TX)


Oct/2022-Apr/2024.
Data Analyst/Sr. DataStage Developer:
Portland General Electric (PGE) is an electrical utility based in Portland in the U.S. state of Oregon. It
distributes electricity to customers in parts of Multnomah, Clackamas, Marion, Yamhill, and Washington.
The ERP Integration Project to Develop New ETL jobs for Each Interface to load the data into Maximo
using IBM infosphere DataStage 11.5 version.

Hardware/Software:

IBM Info sphere Information server 11.5 (Data stage, Quality Stage, Information Analyzer, Thin Client,
Business Glossary), Erwin 9.7, CI/CD Process, Oracle, Cloud Snowflake Database, Azure SQL, WNSCP,
Stone Branch, Putty.

Responsibilities:

➢ Involving in gathering Business requirements through Business users and Architects


➢ Working with Offshore developers for ETL Development to pass through Business rules.
➢ Modeled the Star schema data marts by identifying the fact and dimension tables using Erwin
data modeling tool.
 Analyzed IBM data models that provide data warehouse design models to accelerate the
development of business applications.
 Reviewed the logical model with Business users, ETL Team, DBA’s and testing team to provide
information about the data model and business requirements.
 Involved in creating entity relational and dimensional relational data models using Erwin.
➢ Involved in designing, developing and implementation of Data ware housing systems using IBM
INFOSPHERE Data stage 11.5.
➢ Involved in Developing ETL jobs using IBM Data stage 11.5 with different stages Transformer,
Web services Transformer stage, joins, Lookup, Remove duplicates and aggregator stage.
 Involved in Developing Parallel jobs to extract Data from csv files and Oracle data to cloud
snowflake Dataware house Database.
 Responsible for daily verification that all scripts, downloads, and file copies were executed as
Planned, troubleshooting any steps that failed, and providing both immediate and long-term.
Problem resolution.
➢ Involving in Daily Scrum calls for status updates and presenting Demos for end of Each Sprint.
➢ Involved in migrating ETL jobs from Development to Production Environment.
➢ Created Job Sequencer to execute a set of jobs.
➢ Created error and exception handling procedures to identify record, report and fix errors.
➢ Validated the current production data, the existing data and programming logics involved.
➢ Uploaded ETL jobs dsx files into Subversions.
➢ Involved in unit testing, implementation, maintenance and performance tuning.
➢ Involved in debugging ETL jobs and identifying root cause to execute ETL Jobs.
➢ Involved in Scheduling ETL jobs using Stone Branch schedulers.
➢ Migrating ETL jobs from development to UAT Environment.
➢ Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions
and Validations.
➢ Involved in Performance tuning of the jobs and debugging the jobs.
➢ Extensively tested the ETL jobs that were scheduled in Stone Branch tool for scheduling, timing
and
data integrity.

Albertsons Companies: Plano (TX)


Aug/2021-Oct/2022
Sr. DataStage Developer:
Albertsons Companies, Inc.[1][2] is an American grocery company founded and headquartered
in Idaho. The Project was to Migrate the data from Oracle Database to Azure SQL Database on
Enterprise level, Developing and Testing ETL Jobs on IIS 11.7 Version. Deployment of ETL Jobs from IIS
11.5 Version Using CI/CD Process to IIS 11.7 Version.

Hardware/Software:

IBM Info sphere Information server 11.7 (Data stage, Quality Stage, Information Analyzer, Thin Client,
Business Glossary), CI/CD Process, Cloud Snowflake Database, Oracle, Azure SQL, WNSCP, Putty.

Responsibilities:
➢ Involved in gathering technical requirements through Agile Scrum calls and Architects.
➢ Involved in Deployment of ETL Jobs from 11.5 Version to 11.7 Version Using CI/CD Process.
➢ Created a Manifest files for Each Business App codes for Multiple Projects for Each ETL Jobs
Unix files and Property files.
➢ Provisioned the Manifest files to create the Baseline Id for Each Interface.
➢ Deployed the Each Interface for respective Baseline Id to IIS 11.7 Version and updated in Jira.
➢ Verified and Tested the ETL jobs after Post Deployment Process.
➢ Involved in Debugging the ETL Jobs failure and Identified the Root cause to execute the ETL Jobs
Successfully.
➢ Created a status document for each App code Deployment and Uploaded it on Teams server.
➢ Working with Offshore developers for ETL Development to pass through Business rules.
➢ Involved in ongoing development of technical best practices for data movement, data quality.
➢ Involving in development of Data Stage Jobs with required Transformations like
Aggregator, Filter, Funnel, Join, Lookup, Merge, Remove Duplicates, Sort, Transformer etc.
➢ Worked within the Enterprise Data Warehouse Application Development team, developing
and maintaining Data Models in the Enterprise Data Warehouse.
➢ Designed the mappings between sources (external files and databases) and Operational staging
targets.
➢ Involved in Project management from analysis and design to implementation and construction.
➢ Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage
Director.
➢ Used Python scripting , to Migrate the ETL Datastage Jobs from 11.5 verison to Datastage 11.7
Version.
➢ Replaced Oracle Connector with Microsoft Azure Sql running through Python script on Each app
code.
➢ Received the source data in the form of Oracle tables, Sequential files, flat files and SQL Server and
loaded into Cloud Snowflake Database.
 Involved in migrating ETL jobs from Development to Production Environment.
 Created Job Sequencer to execute a set of jobs.
 Used Quality Stage to ensure consistency, removing data anomalies and spelling errors of the
source information before being delivered for further processing.
 Created error and exception handling procedures to identify record, report and fix errors.
 Validated the current production data, the existing data and programming logics involved.
 Involved in Performance tuning of the jobs and debugging the jobs.

Whataburger: Plano (TX)


May/2019-Aug/2021
Sr. DataStage Developer:
Whataburger is an American privately held, regional fast food restaurant chain, headquartered and based
in San Antonio, Texas, that specializes in hamburgers. The project involves implementing the process to
extract the data from Flat files into staging tables then loading the data in the Data Warehouse.

Hardware/Software:

IBM Info sphere Information server 11.5 (Data stage, Quality Stage, Information Analyzer, Thin Client,
Business Glossary), Netezza, Datasets, Active Batch, Aginity Workbench.

Responsibilities:
➢ Involving in gathering Business requirements through Business users and Architects
➢ Working with Offshore developers for ETL Development to pass through Business rules.
➢ Involved in ongoing development of technical best practices for data movement, data quality, data
➢ Involving in development of Data Stage Jobs with required Transformations like
Aggregator, Filter, Funnel, Join, Lookup, Merge, Remove Duplicates, Sort, Transformer etc.
➢ Worked within the Enterprise Data Warehouse Application Development team, developing
and maintaining Data Models in the Enterprise Data Warehouse.
➢ Profiling Legacy systems like column analysis, Drill through Reports & custom filters in
information Analyzer.
➢ Designed, coded, tested, and documented complete customized application solutions for Data
Quality integration.
➢ Data Profiling, Standardization, matching and cleansing activities through Data quality process
Using Quality Stage.
➢ Create projects, add data sources, write, configure and execute rules/rule sets within Information
Analyzer
➢ Developed data profiling solutions, run analysis jobs and view results, and created and managed
data quality controls using Information Analyzer
➢ Performed column analysis, rule analysis, primary key analysis, natural key analysis, foreign-key
analysis, and cross-domain analysis.
➢ Worked with team members, BI developers and the Business to further enhance data models
and Architecture to support Business Intelligence and corporate reporting needs
➢ Developed and maintained audit and validation processes to detect data integrity problems
and work with developers internally and externally to solve data integrity issues.
➢ Participated in the Agile process and met goals as accepted.
➢ Participated in ETL Applications Tools/Servers system and database upgrades.
 Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage
Director.
 Involved in migrating ETL jobs from Development to Production Environment.
 Created Job Sequencer to execute a set of jobs.
 Used Quality Stage to ensure consistency, removing data anomalies and spelling errors of the
source information before being delivered for further processing.
 Created error and exception handling procedures to identify record, report and fix errors.
 Validated the current production data, the existing data and programming logics involved.
 Involved in Performance tuning of the jobs and debugging the jobs.
 Scheduled and Monitored ETL jobs on Daily Basis

Humana: Louisville (KY)


July/2017-May/2019
Sr. Datastage Developer /IGC Consultant:

The project involves implementing the process to consume files from different Health care systems for
loading into Staging Database passing through Data quality rules and to capture the rejections to Load
into Exception counsel for business Verification.

Hardware/Software:

IBM Info sphere Information server 11.5 (Data stage, Quality Stage, Information Analyzer, Thin Client ,
Business Glossary), IBM Master Data Management imam ,IGC , DB2, Oracle 11g, $U Universe ,UNIX.

Responsibilities:
➢ Involving in gathering Business requirements through Business users and Architects
➢ Working with Offshore developers for ETL Development to pass through Business rules.
➢ Developing and managing data quality rules using Information Analyzer.
➢ Importing Metadata through Infosphere Metadata Asset Manager (IMAM) and Publish Infosphere
Governance Catalog (IGC) and Performing Data Profiling through Information Analyzer (IA)
➢ Involving in development of Data Stage Jobs with required Transformations like
Aggregator, Filter, Funnel, Join, Lookup, Merge, Remove Duplicates, Sort, Transformer etc.
➢ Involved in scheduling ETL jobs using UNIX shell scripting and Scheduler.
➢ Developing Linux, Java and Oracle solutions on the Master Data Management
(MDM) product supplied by IBM.
➢ Supporting the MDM production implementation to assure system health.
➢ Resolving any technical incidents and issues with the MDM product in production and test.
➢ Designing and coding technical solutions for business enhancements to our implementation of
MDM database with healthcare eligibility systems.
➢ Identifying and consulting on product improvements of our MDM implementation and Integrating
 Developed data profiling solutions, run analysis jobs and view results, and created and managed
data quality controls using Information Analyzer
 Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage
Director.
 Involved in migrating ETL jobs from Development to Production Environment.
 Created Job Sequencer to execute a set of jobs.
 Created error and exception handling procedures to identify record, report and fix errors.
 Validated the current production data, the existing data and programming logics involved.
 Involved in Performance tuning of the jobs and debugging the jobs.
 Scheduled and Monitored ETL jobs on Daily Basis

UNUM Corporation: Chattanooga (TN)


OCT/2016-July/2017
Data Architect:
UNUM Life Insurance was incorporated in 1848 and offers insurance products like accident, critical
illness and life insurance . The project involves in maintenance of the application that supports all its
products for Insurance and dependent care with modules enrollments, claims, cards, transactions.
The primary responsibility was to create Datastage jobs to extract, transform and load data into data
marts from various sources like RDBMS, flat files.

Hardware/Software:
IBM Info sphere Information server 11.5 (Data stage, Quality Stage, Information Analyzer, Business
Glossary), Infosphere Data Architect (IDA) 9.1, DB2, Oracle 11g, UC4, Autosys, Linux AIX, UNIX.

Responsibilities:

 Interacted with End user community to understand the business requirements and in identifying
data sources.
 Designed the Atomic Warehouse Data Models using IDA.
 Created Physical and Logical models using IDA.
 Reviewed the logical model with Business users, ETL Team, DBA’s, and testing team to provide
information about the data model and business requirements.
 Involved in creating entity relational and dimensional relational data models IDA.
 Designed the mappings between sources (external files and databases) and Operational staging
targets.
 Designed, coded, tested, and documented complete customized application solutions for Data
Quality integration.
 Configured IDA to connect to DB2 Sever Databases and Generated Entity Relationship Diagrams
for Business Data Models and Atomic Warehouse Model.
 Generated and analyzed the existing views in BDM_C Schema.
 Fixed some errors in VIEWS in BDM_C Schema through impact analysis on each attribute and
Entity in Business Data Model and submitted DDL.
 Analyzing Performance tuning on store Procedure and discussion on migrating to I Test for
further tuning then Moving to UAT
 Received the source data in the form of Oracle tables, DB2, Sequential files, flat files and Excel
sheets, SQL Server.
 Responsible for developing, validating, and communicating data modeling solutions, including
both relational and dimensional models.
 Designed the mappings between sources (external files and databases) and Operational staging
targets.

Carilion Clinic: Roanoke (VA)


MAR/2016-OCT/2016
Sr.Datastage Developer /Data Architect/IGC Consultant:
Carilion Clinic is a health care organization based in Virginia with more than 650 physicians in 70
specialties, operating 7 non-profit hospitals and 220 Clinics. The Project was to extract the data from
Clarity Database to ODS tables then into Data Warehousing systems applying some Business
Transformations.

Hardware/Software:
IBM Info sphere Information server 11.5 (Data stage, Quality Stage, Information Analyzer, Business
Glossary), IDA 9.1 (Infosphere Data Architect), IBM Infosphere Data models ,Oracle 11g, Netezza
7.2 ,DBeaver ,Talend ,Aginity Workbench, UC4, Linux AIX ,UNIX.

Responsibilities:
➢ Interacted with End user community to understand the business requirements and in identifying
data sources.
➢ Modeled the Star schema data marts by identifying the fact and dimension tables using Erwin
data modeling tool.
 Analyzed IBM data models that provide data warehouse design models to accelerate the
development of business applications.
 Reviewed the logical model with Business users, ETL Team, DBA’s and testing team to provide
information about the data model and business requirements.
 Involved in creating entity relational and dimensional relational data models using IDA Studio.
 Worked on Different Source analytical systems Scorecard Reports, Flowsheet, SSI, Access Logging,
and Professionals Billing Data.
 Designed the mappings between sources (external files and databases) and Operational staging
targets.
 Demonstrated knowledge of Information Analyzer administrative tasks such as managing logs,
schedules, active sessions and security roles.
 Gathered and articulated data issues with clients on areas such as data privacy, sensitivity and
security.
 Provided expertise in data profiling concepts, issues and activities.
 Perform data profiling tasks, analyzing and annotating data issues.
 Involved in Project management from analysis and design to implementation and construction.
 Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage
Director.
 Received the source data in the form of Oracle tables, Sequential files, flat files and Excel sheets,
SQL Server.
 Involved in development of DataStage Jobs for Integrated eligibility systems with required
Transformations like Aggregator, Filter, Funnel, Join, Lookup, Merge, Remove Duplicates, Sort,
Transformer etc.
 Used different UNIX commands for to start, stop, accessing logfiles, listing the jobs.
 Used shell scripting for Scheduling and FTP process.
 Used Shell scripting to read the data directly from the input directory.
 Involved in Performance tuning of the jobs and debugging the jobs.
 Involved in Business sales meeting with IBM, Horton works and Talend teams.
 Extensively worked on POC for Talend ETL tool
 Scheduled and Monitored ETL jobs on Daily Basis.

Mercy: St. Louis (MO)


JULY/2015-Feb 2016
Sr. Datastage Developer /Quality Stage Developer/Data Architect.
Mercy, A Non-Profit Organization, is committed to providing perfect Service in the health and wellness of
the customers, provide quality health, dental, vision, pharmacy and behavioral health coverage for their
customers and families. The project objective was to collect, organize and store data from different data
sources into staging tables, applying standardization rules to load into Target tables.

Hardware/Software:
IBM Info sphere Information server 8.5 (Data stage, Quality Stage, Information Analyzer, Business
Glossary), Oracle 11g, TOAD, UC4, Sequential files, AIX UNIX, Linux AIX, Autosys.

Responsibilities:
➢ Actively participated in the SCRUM meetings for daily update and Iteration Planning meetings.
➢ Interacted with BA to understand the business requirements and in identifying data sources.
➢ Worked on BOEING project for loading into final schemas tables for Anderson and Epic Clarity
Data.
➢ Involved in meetings with IBM team for upgradation of IBM DataStage.
➢ Working on loading the data in MCDW POTCL tables.
➢ Used Data Stage stages namely Sequential file, Transformer, Aggregate, Sort, Datasets, Join,
Lookup, Funnel, Peek, Row Generator stages in accomplishing the ETL Coding.
➢ Used Aggregator stage to count the no of records that were rejected during Execution of ETL job.
➢ Data Profiling, Standardization, matching and cleansing activities through data quality process.
➢ Used Quality Stage for standardizing the names and Address, Identified Duplicates, Matched and
Unmatched records in each source system.
➢ Used Investigation stage for identifying handled and unhandled data.
➢ Involved in creating designing and mapping documents for DataStage jobs.
➢ Responsible for daily verification that all scripts, downloads, and file copies were executed as
planned, troubleshooting any steps that failed, and providing both immediate and long-term
problem resolution.
➢ Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions
and Validations.
➢ Created Job Sequencer to execute a set of jobs.
➢ Created error and exception handling procedures to identify record, report and fix errors.
➢ Validated the current production data, the existing data and programming logics involved.
➢ Used TOAD as a querying tool to perform the basic database testing to see if there any data
inconsistencies.
➢ Used shell scripting for Scheduling and FTP process.
➢ Used UNIX commands to analyze the data in flat files, to handle the null values and Positions of
the columns in a file.
➢ Extensively tested the ETL jobs that were scheduled in UC4 tool for scheduling, timing and data
integrity.
➢ Used Daptiv for Project management tool.

Axiall Corp: Atlanta (GA)


May/2014-June/2015
Datastage / QualityStage Developer

Axiall Corp. is a Chemistry Company that works on applied chemistry to solve common problems,
improve everyday life and drive human progress. The project objective was to collect, organize and store
data from different Data sources to provide a single source of integrated SAP system.

Hardware/Software:
IBM Info sphere Information server 11.3 (Data stage, Quality Stage, Fast track, Information Analyzer,
Business Glossary), SAP ECC 6.0, SQL, Server 2008, Oracle 11g, Autosys, Erwin, UNIX, HP quality center.

Responsibilities:
➢ Involved in status meetings, and interacted with the Business Analyst to get the business rules.
➢ Involved in creating specifications for ETL processes, finalized requirements and prepared
specification document.
➢ Involved in meetings with SAP functional team, to discuss Customer master and Vendor master
data.
➢ Continuously working with data stewards to gather requirement specifications.
➢ Worked on BDR (Business Data repository) to identify fields in SAP.
➢ Fast Track mapping from legacy systems to target SAP.
➢ Profiling Legacy systems like column analysis, Drill through Reports & custom filters in
information Analyzer.
➢ Data Profiling, Standardization, matching and cleansing activities through data quality process.
➢ Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets,
Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the DataStage
coding.
➢ Applied default, field and cross reference Transformation rules in data stage.
➢ Implemented RTL Process from Staging area to alignment to generate gap reporting.
➢ Used different stages in Datastage design to load the data in different SAP tables (LFA1, LFBK,
LFM1, ADDR2, ADDR3, ADDR6, and TIBAN Tables).
➢ Involved in creating functional and scope documentation for data cleansing, conversion, and
integration processes.
➢ Managed the Metadata associated with the ETL processes used to populate the data.
➢ Used Information Analyzer for generating Column Analysis and Primary Key analysis report.
➢ Used Quality Stage for standardizing the names and Address, Identified Duplicates, Matched and
Unmatched records in each source system.
➢ Used Investigation stage for identifying handled and unhandled data.
➢ Created SAP Idoc jobs to load data into SAP Idoc.
➢ Used the DataStage Director for scheduling and monitoring the jobs.
➢ Involved in Exports and Imports of ETL jobs.
➢ Wrote complex SQL queries using joins, sub queries and correlated sub queries for extracting the
data.
➢ Involved in job scheduling using Autosys.
➢ Created re-usable components using shared containers for local use or shared use.
➢ Created Job Sequencer to execute a set of jobs.
➢ Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results,

Mass Mutual: Enfield (CT)


Nov/2013- Apr/2014
Datastage Developer:
MassMutual Financial Group offers whole life insurance, annuities, retirement plans, disability income
insurance and long-term care insurance for individuals. The project objective was to collect, organize and
store data from different data sources to create fixed width file and to provide a single source of
integrated and claims record for the purpose of reporting, analysis and decision support to improve the
client services

Hardware/Software:
IBM Info sphere Information server 8.5 (Data stage, Quality Stage, Information Analyzer, Business
Glossary), Oracle 11g, Autosys, TOAD 9.6, Erwin, UNIX, HP quality center.

Responsibilities:
➢ Involved in technical leadership in the analysis, decision-making, design, and support phases of
implementation of computer applications and network hardware and infrastructure, operating
systems, databases and enterprise-wide business applications.
➢ Involved in development phase meetings for Business Analysis, Requirements Gathering and
managed offshore team to get Expected results.
➢ Extensively used DataStage Manager, Designer, Administrator and Director for creating and
implementing jobs.
➢ Extensively worked on error handling, cleansing of data, creating lookup files and performing
lookups for faster access of data.
➢ Used DataStage Manager to import the Metadata from sources and targets.
➢ Involved in creating technical documentation for source to target mapping procedures to
facilitate better understanding of the process and incorporate changes as and when necessary.
➢ Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets,
Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the DataStage
coding.
➢ Used the DataStage Director for scheduling and monitoring the jobs.
➢ Used DataStage debugger to troubleshoot the designed jobs.
➢ Used Shared Containers for code reuse and implementing complex business logic.
➢ Tuned DataStage jobs to enhance their performance.

Emblem Health: New York City (NY)


Datastage Developer/Quality Stage Developer:
Aug/2012-Oct/2013
Emblem Health is a health maintenance organization and health insurance company based in New York
City. It was formed in 2006 by the merger of Group Health Incorporated (GHI) and HIP Health Plan of
New York to become one non-profit company. The project was to create Data stage jobs to extract,
transform and load data into data marts and to create a fixed width file according to specifications
provided by SDOH (State Department of Health NY) from various mainframe files.

Hardware/Software:
IBM Info sphere Information server 8.5 (Data stage, Quality Stage, Information Analyzer, Business
Glossary), Oracle 11g , Flat files, Autosys, TOAD 9.6, Erwin, UNIX, hp quality center.

Responsibilities:
➢ Involved in understanding of business processes and coordinated with business analysts to get
specific user requirements.
➢ Involved in appropriate and quality of data for data integration initiatives.
➢ Involved in Facets of integrated Eligible for health care management system designed to handle
the complex requirements of managed care programs.
➢ Involved in system transaction training on Claims, Utilization Management, Benefit Plans, Billing,
Commissions, Capitation, Customer Service, and Security.
➢ Involved in detailed knowledge of Quality Stage stages to develop an effective application.
➢ Used appropriate tools to fulfill data quality and business requirements.
➢ Provided metadata to create functional and technical specifications for data integration/cleansing
applications.
➢ Monitored resolution of data quality issues.
➢ Involved in understanding logical and physical models for the subject area of Customer
Information Management.
➢ Involved in Documentation Planning and Implementation.
➢ Experienced with data modeling tools such as MS Visio and/or Erwin.

Navistar: Chicago (IL)


Datastage Developer/Quality Stage Developer:
Jan/2012- Jul/2012
The project objective was to collect, organize and store data from different data sources to provide a
single source of integrated and claims record for the purpose of reporting, analysis and decision support
to improve the client services.
Hardware/Software:
IBM Infosphere Information server 8.5 (Datastage, Quality Stage, Information Analyzer, Business
Glossary), Oracle 10g, SQL Server 2008, DB2 UDB, Teradata, Flat files, Sequential files, Control-M.

Responsibilities:

 Interacted with End user community to understand the business requirements and in identifying
data sources.
➢ Developed and executed IBM Quality Stage data quality tools.
➢ Supported and generated Enterprise Data Governance quality assurance metrics.
➢ Created critical field reporting tools for Enterprise Data Management customer segments.
➢ Worked with Datastage Manager for importing metadata from repository, new job Categories and
creating new data elements.
➢ Designed and developed ETL processes using DataStage designer to load data from Oracle, MS
SQL, Flat Files (Fixed Width) to staging database and from staging to the target Data Warehouse
database.
➢ Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets,
Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the ETL
Coding.
➢ Developed job sequencer with proper job dependencies, job control stages, triggers.
➢ Used Quality Stage to ensure consistency, removing data anomalies and spelling errors of the
source information before being delivered for further processing.
➢ Excessively used DS Director for monitoring Job logs to resolve issues.
➢ Involved in performance tuning and optimization of Data Stage mappings using features like
Pipeline and Partition Parallelism and data/index cache to manage very large volume of data.

Lincoln Financial Group: Greensboro (NC)


Aug/2011- Dec/2011
Datastage Developer/Data Analyst:
The aim of the project is to implement an Actuarial Data warehouse environment to centralize and
standardize all actuarial related historical and ongoing in force and termination activity for experience
studies. The major job involved was to extract the data into staging area then loading the data in the Data
Warehouse.
Hardware/Software:
IBM Infosphere Information Server (DataStage and Quality Stage, Information Analyzer, Metadata
Workbench) 8.1, IBM-AIX, SQL Server 2008, Oracle 11g, Unix Shell Scripts (ksh), PL/SQL,UNIX, Autosys,
Erwin/MS Visio .

Responsibilities:
➢ Involved in understanding business processes and coordinated with business analysts to get
specific user requirements.
➢ Involved in understanding logical and physical models for the subject area of Customer
Information Management.
➢ Involved in Documentation Planning and Implementation.
➢ Experienced with data modeling tools such as MS Visio and/or Erwin.
➢ Experienced in relational data models such as conceptual, logical and physical data models,
dimensional data models, data dictionary, and metadata.
➢ Responsible for developing, validating, and communicating data modeling solutions, including
both relational and dimensional models.
➢ Designed the mappings between sources (external files and databases) and Operational staging
targets.
➢ Involved in Project management from analysis and design to implementation and construction.
➢ Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage
Director.
➢ Received the source data in the form of Oracle tables, Sequential files, flat files and Excel sheets,
Mainframes, SQL Server.
➢ Involved in development of DataStage Jobs with required Transformations like Aggregator, Filter,
Funnel, Join, Lookup, Merge, Remove Duplicates, Sort, Transformer etc.
➢ Involved in job scheduling using Autosys.

Target Corporation: Minneapolis (MN)


Jan/2010-July2011
ETL/ DataStage Developer:
Target Corporation is an American retailing company that was founded in Minneapolis, Minnesota in
1902 as the Dayton Dry Goods Company. Target is the second largest discount retailer in the United
States. The project was to create Datastage jobs to extract, transform and load data into data marts from
various sources like RDBMS, flat files.

Hardware/Software
Data Stage 7.5 (Parallel Extender), Oracle 8i, DB2, Sybase, SQL Server, Control-M, Erwin 4.0, Windows
Visio, UNIX, Windows-XP.

Responsibilities
➢ Involved in technical leadership in the analysis, decision-making, design, and support phases of
implementation of computer applications and network hardware and infrastructure, operating
systems, databases and enterprise-wide business applications.
➢ Involved in development phase meetings for Business Analysis and Requirements Gathering.
➢ Extracted, rectified, cleansed and loaded data from flat files, using shell scripts and different
stages into different databases.
➢ Migrated development mappings as well as hot fixes them in production environment.
➢ Performed a impact analysis on ETL jobs, before the jobs migrated in Datastage 7.5.3 version.
➢ Managed the Metadata associated with the ETL processes used to populate the data warehouse.
➢ Designed and developed jobs using Parallel Extender for splitting bulk data into subsets and
to dynamically distribute to all available nodes to achieve best Job performance.
➢ Involved in raising an incident, after the batch monitoring done by a offshore team.
➢ Involved in the migration of DataStage jobs from development to production environment.

You might also like