Vinod K Datastage
Vinod K Datastage
309-363-0147
[email protected]
Professional Summary:
➢ 15+ years of experience in system analysis, design, development and implementation of
Relational Database and Data Warehousing systems using IBM Cloud Pak Datastage(CP4D),IBM
DataStage 11.7,11.5 ,11.3,9.1, 8.5, 8.1/7.x/6.x/5.x (Information Server, Web Sphere, Ascential
DataStage).
➢ Experienced in Migrating ETL Jobs from 11.5 Version to 11.7 Version Using CI/CD Process.
➢ Experienced in designing, developing, documenting, and testing of ETL jobs and mappings in
Server and Parallel jobs using DataStage to populate tables in Data Warehouse and Data marts.
➢ Extensive knowledge in Data Modeling using IBM Infosphere Data Architect.
➢ Proficient in developing strategies for Extraction, Transformation and Loading (ETL)
mechanisms.
➢ Expert in designing parallel jobs using various stages like Join, Merge, Lookup, remove duplicates,
Filter, xml stage , Hierarchail stage ,Dataset, Lookup file set, Complex flat file, Modify, Aggregator,
XML.
➢ Designed the web UI application layout and forms using HTML ,CSS and Java script.
➢ Hands on experience in creating indexed views, complex stored procedures, effective functions,
and appropriate triggers to assist efficient data manipulation and data consistency.
➢ Extensive experience in programming Stored Procedures, Triggers, Views, SQL queries
specializing in DB2 for LUW.
➢ Expert in designing Server jobs using various types of stages like Sequential file, ODBC, Hashed
file, Aggregator, Transformer, Sort, Link Partitioner and Link Collector.
➢ Excellent Analytical skills to understand the Business Process and Functionality. Developing
Functional Specifications for business process refinement and automation, Data Modeling, system
architecture, and conducting feasibility studies.
➢ Experienced with IBM InfoSphere Information Governance Catalog
➢ Experienced in integration of various data sources (DB2-UDB, SQL Server, Sybase, Oracle, Cloud
Snowflake Database ,Teradata, XML and MS-Access) into data staging area.
➢ In-depth Knowledge in Relational and Dimensional Data modeling for creating Logical and
Physical Design of Database and ER Diagrams using multiple data modeling tools like IDA 9.1,
ERWIN 7.1 and Visio
➢ Proficient in performance tuning for IDA implementing proper Model management, creating
multiple sub packages in each DB2 LUW logical model, minimizing dependencies, avoiding cross
referencing Data models.
➢ Integrated back-end data with the UI using AJAX, RESTful APIs, and data-binding
techniques.
➢ Experience in Configuring and Managing Linked Servers, data transfer between SQL Server and
other heterogeneous relational database DB2.
➢ Responsible for optimizing all indexes, SQL queries, stored procedures to improve the quality of
software.
➢ Expert in working with DataStage Manager, Designer, Administrator, and Director.
➢ Experienced with Microsoft DTS packages.
➢ Expertise in using UI frameworks like React, Angular, or Vue.js for building
interactive user interfaces.
➢ Experienced in handling facets in Health care systems.
➢ Involved in system transaction training on Claims, Utilization Management, Benefit Plans, Billing,
Commissions, Capitation, Customer Service, and Security.
➢ Experienced in performing problems in-depth and provide recommendations on how to gain
better performance in your specific environment.
➢ Experienced in Data Modeling as well as reverse engineering using tools Erwin, Oracle Designer
MS .
➢ Expert in unit testing, system integration testing, implementation and maintenance of databases
jobs.
➢ Expert in unit testing, system integration testing, implementation, maintenance and performance
tuning.
➢ Worked on different Scheduling tools like Autosys for automating jobs run.
➢ Proficiency in UI technologies such as HTML, CSS, JavaScript, React.js, Angular,
Vue.js, or other front-end frameworks.
Education Qualifications:
Master’s in engineering technology. (Connecticut, USA .2009)
Skillset:
IBM Cloud Pak Datastage(CP4D),IBM Information Infosphere 11.7,11.5,9.1 ,8.5(DataStage, Information
Analyzer, Business Glossary),IBM Information Infosphere 8.1, Ascential DataStage Enterprise Edition and
DataStage 7.5.2/7.0/6.0 (Administrator, Designer, Manager and Director), Informatica Power center
9.0,Infosphere Data Architect 9.1,Teradata SQL Assistant, Oracle Designer, Erwin, ER/Studio, TOAD, C, C+
+, Visual Basic, SQL, PL/SQL, Oracle 11i/10g/9i/8i/7.x, SQL Server 2000/2005/2008, MS Access, Cloud
Snowflake Database ,Teradata, DB2, Unix, Linux, IBM AIX 5.2/5.1,7.1,9.1, Sun Solaris V8.0, HP-UX V11.0,
Windows XP/NT/2000/98, DataStage Version Control.
Project Summary:
Hardware/Software:
IBM Cloud Pak Datastage (CP4D) ,IBM Info sphere Information server 11.5 (Data stage, Quality Stage,
Information Analyzer, Thin Client, Business Glossary), AQT , XML ,Json data , DB2, Linux,WNSCP, Stone
Branch, Putty.
Responsibilities:
Hardware/Software:
IBM Info sphere Information server 11.5 (Data stage, Quality Stage, Information Analyzer, Thin Client,
Business Glossary), Erwin 9.7, CI/CD Process, Oracle, Cloud Snowflake Database, Azure SQL, WNSCP,
Stone Branch, Putty.
Responsibilities:
Hardware/Software:
IBM Info sphere Information server 11.7 (Data stage, Quality Stage, Information Analyzer, Thin Client,
Business Glossary), CI/CD Process, Cloud Snowflake Database, Oracle, Azure SQL, WNSCP, Putty.
Responsibilities:
➢ Involved in gathering technical requirements through Agile Scrum calls and Architects.
➢ Involved in Deployment of ETL Jobs from 11.5 Version to 11.7 Version Using CI/CD Process.
➢ Created a Manifest files for Each Business App codes for Multiple Projects for Each ETL Jobs
Unix files and Property files.
➢ Provisioned the Manifest files to create the Baseline Id for Each Interface.
➢ Deployed the Each Interface for respective Baseline Id to IIS 11.7 Version and updated in Jira.
➢ Verified and Tested the ETL jobs after Post Deployment Process.
➢ Involved in Debugging the ETL Jobs failure and Identified the Root cause to execute the ETL Jobs
Successfully.
➢ Created a status document for each App code Deployment and Uploaded it on Teams server.
➢ Working with Offshore developers for ETL Development to pass through Business rules.
➢ Involved in ongoing development of technical best practices for data movement, data quality.
➢ Involving in development of Data Stage Jobs with required Transformations like
Aggregator, Filter, Funnel, Join, Lookup, Merge, Remove Duplicates, Sort, Transformer etc.
➢ Worked within the Enterprise Data Warehouse Application Development team, developing
and maintaining Data Models in the Enterprise Data Warehouse.
➢ Designed the mappings between sources (external files and databases) and Operational staging
targets.
➢ Involved in Project management from analysis and design to implementation and construction.
➢ Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage
Director.
➢ Used Python scripting , to Migrate the ETL Datastage Jobs from 11.5 verison to Datastage 11.7
Version.
➢ Replaced Oracle Connector with Microsoft Azure Sql running through Python script on Each app
code.
➢ Received the source data in the form of Oracle tables, Sequential files, flat files and SQL Server and
loaded into Cloud Snowflake Database.
Involved in migrating ETL jobs from Development to Production Environment.
Created Job Sequencer to execute a set of jobs.
Used Quality Stage to ensure consistency, removing data anomalies and spelling errors of the
source information before being delivered for further processing.
Created error and exception handling procedures to identify record, report and fix errors.
Validated the current production data, the existing data and programming logics involved.
Involved in Performance tuning of the jobs and debugging the jobs.
Hardware/Software:
IBM Info sphere Information server 11.5 (Data stage, Quality Stage, Information Analyzer, Thin Client,
Business Glossary), Netezza, Datasets, Active Batch, Aginity Workbench.
Responsibilities:
➢ Involving in gathering Business requirements through Business users and Architects
➢ Working with Offshore developers for ETL Development to pass through Business rules.
➢ Involved in ongoing development of technical best practices for data movement, data quality, data
➢ Involving in development of Data Stage Jobs with required Transformations like
Aggregator, Filter, Funnel, Join, Lookup, Merge, Remove Duplicates, Sort, Transformer etc.
➢ Worked within the Enterprise Data Warehouse Application Development team, developing
and maintaining Data Models in the Enterprise Data Warehouse.
➢ Profiling Legacy systems like column analysis, Drill through Reports & custom filters in
information Analyzer.
➢ Designed, coded, tested, and documented complete customized application solutions for Data
Quality integration.
➢ Data Profiling, Standardization, matching and cleansing activities through Data quality process
Using Quality Stage.
➢ Create projects, add data sources, write, configure and execute rules/rule sets within Information
Analyzer
➢ Developed data profiling solutions, run analysis jobs and view results, and created and managed
data quality controls using Information Analyzer
➢ Performed column analysis, rule analysis, primary key analysis, natural key analysis, foreign-key
analysis, and cross-domain analysis.
➢ Worked with team members, BI developers and the Business to further enhance data models
and Architecture to support Business Intelligence and corporate reporting needs
➢ Developed and maintained audit and validation processes to detect data integrity problems
and work with developers internally and externally to solve data integrity issues.
➢ Participated in the Agile process and met goals as accepted.
➢ Participated in ETL Applications Tools/Servers system and database upgrades.
Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage
Director.
Involved in migrating ETL jobs from Development to Production Environment.
Created Job Sequencer to execute a set of jobs.
Used Quality Stage to ensure consistency, removing data anomalies and spelling errors of the
source information before being delivered for further processing.
Created error and exception handling procedures to identify record, report and fix errors.
Validated the current production data, the existing data and programming logics involved.
Involved in Performance tuning of the jobs and debugging the jobs.
Scheduled and Monitored ETL jobs on Daily Basis
The project involves implementing the process to consume files from different Health care systems for
loading into Staging Database passing through Data quality rules and to capture the rejections to Load
into Exception counsel for business Verification.
Hardware/Software:
IBM Info sphere Information server 11.5 (Data stage, Quality Stage, Information Analyzer, Thin Client ,
Business Glossary), IBM Master Data Management imam ,IGC , DB2, Oracle 11g, $U Universe ,UNIX.
Responsibilities:
➢ Involving in gathering Business requirements through Business users and Architects
➢ Working with Offshore developers for ETL Development to pass through Business rules.
➢ Developing and managing data quality rules using Information Analyzer.
➢ Importing Metadata through Infosphere Metadata Asset Manager (IMAM) and Publish Infosphere
Governance Catalog (IGC) and Performing Data Profiling through Information Analyzer (IA)
➢ Involving in development of Data Stage Jobs with required Transformations like
Aggregator, Filter, Funnel, Join, Lookup, Merge, Remove Duplicates, Sort, Transformer etc.
➢ Involved in scheduling ETL jobs using UNIX shell scripting and Scheduler.
➢ Developing Linux, Java and Oracle solutions on the Master Data Management
(MDM) product supplied by IBM.
➢ Supporting the MDM production implementation to assure system health.
➢ Resolving any technical incidents and issues with the MDM product in production and test.
➢ Designing and coding technical solutions for business enhancements to our implementation of
MDM database with healthcare eligibility systems.
➢ Identifying and consulting on product improvements of our MDM implementation and Integrating
Developed data profiling solutions, run analysis jobs and view results, and created and managed
data quality controls using Information Analyzer
Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage
Director.
Involved in migrating ETL jobs from Development to Production Environment.
Created Job Sequencer to execute a set of jobs.
Created error and exception handling procedures to identify record, report and fix errors.
Validated the current production data, the existing data and programming logics involved.
Involved in Performance tuning of the jobs and debugging the jobs.
Scheduled and Monitored ETL jobs on Daily Basis
Hardware/Software:
IBM Info sphere Information server 11.5 (Data stage, Quality Stage, Information Analyzer, Business
Glossary), Infosphere Data Architect (IDA) 9.1, DB2, Oracle 11g, UC4, Autosys, Linux AIX, UNIX.
Responsibilities:
Interacted with End user community to understand the business requirements and in identifying
data sources.
Designed the Atomic Warehouse Data Models using IDA.
Created Physical and Logical models using IDA.
Reviewed the logical model with Business users, ETL Team, DBA’s, and testing team to provide
information about the data model and business requirements.
Involved in creating entity relational and dimensional relational data models IDA.
Designed the mappings between sources (external files and databases) and Operational staging
targets.
Designed, coded, tested, and documented complete customized application solutions for Data
Quality integration.
Configured IDA to connect to DB2 Sever Databases and Generated Entity Relationship Diagrams
for Business Data Models and Atomic Warehouse Model.
Generated and analyzed the existing views in BDM_C Schema.
Fixed some errors in VIEWS in BDM_C Schema through impact analysis on each attribute and
Entity in Business Data Model and submitted DDL.
Analyzing Performance tuning on store Procedure and discussion on migrating to I Test for
further tuning then Moving to UAT
Received the source data in the form of Oracle tables, DB2, Sequential files, flat files and Excel
sheets, SQL Server.
Responsible for developing, validating, and communicating data modeling solutions, including
both relational and dimensional models.
Designed the mappings between sources (external files and databases) and Operational staging
targets.
Hardware/Software:
IBM Info sphere Information server 11.5 (Data stage, Quality Stage, Information Analyzer, Business
Glossary), IDA 9.1 (Infosphere Data Architect), IBM Infosphere Data models ,Oracle 11g, Netezza
7.2 ,DBeaver ,Talend ,Aginity Workbench, UC4, Linux AIX ,UNIX.
Responsibilities:
➢ Interacted with End user community to understand the business requirements and in identifying
data sources.
➢ Modeled the Star schema data marts by identifying the fact and dimension tables using Erwin
data modeling tool.
Analyzed IBM data models that provide data warehouse design models to accelerate the
development of business applications.
Reviewed the logical model with Business users, ETL Team, DBA’s and testing team to provide
information about the data model and business requirements.
Involved in creating entity relational and dimensional relational data models using IDA Studio.
Worked on Different Source analytical systems Scorecard Reports, Flowsheet, SSI, Access Logging,
and Professionals Billing Data.
Designed the mappings between sources (external files and databases) and Operational staging
targets.
Demonstrated knowledge of Information Analyzer administrative tasks such as managing logs,
schedules, active sessions and security roles.
Gathered and articulated data issues with clients on areas such as data privacy, sensitivity and
security.
Provided expertise in data profiling concepts, issues and activities.
Perform data profiling tasks, analyzing and annotating data issues.
Involved in Project management from analysis and design to implementation and construction.
Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage
Director.
Received the source data in the form of Oracle tables, Sequential files, flat files and Excel sheets,
SQL Server.
Involved in development of DataStage Jobs for Integrated eligibility systems with required
Transformations like Aggregator, Filter, Funnel, Join, Lookup, Merge, Remove Duplicates, Sort,
Transformer etc.
Used different UNIX commands for to start, stop, accessing logfiles, listing the jobs.
Used shell scripting for Scheduling and FTP process.
Used Shell scripting to read the data directly from the input directory.
Involved in Performance tuning of the jobs and debugging the jobs.
Involved in Business sales meeting with IBM, Horton works and Talend teams.
Extensively worked on POC for Talend ETL tool
Scheduled and Monitored ETL jobs on Daily Basis.
Hardware/Software:
IBM Info sphere Information server 8.5 (Data stage, Quality Stage, Information Analyzer, Business
Glossary), Oracle 11g, TOAD, UC4, Sequential files, AIX UNIX, Linux AIX, Autosys.
Responsibilities:
➢ Actively participated in the SCRUM meetings for daily update and Iteration Planning meetings.
➢ Interacted with BA to understand the business requirements and in identifying data sources.
➢ Worked on BOEING project for loading into final schemas tables for Anderson and Epic Clarity
Data.
➢ Involved in meetings with IBM team for upgradation of IBM DataStage.
➢ Working on loading the data in MCDW POTCL tables.
➢ Used Data Stage stages namely Sequential file, Transformer, Aggregate, Sort, Datasets, Join,
Lookup, Funnel, Peek, Row Generator stages in accomplishing the ETL Coding.
➢ Used Aggregator stage to count the no of records that were rejected during Execution of ETL job.
➢ Data Profiling, Standardization, matching and cleansing activities through data quality process.
➢ Used Quality Stage for standardizing the names and Address, Identified Duplicates, Matched and
Unmatched records in each source system.
➢ Used Investigation stage for identifying handled and unhandled data.
➢ Involved in creating designing and mapping documents for DataStage jobs.
➢ Responsible for daily verification that all scripts, downloads, and file copies were executed as
planned, troubleshooting any steps that failed, and providing both immediate and long-term
problem resolution.
➢ Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions
and Validations.
➢ Created Job Sequencer to execute a set of jobs.
➢ Created error and exception handling procedures to identify record, report and fix errors.
➢ Validated the current production data, the existing data and programming logics involved.
➢ Used TOAD as a querying tool to perform the basic database testing to see if there any data
inconsistencies.
➢ Used shell scripting for Scheduling and FTP process.
➢ Used UNIX commands to analyze the data in flat files, to handle the null values and Positions of
the columns in a file.
➢ Extensively tested the ETL jobs that were scheduled in UC4 tool for scheduling, timing and data
integrity.
➢ Used Daptiv for Project management tool.
Axiall Corp. is a Chemistry Company that works on applied chemistry to solve common problems,
improve everyday life and drive human progress. The project objective was to collect, organize and store
data from different Data sources to provide a single source of integrated SAP system.
Hardware/Software:
IBM Info sphere Information server 11.3 (Data stage, Quality Stage, Fast track, Information Analyzer,
Business Glossary), SAP ECC 6.0, SQL, Server 2008, Oracle 11g, Autosys, Erwin, UNIX, HP quality center.
Responsibilities:
➢ Involved in status meetings, and interacted with the Business Analyst to get the business rules.
➢ Involved in creating specifications for ETL processes, finalized requirements and prepared
specification document.
➢ Involved in meetings with SAP functional team, to discuss Customer master and Vendor master
data.
➢ Continuously working with data stewards to gather requirement specifications.
➢ Worked on BDR (Business Data repository) to identify fields in SAP.
➢ Fast Track mapping from legacy systems to target SAP.
➢ Profiling Legacy systems like column analysis, Drill through Reports & custom filters in
information Analyzer.
➢ Data Profiling, Standardization, matching and cleansing activities through data quality process.
➢ Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets,
Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the DataStage
coding.
➢ Applied default, field and cross reference Transformation rules in data stage.
➢ Implemented RTL Process from Staging area to alignment to generate gap reporting.
➢ Used different stages in Datastage design to load the data in different SAP tables (LFA1, LFBK,
LFM1, ADDR2, ADDR3, ADDR6, and TIBAN Tables).
➢ Involved in creating functional and scope documentation for data cleansing, conversion, and
integration processes.
➢ Managed the Metadata associated with the ETL processes used to populate the data.
➢ Used Information Analyzer for generating Column Analysis and Primary Key analysis report.
➢ Used Quality Stage for standardizing the names and Address, Identified Duplicates, Matched and
Unmatched records in each source system.
➢ Used Investigation stage for identifying handled and unhandled data.
➢ Created SAP Idoc jobs to load data into SAP Idoc.
➢ Used the DataStage Director for scheduling and monitoring the jobs.
➢ Involved in Exports and Imports of ETL jobs.
➢ Wrote complex SQL queries using joins, sub queries and correlated sub queries for extracting the
data.
➢ Involved in job scheduling using Autosys.
➢ Created re-usable components using shared containers for local use or shared use.
➢ Created Job Sequencer to execute a set of jobs.
➢ Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results,
Hardware/Software:
IBM Info sphere Information server 8.5 (Data stage, Quality Stage, Information Analyzer, Business
Glossary), Oracle 11g, Autosys, TOAD 9.6, Erwin, UNIX, HP quality center.
Responsibilities:
➢ Involved in technical leadership in the analysis, decision-making, design, and support phases of
implementation of computer applications and network hardware and infrastructure, operating
systems, databases and enterprise-wide business applications.
➢ Involved in development phase meetings for Business Analysis, Requirements Gathering and
managed offshore team to get Expected results.
➢ Extensively used DataStage Manager, Designer, Administrator and Director for creating and
implementing jobs.
➢ Extensively worked on error handling, cleansing of data, creating lookup files and performing
lookups for faster access of data.
➢ Used DataStage Manager to import the Metadata from sources and targets.
➢ Involved in creating technical documentation for source to target mapping procedures to
facilitate better understanding of the process and incorporate changes as and when necessary.
➢ Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets,
Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the DataStage
coding.
➢ Used the DataStage Director for scheduling and monitoring the jobs.
➢ Used DataStage debugger to troubleshoot the designed jobs.
➢ Used Shared Containers for code reuse and implementing complex business logic.
➢ Tuned DataStage jobs to enhance their performance.
Hardware/Software:
IBM Info sphere Information server 8.5 (Data stage, Quality Stage, Information Analyzer, Business
Glossary), Oracle 11g , Flat files, Autosys, TOAD 9.6, Erwin, UNIX, hp quality center.
Responsibilities:
➢ Involved in understanding of business processes and coordinated with business analysts to get
specific user requirements.
➢ Involved in appropriate and quality of data for data integration initiatives.
➢ Involved in Facets of integrated Eligible for health care management system designed to handle
the complex requirements of managed care programs.
➢ Involved in system transaction training on Claims, Utilization Management, Benefit Plans, Billing,
Commissions, Capitation, Customer Service, and Security.
➢ Involved in detailed knowledge of Quality Stage stages to develop an effective application.
➢ Used appropriate tools to fulfill data quality and business requirements.
➢ Provided metadata to create functional and technical specifications for data integration/cleansing
applications.
➢ Monitored resolution of data quality issues.
➢ Involved in understanding logical and physical models for the subject area of Customer
Information Management.
➢ Involved in Documentation Planning and Implementation.
➢ Experienced with data modeling tools such as MS Visio and/or Erwin.
Responsibilities:
Interacted with End user community to understand the business requirements and in identifying
data sources.
➢ Developed and executed IBM Quality Stage data quality tools.
➢ Supported and generated Enterprise Data Governance quality assurance metrics.
➢ Created critical field reporting tools for Enterprise Data Management customer segments.
➢ Worked with Datastage Manager for importing metadata from repository, new job Categories and
creating new data elements.
➢ Designed and developed ETL processes using DataStage designer to load data from Oracle, MS
SQL, Flat Files (Fixed Width) to staging database and from staging to the target Data Warehouse
database.
➢ Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets,
Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the ETL
Coding.
➢ Developed job sequencer with proper job dependencies, job control stages, triggers.
➢ Used Quality Stage to ensure consistency, removing data anomalies and spelling errors of the
source information before being delivered for further processing.
➢ Excessively used DS Director for monitoring Job logs to resolve issues.
➢ Involved in performance tuning and optimization of Data Stage mappings using features like
Pipeline and Partition Parallelism and data/index cache to manage very large volume of data.
Responsibilities:
➢ Involved in understanding business processes and coordinated with business analysts to get
specific user requirements.
➢ Involved in understanding logical and physical models for the subject area of Customer
Information Management.
➢ Involved in Documentation Planning and Implementation.
➢ Experienced with data modeling tools such as MS Visio and/or Erwin.
➢ Experienced in relational data models such as conceptual, logical and physical data models,
dimensional data models, data dictionary, and metadata.
➢ Responsible for developing, validating, and communicating data modeling solutions, including
both relational and dimensional models.
➢ Designed the mappings between sources (external files and databases) and Operational staging
targets.
➢ Involved in Project management from analysis and design to implementation and construction.
➢ Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage
Director.
➢ Received the source data in the form of Oracle tables, Sequential files, flat files and Excel sheets,
Mainframes, SQL Server.
➢ Involved in development of DataStage Jobs with required Transformations like Aggregator, Filter,
Funnel, Join, Lookup, Merge, Remove Duplicates, Sort, Transformer etc.
➢ Involved in job scheduling using Autosys.
Hardware/Software
Data Stage 7.5 (Parallel Extender), Oracle 8i, DB2, Sybase, SQL Server, Control-M, Erwin 4.0, Windows
Visio, UNIX, Windows-XP.
Responsibilities
➢ Involved in technical leadership in the analysis, decision-making, design, and support phases of
implementation of computer applications and network hardware and infrastructure, operating
systems, databases and enterprise-wide business applications.
➢ Involved in development phase meetings for Business Analysis and Requirements Gathering.
➢ Extracted, rectified, cleansed and loaded data from flat files, using shell scripts and different
stages into different databases.
➢ Migrated development mappings as well as hot fixes them in production environment.
➢ Performed a impact analysis on ETL jobs, before the jobs migrated in Datastage 7.5.3 version.
➢ Managed the Metadata associated with the ETL processes used to populate the data warehouse.
➢ Designed and developed jobs using Parallel Extender for splitting bulk data into subsets and
to dynamically distribute to all available nodes to achieve best Job performance.
➢ Involved in raising an incident, after the batch monitoring done by a offshore team.
➢ Involved in the migration of DataStage jobs from development to production environment.