Nikhil
Nikhil
Nikhil
E-mail: [email protected]
Hyderabad
Career Objective:
Professional Summary:
Having hands on experience in developing mappings in Pentaho data integration using SQL
Server.
Having hands on experience in developing mappings and workflows in Informatica Power
Center using Oracle 10g.
Creating and Altering the database tables as per client requirement.
Hands on experience on deploying Pentaho ktr files.
Experience in version control using GIT.
Experience in raising Change Requests for the deployment process.
Experience in code review and managing Pull Requests raised in Bitbucket and Github.
Good client relation skills and the drive to complete tasks effectively and efficiently where
customer services and technical skills are demanded.
Good team member, positive attitude and self-motivated, quick Learner, willing to adapt to
new challenges and new technologies
ETL Tools : Informatica Power Center 9.x/10.x , Pentaho Data Integration 9.1
Education:
CERTIFICATIONS :
PROFESSIONAL EXPERIENCE:
Associate – Projects
Description:
Mortgage project, SQL Server data that is created to extracting data on daily basis and sent to the
downstream. Some jobs are using Pentaho data integration and Some jobs are using Informatica
Power Center ETL. Data trigger through autosys and sent through Mortgage-Data-Feed Engine, data
saved in S3 bucket and move the data from File Mover to respective downstream.
Responsibilities:
Environment Tools:
Pentaho Data Integration 9.1, Informatica PowerCenter 10.2, SQL Server Management Studio,
Autosys scheduling tool., Swagger, Filemover.
PROFESSIONAL EXPERIENCE:
Software Engineer
2. Project : ASDA
Domain : Retail
Description :
ASDA, a british supermarket retailer giant, became a subsidiary of Walmart after a takeover in July
1999 and currently stands at 3rd rank by market share in the UK. ASDA operates online grocery
delivery and pick up services at more than 300+ stores delivering over a billion items across
thousands of trucks trips each year. A majority of the cost of fulfilment annually is attributed to
the ins-store order picking process as most of the e-commerce grocery orders are picked and
shipped from ASDA stores. Improving in-store order fulfilment velocity would ensure that more
items are processed per hour/picker thereby improving the capacity of online orders fulfilled in a
day (~directly impacting revenue), increasing on-time deliveries and driving operational costs
down.
Responsibilities:
Extensively used ETL to load data from Oracle and Flat files to Data Warehouse.
Performed Unit testing and Integration testing of Informatica mappings.
Implemented various Performance Tuning techniques.
Environment:
PROFESSIONAL EXPERIENCE:
Domain : Media
Description:
Rogers Inc. sells Video on Demand services to its clients. The VOD services statistics are then
collected and stored within various source systems. The purpose of the Project is to integrate all
statistics into one holistic view so that the business may utilize the outputs to perform reporting
and analyse the data. This Project will also include reconciliation process with varios validation
rules within the ETL process so that corrupt data can be pushed out to unreconcilled tables. These
data will further be reconciled.Solaris is being delivered to consumers via Maestro/COMCAST
platform. VST requires data for settlement purposes. Transaction TVOD data (Events, Credits ot
Reversals) is the library of the videos where subscribers need to pay for the rental videos. This
record will be created using Maestro as this rental is eligible for rating and billing. Exadata
receives Asset information from Hadoop. It is agreed between Exadata and Hadoop that later
team is going to send separate row for each of the asset types (Movie, Poster, Preview, Title).
Exadata is responsible to consolidate all column values for each Package Asset IDs. The Asset data
would go through the reconciliation process before updating the Legacy Tables.
Responsibilities:
Responsible for gathering suit of business requirements, Prepare Source to Target
Mapping specifications and Transformation rules.
Created Source to Target Mapping Specification Document
Involved in system study, analyze the requirements by meeting the client and designing
the system
Developed mappings/Reusable Objects/Transformation/mapplets by using mapping
designer, transformation developer and mapplet designer
Extracted data from different sources like Oracle, Flat files .
Designed and developed complex aggregate, join, look up transformation rules (business
rules) to generate consolidated (fact/summary) data identified by dimensions using
Informatica ETL tool
Used the Update Strategy Transformation to update the Target Dimension tables
Created connected and unconnected Lookup transformations to look up the data from
the source and target tables
Involved in Performance tuning for sources, targets, mappings, sessions and server
Developed batch file to automate the task of executing the different workflows and
sessions associated with the mappings on the development server
People soft application engine was used to load the data marts
Created test cases and completed unit, integration and system tests for Data warehouse
Joined Tables Originating from Oracle.
Wrote test cases, test conditions for Various Derivatives and Subject areas.
Actively Participated in Team meetings and discussions to propose the solutions to the
problems.
Environment:
Informatica PowerCenter 9.6, SQL DEVELOPER, UNIX, JIRA, Shell Scripting
PROFESSIONAL EXPERIENCE:
Description :
This Application is mainly targeted in ICF 7.4 to TDP Integration i.e, to check whether the
new source systems implemented ICF 7.4 will take care and serve all the business logics
properly and also responsible to check whether the data populated in EDW target were of
right data type right from ICF new source systems (Views).It also includes some of the
downstream like Tax, Liquidity and Cash On Hand Reports and sigma feed will no longer
needed.
Responsibilities:
Gathering suit of business requirements, Prepare Source to Target Mapping
specifications and Transformation rules.
Created Source to Target Mapping Specification Document
Involved in system study, analyze the requirements by meeting the client and
designing the system
Developed mappings/Reusable Objects/Transformation/mapplets by using
mapping designer, transformation developer and mapplet designer
Extracted data from different sources like Oracle, Flat files .
Designed and developed complex aggregate, join, look up transformation rules
(business rules) to generate consolidated (fact/summary) data identified by
dimensions using Informatica ETL tool
Used the Update Strategy Transformation to update the Target Dimension tables
Created connected and unconnected Lookup transformations to look up the data
from the source and target tables
Involved in Performance tuning for sources, targets, mappings, sessions and
server
Used PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica
Wrote SQL, PL/SQL for implementing business rules and transformations.
Developed batch file to automate the task of executing the different workflows
and sessions associated with the mappings on the development server
People soft application engine was used to load the data marts
Created test cases and completed unit, integration and system tests for Data
warehouse
Joined Tables Originating from Oracle.
Wrote test cases, test conditions for Various Derivatives and Subject areas.
Actively Participated in Team meetings and discussions to propose the solutions
to the problems.
Prepare QA Test Plan, Test cases and QA Signoff documents.
Prepare test cases and Test Plan in HP ALM.
QA Team is to validate the DB data in ICF7.4
Analyzing the business requirements according to DMS logic.
Analyzing the user’s tasks and developing a model of the tasks and the flow of
work between the tasks.
QA team is to validate the End -to-End testing from source table columns to
target table columns which includes business requirement, hard coded values and
straight copy.
QA team is validating the workflow dependencies.
QA Team is to validate the Trigger file functionality.
QA Team is to validate the Command task and Event Wait task.
Verify that in order to pull cash pool accounts, the logic being used is
BANKACCOUNTS.BANKACCTTYPEID='INT'