GangadharaRaoChiluka (11y 1m)
GangadharaRaoChiluka (11y 1m)
GangadharaRaoChiluka (11y 1m)
Career Objective:
Career Overview:
Scholastics:
MCA (Master Of Computer Applications) from KGRL College Bhimavaram, Andhra University
B.Sc (Bachelor of Computer Science) Acharya Nagarjuna University.
Skill Set :
Organizational Experience:
Worked as Senior Project Engineer in WIPRO India Pvt from Feb 2021 to Dec 2023
Worked as Software Engineer in IBM India Pvt from Nov’30 2011 to Dec 10 2019.
Projects Handled:
Responsibilities:
Identify, analyze and develop interfaces and integration flows using Mule ESB and
Anypoint Studio
Involved in exposing, consuming Web services SOAP and Restful (JSON) Web
services
Deployed Application in On premises using environment using GitHub.
Deployed the application in UNIX machine and used FTP to see the logs etc.
Involved in Mule soft API Development using RAML
Involved in fixing the defects raised by the QA, UAT and Production.
Have written MUnit test cases to validate Mule Flows.
Hands on experience in MuleSoft Expression Language (MEL) to access payload data,
properties and variable of Mule Message Flow.
Strong application integration experience using Mule ESB with Connectors,
transformations, Routing, Active MQ, Batch Processing.
Developed applications which connects to the client database and retrieve all the
records and process to SAP system.
Extensively worked on Mule Connectors
Involved in Mule soft API Development using RAML
Experience in transformations using Dataweave Language (DWL)
Involved in fixing the defects raised by the SIT,UAT Team
Involved in developing interfaces integrating with 3rd party Web services
Prepared Test cases and Testing Application using MUnit and soupUI.
Project # 2 (Feb 2021 to Mar 2022)
Project Name : standard insurance company (SIC)
Client : Standard Insurance,UK
Team Size : 5
Environment : Mule ESB,Web Services(REST),SOAP,AnyPointStudio,GitHub,Munit
Jira,SourceTree,Anypoint platform,Jenkins,Oracle sql/pl/sql
Description:
standard insurance company Policies API enables MuleSoft users to easily facilitate
various.scenarios on the back-end. The connector provides full Orders API capabilities,
including create,update, show details for, authorize and capture Policies payments for, save,
and void Policies orders.For more information on the insurance Orders API, including sample
payloads and parameters, see the Policies Orders API documentation.With the Policy Orders
API you can create and capture payments instantly (Capture intent) or
if used in conjunction with the Payments Connector, create and authorize the payment when
the buyer is present, and then capture it later (e.g. after confirming product availability
in stock).
Responsibilities:
Identify, analyze and develop interfaces and integration flows using Mule ESB and
Anypoint Studio
Involved in exposing, consuming Web services SOAP and Restful (JSON) Web
services
Deployed Application in On premises using environment using GitHub.
Deployed the application in UNIX machine and used FTP to see the logs etc.
Involved in Mule soft API Development using RAML
Involved in fixing the defects raised by the QA, UAT and Production.
Have written MUnit test cases to validate Mule Flows.
Hands on experience in MuleSoft Expression Language (MEL) to access payload data,
properties and variable of Mule Message Flow.
Hands on experience in various Mule Connectors such as Http, Https, File, SAP, FTP,
SFTP, VM, DB, JMS.
Write stored procedures according to business requirement in Batch process
Strong application integration experience using Mule ESB with Connectors,
transformations, Routing, Active MQ, Batch Processing.
Developed applications which connects to the client database and retrieve all the
records and process to SAP system.
Extensively worked on Mule Connectors
Involved in Mule soft API Development using RAML
Experience in transformations using Dataweave Language (DWL)
Involved in fixing the defects raised by the SIT,UAT Team
Involved in developing interfaces integrating with 3rd party Web services
Prepared Test cases and Testing Application using MUnit and soupUI.
Description:
Prudential Insurance (PI) is a single consolidated system that has complete dataset to
fulfil the most of reporting requirements in Prudential.Prudential Insurance system load data
from all required upstream systems and generated the reports in a single system.
Prudential Insurance also able to perform inter-system reconciliations or data analysis.
Prudential Insurance plays an important role to improve the Prudential data
consistency and integrity by the daily reconciliations of customer account balances, contract
balances and ledger balances.
Prudential Insurance having the different upstream systems like Finacle, PSGL, Murex,
Imex, NCIO, ELMS etc. and also downstream systems like FRDM, CPMS, RMG etc.
Responsibilities:
Description:
Shared Services Application Centre is rendering services for two key areas i.e. Data
Warehouse and Integration and Finance. Being member in Finance team, working on
difference projects like Chrysalis – Re pricing, BancTec replacement etc. All these projects
mainly on mainframes and responsible for manual testing using mainframes (submission of
jobs – checking the statuses).
Description:
The Historical Product Delivery Database (HPDD) system was introduced to store product
related data such as phones’ IMEI/MEID numbers, simlock codes, SW version, shipment
information and so on.. Following are the vital objectives of the HPDD
• Validation and storage of unit & delivery history information (IMEI/MEID, Simlock
Codes, software revisions etc.)
The HPDD Team is performing all work regarding system development and enhancement of
the Customer Services applications in HPDD, as well as hosting these within the HPDD
database. The number of solutions / applications has grown rapidly, entailing an increased
I/O and CPU load on HPDD beyond previous estimates. Continuing to run everything within
the HPDD database will result in a system design that will soon outgrow itself. In addition to
this, we also have an increasing number of suppliers and forecasted increases in production
volumes, as well as more complex information flows due to process changes such as VMI,
the required ability to split order information into pallets and master packs etc.
Description:
Standard Life is currently using an obsolete workflow system (AWD V2x from DST) which
they need to replace by the end of 2014, Aim of the project is to transition to solution based
on IBM's Business Process Management Suite (BPMS), including Business process Manager
(BPM), Business Activity Monitor (BAM) and Operational Decision Manager (ODM).
• Preparing test scenarios & test scripts for the given requirements and upload them
into QC
• Executing the test cases from Quality Center
• Preparing and executing VB Scripts from Quality Center
• Involving in UI testing, Performance testing
• Raising the defects (if any) in Quality center
• Assigning the work and reviewing scripts/test scripts of my team members
• Interacting with Development team and on site team
• Involving in Team Monitoring, Status Tracking,
• Team Meetings and Reporting
Description: Product Cluster is a cluster that deals with corporate product database which
has all the information about services and contracts provided by KPN. Product Cluster is an
application cluster that covers following four applications.
PROMIS: Is the central corporate database that is the most primary part of the Product
Cluster
Technical Responsibilities: -
The FRS is the definitive source of internal management data to support the Management
Committee and Board of Directors. The client also chose to implement Oracle BI OLAP to
report multidimensional facts from the data staged in PeopleSoft EPM as the source data.
Currently Citi has decided to upgrade from OLAP 10g to 11g.CITI have 6 different cubes like
Financial cube, Management cube, OU cube.Citi has two different jobs are there 1.History
build and 2.Incremental build.Histroy build maintains data from 2006 January to September
2012.
Technical Responsibilities: -