GangadharaRaoChiluka (11y 1m)

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 10

Resume

Name : Chiluka Gangadhara Rao


Tel : +91-8500459496
Mail Id : [email protected]

Career Objective:

To obtain a challenging position with a reputed organization to utilize my technical


application integration solutions using Mule ESB Platform and database skills for the growth
of the organization as well as to enhance my knowledge about new and emerging trends in
the IT sector.Worked extensively in Web Services (SOAP and REST), APIs, XML and well
versed with ESB and Oracle database.

Career Overview:

 11 years of experience in IT as a Oracle Data Base and Mule Soft Developer


 Having around 3 years of experience in Mule ESB
 Having Around 2 Years of experience in Snowflake Cloud Data Warehouse.
 Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe,Time Travel, Data
Sharing.
 Experience in Mulesoft Anypoint platform features such as Design Center, Exchange,
Api Manager & Runtime Manager.
 Good experience in using Anypoint Studio to develop the Api’s.
 Experience using mule connectors like Database, HTTP, VM, File, FTP, SFTP, Amazon
SQS, Email, CloudHub, etc.
 Good knowledge on API Life cycle phases such as Design, Implementation &
Management.
 Good knowledge on Exception handling strategies.
 Understanding of Snowflake Architecture at high level
 Working experience in Oracle Performance Tuning
 Having 6 years of experience in Oracle SQL,PL/SQL
 Involved in Restful API Development using RAML.
 Used Data weave Language for various transformations in project
 Handled the different exceptions for various flows
 Tested various flows using MUnit Framework
 Applied Security polices for the API’s in anypoint platform
 Worked extensively in Oracle SQL/PLSQL, Developed Packages, Stored Procedures,
 Functions, Triggers, Cursors and PL/SQL.
 Worked on Performance tuning of SQL queries, PL/SQL Procedures, Packages etc for
optimization.
 created proxy application for API project
 Encripted passwors using secure properties place holders
 sent error notifiactions using smtp servers
 Loading the bulk data from source to target using batch job.
 Involved in Code migration from various environments
 Used built tools like Maven and CI tools like Jenkins in the Application
 Fixed the defects raised in SIT and UAT Environment
 Deployed application in onprermisis using jenkin jobs.
 Onshore/offshore co-ordination for better delivery and quality control and
 Deliverables handover from offshore to Client.
 Worked on Agile Methodology attending scrum calls, developed API’s for user stories
in the various sprints.
 Followed some best practise while developing api’s
 Different Data Loading Options in Snowflake
 Unloading of data and snapshots
 Data Sharing in snowflake
 Understanding of Stages - Internal and External
 Understanding of File Formats in snowflake
 Experience in SnowSQL and CRUD operations.
 Understanding of Entities, Relations and different objects in Snowlake database.
 Different types of tables in snowflake and usage.
 Strong understanding of Datwarehouse concepts.

Scholastics:

MCA (Master Of Computer Applications) from KGRL College Bhimavaram, Andhra University
B.Sc (Bachelor of Computer Science) Acharya Nagarjuna University.

Skill Set :

ESB Technologies : Mule ESB, Any Point Studio, RAML,


Web Services : Apache Tomcat,SOAP,REST,JSON
Messaging System : ActiveMQ
Software Tools : PL/SQL developer, TOAD, HP QC, Remedy, SQL
Developer,servicenow,MUnit,JIRA,Splunk,Jenkins
Operating Systems : UNIX, Windows
ProgrammingLanguages : SQL,PL/SQL,
Database : Oracle 12c,11g,10g,9i and Snowflake
Scripting Languages : UNIX Shell Scripting.
Source Control Systems : GitHub.
Code Versioning : SVN, SourceTree (GIT)

Organizational Experience:

Worked as Senior Project Engineer in WIPRO India Pvt from Feb 2021 to Dec 2023
Worked as Software Engineer in IBM India Pvt from Nov’30 2011 to Dec 10 2019.
Projects Handled:

Project # 1 (Apr 2022 to Dec 2023)


Project Name : CDCE
Client : CDCE Hong Kong
Team Size : 5
Environment : Mule ESB,Web Services(REST),SOAP,AnyPointStudio,GitHub,Munit
Jira,SourceTree,Anypoint platform,Jenkins,Oracle sql/pl/sql
Description:
CDCE API enables MuleSoft users to easily facilitate various information like LEGAL,LKA and
VALUATION report from PYTHON API Call. It will help to get information about available
houses and apartment and villas in area wise. Python API provides valuable customers
information to end user. Valuation report provides all eligible information to other vendors.

Responsibilities:

 Identify, analyze and develop interfaces and integration flows using Mule ESB and
Anypoint Studio
 Involved in exposing, consuming Web services SOAP and Restful (JSON) Web
services
 Deployed Application in On premises using environment using GitHub.
 Deployed the application in UNIX machine and used FTP to see the logs etc.
 Involved in Mule soft API Development using RAML
 Involved in fixing the defects raised by the QA, UAT and Production.
 Have written MUnit test cases to validate Mule Flows.
 Hands on experience in MuleSoft Expression Language (MEL) to access payload data,
properties and variable of Mule Message Flow.
 Strong application integration experience using Mule ESB with Connectors,
transformations, Routing, Active MQ, Batch Processing.
 Developed applications which connects to the client database and retrieve all the
records and process to SAP system.
 Extensively worked on Mule Connectors
 Involved in Mule soft API Development using RAML
 Experience in transformations using Dataweave Language (DWL)
 Involved in fixing the defects raised by the SIT,UAT Team
 Involved in developing interfaces integrating with 3rd party Web services
 Prepared Test cases and Testing Application using MUnit and soupUI.
Project # 2 (Feb 2021 to Mar 2022)
Project Name : standard insurance company (SIC)
Client : Standard Insurance,UK
Team Size : 5
Environment : Mule ESB,Web Services(REST),SOAP,AnyPointStudio,GitHub,Munit
Jira,SourceTree,Anypoint platform,Jenkins,Oracle sql/pl/sql
Description:
standard insurance company Policies API enables MuleSoft users to easily facilitate
various.scenarios on the back-end. The connector provides full Orders API capabilities,
including create,update, show details for, authorize and capture Policies payments for, save,
and void Policies orders.For more information on the insurance Orders API, including sample
payloads and parameters, see the Policies Orders API documentation.With the Policy Orders
API you can create and capture payments instantly (Capture intent) or
if used in conjunction with the Payments Connector, create and authorize the payment when
the buyer is present, and then capture it later (e.g. after confirming product availability
in stock).

Responsibilities:

 Identify, analyze and develop interfaces and integration flows using Mule ESB and
Anypoint Studio
 Involved in exposing, consuming Web services SOAP and Restful (JSON) Web
services
 Deployed Application in On premises using environment using GitHub.
 Deployed the application in UNIX machine and used FTP to see the logs etc.
 Involved in Mule soft API Development using RAML
 Involved in fixing the defects raised by the QA, UAT and Production.
 Have written MUnit test cases to validate Mule Flows.
 Hands on experience in MuleSoft Expression Language (MEL) to access payload data,
properties and variable of Mule Message Flow.
 Hands on experience in various Mule Connectors such as Http, Https, File, SAP, FTP,
SFTP, VM, DB, JMS.
 Write stored procedures according to business requirement in Batch process
 Strong application integration experience using Mule ESB with Connectors,
transformations, Routing, Active MQ, Batch Processing.
 Developed applications which connects to the client database and retrieve all the
records and process to SAP system.
 Extensively worked on Mule Connectors
 Involved in Mule soft API Development using RAML
 Experience in transformations using Dataweave Language (DWL)
 Involved in fixing the defects raised by the SIT,UAT Team
 Involved in developing interfaces integrating with 3rd party Web services
 Prepared Test cases and Testing Application using MUnit and soupUI.

Project # 3 (Aug 2019 To Dec 2019)


Project Name : Prudential
Client : Prudential Insurance US,
Team Size :7
Environment : Linux, Oracle 12c, PL/SQL, TWS, Schell Scripting, Snowflake,
Matillion, AWS E2 and S3 bucket.

Description:

Prudential Insurance (PI) is a single consolidated system that has complete dataset to
fulfil the most of reporting requirements in Prudential.Prudential Insurance system load data
from all required upstream systems and generated the reports in a single system.
Prudential Insurance also able to perform inter-system reconciliations or data analysis.
Prudential Insurance plays an important role to improve the Prudential data
consistency and integrity by the daily reconciliations of customer account balances, contract
balances and ledger balances.
Prudential Insurance having the different upstream systems like Finacle, PSGL, Murex,
Imex, NCIO, ELMS etc. and also downstream systems like FRDM, CPMS, RMG etc.

Responsibilities:

• Trouble shooting issues reported by the end user.


• Optimization and Performance tuning of existing process.
• Designing database objects structure and creating different database objects as per
the requirements.
• Unit Test Plan preparation and testing support to UAT.
• Communicate with Business/Data Analyst to understand data/report design
• Extensive experience in developing complex queries/stored procedures in Data
Migration from RDBMS to Snowflake cloud data warehouse
• Creating source to target mappings based on the business requirements.
• Involving creation of Functions users, roles and object privileges
• Started working on time travel and failover snowflake features.
• Responsible for Integration testing and go live support
• Guiding the team in resolving technical issues and interaction with the business/data
analyst and onshore lead.
• Requirement gathering from functional owners/End Users.
• Providing timely status updates to project manager on progress of tasks handled by
team.

Project # 4 (Dec 2017 to Jul 2019)


Project Name : Shared Services Application Centre
Client : Standard Life Assurance, Scotland, UK
Team Size : 18
Environment : Oracle 11g (SQL, PL/SQL), UNIX,
Windows and SnowFlake

Description:
Shared Services Application Centre is rendering services for two key areas i.e. Data
Warehouse and Integration and Finance. Being member in Finance team, working on
difference projects like Chrysalis – Re pricing, BancTec replacement etc. All these projects
mainly on mainframes and responsible for manual testing using mainframes (submission of
jobs – checking the statuses).

Brief about FaST:


This program is intended to replace the legacy system i.e. Walker (mainframe system) to
Oracle E-Business Suite. In order to achieve its objective the program is designed as two
specific releases rel-1 and rel 2 and the rel-1 is successfully achieved its target.

 I am worked on various demands.


 Write stored packages according to business requirement.
 Have done Re-Conciliation queries to verify the results.
 Understanding the Requirements and preparing the high level scenarios.
 Participate in test prep activities 6) Attending daily huddles.

 I communicated various teams for project work.


 Extensively developed Database PL/SQL Packages, Procedures, Functions and
Triggers for implementing the business rules.

 Started working on time travel and failover snowflake features.

 Responsible for Integration testing and go live support.

Project # 5 (Oct 2016 to Nov 2017)


Project Name : HPDD (Historical Product Delivery Database)
Client : Sony Communications
Team Size : 2
Environment : Oracle 11g (SQL, PL/SQL), UNIX,
Windows.

Description:
The Historical Product Delivery Database (HPDD) system was introduced to store product
related data such as phones’ IMEI/MEID numbers, simlock codes, SW version, shipment
information and so on.. Following are the vital objectives of the HPDD
• Validation and storage of unit & delivery history information (IMEI/MEID, Simlock
Codes, software revisions etc.)

• Automated provision of unit level information and delivery statistics to customers,


swaprepair parties, retailers and several SEMC business applications

• Web interface for manual data access


The system helps SEMC to add value to the Supply chain and by that indirect support the
Customers, at delivery and during after market activities such as Swap-Repair, if maintained
and used correctly.

The HPDD Team is performing all work regarding system development and enhancement of
the Customer Services applications in HPDD, as well as hosting these within the HPDD
database. The number of solutions / applications has grown rapidly, entailing an increased
I/O and CPU load on HPDD beyond previous estimates. Continuing to run everything within
the HPDD database will result in a system design that will soon outgrow itself. In addition to
this, we also have an increasing number of suppliers and forecasted increases in production
volumes, as well as more complex information flows due to process changes such as VMI,
the required ability to split order information into pallets and master packs etc.

 worked on various demands.


 Write stored packages according to business requirement.
 Have done Re-Conciliation queries to verify the results.
 4 Took additional responsibility of helping the prepare the test data and preparing
the test doc for test cases.

 Have involved in Migration


 Identify and resolve open issues.
 Involved in devevelopment and creating the test data till UAT.

Project # 6 (Nov 2015 to Sep 2016)


Project Name : BPMS work flow
Client : Standard Life
Team Size : 6
Environment : Oracle 11g (SQL, PL/SQL), UNIX,
Windows.

Description:
Standard Life is currently using an obsolete workflow system (AWD V2x from DST) which
they need to replace by the end of 2014, Aim of the project is to transition to solution based
on IBM's Business Process Management Suite (BPMS), including Business process Manager
(BPM), Business Activity Monitor (BAM) and Operational Decision Manager (ODM).

• Understanding the functional requirement documents & flow diagrams provided by


the Client.

• Preparing test scenarios & test scripts for the given requirements and upload them
into QC
• Executing the test cases from Quality Center
• Preparing and executing VB Scripts from Quality Center
• Involving in UI testing, Performance testing
• Raising the defects (if any) in Quality center
• Assigning the work and reviewing scripts/test scripts of my team members
• Interacting with Development team and on site team
• Involving in Team Monitoring, Status Tracking,
• Team Meetings and Reporting

Project# 7 (Feb 2013 to Oct 2015)

Project Name : Product Cluster


Client : KPN NL
Role : Developer
Team Size :4
Environment : Oracle 11g SQL,HTML,UNIX,Windows.

Description: Product Cluster is a cluster that deals with corporate product database which
has all the information about services and contracts provided by KPN. Product Cluster is an
application cluster that covers following four applications.
PROMIS: Is the central corporate database that is the most primary part of the Product
Cluster

INFOBUS: It is primarily used for news information for external sales.


PINT: Produces the daily/weekly and monthly interface files.
TAROS: Used by product managers to view the products/services by KPN.
Product Cluster gives relevant information about various products being used. Data provided
by Promis is required for and used by many other sections. Also, these fetched data is used
in KPN among its various sections like sales / marketing / billing.
Promis is a product database where it can store the product and price information of
all commercial products and services. It acts like an interface to interconnect all the
products of KPN of one kind of data to all other systems of KPN.When a new product needs
to be inserted in the system or some modification in existing product, product manager
provides this information to poortwachter (gate keeper) in form of AIF document. AIF
document contains all the necessary documentation related to the products. All data is
entered by business.PROMIS provides this information to PINT. PINT is like an enterprise
application which takes the information from PROMIS and provide to other applications like
INFOBUS, TAROS, etc.
INFOBUS is Informatie Betrouwbaar Up-to-date en Snel.It has the purpose of making
related product information available on the basis of the product that are present in the
database PROMIS. Its main purpose is to provide all KPN employees who are in contact with
customers, product information and additional information about products, sales promotion
and background information. INFOBUS is an open application. It requires no authorization to
access the INFOBUS. It works only within the KPN network. Business expects InfoBUS to be
fast, actual, complete and outage not acceptable.Pint is the Public INTerface and is intended
to unlock data from PROMIS for target systems.

Technical Responsibilities: -

 Accepting the Vito tickets and working on that ticket.


 Involved in development and testing the bug fixing.
 Involved in development and testing.
 Write stored procedures according to business requirement.
 Have done Re-Conciliation queries to verify the results.
 Have involved in Migration
 Have raised CMP, VT tickets for migration
 Proactively ask questions and offer input.
 Involved in the STAT tool Implementation testing.

Project# 8 (Dec 2011 to Jan 2013)


Project Name : CITI FRS
Client : CITIGROUP
Role : Developer
Team Size : 7
Environment : Oracle 11g SQL, Oracle BI OLAP, AWM, UNIX,
Windows,
Description:

The Citigroup Financial Reporting System (FRS) implementation enables a seamless


process from General Ledger transactions through to the consolidated Citigroup results. The
goal of the Citigroup Reporting System Initiative is to replace all internal financial reporting
systems with new systems constructed around PeopleSoft Enterprise Performance
Management (EPM).

The FRS is the definitive source of internal management data to support the Management
Committee and Board of Directors. The client also chose to implement Oracle BI OLAP to
report multidimensional facts from the data staged in PeopleSoft EPM as the source data.
Currently Citi has decided to upgrade from OLAP 10g to 11g.CITI have 6 different cubes like
Financial cube, Management cube, OU cube.Citi has two different jobs are there 1.History
build and 2.Incremental build.Histroy build maintains data from 2006 January to September
2012.

Technical Responsibilities: -

 Schedule the jobs for Incremental and History Builds.


 Write the queries for all 6 cubes. These queries help to verify the cube has been built
successfully or not.
 Involved in development and testing the bug fixing.
 Involved in devevelopment and testing.
 Write stored procedures according to business requirement.
 Have done Re-Conciliation queries to verify the results.
 Have involved in Migration
 Have raised CMP, VT tickets for migration
 Identify and resolve open issues.
 Proactively ask questions and offer input.
 Involved in the STAT tool Implementation testing.
 Attend every project team status meeting, recongnize the importance of this project
to the organization.

You might also like