0% found this document useful (0 votes)
93 views8 pages

Srirama OS

Resume

Uploaded by

Mandeep Bakshi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
93 views8 pages

Srirama OS

Resume

Uploaded by

Mandeep Bakshi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Sri Rama Murthy Boggaram

[email protected] 224-522-3744

Profile
IT professional with around 15 years of experience in architect, analysis, design, development, testing and
administration of various applications in ETL, OneStream and Hyperion Suite of Technologies. I have
proven successful implementation of all project lifecycle stages from requirements gathering to
PROFESSIONAL SYNOPSIS
implementation, enhancing financial planning processes, and improving system efficiency.

EPM/CPM Expertise

Data Integration:

 Extensively experienced with extract data from ORACLE and Netezza databases and loading to text
files and Hyperion tables by creating ODI Packages/Process.
 Worked with Workday data reports and insert the data in to SQL tables by creating Boomi jobs.
 Experience in data load to ASO applications using Essbase Integration Services console
 Created the Transformation Rules for Consolidation and planning cube that will help map data from
source systems to the OneStream XF application.

Data Management:

 Update metadata such as cost centers, accounts, and projects, using Oracle Data Relationship
Manager (DRM) and ODI interfaces
 Broad understanding of database calculations such as implementing formulas on dimension members,
calculation scripts and ESSCMD.
 Created the connector type data source for Consolidation, Planning cube that will define the
layout, structure, and source of the data to be loaded into OneStream from EBS system.

Development work:

 Experienced in design and maintenance of BSO storage types and design considerations such as
storage options, member properties, dense/sparse dimensions, user management and security etc.
 Experienced in dealing with Data Forms/Web Forms, Business Rules, Task lists and Smart lists.
 Good knowledge of writing business rules for planning applications and CALC scripts for Essbase
applications as per the requirements of end-user
 Extensively experience in creating financial reports using Hyperion Financial Reporting Studio,
Web Analysis according to client requirements.
 Designed, developed and supported PBCS and EPBCS applications.
 Created cube views, form templates and assign them to workflow steps for data analysis by the
users as part of month end activities in OneStream XF application.
 Created data management steps to copy, clear, manage data between accounting, planning cubes and
responsible for loading supplemental data into OneStream Planning application.

Automation:

 Expertise in automation of data loads using MaxL batch scripts.


 Have good knowledge in VB scripting to automate manual Excel loading files and reports creation.

Monitoring and Security:

 Monitor the all scheduled jobs in Autosys, Boomi and put the jobs on hold if required.
 Experience in create/maintains application security, including Groups and Filters from Hyperion
Shared services
Finance knowledge:

 Extensively experience in performing monthly closes, budget, and reporting ad project forecasts.
Prepare monthly corporate invoices, streamlined data from several excel files using lookups, pivot charts,
and tables.
 Expertise in financial concepts like Cash Flow, General Ledger, Balance sheet, income statement, Trail
Balance, financial consolidation, Report catalog and budget reports.

ETL Expertise

Data Integration:

 Extensively experienced in extracting the data from ORACLE and DB2 databases and loading to .DAT,
xml files and Process those records.
 Execute the Mainframe jobs to extract data and Xcom it UNIX server to process the ETL jobs

Data Management:

 Configure the fields in DML files, based on the new business requirements

Development work flow mechanism

 Created and Configured ODI Repositories, Interfaces, Packages, Procedures and Scenarios.
 Well versed with various Ab Initio components such as Round Robin, Join, Rollup, Partition by key,
gather, merge, interleave, Dedup sorted, Scan, Validate, FTP
 Expertise in Developing Transformations between Source and Target using Ab Initio.
 Worked with another ETL tool as Oracle Data Integrator (ODI) and created several packages,
Interfaces, scenarios, variables and load plans to load data from source to target.
 Expertise includes parallelism techniques and implemented Ab Initio Graphs using Data parallelism.
 Expertise in components in the GDE of Ab Initio for creating, executing, testing, updating and
maintaining graphs in Ab Initio

Automation:

 Created utilities in shell scripting to save time in daily/Weekly jobs like File and EME comparison.

Monitoring and Security:

 Worked with Monitoring/scheduling tools like Control-M, Tivoli, Autosys


 Change type of connections: If we have sensitive/secured information, then while pushing and pulling
the file to Source/Destination

ETL Testing:

 Extensively experienced in creating test plans, test scenarios and test cases as part of
system/Integrations testing for the ETL processes in Ab Initio.
 Using Mainframe jobs, created test files for ETL process/code testing.
 Involved in preparation of Requirement Traceability Metrics (RTM).
 Reporting defects to development team through the HP Quality Center
 Tested ETL process of data extraction using ETL tool, executed graphs/packages and created data files.
 Thorough experience in unit testing, system integration testing, UAT, implementation, maintenance
and writing SQL queries and performance tuning.

General Engineering Skills:

 Assists in requirements gathering and creates system and user documentation


 Provides alternatives, recommendations based on best practices and application functionality
 Participates in code reviews and may perform code review for others
 Excellent analytical, research, conceptual skills along with demonstrated experience on documentation
and presentation.
 Strong knowledge of Data Warehousing concepts and Dimensional modeling like Star Schema and
Snowflake Schema.
 Demonstrated ability to adapt to changing tools, technologies as per business needs is my key
strength.
 Possess excellent interpersonal, communication and analytical skills with demonstrated abilities in
customer relationship management.
 Well versed with multi-vendor multi-sourcing models, onsite-offshore-near shore models. Was
instrumental in guiding, educating and motivating the team across geographies.

Technical Skills

EPM/CPM Tools: Essbase, PBCS, EPBCS, Shared Services, Planning, Hyperion Enterprise 6.5.0, DRM,
HFM, EIS and OneStream

ETL Tool: Ab Initio, ODI

Databases: Oracle 9i/10g/11i/12C

Languages: C++, SQL/PL SQL, Unix Shell Scripting

Reporting Tools: Tableau, Financial Reporting, Hyperion Smart View, Excel Add-in, Web Analysis Studio,
Narrative reporting

Testing Tools: Quality Center, Team Foundation services 2013 (TFS), Load Runner

Scheduling Tools: Tivoli, Autosys

Professional Experience

Project 1

IBAPS (Integrated Budget and Acquisition Planning System)


Role: Sr. OneStream/Hyperion Consultant/Lead Apr 2020 – Till today
Client: FDA (Food and Drug Administration), Elkridge, Maryland

 Responsible for building of planning and Accounting Hierarchies for Accounts, Products, Departments,
Projects, and Entity dimensions by using the metadata build templet in OneStream.
 Created the connector type data source for Consolidation, Planning cube that will define the layout,
structure, and source of the data to be loaded into OneStream from Oracle system.
 Implemented Lease and lease options calculator which dynamically derives the total budget of New
and existing store lease package reports.
 Created architectures to redesigned HFM in OneStream application and deployed successfully.
 Created the Cubes for Consolidation, Planning, and budgeting activities.
 Created the Transformation Rules for Consolidation and planning cube that will help map data from
source systems to the OneStream XF application.
 Responsible for loading lease options supplemental data into OneStream Planning application.
 Created the Workflow Profile for Stores in ZRAD (Zone, region, Area, Division) users to input the
Budget and forecast number into the forms.
 Created data management steps to copy, clear, manage data between accounting and planning cubes.
 Troubleshoot user issues as part of the day-to-day Planning and reporting activities.
 Created data management steps to copy data between actual scenario to budget & forecast scenarios
in planning cube.
 Manage relationships with business users/stake holders and serve as an advocate for OneStream
support and other downstream teams
 Performed migration of both components and applications between Non-Prod & Prod environments.
 Performed historical data migrations from Hyperion to OneStream XF applications.
 Weekly back-ups and monthly consolidations of data and automated monthly/quarterly/annual report
 Built Job aids and maintain robust process documentation to train and mentor functional, technical
team members in gaining expertise on OneStream tools/technologies and methodologies.
 Created simplified security setup for ZRAD (Zone, region, Area, Division) users to perform segregation
of duties during month end process.

Environment: OneStream, Hyperion Suite 11.2, Smart view, Jira.

Project 2

COGS (Cost OF Goods)


Role: Sr. Application Developer Jul 2018 – Mar 2020
Client: Starbucks Coffee Company, Seattle, Washington

Responsibilities:

 Actively participated in sprint planning meetings to gather the requirements and simplify them for
better product delivery
 Work directly with SCTC and Prod Cost teams to achieve successful outcomes in cross-functional
project deliverables and activities
 Using ODI, extracts data from various databases such as Oracle, GSIT and perform exploratory data
analysis to cleanse, massage and aggregate data to load in to Hyperion planning application.
 Created calculation scripts or business rules to calculate the actual and forecast data models.
 Extensively created ODI data integrations to transform and load expected receipts, PO actuals, Raw
materials Cost publish, finished goods order management.
 Prepared calculations scripts to calculate the Periodic moving average cost (PMAC) for USD and EURO
currencies.
 Established a security models to PROD Cost and SCTC users to Business rules and forms for both Raw
material and Finished Goods Applications.
 Create/maintains application security, including Groups, Filters, and User setup for SCTC and Prod Cost
users.
 Migrated all artifacts from DEV to CERT during release process via LCM.
 Performed a smoke testing after migration to production is complete.
 Present and explain highlights/results to Business on SME tech review and Sprint Review meetings
 Utilized JIRA to manage Epics, Features, PBIs, tasks and Microsoft Teams to capture all project
documentation.
 Participated in weekly meetings with product manager to groom backlog items
 Experienced in dealing with Data Forms/Web Forms, Business Rules, Task lists and Smart lists.
 Extensively worked with creating standard and Attribute dimensions in Planning applications.
 Created ODI packages to load Metadata to Hyperion planning applications via outline Load utility.
 Expertise in creating Load rules, Calculation scripts and export scripts and prepare MAXL scripts to load
them via .bat files or ODI scenarios.
 Performed post metadata and data load activities to ensure that the application is ready to be used by
finance

Environment: Hyperion Suite 11.1.2.4 (workspace, Planning, Shared services), ODI, DRM, Smart view,
oracle 10g.

Project 3
GFH (Global Finance Hyperion)
Role: Sr. Hyperion Administrator/Developer Dec 2017 – Jun 2018
Client: Starbucks Coffee Company, Seattle, Washington

Responsibilities:

 Created and maintained Essbase application objects including database outline, Report Scripts, Calc
scripts, and data load scripts (rule files) and security objects.
 Developed & Scheduled batch scripts using Control-M to automate dimension build, data load and
backup of Essbase cubes.
 Troubleshoot data issues from end to end perspective. (ETL - ODI)
 Troubleshoot user issues as part of the day to day reporting activities.
 Involved in redesigning the existing cube to improve the data load performance.
 Weekly back-ups and monthly consolidations of data and automated monthly/quarterly/annual report
generation developed Excel Templates for user interface in Essbase database.
 Migrated Metadata, Security reports using Life cycle management (LCM) from Development to QA & to
Production.
 Performed Optimization using Hour glass design technique, during Data Loads and while calculating
the outline.
 Involved in Optimization & Performance tuning of Outline/Data load/Calculations.
 Setting up users, groups according to business dept and assigning security to users and groups.
 Communicated and interacted on a regular basis with the project manager and development team
during different stages of the project.
 Created financial reports such as Revenue report by product, budget vs. actual variance reports using
financial reporting studio.
 Maintained Business rules for allocation and budgeting in Hyperion Planning.
 Handling all sort of user/system issue and troubleshooting them. This includes resolving data
discrepancy issues between ERP, planning system and Reporting system as well.
 Implementing multiple reporting functionality on ASO cube for Capital Investment Tool which includes
Capital Re-investment model, Asset Analysis Tool, Renovation Portfolio Budget Tracking Report
 Implemented Location and Distance calculator to locate nearest 5 stores dynamically for existing store
packages reports.

Environment: Hyperion Suite 11.1.2.3 (workspace, Planning, Shared services, Web Analysis Studio,
FR), PBCS, EPBCS, ODI 11.1.1, DRM 11.1.2.2, Smart view, HPQC, Excel Add-in, oracle 10g, Boomi,
Autosys, Jira.

Project 4

CIT (Capital Investment Tool)


Role: Hyperion Consultant Oct 2016 – Nov 2017
Client: Starbucks Coffee Company, Seattle, Washington

Starbucks is the world's leading purveyor of specialty coffee with $8+ Billion in revenues, 11000+ stores
and 90000+ employees worldwide.
As a part of CIT (Capital Investment Tool) project, Store Development and Store Development Finance
organizations manage complex data models containing critical Starbucks capital project information in
spreadsheet and Access databases. This information is key in establishing plans and monitoring
performance of store capital projects, and the current process and tools are error-prone, time-consuming,
and not adequately scalable.
This CIT project is to get ability to Scale a Hyperion cube would provide users access to data in one
sustainable application to solve for current process and future reporting, Productivity improvement, end
User Reporting and Improved Data Security

Responsibilities:
 Gathered, analyzed, and documented business requirements to support the design and development of
BI Solutions for a Hyperion cube
 Translated business requirements into technology solutions using agile methodology
 Create and manage Hierarchies in DRM per business requirement using action script
 Developed Versions, Hierarchies, Properties, Property Queries, Imports, Blends, Validations, Exports
and export flags
 Communicate with Source teams to get the correct data sets for Hyperion data loads and create SQL
Tables, Stored procedures, Views to handle
 ODI development to extract, load data and metadata for Essbase and HFM from various Oracle data
sources.
 Worked with ASO application by export data to text files through export scripts and load it to SQL
tables. Also created stored procedure to pivot rows data to Columns and automated the process using
ODI package.
 Have exposure to pull planning smart List data to text files and load it to ASO applications.
 Created delete Entity/Account ODI package to Monthly update the Metadata in ASO and BSO
applications.
 Worked with complex scenarios such as pushing data from BSO to ASO applications
 As part ASO application, created a complex SQL views to load attribute dimensions
 Experience in creating planning Webforms which can help not only planning users but also useful for
Unit testing purposes
 Actively involved in all meetings such as sprint planning, sprint review, sprint retrospective, risk
mitigation and provided valuable suggestion for Hyperion implementations.
 Created a SQL stored procedure to compare the differences in DRM versions. Which is helpful in DRM
migrations between environments and testing purposes.
 Clearly defined and communicated targeted functional solution via functional design specifications
 Verified technical designs satisfy functional requirements
 Highly proficient in OLAP cube development using Hyperion Essbase utilizing the Hyperion EPM Suite.
 Extensive analytic experience with financial budgeting and operational planning for large corporations.
 Expansive project management background and experience.
 Significant experience in translating customer needs into essential processes and systems.
 Strong expertise in Extraction, Transformation and Loading (ETL) of data using ODI, Data Stage and
SQL.
 Successful at multi-tasking and prioritization in the fast-paced environment of Enterprise Systems
Development.
 Present and explain highlights/results to Business on SME tech review and Sprint Review meetings
 Participated in weekly meetings with product manager to groom backlog items
 Utilized MS equivalent of TFS to manage Epics, Features, PBIs and tasks
 Utilized SharePoint as a hub for team storage of documents and communications
 Created PowerPoint presentations to share information with management team
 Developed a weekly status report that contained an executive summary.
 Proactively identified technical, procedural, logistical, and communication issues and drove creative
solutions to fix them

Environment: Hyperion Suite 11.1.2.3 (workspace, Planning, Shared services, Web Analysis Studio,
EIS, FR),ODI 11.1.1, DRM 11.1.2.2, Excel Add-in, Smart view, oracle 10g.

Project 5

CAPA (Catalina Planning and Analysis)


Role: Sr. Hyperion Developer May 2016 – Sep 2016
Client: Catalina Marketing, St. Petersburg, Florida

Catalina powered by world’s largest shopper history database in which it helps leading CPG Brands
influence behavior by personalizing the customers path to purchase. Catalina Marketing mainly deals with
Partner; Media print level transformations as well as Revenue, capital, salary and finance data. Pull the
data from SQL tables and apply the transformations according to the business needs.
Using ETL, we will transform data related to Catalina Contracts, Bills, Workday Adjustments, Lawson
adjustments, Contract Amounts, Store totals, Weekly Counts, Monthly counts, Closed bill/offer counts Etc.
Input data coming from SQL and Netezza tables and also from Workday system.

Responsibilities:

 Extract data from ORACLE and Netezza databases and loading to text files and Hyperion tables by
creating ODI Packages/Process.
 Creates ETL processes, including requirements, design, development, testing, implementation and
documentation
 Updates metadata such as cost centers, accounts, and projects, using Oracle Data Relationship
Manager (DRM) and ODI interfaces
 Experienced in dealing with Data Forms/Web Forms, Business Rules, Task lists and Smart lists.
 Develops, implements and maintains new Essbase and Planning applications under the direction of a
Systems Analyst
 Creates and maintains reports in Financial Reporting, BI Publisher, and other reporting tools
 Automated the jobs using Batch files and schedule it using Autosys and Task Scheduler
 Create/maintains application security, including Groups, Filters, and User setup
 Understands the business processes, how the application supports the business processes and applies
this knowledge to best solve problems
 Troubleshoots and supports existing applications, meeting all SLAs
 Provides timely response and resolution of emergency production questions, issues and defects
 Provides assistance to the business for annual budget and forecast processes
 Assists with system maintenance, including enhancements, patches, and optimizations
 Resolve all JIRA Tickets opened to Hyperion Team and track all the requests
 Present all Hyperion tickets and implementations in weekly status report meetings

Environment: Hyperion Suite 11.1.2.3 (workspace, Planning, Shared services, Web Analysis Studio,
EIS),ODI 11.1.1, DRM 11.1.2.2, Excel Add-in, oracle 10g, Boomi, Autosys, Jira.

Project 6

PASS (Planning and Statistical System)


Role: Sr. Hyperion Developer/Administrator/Lead
Client: Allstate, Pune, India Jan 2012 – Sep 2013
Allstate, Northbrook, Illinois Oct 2013 – Apr 2016

The flow of the data starts with Actuals feed from SAP FDW on a monthly basis into the Detail application
databases and the Main1 database of the CONSMAIN application. Once Actuals are loaded and calculated
the Actuals are feed into each of the regional applications via an Essbase data partition definition. The
regional applications are where the users go to enter their plans and forecasts via the web. Actuals get
loaded to the Main1 database for each of these applications for their appropriate intersection of data.

Once the users develop their plans and forecasts in these applications the data goes back to the
CONSMAIN application’s Main2 database from each of the regional applications’ Main2 database. This data
movement is also handled via Essbase data partitions.

There are around 250 users, who use this application to look into the Actuals and enter their plan and
forecasts’ updates online into this application through data/web forms and Excel. Then those data gets
consolidated in the CONSMAIN application.

Responsibilities:
 Developed rule files to load Metadata and data to ESSBASE, once data model is built and consistent.
Also automated data load into Essbase cubes which led to reduction in time required for data load
 Created partitions to copy portion of source data to target applications
 To accommodate the business needs, developed many calculation scripts with optimization
techniques.
 Responsible for creating Web Forms, Business rules, Smart list and task lists in Hyperion planning
and for automating jobs and processes using MAXL and task schedulers
 Responsible for migrating all the planning objects to QA and Production using Life cycle management
(LCM)
 Created reports from HFR and scheduled it on weekly or monthly basis
 Directed the day-to-day maintenance of Hyperion Essbase and Planning applications as well as all
Hyperion activities related to monthly management reporting, forecasting, and the annual budget
process.
 Resolved day to day ITSM production tickets related to Hyperion applications and supported Hyperion
planning, Essbase, Smart View and Excel Add-in queries from end users
 Acted as a liaison between the IT and the finance team which involved developing business ideas and
modeling them into working ESSBASE cubes for design and development
 Developed and modified existing financial and marketing reports which include corporate
consolidation reports, and financial and planning reports for management.
 Hierarchy Changes (MMM/Planning Web) to add members in Management and planning outline via
Change request.
 Effectively handling the offshore team and get the timely deliverables.

Environment: Hyperion Essbase 11.2.x.x, Hyperion planning 11.2.x.x, shared


Services, Hyperion Smart View, Excel Add-in, oracle 10g, Unix, Ab Initio.

Education

Bachelor’s in computer science


2005 – 2008

Maters in Computer Applications


2010 – 2012

You might also like