Rufus SAP Datasphere 1
Rufus SAP Datasphere 1
PROFILE
A lifelong learner and passionate professional in the field of Data Analytics and HANA XSA Trainer. Always striving to
keep myself updated with the state-of-the-art developments/trends in this area and its related fields such as SAP
Datasphere, HANA XSA/Native HANA, ABAP CDS, SAP BO and other SAP tools.
Overall, 11+ years of IT experience, comprising almost 9+ years in the field of Datasphere/HANA Analytics and
around 3 years in SAP ABAP. Worked with 2 renowned IT companies, 1 BIG-4 and 1 Product Organization till date.
The clients list includes biggest brands which spans across various domains such Retail, manufacturing, Media and
Pharma etc. Have got both onsite and offshore experience while working with different teams across the globe
covering regions from NA, EMEA and APAC.
SKILLS
WORK EXPERIENCE
SAP Datasphere
• Expertise in setting up containers from scratch in HANA Web IDE, and making the system available for
developers/Client to work.
• Strong knowledge in SQL programming.
• Strong knowledge in creating HANA CV’s in XSA tool and making THE BEST use of the new XSA features.
• Have used new XSA features like Minus/Intersect/Table Function Node/Non-Equi Join during the
developments.
• Created Flowgraphs to move the data from the HANA CV to HANA tables for daily snapshot scenarios.
• Developed multiple Flowgraphs to handle the delta load from Oracle to SAP HANA tables.
• Strong knowledge in using Input parameters and Variables based on the business requirements and majorly
use Input parameter for effective data filtering.
• Expertise in improving the performance of calculation views based on performance analysis tool provided
by HANA.
• Created Multiple Security Roles & SQL Analytical Privileges to handle data security from the view level.
• Implemented more than 200+ KPI’s using the HANA calculation views and exposing it to SAC layer.
• Created table functions for scenarios which cant be implemented in calculation view.
• Strongly used Window functions to implement complex business solutions.
• Extensively Trained HANA developers and Client to make them adapt to the tool in a better pace. Have
created training documents and materials to help understand the tool better.
• Expertise in working in setting up the containers for Cross-Container access.
• Strong hands on in creating user provided services for consuming non-HDI containers within HDI container.
• Expert on configuring CI/CD Pipelines in Gitlab for HANA XSA environment.
• Created Pipelines to automate the BUILD & DEPLOY HANA XSA code into HANA DB and update the
corresponding branches.
• Extensively created DWF Task chains to schedule multiple procedures and flowgraphs for automatic job
handling.
• Strong knowledge in Working in temporary container concepts without updating the Master branch
frequently.
• Created Synonyms extensively on top of the Database tables and consume them in the HANA calculation
views.
• Expertise in working in files like package.json/undeploy.json to deploy/undeploy the objects from the
HANA DB whenever required and handled errors effectively.
• Extensively created multiple .HDBCDS tables for the business requirement.
• Debugging complex HANA Procedures and resolving data issues effectively.
• Expertise in creating synonyms on top of HANA CV’s by fully utilizing the Cross-Container feature of HANA.
• Have used the Git Hub tool to perform operations like Fetch/Pull/Merge/Commit/Stage.
• Used Azure as the cloud interface to create the remote branch and generating pull request to push the
changes to Master.
• Provided KT to the new team members on HANA XSA working models and created Knowledgeable
documents on HANA XSA and Git Hub.
• Developed calculation views of both Cube and Dimensions.
• Strong knowledge in GIT history to compare the changes between two object versions.
SAP Fiori/UI5
• Developed Odata services for the calculation views and exposed them to other front-end applications like
Fiori/UI5.
• Have implemented multiple list reports with different levels of selections and Input Parameters to have a
great control on the report from the Fiori end.
• Strong knowledge in working on Annotations, Manifest.json, Controllers, Fragments files based on the Fiori
requirement.
• Extensively Created multiple Donut charts, Column Charts, line charts in Overview page and implemented
the navigation to the corresponding list reports.
Developed HANA Compliance tool which can be used by all EY HANA Clients:
• Being part for Data & Analytics COE team developed compliance tool for EY.
• The tool will analyze whether the HANA views are following the SAP HANA Best Practices.
• Created a Table Function to find the Total No of Joins/Data Sources/Table Function/Total Calculated
Columns/Total Formula’s/Total Filters/Total Unions/Total RANK/Invalid Column labels used in any HANA
View.
• Created a Table Function to SCAN the HANA view completely and implemented checks like whether the view
is Scripted View/Analytical View/Attribute View/ is Full outer join used/ is Cardinality set/ Is Star join
recommended/ any classical analytical privilege used/ is Filters Applied on data source/ is all cols projected/
Risk category.
• Created a Table Function to find all the HANA views which does not have a dedicated value help view used
for Input Parameter or Variables.
• Developed Table Function which decides whether the Object can be transported to QA and raise the possible
exceptions if the objects has been moved to other environments.
• Created a table function to act as a Where used list for HANA models, which helps the developers to analyze
whether the object has been currently worked by someone.
• Created a table function to know which objects are released & not released to other environments.
• Comparing the models between DEV & PRD systems to understand how many are in sync/un-sync with the
PRD environment.
• Created Table Function to identify the total number of Associations used/ No of Source DDL and tables used/
Dependent Objects/ List of Header annotations/ List of Field level annotations.
Snowflake
• Developed Views in Snowflake by consuming the data from different data sources.
• Created tables and consumed these tables in the Views to meet the business requirement.
• Created reusable using Snowflake Information schema tables which helps the developers to track the
development progress and have better understanding of the views available.
HANA SDA
• Created HANA SDA connection with Non-SAP systems like Oracle and MS-SQL databases.
• Developed multiple virtual tables by using MS-SQL and Oracle database tables as source.
• Created procedures to move the data from Virtual tables to SAP HANA tables for staging scenarios.
• Developed calculation views on top of these tables to full fil the reporting requirement of business.
Reporting Experience
• Developed stories in Tibco Spotfire and strong knowledge in creating the Dashboards in Tibco Spotfire by
using HANA Views as Data Source.
• Implemented Row level security in Spotfire Dashboards to handle Data Segregation Methods.
• Have implemented data segregation in Spotfire using Join, Relationship method and compared the
performance between these methods.
• Extensively worked on Analysis for Office tool by consuming the HANA CV’s and creating the analytics model
for the clients.
• Strong knowledge in Analysis for Office and provided a KT session to the Walt Disney Stake holders.
• Developed dashboards in Power BI for OTC and Finance modules.
• Knowledge in creating dashboards and stories in Tableau.
ABAP Experience
• Strong knowledge in creating Open ODS views, Composite Providers and Bex Queries.
• Created Composite provider to consume multiple HANA CV’s and ADSO based on the business requirement.
• Created BW queries on top of Composite providers and exposed to BOBJ.
LSMW
• Created LSMW tools with all kind of methods – Recording, IDoc, Direct input.
• Created multiple LSMW tools in CRM Sales and Marketing for the team with minimal KT.
• Knowledge in debugging LSMW tools and incorporating additional logic based on the user requirement and
extracting data from SAP ECC and as well from CRM systems.
• In CRM worked on One Order Model, Ibase, Iobjects, Business Partner, Service Contract, Service Order,
Leads, Opportunity, Service Confirmations and Business Partner Updates.
• Have created Multiple Extraction tool both in ECC and CRM systems using SQVI.
• Developed multiple reusable tools in CRM systems for user benefits and got appreciation from high level
clients.
• Have Created a Extraction tool for Service Contract, Service Orders, Service Confirmation, IBase, IOBjects, BP
Account and Contact Person extracts.
Onsite Exposure
• Traveled to countries like Germany, Poland, Switzerland and had direct interaction with business stake
holders and understand the Global reporting requirement of the customers.
• Skilled in understanding the business language and converting them to technical terms and implement them.
• Have presented the business use case of Enterprise HANA and explained how it solves complex business
scenarios.
EMPLOYMENT HISTORY
Current Employer: ITC Infotech, Bangalore
Role: SAP Datasphere Consultant (Contract)
Duration: Mar 2024 – Till Date
1. Tropicana Brands
Previous Employer: Vaspp Technologies, Bangalore
Role: Senior HANA & Datasphere Consultant (Contract)
Duration: June 2023 – Mar 2024
1. Cisco US (Software)
CERTIFICATIONS
LANGUAGES
• English
• Tamil
INTERESTS
• Playing Keyboard/Guitar
• Badminton
PERSONAL DETAILS
Declaration
I, K Rufus Samuel Thangaraj, hereby declare that the information contained here is true and correct to the best of my
knowledge and belief.
__________________________ Date:
K Rufus Samuel Thangaraj Chennai, TamilNadu