0% found this document useful (0 votes)
163 views8 pages

Rufus SAP Datasphere 1

Uploaded by

Dia San
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
163 views8 pages

Rufus SAP Datasphere 1

Uploaded by

Dia San
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

K Rufus Samuel Thangaraj (Immediate Joiner)

SAP Datasphere/DWC/Native HANA


Phone: +91 99629 51959
a
Email: [email protected]
LinkedIn: https://www.linkedin.com/in/rufus-samuel-7728882a/

PROFILE
A lifelong learner and passionate professional in the field of Data Analytics and HANA XSA Trainer. Always striving to
keep myself updated with the state-of-the-art developments/trends in this area and its related fields such as SAP
Datasphere, HANA XSA/Native HANA, ABAP CDS, SAP BO and other SAP tools.
Overall, 11+ years of IT experience, comprising almost 9+ years in the field of Datasphere/HANA Analytics and
around 3 years in SAP ABAP. Worked with 2 renowned IT companies, 1 BIG-4 and 1 Product Organization till date.
The clients list includes biggest brands which spans across various domains such Retail, manufacturing, Media and
Pharma etc. Have got both onsite and offshore experience while working with different teams across the globe
covering regions from NA, EMEA and APAC.

SKILLS

Advance Intermediate Novice

SAP Datasphere/DWH SAP ABAP CDS SAP DI


SAP HANA XSA & XSC SAP PaPM HANA Embedded Analytics
HANA SQL Script HANA SDA/SDQ SAP Analytics Cloud
HANA XSJS Services Power BI Python
SAP HANA Security SAP BO, Webi
SDI Flowgraphs LSMW
Analysis for Office Snowflake
TIBCO Spotfire SAP BW4 HANA Modeling
S4 Cloud CDS Views SAP Fiori Elements
SAP CRM Sales and Service SAP SLT
SAP MM, SD Functional

WORK EXPERIENCE

SAP Datasphere

• Strong working knowledge of Data Modeling & Data Transformation in Datasphere.


• Extensively Created Multiple Remote Tables with Semantic usage of (Relational Dataset, Fact, Dimension,
Text, Hierarchy tables)
• Created remote tables to consume data from S4 using the CDS views & S4 Tables.
• Implemented dynamic filters in remote tables to optimize data retrieval from S4 CDS views and S4
tables, ensuring efficient and targeted data access.
• Interact & Analyze business requirements to identify and decide which data should be stored in Datasphere.
• Created Complex data modelling using Graphical Views (Relational Dataset/Fact/Dimension) based on the
Functional design document and leveraging S4 as a data source.
• Developed SQL view using Standard tables to monitor the Remote tables & Remote Queries which are
scheduled and created task chains to send emails to Architect for any errors on data loading activities.
• Created Functional & Technical Design document for Datasphere requirements.
• Created SQL views to handle complex business logics and holds good hands-on SQL.
• Developed multiple Analytical Models to make the reporting available for SAC & Power BI.
• Developed Task chains to handle the Dataflow updates in a sequential order and for better monitoring.
• Created Multiple Dataflow to move data from MSSQL tables to persistent tables in Datasphere and
incorporating diverse data cleansing logics applied.
• Developed procedures in DB explore of the Datasphere space to convert the ABAP programs logics to
Persistent tables and facilitating streamlined reporting processes.
• Adept at creating and managing Task chains for efficient data loading and monitoring using DWF Task chain
Monitor.
• Involved in meticulous unit testing of dataflows and conducting thorough data reconciliations with source
systems to guarantee data accuracy.
• Knowledge in setting up connections between Non-SAP sources & Datasphere.
• Consumes TPM data by using Azure as a Source and use Replication flows to move data to Datasphere.
• Developed replication flows for Delta enabled CDS views and move data into Datasphere.
• Experience in creating Models and Dimensions in SAC.
• Proven track record in devising effective approaches for data analysis and reporting to drive informed
decision-making.
• Transporting Datasphere artifacts from one landscape to another using Import/Export folder creation.

HANA XSA Experience

• Expertise in setting up containers from scratch in HANA Web IDE, and making the system available for
developers/Client to work.
• Strong knowledge in SQL programming.
• Strong knowledge in creating HANA CV’s in XSA tool and making THE BEST use of the new XSA features.
• Have used new XSA features like Minus/Intersect/Table Function Node/Non-Equi Join during the
developments.
• Created Flowgraphs to move the data from the HANA CV to HANA tables for daily snapshot scenarios.
• Developed multiple Flowgraphs to handle the delta load from Oracle to SAP HANA tables.
• Strong knowledge in using Input parameters and Variables based on the business requirements and majorly
use Input parameter for effective data filtering.
• Expertise in improving the performance of calculation views based on performance analysis tool provided
by HANA.
• Created Multiple Security Roles & SQL Analytical Privileges to handle data security from the view level.
• Implemented more than 200+ KPI’s using the HANA calculation views and exposing it to SAC layer.
• Created table functions for scenarios which cant be implemented in calculation view.
• Strongly used Window functions to implement complex business solutions.
• Extensively Trained HANA developers and Client to make them adapt to the tool in a better pace. Have
created training documents and materials to help understand the tool better.
• Expertise in working in setting up the containers for Cross-Container access.
• Strong hands on in creating user provided services for consuming non-HDI containers within HDI container.
• Expert on configuring CI/CD Pipelines in Gitlab for HANA XSA environment.
• Created Pipelines to automate the BUILD & DEPLOY HANA XSA code into HANA DB and update the
corresponding branches.
• Extensively created DWF Task chains to schedule multiple procedures and flowgraphs for automatic job
handling.
• Strong knowledge in Working in temporary container concepts without updating the Master branch
frequently.
• Created Synonyms extensively on top of the Database tables and consume them in the HANA calculation
views.
• Expertise in working in files like package.json/undeploy.json to deploy/undeploy the objects from the
HANA DB whenever required and handled errors effectively.
• Extensively created multiple .HDBCDS tables for the business requirement.
• Debugging complex HANA Procedures and resolving data issues effectively.
• Expertise in creating synonyms on top of HANA CV’s by fully utilizing the Cross-Container feature of HANA.
• Have used the Git Hub tool to perform operations like Fetch/Pull/Merge/Commit/Stage.
• Used Azure as the cloud interface to create the remote branch and generating pull request to push the
changes to Master.
• Provided KT to the new team members on HANA XSA working models and created Knowledgeable
documents on HANA XSA and Git Hub.
• Developed calculation views of both Cube and Dimensions.
• Strong knowledge in GIT history to compare the changes between two object versions.

HANA XSC Experience


• Extensively developed multiple Calculation views for different SAP modules like BRIM, SD, CRM,MM, FICO,
EWM.
• Created Table functions whenever needed and strong knowledge in using SQL functions like Window, Date,
String functions, Regular Expressions.
• Developed Stored Procedure to handle delta mechanism to the HANA tables.
• Developed dynamic stored procedures for handling the FULL/DELETE load of persistent tables.
• Created XSJS job and XSJS scripts to schedule the procedures automatically.
• Worked on HANA Performance tuning on the Calculation views and reduced the memory consumption and
increased the execution speed of the CV’s.
• Extensively worked on all types of joins like Inner/Outer/Left/Dynamic/Referential joins.
• Created Calculated columns whenever required using both Column and SQL engine.
• Created Python scripts to call the HANA procedure.
• Extensively used Plan Viz and Explain plan to analyze the performance bottle necks in HANA CV’s.
• Showcased the client on implementing row level security without using Analytical Privileges.
• Created Dynamic Analytical privileges to segregate the data based on the User Login.
• Created Procedures to handle the Dynamic Analytic privileges for the row level security.
• Developed HDBDD tables for business needs and used procedures to import data to those tables.
• Strong knowledge in using the HANA System views for automating the Daily activities and track the
development objects which are released and pending to be released.
• Expertise in debugging HANA Models, Procedures and providing permanent fixes to the business.
• Good knowledge in SLT area.
• Good is using HANA Cockpit and monitoring it on the daily basis and fixing the OOM dumps.
• Extensively worked on transporting HANA models to QA using HALM and Import/Export method.
• Created Delivery units and mapping them to HANA Packages/Schemas.
• knowledge on HANA Sentimental analysis & Text search features.
• Good knowledge in using Ranks, Union, Joins, Aggregation & Projections.
• Extensively used Variables and Input parameters in HANA views to filter the data effectively and increase the
performance of the HANA view.
• Strong knowledge in SQL and converting the complex logic to Table Functions/Procedures.

SAP Fiori/UI5
• Developed Odata services for the calculation views and exposed them to other front-end applications like
Fiori/UI5.
• Have implemented multiple list reports with different levels of selections and Input Parameters to have a
great control on the report from the Fiori end.
• Strong knowledge in working on Annotations, Manifest.json, Controllers, Fragments files based on the Fiori
requirement.
• Extensively Created multiple Donut charts, Column Charts, line charts in Overview page and implemented
the navigation to the corresponding list reports.
Developed HANA Compliance tool which can be used by all EY HANA Clients:

• Being part for Data & Analytics COE team developed compliance tool for EY.
• The tool will analyze whether the HANA views are following the SAP HANA Best Practices.
• Created a Table Function to find the Total No of Joins/Data Sources/Table Function/Total Calculated
Columns/Total Formula’s/Total Filters/Total Unions/Total RANK/Invalid Column labels used in any HANA
View.
• Created a Table Function to SCAN the HANA view completely and implemented checks like whether the view
is Scripted View/Analytical View/Attribute View/ is Full outer join used/ is Cardinality set/ Is Star join
recommended/ any classical analytical privilege used/ is Filters Applied on data source/ is all cols projected/
Risk category.
• Created a Table Function to find all the HANA views which does not have a dedicated value help view used
for Input Parameter or Variables.
• Developed Table Function which decides whether the Object can be transported to QA and raise the possible
exceptions if the objects has been moved to other environments.
• Created a table function to act as a Where used list for HANA models, which helps the developers to analyze
whether the object has been currently worked by someone.
• Created a table function to know which objects are released & not released to other environments.
• Comparing the models between DEV & PRD systems to understand how many are in sync/un-sync with the
PRD environment.
• Created Table Function to identify the total number of Associations used/ No of Source DDL and tables used/
Dependent Objects/ List of Header annotations/ List of Field level annotations.
Snowflake

• Developed Views in Snowflake by consuming the data from different data sources.
• Created tables and consumed these tables in the Views to meet the business requirement.
• Created reusable using Snowflake Information schema tables which helps the developers to track the
development progress and have better understanding of the views available.
HANA SDA
• Created HANA SDA connection with Non-SAP systems like Oracle and MS-SQL databases.
• Developed multiple virtual tables by using MS-SQL and Oracle database tables as source.
• Created procedures to move the data from Virtual tables to SAP HANA tables for staging scenarios.
• Developed calculation views on top of these tables to full fil the reporting requirement of business.

Reporting Experience

• Developed stories in Tibco Spotfire and strong knowledge in creating the Dashboards in Tibco Spotfire by
using HANA Views as Data Source.
• Implemented Row level security in Spotfire Dashboards to handle Data Segregation Methods.
• Have implemented data segregation in Spotfire using Join, Relationship method and compared the
performance between these methods.
• Extensively worked on Analysis for Office tool by consuming the HANA CV’s and creating the analytics model
for the clients.
• Strong knowledge in Analysis for Office and provided a KT session to the Walt Disney Stake holders.
• Developed dashboards in Power BI for OTC and Finance modules.
• Knowledge in creating dashboards and stories in Tableau.

ABAP Experience

• Developed ABAP CDS views using associations and joins.


• Created multiple CDS views in S4 HANA cloud.
• Have worked on different types of CDS views like Basic, Consumption and Composite Layers.
• Created CDS views of Analytical Query and exposed it to Analysis for office.
• Used different level of Header and fields annotations to enhance the power of ABAP CDS.
• Created CDS views of Odata type and exposed it to Fiori.
• Knowledge in creating AMDP classes and methods.
• Strong knowledge in debugging Standard and Custom programs.
• Experience in preparing technical design document based on the functional specifications.
• Attending walkthroughs from functional team as well from the onsite team during the project phase and
Depth knowledge in working in Data dictionary concepts.
• Experience in working with TMG and using the required events when possible
• Worked on creating a custom BADI for a sales order and implemented the same.
• Have created a new custom ALV report for sales team by using OOABAP concepts
• Extensively worked on CRs in Reports (Both Classical and Interactive)
• Knowledge in fixing issues in TR movement and handling issues with Transports.

BW4 HANA Experience

• Strong knowledge in creating Open ODS views, Composite Providers and Bex Queries.
• Created Composite provider to consume multiple HANA CV’s and ADSO based on the business requirement.
• Created BW queries on top of Composite providers and exposed to BOBJ.
LSMW

• Created LSMW tools with all kind of methods – Recording, IDoc, Direct input.
• Created multiple LSMW tools in CRM Sales and Marketing for the team with minimal KT.
• Knowledge in debugging LSMW tools and incorporating additional logic based on the user requirement and
extracting data from SAP ECC and as well from CRM systems.
• In CRM worked on One Order Model, Ibase, Iobjects, Business Partner, Service Contract, Service Order,
Leads, Opportunity, Service Confirmations and Business Partner Updates.
• Have created Multiple Extraction tool both in ECC and CRM systems using SQVI.
• Developed multiple reusable tools in CRM systems for user benefits and got appreciation from high level
clients.
• Have Created a Extraction tool for Service Contract, Service Orders, Service Confirmation, IBase, IOBjects, BP
Account and Contact Person extracts.
Onsite Exposure
• Traveled to countries like Germany, Poland, Switzerland and had direct interaction with business stake
holders and understand the Global reporting requirement of the customers.
• Skilled in understanding the business language and converting them to technical terms and implement them.
• Have presented the business use case of Enterprise HANA and explained how it solves complex business
scenarios.

EMPLOYMENT HISTORY
Current Employer: ITC Infotech, Bangalore
Role: SAP Datasphere Consultant (Contract)
Duration: Mar 2024 – Till Date

1. Tropicana Brands
Previous Employer: Vaspp Technologies, Bangalore
Role: Senior HANA & Datasphere Consultant (Contract)
Duration: June 2023 – Mar 2024

1. Karl Storz US (Medical Equipment’s)


2. Rich Products

Previous Employer: Zensar Technologies, Chennai


Role: Senior Technical Analyst
Duration: Jan 2023 – June 2023

1. Cisco US (Software)

Previous Employer: Ernst and Young, Chennai


Role: Senior HANA Lead
Duration: Nov 2019 – Nov 2022
Clients Worked –

2. Carrier Global US (Appliance),


3. Merck and Organon US (Pharma),
4. Walt Disney International US (Media)
5. PepsiCo US (FMCG)
6. Booking.com UK (Travel)
7. Karl Storz US (Medical Equipment’s)

Previous Employer: KLA - US, Chennai


Role: Lead HANA Modeler
Duration: Feb 2019 – Nov 2019

Previous Employer: Mettler Toledo, Mumbai (US/EMEA)


Role: SAP HANA Technical Analyst
Duration: June 2017 – Jan 2019

Previous Employer: Accenture, Chennai


Role: Application Developer
Duration: April 2015 – May 2017

Previous Employer: Wipro Technologies, Chennai


Project: British American Tobacco (BAT) APAC
Role: SAP ABAP Developer
Duration: Nov 2012 – April 2015

CERTIFICATIONS

• EY Analytics Data Architecture – Bronze Badge [EY]


• EY Analytics Data Visualization – Bronze Badge [EY]
• SAP HANA Query Optimization [Open SAP]
• The Complete SQL Bootcamp 2020 [Udemy]
• Oracle SQL – SQL Developer from Scratch [Udemy]
• SAP HANA SQL Script Part 1 [Udemy]
• SAP HANA SQL Script Part 2 [Udemy]
EDUCATION

• Bachelor of Information Technology Graduated, May 2012


• Panimalar Engineering College Marks 72%
• Anna University Affiliated

• H.S.C. / Std 12th


• Daniel Thomas Matric Hr Sec School Graduated, June 2007
• State Board Marks 78%

• S.S.C./ Std 10th


• Daniel Thomas Matric Hr Sec School Graduated, March 2005
• State Board Marks 70%

LANGUAGES

• English
• Tamil

INTERESTS

• Playing Keyboard/Guitar
• Badminton

PERSONAL DETAILS

Father’s Name: T Kanagaraj Birthday: 07-March-1991


Gender: Male Marital Status: Married
Nationality: India Email: [email protected]
Present Address: K Block No 303, S&P Living Space, Kil Ayanambakkam
Chennai - 600095

Declaration
I, K Rufus Samuel Thangaraj, hereby declare that the information contained here is true and correct to the best of my
knowledge and belief.
__________________________ Date:
K Rufus Samuel Thangaraj Chennai, TamilNadu

You might also like