100% found this document useful (1 vote)
1K views

SAP DataSphere Tutorial

The document provides a tutorial for using SAP DataSphere to build an analytics model and report. It describes creating a space, loading sample CSV data, building an entity relationship model, creating tables and loading data, building graphical views and a fact, and consuming the analytic model in SAP Analytics Cloud.

Uploaded by

ksjoshi1605
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
1K views

SAP DataSphere Tutorial

The document provides a tutorial for using SAP DataSphere to build an analytics model and report. It describes creating a space, loading sample CSV data, building an entity relationship model, creating tables and loading data, building graphical views and a fact, and consuming the analytic model in SAP Analytics Cloud.

Uploaded by

ksjoshi1605
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

SAP DATASPHERE

Beginner Tutorial

Date of Creation: 2024-05-06 © Jürgen Noe Consulting GmbH


page 2 of 38

Table of Contents

1 Foreword ........................................................................ 3
2 Prerequisites................................................................... 3
3 Business Case................................................................. 4
4 Step-by-step Solution ...................................................... 5
4.1 Create a Space in DataSphere ..................................................................... 5
4.2 Create Scoped Roles .................................................................................. 6
4.3 Assign Users .............................................................................................. 9

5 Prepare model and data ................................................. 11


5.1 Data Builder ............................................................................................. 11
5.2 Create ERM .............................................................................................. 12
5.3 Load data into tables ................................................................................ 18

6 Build the analytics ......................................................... 21


6.1 Create graphical view................................................................................ 21
6.2 Create Fact .............................................................................................. 28
6.3 Create Analytic Model ............................................................................... 30

7 Consuming the Analytic Model in SAC ............................ 33


7.1 Create a story in SAC ................................................................................ 33
7.2 Add Diagram using DataSphere Analytic Model........................................... 35
page 3 of 38

1 Foreword
This document is an update to the original SAP Datasphere Content Tutorial which will
be updated if new features will be available. You can find the original SAP Tutorial here:

datasphere-
content/Sample_Bikes_Sales_content/SAP_Datasphere_Content_Tutorial.pdf at main ·
SAP-samples/datasphere-content (github.com)

2 Prerequisites
The best way for accessing the CSV files used for this demo, is to have your own GitHub
account. If you are not already registered, register please. See links to GitHub and Visual
Studio Code Installation. Install both of them on your local PC. Use installation
documents for

GitHub: Installing GitHub Desktop - GitHub Docs

Visual Studio Code: Visual Studio Code – Microsoft-Apps

It is important that you set a GIT directory as starting point for all GIT Repositories
preferable in one of your user subfolders. If Github and VS Code are installed, please
open VS Code. Now press <Ctrl>-<Shift>-Ö and open a new terminal window.
Alternatively you can open it as VS Code command entering <CTRL>-<SHIFT>-P and
selecting “Create Terminal” from the command palette.

Clone Data from SAP-samples/datasphere-content: Use sample content to explorer SAP


Datasphere. The downloads contain sample data as CSV files, but could also include
model / metadata information. See the README files for details. (github.com) into your
Visual Studio Code. Be sure that you are under your GIT repository folder in the terminal
window. then enter

git clone https://github.com/SAP-samples/datasphere-content

Check, if SAP DataSphere is connected to SAP Analytics Cloud in the settings. If not sure
ask your system administrator.

Note your local folder for Sample Bike Sales content in datasphere-
content/Sample_Bikes_Sales_content at main · SAP-samples/datasphere-content
(github.com)\CSV. You need this folder later to upload CSV files to DSP.
page 4 of 38

In Datasphere you need sufficient privileges to create new models, please check that
you have DW Modeler role assigned.

3 Business Case
The well-known company “Best run bikes” wants to have a sales-by-region report. Due
to an increase in the number of sales, the company wants to understand how the
different regions, represented by Sales organization, are performing in each period of
current fiscal year 2019. As measurement for performance the gross amount should be
taken into consideration. A simple diagram is sufficient in this early evaluation. The
solution will be developed with SAP Datasphere (DSP) and SAP Analytics Cloud (SAC).
page 5 of 38

4 Step-by-step Solution
4.1 Create a Space in DataSphere
Please log on to your SAP DataSphere now. First of all you need an appropriate space in
DSP where you can upload the CSV data.

Choose Space Management from the menu and then Create:

Enter name and ID in the next pop-up:

We will use SPTUTORIAL for this tutorial.

In the next Screen you can enter further details for your space, but we just use it as
proposed in standard.
page 6 of 38

Now deploy the space.

4.2 Create Scoped Roles


In the next step you have to assign users to this space that you can work with it and
create tables or views. For this purpose you need to create scoped user roles.

Press in the menu Security -> Roles. You see all available roles, scoll down the Screen to
the bottom:

Press Create a new scoped role and add it to your DataSphere environment.

Enter name and ID as shown below and press create:


page 7 of 38

Now you have to assign a role template. Choose DW Modeler from the list and press OK:

The role is created and displays all privileges are assigned to this role. Now we need to
assign the Space(s) to this role. In this context a space is a scope. So choose “Scopes”
from the menu.
page 8 of 38

A warning pop-up appears that the role has not yet been saved. Please Save the role and
continue:

Choose “Scope” again. In the next Pop-up window choose “Bike Tutorial” Space as
scope and press “Save”:

Display of role has changed:


page 9 of 38

4.3 Assign Users


Now we can assign users to this scoped role. Press “Users” and assign all users you
want to grant this role.

But you can also switch back to Tutorial Space. Choose Space Management and select
Tutorial space and press “Edit”:

Here you can assign users to the space.


page 10 of 38
Click “Add” and assign all users in the next pop-up to the space you want. Press “Next”.
Now you should assign our Tutorial role in the next pop-up. Select “Tutorial Role” and
press “Create”:

As a result you should see a list with all users assigned to this space. “Deploy” space
again.
page 11 of 38
5 Prepare model and data
Space is created, users with scoped roles have been assigned. Now we can go create
our data model and import data from CSV files into our Tutorial Space. Go to “Data
Builder” in the menu.

5.1 Data Builder

You should be able to see Bike Tutorial Space. If it is not available wait a few seconds, as
deployment may not yet completed. If Bike Tutorial is available, just click on it and you
get into the data builder GUI.

Here you see a list of all objects you can create and use to store data and execute tasks.

We will start with the Entity-Relationship-Model (ERM). That is the preferred starting
point for a new project. You can define the ERM from a JSON-File. In the ERM you define
all tables / views and their dependencies. The content tutorial provides a JSON-ERM file.
page 12 of 38

5.2 Create ERM


Press Button “New Entity Relationship Model”. You get into the screen for creating a
entity-relationship model:

You can directly create a entity-relationship model from JSON file just by uploading it:

Choose “Import objects from CSN/JSON File”. In the next pop-up choose JSON-File
from your sales content folder and press “Next”:
page 13 of 38

Select all Objects to import and press “Import CSN File”:

You will get a message that import was successful. The ERM should look like this:

You find yellow numbers about the tables, telling you something is not ok. If you press on
a number you will get a more detailed information:
page 14 of 38

This warning actually prevents you to deploy the ERM and is rather an error. It tells you in
some columns are missing text/associations but text/association is setup for the model.
Choose table “SalesOrders” and expand it to Full screen you should see the details:

You find 3 associations defined:


page 15 of 38
Press “Enter Full Screen” for Full screen display of the table with all columns and their
settings:

As you can see for no column a text / association is defined for the any of the three
defined associations. Please enter following associations:

Save the model. Saving every few minutes is recommended to prevent time out. Ignore
the warnings of missing associations and save in spite of warnings, press “Save”:
page 16 of 38

As a result the warning has disappeared in the ERM on this table:

Maintain the associatinos accordingly for all other tables:


page 17 of 38

Don’t forget to save �

Now you are ready to deploy your ERM. During deployment all tables will be generated.
page 18 of 38

5.3 Load data into tables


Go back to data builder and chose “Bike Tutorial” Space. Now you can dind the created
tables from your ERM and the ERM itself in the list of objects:

Click on the Link in Business name fpr ProductCategories. You will get to detailed table
information in a new screen:
page 19 of 38

With the Up-Arrow Button you can upload data from CSV to your table. Press up-arrow
button. In the following pop-up choose “ProductCategories.csv” from your GIT sales
content folder:

You will a data preview. If it is ok, you can press “Import”. Next you will see a progress
window for your upload and get a notification if upload has finished.

With View you can view the data:


page 20 of 38

The view will be prepared and you can see the result:

Please upload all other CSV files from the ERM model in the same way. Now you’re ready
and have set up your data.
page 21 of 38

6 Build the analytics


6.1 Create graphical view
Let’s start with the basics and create a graphical view. Go to Data Builder as shown
beforehand and create a new Graphical View.

Now drag and drop table SalesOrder to the canvas from left to right. The result should
look like this:
page 22 of 38

If you start dragging a table from left to right there will be created an object “View_1”
which is the output of your graphical View. Now drag table SalesOrderItems and drop it
on SalesOrders. It is important that it will really be dropped on the table SalesOrder. The
result should be like this. Datasphere creates a join automatically:

If you move “SalesOrderItems” about “SalesORders” then you see available options,
how to join them:
page 23 of 38

Standard is Join if you just drop it. IF you want a “Union” you choose it here. The third
option is to replace the old table by the new one.

You can inspect the join in the properties window:


page 24 of 38

For the join following join types are available:


page 25 of 38
Inner is chosen by default.

If you don’t want the action, just press “ESC” and no additional join will be created.

You can define the output columns in the Projection properties:

If you select one column you will see on the canvas from which base table it is:
page 26 of 38

As you can see, column “SalesOrderID” is taken from SalesOrders. Duplicate Columns
are disabled. You can recognize them as they are greyed-out. You can rename and
restore columns, if you want to see for example gross amount on header and item level.
Let’s rename gross amount from sales order items and restore it:

To make it visible, you have to restore the renamed column:

Now you have added gross amount on item level to your output:
page 27 of 38

You can take a preview on each stage of the graphical view:

Just press the data viewer button.


page 28 of 38

6.2 Create Fact


For later use in SAC we have to make sure we have the right output model. So switch
output to Fact first:

If you choose “Fact” as Semantic Usage” you have the option to create an Analytic
Model which is the preferred object to be exposed to SAC from DataSphere. For use as
Analytic Model we have to make sure that measurements and dimensions are set up
correctly. Highlight all potential measurements in the attributes list. First order all
potential measurements in the list under beneath:
page 29 of 38

Now highlight them all:

Finally move them with drag and drop to the measurements. Measures should now look
like that:
page 30 of 38

Save and deploy the changes.

6.3 Create Analytic Model


Now we can create the Analytic Model:

A new analytic model will be created:


page 31 of 38
The properties are set to:

And dimensions:
page 32 of 38

Save and deploy the model:


page 33 of 38
Everything is setup and we can consume the analytic model in SAC.

7 Consuming the Analytic Model in SAC


Switch from DataSPhere to SAC in the DataSphere menu:

7.1 Create a story in SAC


We want to create a new story. Choose “Storys” from the menu in SAC:

We create a new flexible story:


page 34 of 38

We use optimized mode for it:

A new empty story canvas is created:


page 35 of 38

7.2 Add Diagram using DataSphere Analytic Model


We want to use a diagram, drop it on the canvas. As no source data is defined, we are
asked to define which data source should be used. As we connected DataSphere to SAC
we get also DataSphere Datasets or Models to choose from.

We switch to Datasphere:
page 36 of 38

We see all Spaces where we are granted access rights and offer at least one analytic
model. We choose our SPTUTORIAL space.

Here we find our “AMSalesOrderDetails” analytic model. We choose it and add it as data
source to our diagram. We need to define at least one measurement and add one
dimension for our requirement.
page 37 of 38

Let’s add grossamount and dimensions SalesOrganization and Fscal Year/Period:


page 38 of 38

Proudly we can present the solution to the customers.

I hope you enjoyed this tutorial with DataSphere and SAC. More to come soon.

You might also like