Accessing SAP Datasphere Via The Command Line
Accessing SAP Datasphere Via The Command Line
Many of the features available to SAP Datasphere users can also be accessed via the command line.
Note
The SAP Datasphere command line interface module has been renamed from dwc to datasphere. The
command dwc has been decommissioned at the end of 2023: please use the new datasphere command
instead. For more information, see https://www.npmjs.com/package/@sap/datasphere-cli .
Command Description
datasphere Users with a DW Administrator role (or equivalent privileges) can list, upload, and delete TLS
configuration server certificates (see Manage TLS Server Certificates via the Command Line [page 133]).
datasphere dbusers Users with the DW Space Administrator role (or equivalent privileges) can reset database
user passwords (see Reset Database User Passwords via the Command Line [page 35]).
datasphere global- Users with the DW Administrator role (or equivalent privileges) can manage global roles (see
roles Manage Global Roles via the Command Line [page 14]).
datasphere Users with the DW Modeler role (or equivalent privileges) can manage data providers (see
marketplace Manage Data Marketplace Data Providers via the Command Line [page 60]) and data
products (see Manage Data Marketplace Data Products via the Command Line [page 79]).
datasphere objects Users with the DW Modeler role (or equivalent privileges) can list, read, and delete modeling
objects (see Manage Modeling Objects via the Command Line [page 47]).
datasphere scoped- Users with the DW Administrator role (or equivalent privileges) can manage scoped roles
roles (see Manage Scoped Roles via the Command Line [page 17]).
datasphere spaces Users with the: DW Administrator role (or equivalent privileges) can create spaces and
allocate storage and memory to them, while users with the DW Space Administrator role can
manage and staff spaces (see Manage Spaces via the Command Line [page 27]).
Users with a DW Integrator role (or equivalent privileges) can list, validate, and delete con-
nections and read connection details. Additionally, they can create and edit SAP SuccssFac-
tors connections (see Manage Connections via the Command Line [page 134]).
datasphere workload Users with the DW Administrator role (or equivalent privileges) can set space priorities and
statement limits for spaces. (see Manage Space Priorities and Statement Limits via the
Command Line [page 44]).
datasphere tasks Users with the DW Integrator role (or equivalent privileges) can orchestrate tasks and task
chains (see Manage Tasks and Task Chains via the Command Line [page 131]).
datasphere users Users with the DW Administrator role (or equivalent privileges) can manage SAP Datasphere
users (see Manage Users via the Command Line [page 24]).
See the blog @sap/datasphere-cli: Command-Line Interface for SAP Datasphere: Overview (updated
September 2022) for a summary of blogs about working with the command line interface.
The command line interface, datasphere must be installed and configured before you can use it to access
your SAP Datasphere tenant.
• To use datasphere, you must install it (see Install or Update the SAP Datasphere Command Line Interface
[page 5]).
• We recommend that you log in via an OAuth client (see Log into the Command Line Interface via an OAuth
Client [page 6]).
• To simplify issuing commands, you can set the host value to identify the SAP Datasphere tenant you are
currently working with (see Set a Host Value to Identify Your SAP Datasphere Tenant [page 6].
• The commands you are allowed to issue depend on the SAP Datasphere roles that you have (see
Command Line Roles and Privileges [page 13].
• In addition to the command-specific options listed, there are general options that can be used with any
command (see Miscellaneous Options and Commands [page 10]).
Authenticated requests are associated either with the authenticated username, tenant ID or with the OAuth
client ID. Unauthenticated requests are associated with the originating IP address, and not the user.
Requests are limited to approximately 300 per user per minute. If you exceed the limit, you will receive
the HTTP 429 Too Many Requests response status code and can review the following request response
headers for further information:
The SAP Datasphere command line interface (datasphere) is a Node.js package that you download using
the Node Package Manager (npm).
Context
Note
The SAP Datasphere command line interface module has been renamed from dwc to datasphere. The
command dwc has been decommissioned at the end of 2023: please use the new datasphere command
instead. For more information, see https://www.npmjs.com/package/@sap/datasphere-cli .
Prerequisites
You have installed the following on your system:
npm is distributed with Node.js. Therefore, when you download Node.js, npm is automatically installed. To
download the Node.js installer, see nodejs.org .
Note
You can test if Node.js and npm are installed on your system by executing the following commands:
• node -v
• npm -v
If Node.js and npm are already installed, then their current versions will appear. If you receive an error, you
have not installed them yet.
Procedure
Note
To update datasphere to the latest version at any time, you just need to run npm install -g
@sap/datasphere-cli again.
datasphere --version
3. Log into the Command Line Interface via an OAuth Client [page 6].
Note
The SAP Datasphere command line interface module has been renamed from dwc to datasphere. The
command dwc has been decommissioned at the end of 2023: please use the new datasphere command
instead. For more information, see https://www.npmjs.com/package/@sap/datasphere-cli .
For information about creating an OAuth client, see Create OAuth2.0 Clients to Authenticate Against SAP
Datasphere.
Note
See the following blogs for more information about working with the command line interface and OAuth:
Note
The SAP Datasphere command line interface module has been renamed from dwc to datasphere. The
command dwc has been decommissioned at the end of 2023: please use the new datasphere command
instead. For more information, see https://www.npmjs.com/package/@sap/datasphere-cli .
You can log in by passing the OAuth client information as options on the command line.
Note
datasphere login
--client-id "<id>"
--client-secrets "<secrets>"
Note
You will be directed to log in with your SAP Datasphere username and password in a browser window once
at the beginning of your OAuth session to determine your space permissions.
You must specify the host which you are logging into using the option --host , or by first setting the host by
calling datasphere config host set <host>, to allow the CLI to store the secrets for the defined tenant
URL. When running a command, the CLI uses the secrets matching the currently maintained tenant URL via
the --host option or set via datasphere config host set <host>.
Parameter Description
Note
In certain environments, including macOS, you must encode the client ID as a URI
when passing it as a parameter.
--token-url "<url>" [optional] Enter the Token URL provided by your administrator.
You can log in more securely by passing the OAuth client information in an options file instead of on the
command line.
Note
datasphere login
--options-file <file>.json
Note
You will be directed to log in with your SAP Datasphere username and password in a browser window once
at the beginning of your OAuth session to determine your space permissions.
Parameter Description
--options-file Enter a path to a .json file containing the basic OAuth options:
<file>.json
{
"client-id": "<client-id>",
"client-secrets": "<client-secrets>",
"authorization-url": "<authorization-url>",
"token-url": "<token-url>"
}
You can avoid running a login command (and entering your SAP Datasphere username and password) at the
beginning of each OAuth session by extracting the personal access and refresh tokens and passing them either
as options or in a secrets file.
To extract your tokens, log into datasphere as usual and then enter the following command and press
Return :
Example output:
Then copy the values for access_token and refresh_token. You can pass these values either as options on
the command line or in an options file.
Note
Your access and refresh tokens are valid for 720 hours (30 days).
Having extracted your tokens, you no longer need to log in at the beginning of your session and can pass your
tokens in a secrets file with any command.
For example, the following command, to read a space, can be run without having logged in beforehand:
{
"client_id": "<client-id>",
"client_secrets": "<client-secrets>",
"authorization_url": "<authorization-url>",
"token_url": "<token-url>",
"access_token": "<access-token>",
"refresh_token": "<refresh-token>"}
Note
Secrets files use versions of the options with underscores instead of hyphens.
To log out from an account, use the datasphere logout command: this command allows you to optionally
specify the ID of the login/secrets to remove using the option --login-id <id>.
datasphere logout
--login-id <id>
By default, when you omit the option --login-id <id> the login/secrets with ID 0 are removed.
Parameter Description
--login-id <id> Optional: specifies the login ID (choices: "0", default: "0").
Only command-specific options are displayed in the SAP Datasphere command line interface. The remaining
general options are listed here.
General Options
Option Description
--host "<url>" Enter the URL of your SAP Datasphere tenant. You can copy the URL of any page in your
tenant. Alternatively, set a host value (see Set a Host Value to Identify Your SAP Datasphere
Tenant [page 6]).
--options-file [optional] Enter a path to a .json file containing all of some of your datasphere op-
<file>.json
tions, listed using the full option names.
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
If you do not include this option, the output will be printed to the command line.
Option Description
Note
In certain environments, including macOS, you must encode the client ID as a URI
when passing it as a parameter.
--options-file [optional] Enter a path to a .json file containing the basic OAuth options:
<file>.json
{
"client-id": "<client-id>",
"client-secrets": "<client-secrets>",
"authorization-url": "<authorization-url>",
"token-url": "<token-url>"
}
--secrets-file [optional] Enter a path to a .json file containing the extended OAuth options, including the
<file>.json extracted tokens:
{
"client_id": "<client-id>",
"client_secrets": "<client-secrets>",
"authorization_url": "<authorization-url>",
"token_url": "<token-url>",
"access_token": "<access-token>",
"refresh_token": "<refresh-token>"}
Note
Secrets files use versions of the options with underscores instead of hyphens.
Passing this file with the tokens allows you to run any command without having first to log
in.
--access-token [optional] Enter the access token for interactive oauth session authentication.
"<token>"
--refresh-token [optional] Refresh the access token for interactive oauth session authentication.
"<token>"
--code "<code>" [optional] Enter the code for oauth token retrieval.
--token-url "<url>" [optional] Enter the Token URL provided by your administrator.
--expires-in [optional] Enter the date when the interactive oauth session authentication expires.
"<expires>"
See also Log into the Command Line Interface via an OAuth Client [page 6].
Option Description
--passcode <code> [optional] If you are not logged into an OAuth client, you must provide a passcode that you
have obtained from your SAP Datasphere tenant.
You can include a passcode with your command using the --passcode parameter. If you
do not include the --passcode parameter, datasphere will prompt you to obtain a
passcode:
1. Enter y and datasphere will open the passcode page for your tenant.
2. If you are not already logged in, you must enter your username and password.
3. When you arrive at the passcode page, copy the temporary authentication code and
paste it into the command line.
Note
If you are not logged into an OAuth client, you must enter a new passcode for each
command that you issue with datasphere.
Configuration Options
Command Description
Command Description
datasphere config cache init --host Download the file of available datasphere commands
"<url>" from the SAP Datasphere instance.
datasphere config cache clean Delete the local file of available datasphere commands.
datasphere <command> --help --host Display help for the specified datasphere command.
"<url>"
datasphere config secrets show Display the parameters for the OAuth client you are logged
into.
datasphere config passcode-url --host Display the passcode url for the SAP Datasphere server.
"<url>"
The command line interface requires the same privileges to perform actions as are needed to use the standard
graphical interface.
DW Modeler
datasphere objects • Data Builder (CRUD----)
• Data Warehouse Business Entity
(CRUD----)
• Data Warehouse Fact Model
(CRUD----)
• Data Warehouse Consumption
Model (CRUD----)
• Data Warehouse Authorization
Scenario (CRUD----)
Users with an administrator role can use the datasphere command line interface to manage users and roles.
Objects Command
See Manage Global Roles via the Command Line [page 14].
Users with a DW Administrator role (or with equivalent privileges) can list and read global roles and add users to
and remove users from them via the command line.
• Install the datasphere command line interface (see Install or Update the SAP Datasphere Command Line
Interface [page 5]).
To browse the available commands, enter the following and press Return :
datasphere global-roles
Note
You cannot create or delete global roles via the command line.
For general information about working with global roles in SAP Datasphere , see Assign Users to a Role.
You can list the global roles in your tenant, and optionally write them to a file.
Parameter Description
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
If you do not include this option, the output will be printed to the command line.
To read the list of users added to a global role and optionally write it to a file, enter the following command and
press Return :
Parameter Description
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
If you do not include this option, the output will be printed to the command line.
For example, to write the list of users added to the global role Custom_Administrator to the file
admins.json , enter the following command and press Return :
Parameter Description
For example, to add the users AADAMS and SSMITH to the global role Custom_Administrator, enter the
following command and press Return :
For example, to remove the users AADAMS and SSMITH from the global role Custom_Administrator, enter
the following command and press Return :
Users with a DW Administrator role (or with equivalent privileges) can create, read, update, and delete scoped
roles via the command line.
• Install the datasphere command line interface (see Install or Update the SAP Datasphere Command Line
Interface [page 5]).
• Log in to your SAP Datasphere tenant (see Log into the Command Line Interface via an OAuth Client [page
6]).
• Have a role of DW Administrator or equivalent privileges (see Command Line Roles and Privileges [page
13]).
To browse the available commands, enter the following and press Return :
datasphere scoped-roles
You can list the scoped roles in your tenant, and optionally write them to a file.
Parameter Description
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
If you do not include this option, the output will be printed to the command line.
You can read the CSN/JSON definition of a scoped role and optionally write it to a file.
To read the list of spaces that are assigned to a scoped role, and optionally write it to a file, enter the following
command and press Return :
To read the list of users assigned to a scoped role and optionally write it to a file, enter the following command
and press Return :
Parameter Description
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
If you do not include this option, the output will be printed to the command line.
You can create scoped roles by providing a definition in a JSON file or an input string.
Parameter Description
--file-path [optional] Enter a path to a file with a .json extension containing your scoped role defini-
<file>.json tion.
{
"name": "<Name>",
"description": "<Description>",
"inheritance": "<Role_Template_Name>"
}
--input [optional] Provide your input in stringified json format instead of via the --file-path
'<stringified- option.
json>'
For example, to create the scoped role Sales_Modeler based on the role template Custom_Modeler, enter
the following command and press Return :
{
"name": "Sales_Modeler",
"description": "Modeler for Sales spaces",
"inheritance": "Custom_Modeler"
}
You can update the description or template role of a scoped role by providing a new definition in a JSON file or
an input string.
Parameter Description
--file-path [optional] Enter a path to a file with a .json extension containing your scoped role defini-
<file>.json tion.
{
"description": "<Description>",
"inheritance":
"<Role_Template_Name>"
--input [optional] Provide your input in stringified json format instead of via the --file-path
'<stringified- option.
json>'
You can add spaces to a scoped role as a comma-separated list of space IDs.
Note
You must add one or more spaces to a scoped role before you can assign users to it.
Parameter Description
For example, to add the two spaces SALES_EU and SALES_US to the scoped role Sales_Modeler, enter the
following command and press Return :
You can add users to a scoped role (and specify the spaces they will be given access to) by providing a
definition in a JSON file or an input string.
Note
You must add your spaces first, before you can add users to them (see Add Spaces to Scoped Roles [page
20]).
Parameter Description
--file-path Enter a path to a file with a .json extension containing the list of users to add and the
<file>.json
scopes to add them to:
[
{
"id": "<userID>",
"scopes": ["<spaceID>","<spaceID>"]
},
{
"id": "<userID>",
"scopes": ["<spaceID>","<spaceID>"]
}
]
--input [optional] Provide your input in stringified json format instead of via the --file-path
'<stringified- option.
json>'
For example, to add the users BBAXTER and JJONES to the spaces SALES_EU and SALES_US via the scoped
role Sales_Modeler, enter the following command and press Return :
[
{
"id": "BBAXTER",
"scopes": ["SALES_EU", "SALES_US"]
},
{
"id": "JJONES",
"scopes": ["SALES_EU", "SALES_US"]
}
]
Users with a DW Space Administrator role (or equivalent permissions) can add users to their space using
the datasphere spaces users add command (see Add Users to a Space [page 32] ).
Parameter Description
--file-path [optional] Enter a path to a file with a .json extension containing the list of users to
<file>.json
remove and the spaces to remove them from:
[
{
"id": "<userID>",
"scopes": ["<spaceID>","<spaceID>"]
},
{
"id": "<userID>",
"scopes": ["<spaceID>","<spaceID>"]
}
]
--input [optional] Provide your input in stringified json format instead of via the --file-path
'<stringified- option.
json>'
For example, to remove BBAXTER and JJONES from SALES_EU via the scoped role Sales_Modeler, enter the
following command and press Return :
[
{
"id": "BBAXTER",
"scopes": ["SALES_EU"]
},
{
"id": "JJONES",
"scopes": ["SALES_EU"]
}
]
Users with a DW Space Administrator role (or equivalent permissions) can remove users from their space
using the datasphere spaces users remove command (see Remove Users from a Space [page 34]).
Parameter Description
For example, to remove the space SALES_EU from the scoped role Sales_Modeler, enter the following
command and press Return :
Parameter Description
--force [optional] Suppress the Confirm Deletion step and delete the scoped role without confirma-
tion.
Users with a DW Administrator role (or with equivalent privileges) can list, create, update, and delete users via
the command line.
• Install the datasphere command line interface (see Install or Update the SAP Datasphere Command Line
Interface [page 5]).
• Log in to your SAP Datasphere tenant (see Log into the Command Line Interface via an OAuth Client [page
6]).
• Have a role of DW Administrator or equivalent privileges (see Command Line Roles and Privileges [page
13]).
To browse the available commands, enter the following and press Return :
datasphere users
For general information about working with users in SAP Datasphere, see Managing SAP Datasphere Users
List Users
You can list the users in your tenant, and optionally write them to a file.
--accept <format> [optional] Specify the format to return the user definition in. You can choose between:
• application/vnd.sap.datasphere.space.users.list+json - [de-
fault] Standard list including id, first name, last name, display name, and email.
• application/vnd.sap.datasphere.space.users.details+json - In
addition, includes manager and roles.
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
If you do not include this option, the output will be printed to the command line.
Create Users
You can create users by providing definitions in a JSON file or an input string.
Parameter Description
--file-path [optional] Enter a path to a file with a .json extension containing your user definitions.
<file>.json
[
{
"id": "<userID>",
"firstName": "<firstName>",
"lastName": "<lastName>",
"email": "<email>"
},
{
"id": "<userID>",
"firstName": "<firstName>",
"lastName": "<lastName>",
"email": "<email>"
}
]
--input [optional] Provide your input in stringified json format instead of via the --file-path
'<stringified- option.
json>'
For example, to create users from a file, enter the following command and press Return :
[
{
"id": "BBaxter",
"firstName": "Bob",
Update Users
You can update the email or manager of a user by providing a new definition in a JSON file or an input string.
Parameter Description
--file-path [optional] Enter a path to a file with a .json extension containing your user definitions.
<file>.json
[
{
"id": "<userID>",
"email": "<email>"
},
{
"id": "<userID>",
"manager": "<manager userID>"
},
{
"id": "<userID>",
"manager": "<manager userID>",
"email": "<email>"
}
]
--input [optional] Provide your input in stringified json format instead of via the --file-path
'<stringified- option.
json>'
Delete Users
--force [optional] Suppress the Confirm Deletion step and delete the scoped role without confirma-
tion.
Users with an administrator role can use the datasphere command line interface to create, read, update, and
delete spaces. Users with a space administrator role can update some space properties, add (or remove) users,
database users and HDI containers, and delete spaces.
Note
The SAP Datasphere command line interface module has been renamed from dwc to datasphere. The
command dwc has been decommissioned at the end of 2023: please use the new datasphere command
instead. For more information, see https://www.npmjs.com/package/@sap/datasphere-cli .
Note
You cannot create or manage a file space via the command line.
• Install the datasphere command line interface (see Install or Update the SAP Datasphere Command Line
Interface [page 5]).
• Log in to your SAP Datasphere tenant (see Log into the Command Line Interface via an OAuth Client [page
6]).
To browse the available commands, enter the following and press Return :
datasphere spaces
For general information about working with spaces, see Creating Spaces and Allocating Storage and Managing
Your Space.
To list the spaces available to you on the tenant, enter the following command and press Return :
Parameter Description
--host "<url>" Enter the URL of your SAP Datasphere tenant. You can copy the URL of any page in your
tenant. Alternatively, set a host value (see Set a Host Value to Identify Your SAP Datasphere
Tenant [page 6]).
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
If you do not include this option, the output will be printed to the command line.
Read a Space
To read a space definition to the console or to a file, enter the following command and press Return :
Parameter Description
--definitions [Optional] Read the object definitions contained in the space. You can use the --
[<obj1>,<obj2>] definitions parameter by itself to read all the objects, or specify a comma-separated
list of object technical names.
Object definitions are read using the standard CSN syntax (see Core Data Services Schema
Notation (CSN) ). The following objects can be read (exported):
• Local Tables - The definition of a local table contains the structure of the table only, and
does not have dependencies on any other objects.
• Remote Tables - The definition of a remote table contains information about its connec-
tion.
Note
Remote tables exported from one space can be imported into another only if they
were originally imported from a connection created in v2021.19 or later.
• Views - The definition of a view contains the definitions of all its sources and any used
data access controls. When you export a view, these objects are exported too.
• Data Access Controls - The definition of a data access control contains the definition of
its permissions entity. When you export a data access control, the permissions entity is
exported too.
Note
You can also export content from and import content to your space via:
• The objects commands, which support a wider selection of object types (see
Manage Modeling Objects via the Command Line [page 47]).
• The (Transport) app (see Transporting Content Between Tenants).
• Export to CSN/JSON File buttons in selected Data Builder editors (see Importing
and Exporting Objects in CSN/JSON Files).
--no-space- [Optional] Suppress the display of the spaceDefinition property. When used with the
definition --definitions option, this allows you to read object definitions without seeing the
other properties of the space.
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
If you do not include this option, the output will be printed to the command line.
To create or update a space, you must first prepare a space definition file (see The Space Definition File Format
[page 36]).
You need only complete the parameters that you want to set. All other space properties are either set to
default values or keep their current values. If your file contains valid CSN object definitions, then these
entities will be created or updated in the space.
When your file is ready, enter the following command and press Return :
Parameter Description
--file-path Enter a path to a file with a .json extension containing your space definition.
<file>.json
--force-definition- [Optional] Deploy changes to objects even if they will generate validation messages warning
deployment of impacts to objects that depend on them. Using this option is equivalent to clicking the
Deploy Anyway button in the Validation Messages dialog (see Modifying Objects That Have
Dependent Objects).
--enforce-database- [Optional] Allow the deletion of database users and their associated Open SQL schemas,
user-deletion when the dbusers section is present in the space definition file. If any existing database
user is not included in the dbusers section and this option is omitted, then they will not be
deleted in order to protect against unintended data loss.
--input [Optional] Provide your space definition in stringified json format instead of via the --
'<stringified- file-path option. For example:
json>'
--input '{"MY_SPACE":{"spaceDefinition":
{"version":"1.0.4"}}}'
Note
If any parameters are set incorrectly, the creation or update is canceled and an error message is written to
the console.
Delete a Space
--force Delete the space without confirmation. If you do not include this option you will be
prompted to confirm the deletion.
If prompted, confirm that you want to delete the space. The space is deleted and a confirmation message is
written to the console.
Caution
When you delete a space, the space (along with its content and data) is permanently deleted from the
database and cannot be recovered.
Users with a DW Space Administrator role (or equivalent permissions) can manage user access to their space
via the command line.
• Install the datasphere command line interface (see Install or Update the SAP Datasphere Command Line
Interface [page 5]).
• Log in to your SAP Datasphere tenant (see Log into the Command Line Interface via an OAuth Client [page
6]).
• Have a role of DW Space Administrator or equivalent privileges (see Command Line Roles and Privileges
[page 13]).
To browse the available commands, enter the following and press Return :
For general information about working with space users, see Control User Access to Your Space.
You can read a list of users in your space and optionally write it to a file.
Parameter Description
--accept <format> [optional] Specify the format to return the list in. You can choose between:
• application/vnd.sap.datasphere.space.users.list+json - [de-
fault] User IDs only.
• application/vnd.sap.datasphere.space.users.details+json -
User IDs and scoped roles.
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
If you do not include this option, the output will be printed to the command line.
To add a user to your space, you must add them to a scoped role that includes your space as a scope and gives
the appropriate privileges.
Parameter Description
--file-path Enter a path to a file with a .json extension containing the list of users to add and the
<file>.json scoped roles you want to add them to:
[
{
"id": "<UserID>",
"roles": "<RoleID[,RoleID...]>"
},
{
"id": "<UserID>",
"roles": "<RoleID[,RoleID...]>"
}
]
Each user must exist on your tenant and each scoped role that you assign them to must
include your space as a scope.
For example, to add the users BBAXTER and JJONES to the space SALES_US via the scoped role
Sales_Modeler, enter the following command and press Return :
Where the file add.json contains the following (where <package> and <id> for the tenant are t and W
respectively):
[
{
"id": "BBAXTER",
"roles": ["Sales_Admin","Sales_Modeler"]
},
{
"id": "JJONES",
"roles": ["Sales_Modeler"]
}
]
You can update the roles that are assigned to users for your space. For example you may want to remove a
user's modeling privileges and leave here only with viewing privileges
Parameter Description
--file-path Enter a path to a file with a .json extension containing containing the list of users to
<file>.json update and the new scoped roles you want them to belong to
[
{
"id": "<UserID>",
"roles": ["<RoleID>","<RoleID>"]
},
{
"id": "<UserID>",
"roles": ["<RoleID>","<RoleID>"]
}
]
For example, to give BBAXTER the roles of both Sales_Modeler and Sales_Integrator, and to remove
JJONES from the role Sales_Modeler and to add her to Sales_Space_Admin, enter the following command
and press Return :
[
{
"id": "BBAXTER",
"roles": ["Sales_Modeler","Sales_Integrator"]
},
{
"id": "JJONES",
"roles": ["Sales_Space_Admin"]
}
]
Parameter Description
--file-path Enter a path to a file with a .json extension containing the list of users to remove and the
<file>.json scoped roles to remove them from:
[
{
"id": "<UserID>",
"role": "<RoleID>"
},
{
"id": "<UserID>",
"role": "<RoleID>"
}
]
For example, to remove BBAXTER from both the Sales_Modeler and Sales_Integrator roles in the
SALES_US space, enter the following command and press Return :
[
{
"id": "BBAXTER",
"role": "Sales_Modeler"
},
{
"id": "BBAXTER",
"role": "Sales_Integrator"
}
]
Users with the DW Space Administrator role (or equivalent privileges) can reset database user passwords from
the command line
To reset a database user password, enter the following command and press Return :
Note
We recommend that, for reasons of security, you specify to receive the new password in an output file.
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
If you do not include this option, the output will be printed to the command line.
For example, to reset the password of the JEFF user in the SALES, and pretty-print it to the file jeff.json,
enter the following:
Note
You can create database users with a space definition file (see The Space Definition File Format [page
36]).
Space properties are set and retrieved in the space definition file format and stored as a .json file. A space
definition file must not exceed 25MB.
Space Properties
Users with the DW Administrator role can create spaces and set any space properties (see Create a Space)
using the following syntax:
{
"<SPACE_ID>": {
"spaceDefinition": {
"version": "1.0.4",
"label": "<Space_Name>",
"assignedStorage": <bytes>,
"assignedRam": <bytes>,
"longDescription": "<Space_Description>",
"priority": <value>,
"injection": {
"dppRead": {
"retentionPeriod": <days>,
"isAuditPolicyActive": true|false
Note
Users with the DW Space Administrator role cannot create spaces, but they can set space properties except
SPACE_ID, assignedStorage, assignedRam, and priority.
Caution
The following workload management parameters are deprecated and may be removed in future releases:
• priority
• workloadClass.totalStatementMemoryLimit.value
• workloadClass.totalStatementMemoryLimit.unit
• workloadClass.totalStatementThreadLimit.value
• workloadClass.totalStatementThreadLimit.unit
<SPACE_ID> Space ID [required] Enter the technical name of the space. Can contain a maxi-
mum of 20 uppercase letters or numbers and must not contain spaces
or special characters other than _ (underscore). Unless advised to do
so, must not contain prefix _SYS and should not contain prefixes: DWC_,
SAP_ (See Rules for Technical Names).
version - [required] Enter the version of the space definition file fomat. This must
always be set to 1.0.4.
label Space Name Enter the business name of the space. Can contain a maximum of 30
characters, and can contain spaces and special characters.
assignedStorage Disk (GB) Enter the amount of storage allocated to the space in bytes. You can en-
ter any value between 100000000 bytes (100MB) and the total storage
size available in the tenant.
Note
To set no size limit for the space (and disable the Enable Space Quota
option), enter 0 for both this parameter and assignedRam.
assignedRam Memory (GB) Enter the amount of ram allocated to the space in bytes. You can enter
any value between 100000000 bytes (100MB) and the total storage
size available in the tenant.
longDescription Description Enter a description for the space. Can contain a maximum of 4 000
characters.
priority Space Priority [deprecated] Enter the prioritization of this space when querying the
database. You can enter a value from 1 (lowest priority) to 8 (highest
priority).
Default value: 5
Enable Audit Log for Enter the audit logging policy for read and change operations and the
dppRead.isAudit
number of days that the logs are retained. you can retain logs for any
PolicyActive Read Operations
period between 7 and 10000 days.
dppRead.retenti Keep Logs for <n>
Default values: false, 30, false, 30
onPeriod Days
allowConsumptio Expose for Consump- Choose the default setting for the Expose for Consumption property for
n tion by Default views created in this space.
enableDataLake Use This Space to Ac- Enable access to the SAP HANA Cloud data lake. Enabling this option is
cess the Data Lake only possible if no other space already has access to the data lake.
Total Statement Mem- [deprecated] Enter the maximum number (or percentage) of GBs of
workloadClass.t
memory that statements running concurrently in the space can con-
otalStatementMe ory Limit
sume. You can enter any value or percentage between 0 (no limit) and
moryLimit.value GB/% the total amount of memory available in your tenant.
Total Statement [deprecated] Enter the maximum number (or percentage) of threads
workloadClass.t
that statements running concurrently in the space can consume. You can
otalStatementTh Thread Limit
enter any value or percentage between 0 (no limit) and the total number
readLimit.value Threads/% of threads available in your tenant.
For example, the following file will create a new space, with all default properties:
{
"NEWSPACE": {
"spaceDefinition": {
"version": "1.0.4"
}
}
}
Note
If a property is not set it will receive the default value (on creation) or will keep its current value (on update).
This second file will update NEWSPACE by modifying the Space Name and increasing the Disk (GB) and
In-Memory (GB) allocations:
{
"NEWSPACE": {
"spaceDefinition": {
"version": "1.0.4",
"label": "My New Space",
"assignedStorage": 6000000000,
"assignedRam": 5000000000
}
}
}
This third file will update the Space Priority, and will leave the other parameters as previously set:
{
"NEWSPACE": {
"spaceDefinition": {
"version": "1.0.4",
"priority": 4
}
}
}
The following properties are not supported when creating, reading, or updating spaces using datasphere:
• Connections
• Time Data
• Space Status and other run-time properties
Members
See Manage Space Users via the Command Line [page 31].
Database Users
Users with the DW Administrator, DW Space Administrator, or DW Integrator role can add database users to a
space (see Integrating Data via Database Users/Open SQL Schemas) using the following syntax:
{
...
"dbusers":{
"<Space_ID>#<DB_UserName>":{
"ingestion":{
"auditing":{
"dppRead":{
"retentionPeriod":<days>
"isAuditPolicyActive":false
},
"dppChange":{
"retentionPeriod":<days>
"isAuditPolicyActive":false
}
}
},
"consumption":{
"consumptionWithGrant":false,
"spaceSchemaAccess":false,
"scriptServerAccess":false,
"localSchemaAccess":false,
"hdiGrantorForCupsAccess":false
}
}
}
}
<SPACE_ID> Space ID [required] Must be the same as the <Space_ID> used at the root of the
space definition file.
<DB_UserName> Database User Name [required] Enter the name of the database user. Can contain a maximum
Suffix of 20 uppercase letters or numbers and must not contain spaces or
special characters other than _ (underscore).
Enable Audit Log for Enter the audit logging policy for read and change operations and the
ingestion.audit
number of days that the logs are retained. you can retain logs for any
ing.dppRead.isA Read Operations
period between 7 and 10000 days.
uditPolicyActiv Keep Logs for <n>
e Default values: false, 30, false, 30
Days
ingestion.audit
ing.dppChange.r
etentionPeriod
consumption.con With Grant Option Allow the database user to grant read access to the space schema to
sumptionWithGra other users.
nt
Default value: false
consumption.spa Enable Read Access Grant the database user read access to the space schema.
ceSchemaAccess (SQL)
Default value: false
consumption.scr Enable Automated Grant the database user access to the SAP HANA Cloud machine learn-
iptServerAccess Predictive Library ing libraries.
(APL) and Predictive
Analysis Library (PAL Default value: false
consumption.loc Enable Write Access Grant the database user write access to the OpenSQL schema.
alSchemaAccess (SQL, DDL, & DML)
Default value: false
consumption.hdi Enable HDI Grant the database user read access to HDI containers associated with
GrantorForCupsA Consumption the space.
ccess
Default value: false
For example, the following file will add a database user to NEWSPACE:
{
"NEWSPACE": {
"spaceDefinition": {
"version": "1.0.4",
"dbusers": {
"NEWSPACE#JJONES": {
"ingestion": {
"auditing": {
"dppRead": {
"retentionPeriod": 21,
When updating database users, you must always list all database users that you want to have assigned to
the space. To delete a database user, remove them from the dbusers section and include the --enforce-
database-user-deletion. If any existing database user is not included in the dbusers section and this
option is omitted, then they will not be deleted in order to protect against unintended data loss.
Note
You can use the datasphere dbusers password reset command to obtain a new password for a
database user (see Reset Database User Passwords via the Command Line [page 35]).
HDI Containers
Users with the DW Administrator, DW Space Administrator, or DW Integrator role can associate HDI containers
to a space (see Exchanging Data with SAP SQL Data Warehousing HDI Containers) using the following syntax:
{
...
"hdicontainers":{
"<Container_Name>":{}
},
<Container_Name> HDI Container Name [required] Enter the name of an HDI container that is associated with
your SAP Datasphere instance and which is not assigned to any other
space.
For example, the following file will associate two HDI containers to NEWSPACE:
{
"NEWSPACE": {
"spaceDefinition": {
"version": "1.0.4",
"hdicontainers": {
"MyHDIContainer": {},
"MyOtherContainer": {},
}
}
When updating HDI containers, you must always list all HDI containers that you want to have assigned to the
space. To delete an HDI container, remove them from the hdicontainers section.
Users with the DW Administrator or DW Space Administrator role can add tables and views, and data access
controls to a space using the standard CSN syntax (see Core Data Services Schema Notation (CSN) ). Users
with the DW Modeler role can add tables and views.
Note
You can also use the objects commands, which support a wider selection of object types, to read and
write objects to your space (see Manage Modeling Objects via the Command Line [page 47]).
For example, the following file will create a table with two columns in NEWSPACE:
{
"NEWSPACE": {
"spaceDefinition": {
"version": "1.0.4"
},
"definitions": {
"Products": {
"kind": "entity",
"elements": {
"Product ID": {
"type": "cds.Integer64",
"key": true,
"notNull": true
},
"Product Name": {
"type": "cds.String",
"length": 5000
}
}
}
}
}
}
Note
To obtain more complex examples, read existing objects from a space into a file using the --definitions
option (see Read a Space [page 28]).
You can use the SAP Datasphere datasphere command line interface to set space priorities and statement
limits for spaces.
Working with Space Priorities and Statement Limits on the Command Line
To work with space priorities and statement limits on the command line, you must:
• Install the datasphere command line interface (see Install or Update the SAP Datasphere Command Line
Interface [page 5]).
• Log in to your SAP Datasphere tenant (see Log into the Command Line Interface via an OAuth Client [page
6]).
• Have a role of DW Administrator (or equivalent privileges) (see Command Line Roles and Privileges [page
13]).
To browse the available commands, enter the following and press Return :
datasphere workload
For general information about space priorities and statement limits, see Set Priorities and Statement Limits for
Spaces.
To list space priorities and statement limits for spaces available on the tenant, enter the following command
and press Return :
Parameter Description
--host "<url>" Enter the URL of your SAP Datasphere tenant. You can copy the URL of any page in your
tenant. Alternatively, set a host value (see Set a Host Value to Identify Your SAP Datasphere
Tenant [page 6]).
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
If you do not include this option, the output will be printed to the command line.
To update space priorities and statement limits for spaces available on the tenant, enter the following
command and press Return :
Parameter Description
--file-path Enter a path to a file with a .json extension containing the list of space properties to
<file>.json update.
{
"assignment": "SPACE",
"workloadClasses": [
{
"spaceId": "<SpaceID>",
"priority": <PriorityNumber>,
"workloadType": "<WorkloadType>",
"totalStatementThreadLimit": <Value>,
"totalStatementThreadLimitUnit": "Counter|
Percent",
"totalStatementMemoryLimit": <Value>,
"totalStatementMemoryLimitUnit": "Gigabyte|
Percent"
}
]
}
In this example, you set the following priorities and statements limits to 2 spaces:
• Set the space SALES_EU the space priority from 1 to 2 and change the workload type from custom to
default.
• Set the space SALES_US the space priority from 5 to 8 and change the workload type from default
to custom, with the totalStatementThreadLimit parameter to 50%, and the totalStatementMemoryLimit
parameter to 80%.
{
"assignment": "SPACE",
"workloadClasses": [
{
"spaceId": "SALES_EU",
"priority": 2,
"workloadType": "default",
},
{
"spaceId": "SALES_US",
"priority": 8,
"workloadType": "custom",
"totalStatementThreadLimit": 50,
"totalStatementThreadLimitUnit": "Percent",
The space priorities and statement limits for spaces are set and retrieved in the space workload management
definition file format and stored as a .json file.
You can set any space priority and statement limits properties using the following syntax:
{
"assignment": "SPACE",
"workloadClasses": [
{
"spaceId": "<SpaceID>",
"priority": <PriorityNumber>,
"workloadType": "<WorkloadType>",
"totalStatementThreadLimit": <Value>,
"totalStatementThreadLimitUnit": "Counter|Percent",
"totalStatementMemoryLimit": <Value>,
"totalStatementMemoryLimitUnit": "Gigabyte|Percent"
}
{
"spaceId": "<SpaceID>",
"priority": <PriorityNumber>,
"workloadType": "<WorkloadType>",
"totalStatementThreadLimit": <Value>,
"totalStatementThreadLimitUnit": "Counter|Percent",
"totalStatementMemoryLimit": <Value>,
"totalStatementMemoryLimitUnit": "Gigabyte|Percent"
}
]
}
<SPACE_ID> Space ID [required] Enter the technical name of the space. Can contain a maxi-
mum of 20 uppercase letters or numbers and must not contain spaces
or special characters other than _ (underscore). Unless advised to do
so, must not contain prefix _SYS and should not contain prefixes: DWC_,
SAP_ (See Rules for Technical Names).
priority Space Priority Enter the prioritization of this space when querying the database. You
can enter a value from 1 (lowest priority) to 8 (highest priority).
Default value: 5
totalStatementM Total Statement Mem- If you've chosen custom for the workloadType, you can set these pa-
rameters. Enter the maximum number (or percentage) of GBs of mem-
emoryLimit ory Limit
ory that statements running concurrently in the space can consume. You
totalStatementM GB/% can enter any value or percentage between 0 (no limit) and the total
amount of memory available in your tenant.
emoryLimitUnit
Default values: 0, Gigabyte
totalStatementT Total Statement If you've chosen custom for the workloadType, you can set these
parameters. Enter the maximum number (or percentage) of threads that
hreadLimit Thread Limit
statements running concurrently in the space can consume. You can
totalStatementT Threads/% enter any value or percentage between 0 (no limit) and the total number
of threads available in your tenant.
hreadLimitUnit
Default values: 70, Percent
• Install the datasphere command line interface (see Install or Update the SAP Datasphere Command Line
Interface [page 5]).
To browse the available commands, enter the following and press Return :
Note
Both types of views are accessible via the views command.
types These objects cannot be created manually in SAP Datasphere, but may be imported as
necessary along with remote tables or content packages.
contexts
You cannot manage connections, folders, or packages, and you cannot assign modeling objects to folders
or packages with the create and update commands.
You can list all the objects of a particular type in a space and optionally write the output to a file.
Parameter Description
--technical-names [optional] Specify the objects to include in your list by technical name, separated by com-
<name>[,...] mas.
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
If you do not include this option, the output will be printed to the command line.
--select [optional] Specify the properties to include in your list, separated by commas, from among
"<property>[,...]" the following:
• technicalName
• businessName
• type
• semanticUsage
• status
• createdOn
• changedOn
• deployedOn
• createdBy
• changedBy
• defaultFileName - The .json filename is calculated by combining the
technicalName and the type and has a maximum length of 250 characters.
Default: technicalName
--filter "<property [optional] Specify a filter condition using the standard OData filter syntax. You can use the
operator following comparison operators:
value>[ and| • eq - Equal to
or ...]"
• ne - Not equal to
• gt - Greater than
• lt - Less than
• ge - Greater than or equal to
• le - Less than or equal to
You can combine filter conditions using the and and or keywords and control priority by
grouping conditions with parentheses.
For example, to list only objects that were deployed after 31 December 2022, and have a
semantic usage of Fact or Dimension, enter
--top <n> [optional] List only the first <n> objects, up to a maximum of 200.
Default: 25
For example, to list all the views in space MySpace showing their technical and business names as well as their
semantic usage and status and write them to a file, enter:
Property Description
Enter an object name. You must use single or double quotes if the name contains spaces.
technicalName
For example, to list only the objects with technical name Table_1 and Table_2, enter:
businessName
--filter "technicalName eq Table_1 or technicalName eq
Table_2"
• LocalTable
• RemoteTable
• View
• IntelligentLookup
• ReplicationFlow
• TransformationFlow
• DataFlow
• TaskChain
• ERModel
• AnalyticModel
• DataAccessControl
• BusinessEntity
• FactModel
• ConsumptionModel
• Folder
• Fact
• RelationalDataset
• Dimension
• Hierarchy
• HierarchyWithDirectory
• Text
• AnalyticalDataset
For example, to list only objects with a semantic usage of Dimension, enter:
• NotDeployed
• Deployed
• ChangesToDeploy
• DesignTimeError
• DesignTimeError
• PendingDeployment
For example, to list only objects deployed after 31 December 2022, enter:
changedOn
Read Objects
You can read the CSN/JSON definition of an object and optionally write it to a file.
Parameter Description
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
Enter the option without any value (--output) to write the output to a file with the
defaultFileName for the object.
If you do not include this option, the output will be printed to the command line.
--accept <format> [optional] Specify the format to return the object definition in. You can choose between:
• application/vnd.sap.datasphere.object.content+json - [default]
Most complete definition of the object, which may include undeployed design-time
changes and unresolved associations (following import of the object).
• application/vnd.sap.datasphere.object.content.design-
time+json - Current design-time version of the object, which
may include undeployed changes, but excludes any unresolved associations.
• application/vnd.sap.datasphere.object.content.run-
time+json - Current run-time version of the ob-
ject, which excludes any undeployed design-time changes.
Create Objects
You can create an object by providing a definition in a JSON file or an input string.
Note
For information about obtaining the JSON syntax for your object and other considerations when using the
create and update commands, see Creating and Updating Modeling Objects [page 56].
Parameter Description
--file-path <name> [optional] Enter a path to a file with a .json extension containing your object definition.
--input [optional] Provide your object definition in stringified json format instead of via the --
'<stringified- file-path option.
json>'
Note
We recommend that you:
• Strip unnecessary spaces, tabs, newline, and return characters (\t, \n, and \r).
• Use double-quotes consistently to surround names and values within the JSON
code.
• Surround the whole string in single-quotes.
• Be aware of the escape characters available in your shell environment and when it
is necessary to use them.
--save-anyway [optional] Save the object, even if there are validation warnings.
Update Objects
You can update an object by providing a new definition in a JSON file or an input string.
Note
For information about obtaining the JSON syntax for your object and other considerations when using the
create and update commands, see Creating and Updating Modeling Objects [page 56].
Parameter Description
--file-path <name> [optional] Enter a path to a file with a .json extension containing your object definition.
--input [optional] Provide your object definition in stringified json format instead of via the --
'<stringified- file-path option.
json>'
Note
We recommend that you:
• Strip unnecessary spaces, tabs, newline, and return characters (\t, \n, and \r).
• Use double-quotes consistently to surround names and values within the JSON
code.
• Surround the whole string in single-quotes.
• Be aware of the escape characters available in your shell environment and when it
is necessary to use them.
--save-anyway [optional] Save the object, even if there are validation warnings.
Delete Objects
Parameter Description
--delete-anyway [optional] Delete the object, even if other objects depend on it.
--force [optional] Suppress the Confirm Deletion dialog and delete the object without confirmation.
For example, to delete the local table MyTable in space MySpace, and suppress the Confirm Deletion dialog,
enter:
When creating or updating modeling objects, you must respect dependencies between objects and understand
the syntax of the object description files.
When creating (see Create Objects [page 53]) or updating (Update Objects [page 54]) objects, any sources or
other objects that your object depends on must already be present in the space, or your action will fail.
For example, if View_A has two tables (Table_B and Table_C) and another view (View_D) as its sources, then
these three objects must all be present in the space before you can create View_A.
Note
When you export a view (or other object), via CSN export (see Importing and Exporting Objects in CSN/
JSON Files), the exported CSN file contains the view and all its dependencies. CSN files generated in
this way cannot be used directly with the command line. You must provide one CSN file for each object
(excluding any dependencies) and create the sources of the view before creating the view itself.
Local Tables A local table does not have dependencies on any other objects.
Flows A data flow, replication flow, or transformation flow depends on all its sources and its target tables.
Views A view depends on all its sources and any used data access controls.
Intelligent Lookups An intelligent lookup depends on its input and lookup entities.
Analytic Models An analytic model depends on its fact and dimension sources.
Data Access Controls The definition of a data access control contains the definition of its permissions entity. When you
export a data access control, the permissions entity is exported too.
See Securing Data with
Data Access Controls.
Business Entities / A business entity depends on its source data entity and any authorization scenarios.
Business Entity Ver-
sions
Fact Models A fact model depends on all its source fact models and business entities.
Consumption Models A consumption model depends on all its source fact models and business entities.
Note
You cannot manage connections, folders, or packages, and you cannot assign modeling objects to folders
or packages with the create and update commands.
You can obtain detailed information about object syntax by reading an appropriate object from your space. The
principle sections are:
Section Description
editorSettings Contains information about the editor associated with the object including, in the case of graphical
objects, a serialization of the diagram.
sharing [tables and views] Contains a list of the spaces to which the object is shared (see Modifying
Sharing [page 59]).
i18n Contains translations of metadata grouped by language (see Modifying Metadata Translations
[page 59]).
Modifying Sharing
Local tables, remote tables, and views can be shared from their space to other spaces, in which they can be
used as sources for flows and views (see Sharing Entities and Task Chains to Other Spaces).
The sharing section lists the spaces to which the entity is shared. You can modify this section to add or
remove spaces to which the table or viewed is shared.
In this example, the object, a local table, is shared to two spaces, SALES_EU and SALES_US:
"sharing": {
"LocalTable":[
"targetSpaces": [
"SALES_EU",
"SALES_US"
]
]
}
Business names in your exposed objects can be translated to support display languages in SAP Analytics
Cloud (see Translating Metadata for SAP Analytics Cloud).
The i18n section contains translations of the model or entity business name, and the business names of its
measures and attributes, grouped by language. You can modify this section to edit translations, or add or
remove languages and their translations.
In this example, the local table's business name, and the business name of its two attributes are provided in
English and French:
"i18n": {
"en":[
"[email protected]": "Departments",
"LocalTable#[email protected]": "Department ID",
"LocalTable#[email protected]": "Department Name"
],
"fr":[
"[email protected]": "Départements",
"LocalTable#[email protected]": "Numéro de département",
"LocalTable#[email protected]": "Nom du département"
]
}
Certain objects can be generated in SAP Datasphere. These objects can be modified in only limited ways::
Users with a modeler role can use the datasphere command line interface to manage the Data Marketplace.
Note
The SAP Datasphere command line interface module has been renamed from dwc to datasphere. The
command dwc has been decommissioned at the end of 2023: please use the new datasphere command
instead. For more information, see https://www.npmjs.com/package/@sap/datasphere-cli .
• Manage Data Marketplace Data Providers via the Command Line [page 60]
• Manage Data Marketplace Data Products via the Command Line [page 79]
• Manage Data Marketplace Licenses via the Command Line [page 106]
• Manage Data Marketplace Releases via the Command Line [page 115]
• Manage Data Marketplace Contexts via the Command Line [page 120]
You can use the SAP Datasphere command line interface, datasphere, to manage Data Marketplace data
providers.
Note
The SAP Datasphere command line interface module has been renamed from dwc to datasphere. The
command dwc has been decommissioned at the end of 2023: please use the new datasphere command
instead. For more information, see https://www.npmjs.com/package/@sap/datasphere-cli .
You can easily create new data providers or update their properties in batch mode, for example update the
contact email addresses or the visibility in the marketplace. It is also possible to show all data providers that a
user has access to.
The following sections, each representing one command, are available in this topic:
• Install the datasphere command line interface (see Install or Update the SAP Datasphere Command Line
Interface [page 5]).
• Log in to your SAP Datasphere tenant (see Log into the Command Line Interface via an OAuth Client [page
6]).
• Have a role of DW Modeler or equivalent privileges (see Command Line Roles and Privileges [page 13]).
To browse the available commands, enter the following and press Return :
To get access to a data provider the user needs to be member of the data provider profile. If a
contentAggregatorID is given as parameter, only data provider profiles that are managed by that content
aggregator are returned.
Returns a simplified list of all data providers that the current user has access to.
If you want a detailed representation of a certain data provider, use the read command.
Parameter Description
--output <output> Specifies the file in which the output of the command is
stored (optional).
Choices:
• application/vnd.sap.marketplace.providers.list+json
(default)
• application/vnd.sap.marketplace.providers.details+json
Reads the metadata of a certain data provider after specifying its UUID or its content aggregator UUID.
Parameter Description
--output <output> Specifies the file in which the output of the command is
stored (optional).
Creates a new data provider or a new managed data provider (managed by a content aggregator) based on
a configuration following the data provider definition file format and stored as a .json file. See The Data
Provider Definition File Format [page 66] for more information.
Specify the full path to the input .json file, for example C:\temp\mydataproviderdefintion.json.
To create a managed data provider specify the contentAggregatorID in the request body.
Note
Managed providers can only be created if you are a member of the assigned content aggregator profile.
Note
If you want to update specific properties only, use the update command.
Updates only selected properties of the specified data provider which are defined in the provided data provider
definition file.
Parameter Description
application/vnd.sap.marketplace.providers.generate-
keys+json (default)
Returns a list of all the existing activation keys for a specified data provider.
Parameter Description
application/vnd.sap.marketplace.providers.provider-
keys+json (default)
Parameter Description
application/vnd.sap.marketplace.providers.provider-
keys.mass-delete+json (default)
The properties of a data provider definition are set and retrieved in the data provider definition file format and
stored as a .json file.
A data provider definition file must not exceed 25MB, and can contain the following information:
Users with the DW Modeler role can create data providers and set any data provider properties (see Manage
Data Marketplace Data Providers via the Command Line [page 60]) using the following syntax:
Sample Code
{
"name": "<string>",
"contentAggregatorProviderID": "<string>",
"logo": "<string>",
"description": "<string>",
"homepageUrl": "<string>",
"linkedinUrl": "<string>",
"regionalCoverages": [
"<string>",
"<string>", ...
],
"dataCategories": [
"<string>",
"<string>", ...
],
"industries": [
"<string>",
"<string>", ...
],
"sapApplications": [
"<string>",
<regionalCoverages> Regional Coverage String values for each region the data
is applicable to. Multiple values can be
specified.
[ C001 - Benchmarking
Data, C011 - Company
Data, C021 - Countries,
Regions & Cities Data,
C031 - Culture & Sports
Data, C041 - Environmental
& Weather Data, C051
- Finance & Economy
Data, C061 - Geospatial
Data, C071 - Health
Data, C081 - Hospitality,
Travel & Tourism Data,
C091 - Industry-Specific
Data, C101 - Innovation
& Trend Data, C111 -
Legal & Justice Data,
C121 - Market & Consumer
Data, C131 - Media &
Entertainment Data, C141
- Mobility Data, C151 -
Natural Resources & Energy
Data, C161 - Political
Data, C171 - Product &
Services Data, C181 -
Public Sector & Society
Data, C191 - Science
& Technology Data, C201
- Social Media, News &
Communication Data, C206 -
Sustainability Data, C211
- Transport & Logistics
Data, C221 - Web, IoT &
Device Data, C231 - Other
Data Categories ]
Tech, Discrete
Industries:Industrial
Machinery and Components,
Service Industries,
Service
Industries:Airlines,
Service
Industries:Engineering,
Construction, and
Operations, Service
Industries:Media, Service
Industries:Professional
Services, Service
Industries:Railways,
Service Industries:Sports
& Entertainment, Service
Industries:Telecommunicati
ons, Service
Industries:Travel and
Transportation, Public
Services, Public
Services:Defense and
Security, Public
Services:Future Cities,
Public
Services:Healthcare,
Public Services:Higher
Education and Research,
Public Services:Public
Sector ]
[ HR & People
Engagement, HR &
People Engagement:Employee
in HR and People
Engagements, HR &
People Engagement:Employee
Experience Management, HR
& People Engagement:Core
HR and Payroll, HR &
People Engagement:Talent
Management, HR &
People Engagement:HR
Analytics and Workforce
Planning, CRM and
Customer Experience,
CRM and Customer
Experience:Customer Data,
CRM and Customer
Experience:Marketing,
CRM and Customer
Experience:Commerce,
CRM and Customer
Experience:Sales, CRM
and Customer
Experience:Service, ERP
& Finance, ERP &
Finance:SAP S/4HANA, ERP
& Finance:ERP for Small
and Midsize Enterprises,
ERP & Finance:Financial
Planning and Analysis,
ERP & Finance:Accounting
and Financial Close,
ERP & Finance:Treasury
Management, ERP
& Finance:Accounts
Receivable, Billing &
Revenue Management, ERP
& Finance:Cybersecurity,
Governance, Risk
& Compliance,
Network & Spend
Management, Network &
Spend Management:Supplier
Managemetn, Network &
Spend Management:Strategic
Sourcing, Network & Spend
Management:Procurement,
Network & Spend
Management:Services
Procurement and Contingent
Workforce, Network &
Spend Management:Selling
and Fulfillment, Network
& Spend Management:Travel
and Expense, Business
Technology Plattform,
Business Technology
Plattform:Database and
Data Management,
Business Technology
Plattform:Application
Development and
Integration,
Business Technology
Plattform:Analytics,
Business Technology
Plattform:Intelligent
Technologies, Digital
Supply Chain, Digital
Supply Chain:Supply
Chain Planning, Digital
Supply Chain:Supply Chain
Logistics, Digital Supply
Chain:Manufacturing,
Digital Supply Chain:R&D /
Engineering, Digital
Supply Chain:Asset
Management, Experience
Management, Experience
Management:Brand
Experience, Experience
Management:Customer
Experience, Experience
Management:Product
Experience, Experience
Management:Employee
Experience ]
<shipments> Data Shipment The shipment types which the data pro-
vider supports:
• <Direct>
• <External>
• <OpenSql>
For example, the following file will create a new data provider definition:
Sample Code
{
"name": "Example Provider",
"contentAggregatorProviderID": "string",
"logo": "image string",
"description": "Lorem Ipsum description of my Provider",
"homepageUrl": "www.mydataprovidercompany.com",
"linkedinUrl": "www.linkedin.com/mydataprovidercompany",
"regionalCoverages": [
"Germany",
"France"
],
"dataCategories": [
"C001",
"C031"
],
"industries": [
"Financial Services",
"Energy and Natural Ressources"
],
"sapApplications": [
"HR & People Engagement",
"ERP & Finance"
],
"contactEmail": "[email protected]",
"sapEmail": "[email protected]",
"country_code": "DE",
"zipCode": "12345",
"city": "Walldorf",
Use the .json file format to store the data product definition. It is needed for the upload which is part of the
create command for example.
You can find more information on data provider profiles in the Data Provider's Guide under Maintaining your
Data Provider Profile.
You can use the SAP Datasphere command line interface, datasphere, to manage and orchestrate Data
Marketplace data products.
Note
The SAP Datasphere command line interface module has been renamed from dwc to datasphere. The
command dwc has been decommissioned at the end of 2023: please use the new datasphere command
instead. For more information, see https://www.npmjs.com/package/@sap/datasphere-cli .
You can create new data products in mass operations, or update properties, such as pricing information,
publishing status, or context assignments. It is also possible to list all data products that belong to a certain
data provider and delete data products.
Note
To create new releases for data products, you must use the Data Sharing Cockpit (see The Data Sharing
Cockpit). This operation is not available via the command line.
The following sections, each representing one command, are available in this topic:
• Install the datasphere command line interface (see Install or Update the SAP Datasphere Command Line
Interface [page 5]).
• Log in to your SAP Datasphere tenant (see Log into the Command Line Interface via an OAuth Client [page
6]).
• Have a role of DW Modeler or equivalent privileges (see Command Line Roles and Privileges [page 13]).
To browse the available commands, enter the following and press Return :
In environments where you are managing data products for multiple providers, you can use the following
command:
Returns a simple list of all existing data products assigned to your user.
Listed are all data products of all data providers and content aggregators you are a member of.
Parameter Description
--output <output> Specifies the file in which the output of the command is
stored (optional).
Choices:
• application/vnd.sap.marketplace.providers.list+json
(default)
• application/vnd.sap.marketplace.providers.details+json
Lists the properties of a single data product. You need to specify the data products UUID.
Parameter Description
--output <output> Specifies the file in which the output of the command is
stored (optional).
Creates a new data product based on a configuration following the data product definition file format and
stored as a .json file. See The Data Product Definition File Format [page 88] for more information.
Specify the full path to the input .json file, for example C:\temp\mydataproductdefinition.json.
The new data product is created in status Draft. You can change the status with the command change-
lifecycle-status.
Overwrites all properties of the specified data product with the provided data in the data product definition file.
Parameter Description
Specify the full path to the input .json file, for example C:\temp\mydataproductdefinition.json.
Note
If you want to update specific properties only, use the update command.
Updates only selected properties of the specified data product which are defined in the provided data product
definition file.
Parameter Description
Specify the full path to the input .json file, for example C:\temp\mydataproductdefinition.json.
With this command you can change the lifecycle status of a data product. A newly created data product is
automatically set to status Draft.
• Listed
• Delisted
• Deactivated
Parameter Description
Parameter Description
--output <output> Specifies the file in which the output of the command is
stored (optional).
Choices:
• application/vnd.sap.marketplace.providers.list+json
(default)
• application/vnd.sap.marketplace.providers.details+json
Lists the properties of a single data product. You need to specify the technical ID and data provider UUID or
content aggregator UUID.
--output <output> Specifies the file in which the output of the command is
stored (optional).
Creates a new data product for a specified data provider based on a configuration following the data product
definition file format and stored as a .json file. See The Data Product Definition File Format [page 88] for
more information.
Specify the full path to the input .json file, for example C:\temp\mydataproductdefinition.json.
The new data product is created in status Draft. You can change the status with the command change-
lifecycle-status.
Parameter Description
Parameter Description
Overwrites all properties of the specified data product with the provided data in the data product definition file.
Note
If you want to update specific properties only, use the update command.
Parameter Description
Updates only selected properties of the specified data product which are defined in the provided data product
definition file.
Parameter Description
With this command you can change the lifecycle status of a data product. A newly created data product is
automatically set to status Draft.
• Listed
• Delisted
• Deactivated
Parameter Description
Properties of a data product are set and retrieved in the space definition file format and stored as a .json file.
A data product definition file must not exceed 25MB, and can contain the following information:
Users with the DW Modeler role can create data products and set any data product properties (see Manage
Data Marketplace Data Products via the Command Line [page 79]) using the following syntax:
Sample Code
{
"dataProviderProductID": "<string>",
"contentAggregatorProductID": "<string>",
"name": "<string>",
"description": "<string>",
"space": "<string>",
"pricingModel": "OneTime|Monthly",
"pricingDescription": "<string>",
"price": "<number>",
"pricePerMonth": "<number>",
"priceCurrencyCode": "<string>",
"licenseKeyUrl": "<URL>",
"regionalCoverages": [
"<string>",
"<string>", ...
],
"dataCategories": [
"<string>",
"<string>", ...
],
"industries": [
"<string>",
<dataProviderProductID> not set manually in Data Sharing Cock- The unique ID of the given product in
pit the system of the data provider.
<contentAggregatorProductID> not set manually Data Sharing Cockpit The unique ID of the given product in
the system of the content aggregator
(if the corresponding data provider is
managed by a content aggregator).
<pricingModel> Pricing Model Select the pricing model for your data
product: One Time or Monthly. This
setting is only relevant for contract type
LicenseKey.
<licenseKeyUrl> URL for License Key Purchase Enter the URL to the shop of the
data provider where consumers can
purchase licenses for the data product.
<regionalCoverages> Regional Coverage String values for each region the data
is applicable to. Multiple values can be
specified.
Ethiopia, Falkland
Islands (Malvinas), Faroe
Islands, Fiji, Finland,
France, French Guiana,
French Polynesia, French
Southern Territories,
Gabon, Gambia, Georgia,
Germany, Ghana, Gibraltar,
Greece, Greenland,
Grenada, Guadeloupe,
Guam, Guatemala, Guernsey,
Guinea, Guinea-Bissau,
Guyana, Haiti, Heard
and Mc Donald Islands,
Holy See (Vatican
City State), Honduras,
Hong Kong, Hungary,
Iceland, India, Indonesia,
Iran, Islamic Republic
of, Iraq, Ireland,
Isle of Man, Israel,
Italy, Jamaica, Japan,
Jersey, Jordan, Kazakstan,
Kenya, Kiribati, Korea,
Democratic People's
Republic of, Korea,
Republic of, Kosovo
(temporary code), Kuwait,
Kyrgyzstan, Lao, People's
Democratic Republic,
Latvia, Lebanon, Lesotho,
Liberia, Libyan Arab
Jamahiriya, Liechtenstein,
Lithuania, Luxembourg,
Macao, Macedonia,
Madagascar, Malawi,
Malaysia, Maldives, Mali,
Malta, Marshall Islands,
Martinique, Mauritania,
Mauritius, Mayotte,
Mexico, Micronesia,
Federated States of,
Moldova, Republic
of, Monaco, Mongolia,
Montenegro, Montserrat,
Morocco, Mozambique,
Myanmar, Namibia, Nauru,
Nepal, Netherlands,
Netherlands Antilles, New
Caledonia, New Zealand,
Nicaragua, Niger, Nigeria,
Niue, Norfolk Island,
Northern Mariana Islands,
Norway, Oman, Pakistan,
Palau, Palestinian
Territory, Occupied,
Panama, Papua New
Guinea, Paraguay, Peru,
Philippines, Pitcairn,
Poland, Portugal, Puerto
Rico, Qatar, Republic of
Serbia, Reunion, Romania,
Russia Federation, Rwanda,
Saint BarthŽlemy, Saint
Helena, Saint Kitts &
Nevis, Saint Lucia, Saint
Martin, Saint Pierre and
Miquelon, Saint Vincent
and the Grenadines, Samoa,
San Marino, Sao Tome and
Principe, Saudi Arabia,
Senegal, Serbia and
Montenegro, Seychelles,
Sierra Leone, Singapore,
Sint Maarten, Slovakia,
Slovenia, Solomon Islands,
Somalia, South Africa,
South Georgia & The
South Sandwich Islands,
South Sudan, Spain,
Sri Lanka, Sudan,
Suriname, Svalbard and Jan
Mayen, Swaziland, Sweden,
Switzerland, Syrian Arab
[ C001 - Benchmarking
Data, C011 - Company
Data, C021 - Countries,
Regions & Cities Data,
C031 - Culture & Sports
Data, C041 - Environmental
& Weather Data, C051
- Finance & Economy
Data, C061 - Geospatial
Data, C071 - Health
Data, C081 - Hospitality,
Travel & Tourism Data,
C091 - Industry-Specific
Data, C101 - Innovation
& Trend Data, C111 -
Legal & Justice Data,
C121 - Market & Consumer
Data, C131 - Media &
Entertainment Data, C141
- Mobility Data, C151 -
Natural Resources & Energy
Data, C161 - Political
Data, C171 - Product &
Services Data, C181 -
Public Sector & Society
Data, C191 - Science
& Technology Data, C201
- Social Media, News &
Communication Data, C206 -
Sustainability Data, C211
- Transport & Logistics
Data, C221 - Web, IoT &
Device Data, C231 - Other
Data Categories ]
Industries:Industrial
Machinery and Components,
Service Industries,
Service
Industries:Airlines,
Service
Industries:Engineering,
Construction, and
Operations, Service
Industries:Media, Service
Industries:Professional
Services, Service
Industries:Railways,
Service Industries:Sports
& Entertainment, Service
Industries:Telecommunicati
ons, Service
Industries:Travel and
Transportation, Public
Services, Public
Services:Defense and
Security, Public
Services:Future Cities,
Public
Services:Healthcare,
Public Services:Higher
Education and Research,
Public Services:Public
Sector ]
[ HR & People
Engagement, HR &
People Engagement:Employee
in HR and People
Engagements, HR &
People Engagement:Employee
Experience Management, HR
& People Engagement:Core
HR and Payroll, HR &
People Engagement:Talent
Management, HR &
People Engagement:HR
Analytics and Workforce
Planning, CRM and
Customer Experience,
CRM and Customer
Experience:Customer Data,
CRM and Customer
Experience:Marketing,
CRM and Customer
Experience:Commerce,
CRM and Customer
Experience:Sales, CRM
and Customer
Experience:Service, ERP
& Finance, ERP &
Finance:SAP S/4HANA, ERP
& Finance:ERP for Small
and Midsize Enterprises,
ERP & Finance:Financial
Planning and Analysis,
ERP & Finance:Accounting
and Financial Close,
ERP & Finance:Treasury
Management, ERP
& Finance:Accounts
Receivable, Billing &
Revenue Management, ERP
& Finance:Cybersecurity,
Governance, Risk
& Compliance,
Network & Spend
Management, Network &
Spend Management:Supplier
Managemetn, Network &
Spend Management:Strategic
Sourcing, Network & Spend
Management:Procurement,
Network & Spend
Management:Services
Procurement and Contingent
Workforce, Network &
Spend Management:Selling
and Fulfillment, Network
& Spend Management:Travel
and Expense, Business
Technology Plattform,
Business Technology
Plattform:Database and
Data Management,
Business Technology
Plattform:Application
Development and
Integration,
Business Technology
Plattform:Analytics,
Business Technology
Plattform:Intelligent
Technologies, Digital
Supply Chain, Digital
Supply Chain:Supply
Chain Planning, Digital
Supply Chain:Supply Chain
Logistics, Digital Supply
Chain:Manufacturing,
Digital Supply Chain:R&D /
Engineering, Digital
Supply Chain:Asset
Management, Experience
Management, Experience
Management:Brand
Experience, Experience
Management:Customer
Experience, Experience
Management:Product
Experience, Experience
Management:Employee
Experience ]
<shipments> Data Shipment Specify the shipment types you are of-
fering. These shipment types are used
if a customer filters offerings for a dis-
tinct shipment type. It is particularly
important if the data provider has no
products listed yet as only the ship-
ment types given here are used for the
search.
• Direct
The data is copied directly into the
space that the consumers select.
• External
The data is delivered by sharing
files outside SAP Datasphere.
• OpenSql
The consumers need to provide an
Open SQL Schema to you, using
the Data Inbox. Once activated, the
data is accessible through the Data
Builder and the provided schema
appears as a source.
String Array
Note
If you create a data product in full
delivery using the CLI, you cannot
create a release with the CLI. Make
sure to manually create a release
in the Publishing Management sec-
tion of the Data Sharing Cockpit. In-
deed, a release is mandatory if you
want to list your data product for
instance.
<deliveryPatternDescription> Delivery Pattern Description Free text that provides additional infor-
mation when exactly a customer can
expect an update of the product data.
For example, the following code snippet defines a new data product:
Sample Code
{
"dataProviderProductID": "My_Data_Product",
"contentAggregatorProductID": "CW_4711",
"name": "Sales Sample Data for SAP",
"description": "Lorem Ipsum description of my Provider",
"space": "SAMPLE_SPACE",
"pricingModel": "OneTime",
"pricingDescription": "string",
"price": 599.99,
"pricePerMonth": 9.99,
"priceCurrencyCode": "string",
"licenseKeyUrl": "http://mysapdatalicenseshop.com",
"regionalCoverages": [
"Germany",
"France"
],
"dataCategories": [
"C001",
"C031"
],
"industries": [
"Financial Services",
"Energy and Natural Ressources"
],
"sapApplications": [
"HR & People Engagement",
"ERP & Finance"
],
"shipments": [
"OpenSql"
],
"productArtifacts": [
{
"name": "My_View",
"dataFilter": "country = 'france' and year = '2019'",
"columns": [
{
"name": "string"
}
]
}
],
"dataDocumentation": [
{
"name": "My_DataDocumentation_File.pdf",
"description": "My data documentation file",
"blobData": "string",
Use the .json file format to store the data product definition. It is needed for the upload which is part of the
create command for example.
You can find more information on creating data products in the Data Provider's Guide under Creating a Data
Product.
Manage your licenses and generate activation keys for consumers to ensure compliant access to your data
products. For more information about Data Marketplace licenses, see Managing Licenses.
• Install the datasphere command line interface (see Install or Update the SAP Datasphere Command Line
Interface [page 5]).
• Log in to your SAP Datasphere tenant (see Log into the Command Line Interface via an OAuth Client [page
6]).
• Have a role of DW Modeler or equivalent privileges (see Command Line Roles and Privileges [page 13]).
The following sections, each representing one command, are available in this topic:
Returns a simplified list of all the licenses available for your data provider profile.
Parameter Description
--output <output> Specifies the file in which the output of the command is
stored (optional).
The new license is created in status Draft. You can change the status with the command change-lifecycle-
status.
--filePath <license file> Enter a path to a file with a .json extension containing your
license definition.
Lists the properties of a license. You must specify the license UUID.
Parameter Description
--output <output> Specifies the file in which the output of the command is
stored (optional).
Parameter Description
--filePath <license file> Enter a path to a file with a .json extension containing your
license definition.
Parameter Description
--filePath <license file> Enter a path to a file with a .json extension containing your
license definition.
If you want to update specific properties only, use the update command.
Parameter Description
Changes the lifecycle status of a license from Draft to Active, or vice versa. A newly created license is
automatically set to status Draft.
Select the Active (or Draft) status, then specify the license UUID.
Parameter Description
Returns a list of all the existing activation keys for a specified license.
Parameter Description
--output <output> Specifies the file in which the output of the command is
stored (optional).
application/vnd.sap.marketplace.licenses.keys+json (de-
fault)
Parameter Description
--output <output> Specifies the file in which the output of the command is
stored (optional).
Parameter Description
--output <output> Specifies the file in which the output of the command is
stored (optional).
application/vnd.sap.marketplace.licenses.keys.mass-de-
lete+json (default)
--output <output> Specifies the file in which the output of the command is
stored (optional).
--filePath <license file> Enter a path to a file with a .json extension containing your
license definition.
Parameter Description
application/vnd.sap.marketplace.licenses.products+json
(default)
Parameter Description
application/vnd.sap.marketplace.licenses.products.mass-
delete+json (default)
When a consumer installs the product for the first time, the marketplace service verifies the user account's
eligibility based on the registered email address. It then generates a key and activates the data product in the
background. This allows the consumer to immediately install a data product that requires a license key, without
needing to enter the license key.
--filePath <user-file> Enter a path to a file with a .json extension containing the
required information for the users you want to assign (email
address, keyUUID, tenant URL).
The properties of a license definition are set and retrieved in the space definition file format and stored as
a .json file.
A license definition file must not exceed 25MB, and can contain the following information:
License Properties
Users with the DW Modeler role can create licenses and set any license properties (see Manage Data
Marketplace Licenses via the Command Line [page 106]) using the following syntax:
Sample Code
{
"providerUUID": "<string>",
"reference": "<string>",
"company": "<string>",
"domains": ["<string>", "<string>", ...],
"validUntil": "<string>",
"scope": ["<string>", "<string>", ...]
}
<validUntil> Valid Until Enter the last day of the validity of the
license (string in date-time format).
<scope> License Scope Select the data products that you want
to include in the license scope (array
of strings, each string representing a
product UUID to be included in the li-
cense scope).
Sample Code
{
"providerUUID": "66c9b1e6-1d84-4f1d-8f98-1c1e26f54b8d",
"reference": “Example Reference",
"company": "Example Company",
"domains": ["example.com", "anotherexample.com"],
"validUntil": "2023-02-16T16:20:20.698Z",
"scope": ["dcccaddb-e343-4830-b0e3-8a0233309c24", "77d0a15d-a68b-4e6d-
b7ea-262b16b03cce"]
}
Use the .json file format to store the license definition. It is needed for the upload which is part of the create
command for example.
You can find more information on licenses in the Data Provider's Guide under Managing Licenses.
You can use the SAP Datasphere command line interface, datasphere, to publish and manage Data
Marketplace data product releases.
The publishing management within Data Marketplace allows controlled data updates and shipment of these
updates to your consumers. For each data product update, you create a new release. For more information
about Data Marketplace releases, see Publishing New Releases.
• Install the datasphere command line interface (see Install or Update the SAP Datasphere Command Line
Interface [page 5]).
• Log in to your SAP Datasphere tenant (see Log into the Command Line Interface via an OAuth Client [page
6]).
• Have a role of DW Modeler or equivalent privileges (see Command Line Roles and Privileges [page 13]).
The following sections, each representing one command, are available in this topic:
Parameter Description
--output <output> Specifies the file in which the output of the command is
stored (optional).
Parameter Description
--filePath <release file> Enter a path to a file with a .json extension containing your
release definition.
Lists the properties of a release. You must specify the release UUID.
Parameter Description
--output <output> Specifies the file in which the output of the command is
stored (optional).
You must specify the data product UUID and the release UUID.
Parameter Description
--filePath <release file> Enter a path to a file with a .json extension containing your
release definition.
Parameter Description
--filePath <release file> Enter a path to a file with a .json extension containing your
release definition.
Note
If you want to update specific properties only, use the update command.
Parameter Description
Parameter Description
Parameter Description
Parameter Description
Parameter Description
The properties of a release definition are set and retrieved in the space definition file format and stored as
a .json file.
A release definition file must not exceed 25MB, and can contain the following information:
Release Properties
Users with the DW Modeler role can create releases and set any release properties (see Manage Data
Marketplace Releases via the Command Line [page 115]) using the following syntax:
Sample Code
{
"dataContained": "<string>",
"comment": "<string>",
"from": "<string>",
"to": "<string>",
"isLocked": "<boolean>",
"isPublished": "<boolean>"
}
<to> Date Range Enter the end date of the release (string
in date-time format).
Sample Code
{
"dataContained": "Data Contained Description",
"comment": "Release Comment",
"from": "2023-02-16T16:20:20.698Z",
"to": "2023-02-16T16:20:20.698Z",
"isLocked": false,
"isPublished": false
}
Use the .json file format to store the release definition. It is needed for the upload which is part of the create
command for example.
You can find more information on releases in the Data Provider's Guide under Publishing New Releases.
You can use the SAP Datasphere command line interface, datasphere, to create and manage Data
Marketplace contexts.
Use contexts to realize private or internal data marketplaces by restricting the visibility of your data provider
profile and your data products to selected users only. For more information about Data Marketplace contexts,
see Using Contexts to Realize Public, Private, and Internal Data Marketplaces.
• Install the datasphere command line interface (see Install or Update the SAP Datasphere Command Line
Interface [page 5]).
• Log in to your SAP Datasphere tenant (see Log into the Command Line Interface via an OAuth Client [page
6]).
• Have a role of DW Modeler or equivalent privileges (see Command Line Roles and Privileges [page 13]).
The following sections, each representing one command, are available in this topic:
Parameter Description
--output <output> Specifies the file in which the output of the command is
stored (optional).
Choices:
• application/vnd.sap.marketplace.providers.list+json
(default)
• application/vnd.sap.marketplace.providers.details+json
Specify the full path to the input .json file, for example C:\temp\mycontextdefinition.json.
The new context is created in status Draft. You can activate the context with the command datasphere
marketplace context-by-provider change-lifecycle-status.
--filePath <context file> Enter a path to a file with a .json extension containing your
context definition.
Lists the properties of a context for a specific data provider. You must specify the context UUID and the data
provider UUID.
Parameter Description
--output <output> Specifies the file in which the output of the command is
stored (optional).
Parameter Description
--filePath <context file> Enter a path to a file with a .json extension containing your
context definition.
Overwrites all properties of the specified context with the provided data in the context definition file.
Parameter Description
Changes the lifecycle status of a context to Active. A newly created context is automatically set to status Draft:
before consumers can use their activation keys to join a context in the Data Marketplace, you have to activate
the context.
Parameter Description
With this command you join an existing context. After joining you can see all data products listed in this
context.
With this command you leave a context. If you leave a context, you won't be able to see the data products listed
in this context anymore.
Parameter Description
Returns a list of all the existing activation keys for a specified context.
Parameter Description
Parameter Description
Parameter Description
application/vnd.sap.marketplace.contexts.mass-de-
lete+json (default)
The properties of a context definition are set and retrieved in the space definition file format and stored as
a .json file.
A context definition file must not exceed 25MB, and can contain the following information:
Context Properties
Users with the DW Modeler role can create contexts and set any context properties (see Manage Data
Marketplace Contexts via the Command Line [page 120]) using the following syntax:
Sample Code
{
"contextOwnerUUID": "<string>",
"contextUUID": "<string>",
"contextType": "<string>",
"contextName": "<string>",
"description": "<string>",
"logo": {
"name": "<string>",
"blobData": "<string>",
"mimeType": "<string>"
},
"status": "<string>",
"domains": ["<string>", "<string>", ...],
Sample Code
{
"contextOwnerUUID": "66c9b1e6-1d84-4f1d-8f98-1c1e26f54b8d",
"contextUUID": "b7e4e6a3-4e1c-4e8f-a7a8-6e1e26f54b8d",
"contextType": "PrivateDataExchange",
"contextName": "My Context",
"description": "My Context Description",
“logo”: {
“name”:”My_file.jpg”,
“blobData”: ”YXNkYXNkYXNkc2RhZA==",
“mimeType”:”image/jpeg”
},
"status": "Draft",
"domains": ["example.com", "anotherexample.com"]
}
Use the .json file format to store the context definition. It is needed for the upload which is part of the create
command for example.
You can find more information on contexts in the Data Provider's Guide under Using Contexts to Realize Public,
Private, and Internal Data Marketplaces.
1.6 Manage Tasks and Task Chains via the Command Line
Users with an integrator role can use the datasphere command line interface to control and manage tasks
and task chains.
A collection of SAP Datasphere CLI commands that can be used to control and manage tasks and task chains.
The basic CLI command to manage tasks and task chains is datasphere tasks. Then, there are three
subcategories of task and task chain commands you can use. Consent, Application, and Logs.
Consent Commands
• datasphere tasks consent get - retrieves information whether consent has been given or consent has
expired.
• datasphere tasks consent give - provides consent to perform tasks and task chains.
• datasphere tasks consent revoke - revokes consent to perform tasks and task chains.
Command Description
Application Commands
• datasphere tasks chains run - runs a specified task chain in a given space.
Command Description
• datasphere tasks logs get - returns log information about a specific task chain run.
• datasphere tasks logs list - returns log information about all runs of a specific task chain.
Command Description
Users with an administrator role can use the datasphere command line interface to manage TLS server
certificates. Users with an integrator role can list, read, validate, and delete connections and can, additionally,
create and edit SAP SuccessFactors connections.
Creating and editing connections via the command line is supported for SAP SuccessFactors connections only.
Users with a DW Administrator role (or equivalent privileges) can list, upload, and delete TLS server certificates
via the command line.
• Working with TLS Server Certificates on the Command Line [page 133]
• List TLS Certificates [page 133]
• Upload TLS Certificates [page 133]
• Delete TLS Certificates [page 134]
To work with TLS server certificates on the command line, you must:
• Install the datasphere command line interface (see Install or Update the SAP Datasphere Command Line
Interface [page 5]).
• Log in to your SAP Datasphere tenant (see Log into the Command Line Interface via an OAuth Client [page
6]).
• Have a role of DW Administrator or equivalent privileges (see Command Line Roles and Privileges [page
13]).
To browse the available commands, enter the following and press Return :
For information about managing certificates in SAP Datasphere, see Manage Certificates for Connections.
You can list all TLS server certificates uploaded to SAP Datasphere.
Parameter Description
--description Enter a description to provide intelligible information on the certificate, for example to point
<description> out to which connection type the certificate applies.
--file-path<path> [optional] Enter a path to a certificate file with a supported file extension - .pem (privacy-en-
hanced mail), .crt, or .cer.
--input <input> [optional] Provide your certificate definition in stringified json format instead of via the
--file-path option.
For example, to upload a TLS Server certificate for SAP SuccessFactors connections, enter:
Parameter Description
--fingerprint Enter the fingerprint of the certificate that you want to delete.
<fingerprint>
You can retrieve the fingerprint via the command datasphere configuration
certificates list.
Users with a DW Integrator role (or equivalent privileges) can list, validate, and delete connections and
read connection details via the command line. Additionally, they can create and edit SAP SuccssFactors
connections.
• Install the datasphere command line interface (see Install or Update the SAP Datasphere Command Line
Interface [page 5]).
• Log in to your SAP Datasphere tenant (see Log into the Command Line Interface via an OAuth Client [page
6]).
• Have a role of DW Integrator or equivalent privileges (see Command Line Roles and Privileges [page 13]).
To browse the available commands, enter the following and press Return :
For information about working with connections in SAP Datasphere, see Integrating Data via Connections.
You can list all the connections in a space and optionally write the output to a file.
Parameter Description
--accept <accept> [optional] Specify the format to return the connections definition in. You can choose be-
tween:
• "application/
vnd.sap.datasphere.space.connections.list+json" - [default]
• "application/
vnd.sap.datasphere.space.connections.details+json"
--details [optional] List the connections with all their details except for credentials.
--features [optional] List the connections with their technical names and their information about
enabled and disabled features.
--top <n> [optional] List only the first <n> connections (according to their creation date), up to a
maximum of 200.
Default: 10
Default: 0
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
If you do not include this option, the output will be printed to the command line.
For example, to list all the connections in space MySpace showing their technical and business names as well
as their type id, creation date, creator, and replication status and write them to a file, enter:
You can read the JSON definition of a connection (without its credentials) in a space and optionally write the
output to a file.
Parameter Description
--output [optional] Enter a path to a .json file to write the output to.
<file>.json
If you do not include this option, the output will be printed to the command line.
For example, to read the definition of the connection MyConnection in space MySpace write it to the default
file name, enter:
You can create an SAP SuccessFactors connection in a space by providing a definition in a JSON file or an input
string.
Parameter Description
--file-path<path> [optional] Enter a path to a file with a .json extension containing your connection definition.
--input <input> [optional] Provide your connection definition in stringified json format instead of via the
--file-path option.
For example, to create the SAP SuccessFactors connection MyConnection in space MySpace, enter:
Validate Connections
Parameter Description
You can edit an SAP SuccessFactors connection in a space by providing a new definition in a JSON file or an
input string.
Parameter Description
--file-path<path> [optional] Enter a path to a file with a .json extension containing your connection definition.
--input <input> [optional] Provide your connection definition in stringified json format instead of via the
--file-path option.
For example, to edit the SAP SuccessFactors connection MyConnection in space MySpace, enter:
Delete Connections
Parameter Description
--force [optional] Suppress the Confirm Deletion dialog and delete the connection without confir-
mation.
For example, to delete the connection MyConnection in space MySpace and suppress the Confirm Deletion
dialog, enter:
SAP SuccessFactors connection properties are set and retrieved in the connection definition file format and
stored as a .json file. A connection definition file must not exceed 25MB.
Users with the DW Integrator role can create SAP SuccessFactors connections and set any connections
properties (see SAP SuccessFactors Connections) using the following syntax:
{
"name": "<technical name>",
"businessName" : "<business name>", //optional
"description":"<description>",//optional
"authType": "Basic",
"url": "https://<SAP SuccessFactors API Server>/odatav4/<supported SAP
SuccessFactors service group>",
"version" :"V4",
"username": "<username>",
"password":"<password>"
}
{
"name":"<technical name>",
"businessName":"<business name>",//optional
"description":"<description>", //optional
"authType": "OAuth2",
"url": "https://<SAP SuccessFactors API Server>/odata/v2/",
"version": "V2",
"oauth2GrantType":"saml_bearer",
"oauth2TokenEndpoint": "https://oauth2TokenEndpoint.com/oauth/token",
"oauth2CompanyId": "<SAP SuccessFactors company ID>",
"clientId": "<client id>",
"clientSecret":"<SAML assertion>"
}
name Technical Name [required] Enter the technical name of the connection. The technical
name can only contain alphanumeric characters and underscores (_).
Underscore (_) must not be used at the start or end of the name. The
maximum length is 40 characters. The name must be unique within the
space.
Note
Once the object is saved, the technical name can no longer be modi-
fied.
businessName Business Name [optional] Enter a descriptive name to help users identify the object. This
name can be changed at any time.
description Description [optional] Provide more information to help users understand the object.
authType Authentication Type [required] Enter the authentication type to use to connect to the OData
endpoint.
Note
HTTP basic authentication in SAP SuccessFactors will soon be re-
tired. For more information, see Deprecation of HTTP Basic Authen-
tication in SAP SuccessFactors What's New Viewer.
url URL [required] Enter the OData service provider URL of the SAP SuccessFac-
tors service that you want to access.
version Version [required] Enter the OData version used to implement the SAP Success-
Factors OData service (V2 or V4).
oauth2GrantType OAuth Grant Type [required] Enter SAML Bearer as the grant type used to retrieve an
access token.
oauth2TokenEndp OAuth Token Endpoint [required] Enter the API endpoint to use to request an access token:
oint <SAP SuccessFactors API Server>/oauth/token.
oauth2CompanyId OAuth Company ID [required] Enter the SAP SuccessFactors company ID (identifying the
SAP SuccessFactors system on the SAP SuccessFactors API server) to
use to request an access token.
clientId Client ID [required] Enter the API key received when registering SAP Datasphere
as OAuth2 client application in SAP SuccessFactors.
clientSecret SAML Assertion [required] Enter a valid SAML assertion that has been generated for
authentication.
Note
If the SAML assertion expires, the connection gets invalid until you
update the connection with a new valid SAML assertion.
Note
• If a property is not set it will receive the default value (on creation) or will keep its current value (on
edit).
• To obtain examples for the file format, read existing connections from a space into a file using the
--get option (see Read Connection Details [page 136]).
Hyperlinks
Some links are classified by an icon and/or a mouseover text. These links provide additional information.
About the icons:
• Links with the icon : You are entering a Web site that is not hosted by SAP. By using such links, you agree (unless expressly stated otherwise in your
agreements with SAP) to this:
• The content of the linked-to site is not SAP documentation. You may not infer any product claims against SAP based on this information.
• SAP does not agree or disagree with the content on the linked-to site, nor does SAP warrant the availability and correctness. SAP shall not be liable for any
damages caused by the use of such content unless damages have been caused by SAP's gross negligence or willful misconduct.
• Links with the icon : You are leaving the documentation for that particular SAP product or service and are entering an SAP-hosted Web site. By using
such links, you agree that (unless expressly stated otherwise in your agreements with SAP) you may not infer any product claims against SAP based on this
information.
Example Code
Any software coding and/or code snippets are examples. They are not for productive use. The example code is only intended to better explain and visualize the syntax
and phrasing rules. SAP does not warrant the correctness and completeness of the example code. SAP shall not be liable for errors or damages caused by the use of
example code unless damages have been caused by SAP's gross negligence or willful misconduct.
Bias-Free Language
SAP supports a culture of diversity and inclusion. Whenever possible, we use unbiased language in our documentation to refer to people of all cultures, ethnicities,
genders, and abilities.
SAP and other SAP products and services mentioned herein as well as
their respective logos are trademarks or registered trademarks of SAP
SE (or an SAP affiliate company) in Germany and other countries. All
other product and service names mentioned are the trademarks of their
respective companies.