Setting Up The System in BODS
Setting Up The System in BODS
Below are the basic steps you need to follow when setting up a Data Services system-
1) Create a database in SQL (or any supporting back end DB). This will act as local repository for the
designer.
2) Create a Repository (Local, Central or Profiler) using the Repository Manager.
3) Assign the Repository to the data services designer.(Management Console)
4) Define a Job server ( Server Manager)
5) Start with the Designer ( Data services Designer)
I) First step is you need to create a database in SQL so that it will contain all the information of the
intermediate layers like temp tables (simple temp tables for one to one mapping or those temp tables
which handle the transformed data)
You can create a separate user id and password for the DB which you can use when you login to the
Data Services Designer.
Following screenshots illustrate the steps on how you create your database-
Now select the Data base folder on the left hand side and Right click -> New Database
Uncheck the Enforce password policy (the remaining two also will be unchecked)
Select the default database to “how2bods” using the Dropdown button available.
Next if you look at the left hand side of the same window you will find-
Now select the server roles and check system admin on the right hand side.
Check the database you created in the right side top portion of the window and check public and
db_owner on the bottom part of the Window.
Now click on ok and the User name with password will be created for the Database you created.
II) Create a repository on the Data Services side-
Local Repository
Central Repository
Profiler Repository
Local repository is used for a single user instance. This repository is created locally in your system
and the user who login to this system can access the local repository. (It is local to the system)
(Local repository holds all the Jobs, workflows, data flows, data stores and functions of the local user)
Central Repository is used for the Multiple user instance. This repository is created centrally and it
can be used or accessed by multiple users and user groups that are created for this project. Using the
Central repo you can share your jobs and Dataservices components across the multiple users.
This acts as a central storage place so that all the users can upload and download (share) the Data
services artifacts.
Profiler Repository is used to do the data profiler activities like data cleansing and consolidation.
Data Quality aspects are carried out using the Profiler repository. This holds the reports that were
generated when you do a data quality to check to ensure the health of the data.
Navigation:-
Start – Program Files – SAP BusinessObjects XI 3.2 – SAP BusinessObjects Data Services – Data
Services Repository Manager
Using the drop down menu select the repository type as “ Local”
Password: us1_123
Now click on Create button and your local repository will be created.
III) Assign the Repository to Data services using Management Console
Now login to the Data Services Management console using the following Navigation in your desktop-
All programs-SAP BusinessObjects XI 3.2 – SAP BusinessObjects Data Services – Data Services
Management Console
This is like an Administration tool that is used to create and maintain the repositories, add the
repositories, Create the User Groups and Users and also used to schedule and execute the jobs.
Since it is a web based tool, it can be accessed from any location and the administrator / user can
access it from any remote location and perform the tasks.
Login using the User id and password as “admin”. This will be created when you install BODS.
In the screen that appears, click on the Administrator and then click on the Repositories and click on
ADD on the right side to add the local repository that you created-
Give the below credentials-
Password: us1_123
Now click on test button and see if the above credentials entered were correct and click on Apply.
Defining a job server is a one time job and you can define multiple job servers per repository as
required. Job server is used to execute the jobs that you build using the Data Services.
You will find the list of the job serves (and their port numbers) available. If you want to add a new job
server click on add.
Job server port :- 3507 ( this port number should be unique per job server so please cross check with
the port numbers that available and use a new port number)
Click on Add on the right side and now the Repository Information pane will be active.
Password : us1_123
Click on Apply, click on ok, again click on ok and once you come to the main job server page, click on
restart button at the last.
This will restart the services so that the newly created job server will be set as the default server. This
is how you will create the job server.
Please note: If the ‘Add’ button in disabled in the job server, try to open the job server as an
administrator. Usually the job server will not be accessible by all the users. It will be pre-configured
and given to you.
Password : us1_123
Please note: Your repository should appear as available repositories in the Designer, click on your
repository and give the repository password. Click on ‘Reset users’ if prompted.
If your repository is not appearing, login to the Central Management Console and give appropriate
rights to your user. Also add your repository in the CMC.
Local Object library (contains tabs like Project, Job, Work Flow, dataflow, Transformations,
datastores, formats and functions )
First we need to connect or establish a connection between our designer and data base (source,
target or staging) this can be done using the datastores.
Datastore is present as a tab under the local object library and this is used to establish a connection
between the data services and any other (source or target) Systems.
Go to Data store and right click in local object library and click on new-
Give in the credentials
Password: us1_123
Now your data store will be created under the data store tab in the Local Object library. This
establishes a connection between the local database and the Data Services.
Extracting a Flat file data and loading the data to the database (SQL)
In this scenario we are creating job which will load the data from a flat file to SQL DB.
Now your data source (Flat file ) is ready and you need to call that sheet to the data services. Since
it’s a flat file you can directly call that to the data services using File format tab under local object
library instead of building a Datastore for connection.
Check First Row values as column -> click on import schema button.
Now you can see the structure created on the top of the same screen. Make sure that all the data
type are Varchar 255 and click on OK.
Check that file should come under the Excel workbooks under the local object library.
Start creating a project and start building a job so that you can extract the data from the Spread sheet
to data services and store that in the SQL server DB.
Give Job name. Observe that the canvas window will appear or it will be activated. In the canvas, click
on the Work Flow on the right side pane and click on the canvas region to get a work flow and name
the work flow as WF_how2bods.
Double Click on the work flow and similarly get a dataflow into it. If you now observe under the project
area you will find the job displayed as a tree structure.
Double click on the Dataflow. This is the place where your actual process starts.
Drag the created flat file from the Excel workbooks to the designer window or canvas.
Click on the query transform from the right side pane and click it next to the spreadsheet dragged.
Connect the spread sheet to query transform.
Double click on the query and drag all the fields from schema In to Schema Out.
Click on back at the top of the window.
Now create a template table and point it to the data store you created to connect to SQL Server as
below.
Click on the template table on the right side pane and click it back on the canvas.
Make sure that you select the data store you created using the down arrow. Click on OK. Now your
temp table will be created. Connect the query transformation to the temp table and click on save.
Once you click on save, the magnifying glass symbol will be shown at the right bottom of the temp
tale and it says that the table is saved with the structure or schema.
Now your job is ready and you can run the job, select the job you created “Job_how2bods”. Right click
and click on execute.
Please note: Make sure that the file is not open while the job is executing, else job will fail.