0% found this document useful (0 votes)
121 views54 pages

SAP BW 7.3 Learning Material-Basics: - Kiran Kumar Chikati Emp Id: 285671

The document discusses the need for companies to leverage their vast amounts of internal data through a centralized system like a data warehouse that can extract, consolidate, and format data from various sources for flexible analysis and reporting. It provides definitions for key concepts in SAP BW like info providers, transformations, and data transfer processes that are used to move data between different objects in the SAP BW system. The document also compares SAP BW to traditional OLTP systems and data warehousing approaches.

Uploaded by

ckiran1983
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
121 views54 pages

SAP BW 7.3 Learning Material-Basics: - Kiran Kumar Chikati Emp Id: 285671

The document discusses the need for companies to leverage their vast amounts of internal data through a centralized system like a data warehouse that can extract, consolidate, and format data from various sources for flexible analysis and reporting. It provides definitions for key concepts in SAP BW like info providers, transformations, and data transfer processes that are used to move data between different objects in the SAP BW system. The document also compares SAP BW to traditional OLTP systems and data warehousing approaches.

Uploaded by

ckiran1983
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 54

SAP BW 7.

3 Learning Material-
Basics

-Kiran Kumar Chikati


Emp Id: 285671
To maintain their competitive edge, companies must leverage
the information available to them in a manner that is both
efficient and effective.

This vast amount of information, even the company's internal


data is often distributed across many different systems and is
not particularly well formatted.
Companies need system that can extract and consolidate data
from various subsystem such as business application and
other sources(such as databases and internet).

These system must not only enable detailed, Flexible high


quality data analysis to be carried out but also allow the data
to be formatted for multimedia application.

A data warehouse meets precisely these requirements.


The data warehouse is the foundation of all DSS processing.. A
data warehouse is a subject-oriented, integrated, non-volatile,
and time-variant collection of data in support of managements
decisions. The data warehouse contains granular corporate data.
- William H. Inmon

A copy of transaction data specifically structured for query and


analysis.
- Ralph Kimball
Inmons top-down approach:

Inmon defines data warehouse as a centralized repository for the entire


enterprise. Data warehouse stores the atomic data at the lowest level of
detail. Dimensional data marts are created only after the complete data
warehouse has been created. Thus, data warehouse is at the Center of the
Corporate Information Factory (CIF), which provides a logical framework
for delivering business intelligence.
Inmons top-down approach:
W.H.Inmon defines the data warehouse in the following
terms:
Subject-oriented: The data in the data warehouse is
organized so that all the data elements relating to the
same real-world event or object are linked together
Time-variant: The changes to the data in the database
are tracked and recorded so that reports can be
produced showing changes over time
Non-volatile: Data in the data warehouse is never over-
written or deleted -- once committed, the data is static,
read-only, and retained for future reporting
Integrated: The database contains data from most or all
of an organization's operational applications, and that
this data is made consistent
Kimballs bottom-up approach:
Keeping in mind the most important business aspects or departments, data
marts are created first. These provide a thin view into the organizational data,
and as and when required these can be combined into a larger data warehouse.
Kimball defines data warehouse as A copy of transaction data specifically
structured for query and analysis.
Kimballs data warehousing architecture is also known as Data Warehouse Bus
(BUS). Dimensional modelling focuses on ease of end user accessibility and
provides a high level of performance to the data warehouse.
OLTP vs OLAP
SAP Business Intelligence:

SAPBI allows you to analyse


data from operational SAP
applications or any other
business application. You
can also extract and analyse
data from external sources
such as databases, online
services, and the Internet.
The system (data storage,
loading and reporting),
which is pre-configured by
Business Content for core
areas and processes, allows
you to examine the
relationships in every area
of your company.
Standardized structuring and display of all business information - high demand being put on the data
collection process from the underlying DataSources. The data is defined uniquely across the entire
organization to avoid errors arising through varied definitions in different sources.
Simple access to business information via a single point of entry Information must be combined
homogeneously and consistently at a central point from which it can be called up. For this reason,
modern Data Warehouses usually require a separate database. This database enables a standalone
application environment to provide the required services.
Highly developed reporting for analysis with self service for all areas In terms of presentation,
efficient analysis and meaningful multimedia visualization techniques are essential. The system must
be able to cope with the information needs of varied user groups.
Quick and cost-efficient implementation - When implementing the Data Warehouse, an influential cost
factor is its integration into an OLTP system and the straightforward loading of heterogeneous data.
Alongside robust metadata management, delivered business-based Business Intelligence content also
has an important role here.
High performance environment - Data analyses can not be carried out via Data Warehouse without
integrating heterogeneous sources. This is usually done with time-consuming read processes.
Scheduling tools allow the data to be loaded in separate batch jobs at performance-friendly times.
Relieving OLTP systems - In the past, OLTP systems were strongly overloaded by having to store data
and analyze it at the same time. A separate DataWarehouse server now allows you to carry out data
analysis elsewhere.
SAP BW/BW Version Road Map
BW 7.3
BW 7.2

BW 7.01

BW 7.0

BW3.5

BW3.1C

BW3.0B

BW3.0A

BW2.1C

BW2.0B

BW2.0A

BW1.0B

BW1.0A
Terms in BW
Source System:
System that makes the Business Intelligence available for data
extraction.
Data Source:
Data that logically belongs together is stored in the SAP (R/3)
Source System in the form of Data Sources.

Info Source:
Data that logically belongs together is stored in the SAP (BW)
System in the form of Info Sources.
An Info Source contains a number of Info Objects which structure
the information needed to create Info Cubes / DSO Objects in the
SAP (BW) System.
Terms in BW
Persistent Staging Area (PSA):
Stores data in original format while being imported from the source
system. PSA allows for quality check before the data are loaded into
their destinations.

Transformations:
The transformation process allows you to consolidate, cleanse, and
integrate data. When you load data from one BW object into a further BW
object, the data is passed through a transformation. A transformation
converts the fields of the source into the format of the target.
Data Transfer Process (DTP):
Data transfer process (DTP) is used to transfer data within BW from one
persistent object to another object.
Infopackage :
Infopackage is used to load data from source system to persistent
staging area of SAP BW.
InfoProviders
Info Object:
Info Objects are the basic information providers of BW. They
structure the information needed to create Info Cubes /DSO/SPO
Objects.
InfoObjects can be classified into the following types:

Characteristics (for example, customers)


Key figures (for example, revenue)
Units (for example, currency, amount unit)
Time characteristics (for example, fiscal year)
Technical characteristics (for example, request number)

Characteristics:
Characteristics are Business reference objects used to analyze key
figures.
Examples of characteristics InfoObjects:
Cost center (0COSTCENTER)
Material(0MATERIAL)
InfoProviders
Keyfigures:
Key figures provide the values to be evaluated. They are numeric information that is
reported in the query.
Examples of key figure InfoObjects:
Quantity (0QUANTITY)
Amount (0AMOUNT)

Units:
Units are paired with Key figure values . They provide assign a unit of measurement to a
Key Figure Value. For instance 10 Kg where 10 is the KeyFigure and Kg is the unit
Other Examples of Unit Characteristics:
Currency unit (0CURRENCY) (Holds the currency type of the transaction e.g. $, EUR,
USD...)
Value unit (0UNIT) (or) unit of measure (Hold the unit of measure e.g. Gallon, Inch, cm,
PC)

Time Characteristic:

Time characteristics give time reference to data.


Examples of Time Characteristics:
Calendar day (0CALDAY), Calendar year (0CALYEAR), Fiscal year (0FISCYEAR)
InfoProviders
Technical Characteristics:
Technical characteristics are SAP standard objects having their own
administrative purposes.
Examples of Technical Characteristics:
Info Object 0REQUID While loading data to various data targets, SAP
allocates unique numbers which are stored in this Info object
Info Object 0CHNGID When aggregate change run is done, a unique
number is allocated and stored in this info object.

Info Object Catalog:


An info object catalog is a grouping of info Objects according to application
specific. We have two types of info object catalogs.
1. Characteristic info object catalog
2. Key figure info object catalog
Info Area:
Logical collections of data that are based on data models and business rules
that are derived from the enterprise model . SAP (BW) Systems store data in
Info Areas, which can contain DSO/Info Cubes/SPO/Hybrid Provider.
InfoProviders
Data Store Object (DSO):
Object that stores consolidated and cleaned transaction data or master data
on a document level.

An DSO object describes a consolidated dataset from one or several Info


Sources.
This dataset can be evaluated using a BEx query.

The data in DSO objects is stored in transparent, flat database tables unlike
multi-
dimensional data storage in cubes.

The cumulative update of key figures is supported for DSO objects, just as it
is with
InfoCubes, but with DSO objects it is also possible to overwrite data fields.
InfoProviders
Types of Data Store Object (DSO):

There are 3 types of data store objects


Standard DSO
Write Optimized DSO
Direct update DSO

InfoCube:
An InfoCube is a quantity of relational tables that are created according to
the star schema: a large fact table in the center, with several dimension
tables surrounding it.

Types of Infocube:
There are 2 types of Info cubes
Basic InfoCube
RemoteCube or Virtual InfoCube
InfoProviders
InfoCube:
The central objects upon which reports and analyses in BW are based, are called
InfoCubes.
An InfoCube is a multidimensional data structure.
An InfoCube is a set of relational tables that contains Info Objects.
An InfoCube consists of a Fact Table and a set of n Dimension Tables

that define the axes of its multiple dimensions.


Star Schema

In Star Schema model, Fact table is surrounded by dimensional tables. Fact table is usually very large,
that means it contains millions to billions of records. On the other hand dimensional tables are very
small. Hence they contain a few thousands to few million records. In practice, Fact table holds
transactional data and dimensional table holds master data.
The dimensional tables are specific to a fact table. This means that dimensional tables are not shared
to across other fact tables. When other fact table such as a product needs the same product dimension
data another dimension table that is specific to a new fact table is needed.
This situation creates data management problems such as master data redundancy because the very
same product is duplicated in several dimensional tables instead of sharing from one single master
data table. This problem can be solved in extended star schema.
Extended Star Schema
In Extended Star Schema, under the BW star schema model, the dimension table does not contain
master data. But it is stored externally in the master data tables (texts, attributes, hierarchies).
The characteristic in the dimensional table points to the relevant master data by the use of SID table.
The SID table points to characteristics attribute texts and hierarchies.
This multistep navigational task adds extra overhead when executing a query. However the benefit of
this model is that all fact tables (info cubes) share common master data tables between several info
cubes.
Moreover the SID table concept allows users to implement multi languages and multi hierarchy OLAP
environments. And also it supports slowly changing dimension.
R/3 Data Source:

A Data Source is a combination of fields that provide the data


into the BW system. It contains a logical group of fields.
Data is loaded to the data target using an info package.
There are basically two types of Data Sources:
DataSource for transaction data
DataSource for master data
There are 3 types of master data Data Sources
DataSource for attributes
DataSource for texts
DataSource for hierarchies
Data sources are used to get data into the BW system.
Extractors:
An extractor is the program in SAP ERP which can be activated to
prepare and capture the data through an extract structure for transfer
to BW. The extractor can be the result of a standard DataSource or a
custom built DataSource. It can describe a full load or a delta load
process. The data transfer section of the program is remotely called
from SAP BW. When the extractor is a delta type, it also runs to
populate the delta queue with new and changed records.
Extract structure
Extraction method
Extractor
Sometimes we need to fetch Transaction data from R/3 system. This is
nothing but the Extraction of data from R/3 system. Here Extract
structure means a structure containing all the fields that are available
for Extraction. Extraction method signifies the type of extraction
whether Full or Delta. Extractor will have the required logic for
extraction purpose. The Extractors can be categorized into
Application Specific
I. BW Content Extractors
II. Customer Generated Extractors
Cross Application Extractors
I. Generic Extractors.
BW Content Extractors:
Extractors are used to extract data from source
system to BW system. SAP has provided some
Standard Extractors for this purpose. But there are
many situations where Standard extractors do not
meet our requirements. In such cases we need to
enhance our extractors in order to achieve the desired
results.
Application Specific Extractor:
Application specific BW content extractors:

Lo Extraction:
Logistics refers to the process of getting a product or
service to its desired location upon request which involves
transportation, purchasing, warehousing etc.
Main areas in logistics are:
Sales and Distribution (SD) : application 11, 13, 08 (in
LBWE T-code)
Materials Management (MM) : application 03, 02
Logistics Execution (LE) : application 12
Quality Management : application 05
Plant Maintenance (PM) : application 04, 17
Customer Service (CS) : application 18
Project System (PS) : application 20
SAP Retail : application 40,43,44,45
Generic Extractor:

There are many cases in BW installation when there


is a requirement where custom fields are needed
from the source system(R/3) and these fields are not
present in the standard extractor provided by SAP. In
such cases to make the fields available in BW; a new
data source need to be created which will extract
data. This custom made data source is called
Generic extractor. Generic extractors are of 3 types:
Created based on View/ table
Created based on Infoset query
Created based on Function module
Creating Generic Extractor:

T code RSO2 is for creating generic


extractor.
Creating Generic Extractor:
Data Load Types:
Extraction can be done using either Full
update/delta update
Full Load :
Loads all the data for the first time from source
system to BW from required Data source.
Delta Load:
which tells the source system to keep the track
of changes/new records in order to send only
these changed/ new records into BW which
flows as delta.
What is Delta:
Delta is a feature of the extractor, which refers to the changes (new/modified entries),
occurred in the Source System.
0recordmode:
The record mode describes the type of change that a data record contains.
If a datasource is delta capable, the field ROCANCEL which is part of the datasource,
holds the changes from R/3 side.
This field for the DataSource is assigned to the InfoObject 0RECORDMODE in the BI
system.
ROCANCEL serves the same purpose at R/3 side which its counterpart 0RECORDMODE
does at BI side.
Delta:
The following 6 Record Mode types are available, which
identify how a delta record is updated to the data target:
After-image : The record provides an after image, i.e. how
does the record look like after the change. The status of the
record after it has been changed, or after data has been
added is transferred.
Before-image X : The record provides a before image, i.e.
what did the record look like before the change. The status of
the record before the change is transferred. Add A : The
record provides an additive image. Only the differences for all
the numeric values are available.
Delete D : This record mode type signifies that the record
must be deleted.
New Image N : The record provides a new image, i.e. when a
new record is created; a new image is set for it.
Reverse Image R : The record provides a reverse image. The
content of this record is equivalent to a before image. The
only difference occurs when updating a DataStore object: An
existing record with the same key is deleted.
How do we Identify the Delta?
The delta process (how
the data is transferred)
for the DataSource is
determined in the table
ROOSOURCE (in the
source system).
Properties of the delta
process are determined
in the table RODELTAM
(in BI or in the source
system)
Different Types of delta Processes:

Data is brought (extracted) to BI using one of


the following delta processes:
ABR: After before and Reverse image
AIE/AIM: After image
ADD: Additive image
ABR: After Before and Reverse Image

What is it?
Once a new entry is posted or an existing posting
is changed at R/3 side, an after image shows the
status after the change, a before image shows
the status before the change with a negative sign
and the reverse image shows the negative sign
next to the record while indicating it for deletion.
What update type (for key figures) it supports?
Addition
Overwrite
Does it support loading to both info cube and DSO?
YES
Example of ABR Delta Process:
Let us consider a scenario to demonstrate the ABR Delta
Process.
Since it can be used for both Infocube and DSO, let
us consider a scenario where in the loading happens
directly to DSO, with the advantage that we can track
the record changes in change log table for the DSO.
In our case, the DSO is set to additive mode so that
the data source sends both before and after image.
Incase if it is set to overwrite, it sends only after
image.
Let us check the new entry in DSO.
As shown below (Figure 3), the recordmode has the
value N (New entry), indicating that this record is a
new one.
Here, the CRM Gross Weight (CRM_GWEIGH) field has the
initial value 4,093.
Next, the value of CRM gross weight (CRM_GWEIGH)
is changed to 5,360 in the source system.
In order to reflect this change, the data source will
send two entries to BI: One is before image with
negative sign to nullify the initial value (Figure 4).

And the other one is after image entry


(modified value) as shown below (Figure 5).

Upon activation, the after image goes to


active table (Figure 6)
AIE/AIM: After Image Delta Process
What is it?
Once a new entry is posted or an existing posting is
changed at R/3 side, an after image shows the status after
the change.
What update type (for key figures) it supports?
Overwrite Only

Does it support loading to both info cube and DSO?


No, only DSO
Example of AIE/AIM Delta Process

Let us consider the example given above. The initial load to the
target DSO is as shown in Figure 7.

This time after the value of CRM_GWEIGH is changed to 5,360 in the


source system, the datasource sends only one entry to BI, i.e. the
After Image, which will hold the change as shown below (Figure 8).

Upon activation, the after image goes to active table (Figure 9).
ADD: Additive Delta Process

What is it?
Once a new entry is posted or an existing posting is changed
at R/3 side, an additive image shows the difference between
the numeric values.
What update type (for key figures) it supports?
Addition Only

Does it support loading to both info cube and DSO?


YES (In case the DSO has Update type Addition)
Example of ADD Delta Process :
The initial load to the target DSO is as shown in Figure 10.

When the value of CRM_GWEIGH is changed to 5,360 in the


source system, the datasource sends one entry to BI with the
CRM_GWEIGH value 1,267, i.e. the Difference between the
new value and the original value as shown below (Figure 11).

Upon activation, the final entry in active table is the


result of the addition of the original value and the
new value (4093+1267=5360) KG (Figure 12).
More about InfoProviders..
Multiprovider :
A multiprovider is type of infoprovider that builds up union of data from
multiple
info providers.
Physically it doesnt hold any data, it fetch data during runtime of query
from
underlying info providers.Multiprovider works based on union condition
and we can use the following info providers,
1.Data Store Object
2. info cubes
3.Logical Partition Provider
4. Hybrid Provider
5. Master data Info objects
6. Info sets
7. Aggregation Levels
Multiprovider allow you to report on multiple Info Provider like Data store
objects, Info cubes, Master data objects, info sets etc..
More about InfoProviders..
Infoset :
An infoset builds up data join of info providers.
Physically it doesnt hold any data, it fetch data during
runtime of query from
underlying info providers.
infoset works based on join condition and we can join the
following info providers,
1.Data Store Object
2. InfoCubes
3. Master data
Info sets allow you to report on several InfoProviders by
using combination's of
master data-bearing characteristics, InfoCubes and
DataStore objects.
More about InfoProviders..
Hybrid InfoProviders :
It helps in displaying the most current data in the real time reporting scenarios
During runtime, It combine's the mass historic data in the InfoCubes with the
latest data in the DSO.
Semantic Partitioned Objects (SPO) :
Single point of entry for creation and easy maintenance of semantic partitioned
data model.
Transient Providers :
InfoProviders that allows analysis and reporting with BW tools (including BW OLAP
functionalities) on top of an application (e.g. Business Suite system). It is not
modelled in the Data Warehousing Workbench of BW and the data is not
replicated.
Near-Line Storage Solution :
SAP Net Weaver BW offers a near-line storage (NLS) interface that enables
customers to separate frequently accessed data from older, less frequently
accessed data.
Reporting in BW
Business Explorer (BEx)
The Business Explorer is the reporting tool for the Business
Information Warehouse and Consists of
BEx Analyzer
BE Browser
Query
Collection of a selection of characteristics and key figures
(Info Objects) for the analysis of the data of an InfoProvider. A
query always refers exactly to one InfoProvider, whereas you
can define as many queries as you like for each InfoProvider.
BW Content is a preconfigured set of role-related and task-
related information models that are based on consistent
metadata in SAP Business Intelligence.
Role

Web Reports
Workbook

Query
Multi Provider

DSO

Transformations/DTP

InfoSource
Customer
Info Object Order quantity

Extractor
Its BW Layered Scalable Architecture (LSA)architecture
Data Acquisition Layer
The Acquisition Layer is the Inbox of the Data Warehouse:
Fast inbound & outbound flow of data to targets
Accepts data from extraction with as little overhead as possible no early checks, merges,
transformations (1:1)
Adds additional information to each record, like origin (source system), load time, origin
organizational data (like company code). This makes standardized administration and
addressing of all records possible
Provides abstraction of Data Warehouse from sources of data
Provides short term history of extracted data for immediate/short term data inspection
Harmonization and Quality Layer
The data is passed from the Acquisition Layer to the Harmonization and Quality Layer,
which is responsible for transforming the extracted data in accordance with common
semantics and values. The final result is stored in Propagation Layer Data Store objects.
What happens in this layer depends largely on the quality and integration level of the
extracted data. There is often no explicit data storage in this layer. This is especially true
with data flows that receive data from SAP Data Sources, as the data derived from the SAP
sources is frequently already in good shape.
Please note: No business scenario-driven transformation are allowed here. This would
reduce or prevent the reusability of the data in the Propagation Layer.
Data Propagation Layer
The Data Propagation Layer offers persistent data (stored in Data Store objects), which is
compliant with the defined company quality, integration and unification standards. The
Data Propagation Layer meets the extract once deploy many and single version of truth
requirements (reusability).
Business Transformation Layer
The data mart related layers get their data from the Data Propagation Layer. All business
related requirements are modelled in these layers.
In the Business Transformation Layer, data transformations take place, which serve to fulfil
the Data Mart scope. Dedicated Data Store objects in the Business Transformation Layer
might be necessary if we have to join or merge data from various Propagator Layer Data
Store objects.
Please note: Only apply business transformation rules on reusable Propagation Layer Data
Store Objects.
Reporting layer
As the name implies, the Reporting layer contains the reporting related Info Providers
(Architected Data Marts). The Reporting Layer Objects can be implemented as Info Cubes
with or without BW Accelerator or sometimes as Data Store Objects.
Virtualization Layer
To ensure greater flexibility, the queries should always be defined on a MultiProvider.
Queries Please

You might also like