Working With Powercenter 8 Desinger
Working With Powercenter 8 Desinger
Working With Powercenter 8 Desinger
POWERCENTER 8.1.0
DESIGNER
Content
Designer
Designer is used to create mappings that contain transformation instructions for the
Integration Service. The Designer has the following tools that we use to analyze sources,
design target schemas, and build source-to-target mappings.
Source Analyzer
It imports or creates source definitions.
Target Designer
It imports or creates target definitions.
Transformation Developer
Develop transformations to use in mappings. we can also develop user-defined
functions to use in expressions.
Mapplets Designer
It Creates sets of transformations to use in mappings.
Mapping Designer
It Creates mappings that the Integration Service uses to extract, transform, and load
data.
Navigator
It connect to repositories, and open
folders within the Navigator. we can
also copy objects and create
shortcuts within the Navigator.
Workspace
It opens different tools in this
window to create and edit
repository objects, such as sources,
targets, mapplets, transformations,
and mappings.
Output
View details about tasks you
perform, such as saving your work
or validating a mapping.
Designer Windows
Status bar
It Displays the status of the
operation you perform.
Overview
An optional window to simplify
viewing a workspace that
contains a large mapping or
multiple objects. Outlines the
visible area in the workspace
and highlights selected objects
in color.
Instance data
View transformation data while
you run the Debugger to debug
a mapping.
Target data
View target data while you run
the Debugger to debug a
mapping.
Designer Windows
Informatica PowerCenter 8 can access the following data sources and load the data
into the following targets . Application
• Hyperion Essbase
•
• PeopleSoft
IBM MQSeries
Sources • SAP NetWeaver
• IBM DB2 OLAP Server
• JMS • SAS
Relational • Microsoft Message Queue • Siebel
• Oracle • TIBCO
• Sybase ASE Mainframe
• WebMethods
• Informix • Adabas
• Datacom
• IBM DB2 •
• IBM DB2 OS/390 IDMS
• Microsoft SQL Server •
• IBM DB2 OS/400 IDMS-X
• Teradata • IMS
Other • VSAM
File Microsoft Excel
• Flat file Microsoft Access
External web services
• COBOL file
• XML file
• web log
Targets
Relational
Application • MY SAP
• Oracle
• Hyperion Essbase • PeopleSoft EPM
• Sybase ASE
• IBM MQSeries • SAP BW
• Informix
• IBM DB2 OLAP Server • SAS
• IBM DB2
• JMS • Siebel
• Microsoft SQL Server
• Microsoft Message Queue • TIBCO
• Teradata
• WebMethods
Mainframe
File
• IBM DB2 OS/390 • VSAM
• Flat file
• IBM DB2 OS/400
• XML file
Other
Microsoft Access
External web services
We configure logic in a transformation that the Integration Service uses to transform data.
Transformations in a mapping represent the operations the Integration Service performs on the
data.
Data passes into and out of transformations through ports that we link in a mapping or mapplet.
An active transformation can change the number of rows that pass through it.
A passive transformation does not change the number of rows that pass through it.
Designer Transformations
Tasks to incorporate a Aggregator - to do things like "group by".
transformation into a Expression - to use various expressions.
mapping Filter - to filter data with single condition.
Create the Joiner - to make joins between separate databases, file, ODBC sources.
transformation Lookup - to create local copy of the data.
Normalizer - to transform denormalized data into normalized data.
Configure the
transformation Rank - to select only top (or bottom) ranked data.
Sequence Generator - to generate unique IDs for target tables.
Link the Source Qualifier - to filter sources (SQL, select distinct, join, etc.)
transformation to
other Stored Procedure - to run stored procedures in the database - and
transformations capture their returned values.
and target Update Strategy - to flag records in target for insert, delete, update
definitions (defined inside a mapping).
Mapping Designer Router - same as filter but with multiple conditions
Java Transformation- It provides a simple native programming interface
Transformation to define transformation functionality with the Java programming
Developer language.
Reusable transformation- is a transformation that can be used in
Mapplet Designer
multiple mappings
Filter
Router Sequence Generator
Stored Procedure
Update Strategy
Expression
Aggregator Lookup
Sorter
Rank
Joiner
Normalizer
Aggregate Functions
The aggregate functions can be used within an Aggregator transformation.
We can nest one aggregate function within another aggregate function.
AVG
COUNT
integration * intelligence * insight 11
Aggregator Transformation
Aggregate Functions
FIRST
LAST
MEDIAN
MAX
MIN
STDDEV
PERCENTILE
SUM
VARIANCE
Conditional Clauses
We use conditional clauses in the aggregate expression to reduce the number of rows used
in the aggregation. The conditional clause can be any clause that evaluates to TRUE or
FALSE.
When we configure the Integration Service, we can choose how we want the Integration
Service to handle null values in aggregate functions. We can choose to treat null values in
aggregate functions as NULL or zero. By default, the Integration Service treats null values
as NULL in aggregate functions.
We can use the Expression transformation to calculate values in a single row before we
write to the target
We can use the Expression transformation to test conditional statements
To perform calculations involving multiple rows, such as sums or averages we can use
aggregator transformation
We can use the Expression transformation to perform any non-aggregate calculations
Creating an Expression Transformation
Click the Value section of the condition, and then click the Open button.
The Expression Editor appears.
Enter the filter condition we want to apply.
Use values from one of the input ports in the transformation as part of this condition
However, we can also use values from output ports in other transformations.
We may have to fix syntax errors before continuing.
Click OK.
Select the Tracing Level, and click OK to return to the Mapping Designer.
Choose Repository-Save.
In the Mapping Designer, click Transformation > Create. Select the Joiner transformation.
Enter a name, and click OK.
The naming convention for Joiner transformations is JNR_TransformationName.
Drag all the input/output ports from the first source into the Joiner transformation.
The Designer creates input/output ports for the source fields in the Joiner transformation as
detail fields by default. We can edit this property later.
Select and drag all the input/output ports from the second source into the Joiner
transformation.
The Designer configures the second set of source fields and master fields by default.
Edit Transformation
Double-click the title bar of the Joiner
transformation to open the Edit
Transformations dialog box.
Select the port tab.
Add default values for specific ports
as necessary.
Setting the Condition
Select the Condition tab and set the
condition.
Click the Add button to add a
condition.
Click the Properties tab and configure
properties for the transformation.
Click OK .
integration * intelligence * insight 19
Joiner Transformation
Join is a relational operator that combines data from multiple tables into a single result set.
We define the join type on the Properties tab in the transformation.
The Joiner transformation supports the following types of joins.
• Normal
• Master Outer
• Detail Outer
• Full Outer
Use a Lookup transformation in a mapping to look up data in a flat file or a relational table,
view, or synonym.
We can import a lookup definition from any flat file or relational database to which both the
PowerCenter Client and Integration Service can connect.
The Integration Service queries the lookup source based on the lookup ports in the
transformation.
It compares Lookup transformation port values to lookup source column values based on
the lookup condition.
Receives input values directly from Receives input values from other
the pipeline transformation calling: LKP expression
Cache includes all lookup columns You can use a static cache
used in the mapping
Cache includes all lookup/output ports in
If there is no match for the lookup the lookup condition
condition, it returns the
default value for all output ports If there is no match for the lookup
condition, returns null
Pass multiple output values to
another transformation Pass one output value to another
transformation
Supports user-defined default
values Does not support user-defined default
values
Perform a calculation.
Update slowly changing dimension tables.
Connected or unconnected.
Cached or uncached.
Lookup Components
We have to define the following components when we
configure a Lookup transformation in a mapping.
Lookup source
• Ports
• Properties
• Metadata extensions
integration * intelligence * insight 23
Lookup Transformation
Click OK.
ITEM_ID = IN_ITEM_ID
PRICE <= IN_PRICE
Designating a Return Value.
Calling the Lookup Through an Expression.
LKP.lookup_transformation_name(argument,
argument, ...)
Double click on lookup transformation edit
transformation opens. Edit Transformation
The Integration Service builds a cache in memory when it processes the first row of data in
a cached Lookup transformation.
It allocates memory for the cache based on the amount we configure in the transformation
or session properties.
The Integration Service stores condition values in the index cache and output values in the
data cache.
The Integration Service queries the cache for each row that enters the transformation.
The Integration Service also creates cache files by default in the $PMCacheDir.
Persistent cache
Recache from database
Static cache
Dynamic cache
Shared cache
Properties Tab
Create keys
Replace missing values
Cycle through a sequential range of numbers
OUTPUT
CURRVAL
Current Value = 1, Increment By = 1
When we run the workflow, the Integration 1
Service generates the following constant 1
values for CURRVAL.
1
1
1
SQL Query
• We can give query in the Source Qualifier
transformation.
• From the Properties tab, select SQL Query The SQL
Editor displays. Click Generate SQL.
Joining Source Data
We can use one Source Qualifier transformation to join
data from multiple relational tables. These tables must
be accessible from the same instance or database
server.
Use the Joiner transformation for heterogeneous
sources and to join flat files.
Sorted Ports
In the Mapping Designer, open a Source Qualifier
transformation, and click the Properties tab.
Click in Number of Sorted Ports and enter the number of
ports we want to sort.
The Integration Service adds the configured number of
columns to an ORDER BY clause, starting from the top
of the Source Qualifier transformation.
The source database sort order must correspond to the
session.
Stored procedures run in either connected or unconnected mode. The mode we use
depends on what the stored procedure does and how we plan to use it in a session. we can
configure connected and unconnected Stored Procedure transformations in a mapping.
• Connected: The flow of data through a mapping in connected mode also passes
through the Stored Procedure transformation. All data entering the transformation
through the input ports affects the stored procedure. We should use a connected Stored
Procedure transformation when we need data from an input port sent as an input
parameter to the stored procedure, or the results of a stored procedure sent as an
output parameter to another transformation.
Input
Output
Table tab
Edit properties such as table name,
business name, and flat file properties.
Columns tab
Edit column information such as column
names, datatypes, precision, and formats.
Properties tab
We can edit the default numeric and
datetime format properties in the Source
Analyzer and the Target Designer.
Click Launch Editor to create an expression that contains the arguments we defined.
Click OK
The Designer assigns the data type of the data the expression returns. The data types have
the precision and scale of transformation data types.
Click OK
The expression displays in the User-Defined Function Browser dialog box.
A mapplet is a reusable object that we create in the Mapplet Designer. It contains a set of
transformations and we reuse that transformation logic in multiple mappings.
When we use a mapplet in a mapping, we use an instance of the mapplet. Like a reusable
transformation, any change made to the mapplet is inherited by all instances of the mapplet.
Usage of Mapplets
Limitations of Mapplets
We cannot connect a single port in the Input transformation to multiple transformations in
the mapplet.
A mapplet must contain at least one Input transformation or source definition with at least
one port connected to a transformation in the mapplet and same applies for output
transformation.
When a mapplet contains a source qualifier that has an override for the default SQL query,
we must connect all of the source qualifier output ports to the next transformation within the
mapplet.
We cannot include the following objects : Normalizer transformations, Cobol sources, XML
Source Qualifier transformations, XML sources and targets, Pre- and post- session stored
procedures and other mapplets.
Data profiling is a technique used to analyze source data. PowerCenter Data Profiling can
help us to evaluate source data and detect patterns and exceptions. we can profile source
data to suggest candidate keys, detect data patterns and evaluate join criteria.
Use Data Profiling to analyze source data in the following situations.
During mapping development .
During production to maintain data quality.
To profile source data, we create a data profile. we can create a data profile based on a
source or mapplet in the repository. Data profiles contain functions that perform calculations
on the source data.
The repository stores the data profile as an object. we can apply profile functions to a column
within a source, to a single source, or to multiple sources.
We can create the following types of data profiles.
Auto profile
Contains a predefined set of functions for profiling source data. Use an auto profile during
mapping development.
Custom profile
Use a custom profile during mapping development to validate documented business rules
about the source data. we can also use a custom profile to monitor data quality or validate
the results of BI reports.
Optionally, click Save As Default to create new default functions based on the functions
selected here.
Optionally, click Profile Settings to enter settings for domain inference and structure inference
tuning.
Optionally, modify the default profile settings and click OK.
Click Configure Session to configure the session properties after you create the data profile.
Click Next if you selected Configure Session, or click Finish if you disabled Configure Session.
The Designer generates a data profile and profile mapping based on the profile functions.
Configure the Profile Run options and click Next.
Configure the Session Setup options.
Click Finish.
Source View
integration * intelligence * insight 63
Debugger Overview
We can debug a valid mapping to gain troubleshooting information about data and error
conditions.
Debugger used in the following situations
• Before we run a session
After we save a mapping, we can run some initial tests with a debug session
before we create and configure a session in the Workflow Manager.
• After we run a session
If a session fails or if we receive unexpected results in the target, we can run the
Debugger against the session. we might also run the Debugger against a session if
we want to debug the mapping using the configured session properties.
Create breakpoints. Create breakpoints in a mapping where we want the Integration
Service to evaluate data and error conditions.
Configure the Debugger. Use the Debugger Wizard to configure the Debugger for the
mapping. Select the session type the Integration Service uses when it runs Debugger.
Run the Debugger. Run the Debugger from within the Mapping Designer. When we run
the Debugger the Designer connects to the Integration Service. The Integration Service
initializes the Debugger and runs the debugging session and workflow.
Monitor the Debugger. While we run the Debugger, we can monitor the target data,
transformation and mapplet output data, the debug log, and the session log.
Modify data and breakpoints. When the Debugger pauses, we can modify data and see
the effect on transformations, mapplets, and targets as the data moves through the
pipeline. we can also modify breakpoint information.
Create Breakpoints
Goto mapping<<debugger<<edit
transformations.
Choose the instant name, breakpoint
type.
And then ADD to add the breakpoints.
Give the condition for data breakpoint
type.
Give no. of errors before we want to
stop.
Run The Debugger
Got mapping<debugger<start debugger
Click next and then choose the session
as ‘create debug session’ other wise
choose ‘existing session’
Click on next
Debug Indicators