Informatica Scenarios
Informatica Scenarios
The following list of informatica scenarios helps you in learning informatica transformations and
also for facing interviews easily.
Retrieve the previous row value when processing the current row.
Skip Last N rows from the source and load the reamining rows
Load the last N rows from the source into the target.
Load odd number rows into one target and even numbered rows into another target.
Generate the source file name and load into the target.
Without specifying the target file name, generate dynamically in the mapping.
Last row should become first row and first row should become last row in the target.
How to get the previous row value while processing the current row in informatica?
One of my blog readers asked this question. The source data is shown below:
Table Name: Customers
-----------------------
10, 2004, NY
The question is for each customer when processing the record for current row, you have to get the
previous row city value. If there is no previous row, then make the previous row value as null. The
output data is shown below:
------------------------------
Solution:
Connect the source qualifier transformation to the sorter transformation and sort the data
on cust_id, year ports in ascending order.
Connect the sorter transformation to the expression transformation. In the expression
transformation, create the below additional ports and assign the corresponding
expressions:
Name
----
A
B
After excluding the last 5 records, i want to load A,B into the target. How to implement a mapping
logic for this in informatica?
I have employees table as a source. The data in the employees table is shown below:
---------------------
20 302, 50000
I want to sort the data on the department id, employee id and then find the cumulative sum of
salaries of employees in each department. The output i shown below:
---------------------------------
Solution: Follow the below steps for implementing mapping logic in informatica.
Connect the source qualifier transformation to a sorter transformation. Sort the rows on the
dept_id and emp_id ports in ascending order.
Connect the sorter transformation to the expression transformation. In the expression
transformation, create the following additional ports and assign the corresponding
expressions:
Connect the expression transformation ports to the target. Save the mapping.
The sales information of a product for each month is available in a separate row. I want to convert
the rows for all the months in a specific year to a single row. The output is shown below:
Target Data:
-------------------------------------------
Solution:
Follow the below steps to implement the mapping logic for the above scenario in informatica:
Create a new mapping.
Drag the source into the mapping.
Create an expression transformation.
Drag the ports of source qualifier into the expression transformation.
Create the below additional ports in the expression transformation and assign the
corresponding expressions:
Products
--------
Windows
Linux
Unix
Ubuntu
Fedora
Centos
Debian
I want to load only the last record or footer into the target table. The target should contain only the
product "Debain". Follow the below steps for implementing the mapping logic in informatica:
Create a new mapping and drag the source into the mapping. By default, it creates the
source qualifier transformation.
Now create an expression transformation and drag the ports from source qualifier into the
expression transformation. In the expression transformation, create the below additional
ports and assign the corresponding expressions:
Products, o_count
-----------------
Windows, 1
Linux, 2
Unix, 3
Ubuntu, 4
Fedora, 5
Centos, 6
Debian, 7
Now connect the expression transformation to a sorter transformation and sort the rows on
the o_count port in descending order. The output of sorter transformation is shown below:
Products
--------
Debian
Centos
Fedora
Ubuntu
Unix
Linux
Windows
Create another expression transformation and connect the Products port of sorter to
expression transformation. Create the following ports in the expression transformation:
Connect the expression to a filter transformation and specify the filter condition as o_count
= 1.
Connect the filter to the target and save the mapping.
I have a source file which contains N number of records. I want to load the source records into two
targets, such that first row goes into target 1, second row goes into target2, third row goes into
target3 and so on.
Let see how to create a mapping logic for this in informatica with an example. Consider the
following source flat file as an example:
Products
---------
Informatica
Datastage
Pentaho
MSBI
Oracle
Mysql
The data in the targets should be:
Target1
-------
Informatica
Pentaho
Oracle
Target2
-------
Datastage
MSBI
Mysql
Solution:
The mapping flow and the transformations used are mentioned below:
SRC->SQ->EXP->RTR->TGTS
First create a new mapping and drag the source into the mapping.
Create an expression transformation. Drag the ports of source qualifier into the expression
transformation. Create the following additional ports and assign the corresponding
expressions:
Create a router transformation and drag the ports (products, v_count) from expression
transformation into the router transformation. Create an output group in the router
transformation and specify the following filter condition:
MOD(o_count,2) = 1
Now connect the output group of the router transformation to the target1 and default group
to target2. Save the mapping.
In the above solution, I have used expression transformation for generating numbers. You can also
use sequence generator transformation for producing sequence values.
We will create a simple pass through mapping to load the data and "file name" from a flat file into
the target. Assume that we have a source file "customers" and want to load this data into the target
"customers_tgt". The structures of source and target are
Customer_Id
Location
Target: Customers_TBL
Customer_Id
Location
FileName
The loading of the filename works for both Direct and Indirect Source filetype. After running the
workflow, the data and the filename will be loaded in to the target. The important point to note is
the complete path of the file will be loaded into the target. This means that the directory path and
the filename will be loaded(example: /informatica/9.1/SrcFiles/Customers.dat).
If you don’t want the directory path and just want the filename to be loaded in to the target, then
follow the below steps:
Create an expression transformation and drag the ports of source qualifier transformation
into it.
Edit the expression transformation, go to the ports tab, create an output port and assign the
below expression to it.
REVERSE
SUBSTR
REVERSE(CurrentlyProcessedFileName),
1,
INSTR(REVERSE(CurrentlyProcessedFileName), '/') - 1
Now connect the appropriate ports of expression transformation to the target definition.
I have the products table as the source and the data of the products table is shown below.
Product Quantity
-----------------
Samsung NULL
Iphone 3
LG 0
Nokia 4
Now i want to duplicate or repeat each product in the source table as many times as the value in the
quantity column. The output is
product Quantity
----------------
Iphone 3
Iphone 3
Iphone 3
Nokia 4
Nokia 4
Nokia 4
Nokia 4
The Samsung and LG products should not be loaded as their quantity is NULL, 0 respectively.
Now create informatica workflow to load the data in to the target table?
Solution:
Follow the below steps
if (!isNull("quantity"))
product = product;
quantity = quantity;
generateRow();
}
Now compile the java code. The compile button is shown in red circle in the image.
Connect the ports of the java transformation to the target.
Save the mapping, create a workflow and run the workflow.
Go to the Target Designer or Warehouse builder and edit the file definition. You have to click on the
button indicated in red color circle to add the special port.
Now we will see some informatica mapping examples for creating the target file name dynamically
and load the data.
1. Generate a new file for every session run.
Whenever the session runs you need to create a new file dynamically and load the source data into
that file. To do this just follow the below steps:
STPE2: Now connect the expression transformation to the target and connect eh File_Name port of
expression transformation to the FileName port of the target file definition.
Here I have used sessstarttime, as it is constant throughout the session run. If you have used
sysdate, a new file will be created whenever a new transaction occurs in the session run.
2. Create a new file for every session run. The file name should contain suffix as numbers
(EMP_n.dat)
In the above mapping scenario, the target flat file name contains the suffix as 'timestamp.dat'. Here
we have to create the suffix as a number. So, the file names should looks as EMP_1.dat, EMP_2.dat
and so on. Follow the below steps:
STPE1: Go the mappings parameters and variables -> Create a new variable, $$COUNT_VAR and its
data type should be Integer
STPE2: Connect the source Qualifier to the expression transformation. In the expression
transformation create the following new ports and assign the expressions.
STEP3: Now connect the expression transformation to the target and connect the o_file_name port
of expression transformation to the FileName port of the target.
You can create a new file only once in a day and can run the session multiple times in the day to
load the data. You can either overwrite the file or append the new data.
This is similar to the first problem. Just change the expression in expression transformation to
'EMP_'||to_char(sessstarttime, 'YYYYMMDD')||'.dat'. To avoid overwriting the file, use Append If
Exists option in the session properties.
4. Create a flat file based on the values in a port.
You can create a new file for each distinct values in a port. As an example consider the employees
table as the source. I want to create a file for each department id and load the appropriate data into
the files.
STEP1: Sort the data on department_id. You can either use the source qualifier or sorter
transformation to sort the data.
STEP2: Connect to the expression transformation. In the expression transformation create the
below ports and assign expressions.
STEP4: Now connect the expression transformation to the transaction control transformation and
specify the transaction control condition as
Solution:
Follow the below steps for creating the mapping logic
Now create a sorter transformation and drag the ports of expression transformation into it.
In the sorter transformation specify the sort key as o_count and sort order as DESCENDING.
Drag the target definition into the mapping and connect the ports of sorter transformation
to the target.
Q2) Load the header record of the flat file into first target, footer record into second target and the
remaining records into the third target.
The solution to this problem I have already posted by using aggregator and joiner. Now we will see
how to implement this by reversing the contents of the file.
Solution: