If You Store Values in Headers or Properties in Your Iflow
If You Store Values in Headers or Properties in Your Iflow
If you store values in headers or properties in your iflow, those values will be gone when the iflow
finishes. If you need to store values longer than that, you can store them in a variable using the Write
Variables step. In other words, they're a way to persist data.
A global variable can be accessed by any iflow, but a local variable can only be accessed by the iflow that
wrote it.
You can see all the variables, you've stored, in the Operations view => Manage Stores => Variables.
Variables are similar to Data Stores, really. But variables store scalar values (i.e. one number, one
timestamp etc.) whereas Data Stores contain complete payloads (e.g. an XML or JSON document).
Please note that while there's a Write Variables step, there's no Read Variables step. To fetch the value
of a variable into a property or header, you use a Content Modifer with the type set to either Local
Variable or Global Variable.
In order to store the Last Run Date which will be further used in the same process but in different
execution
A: Local variable can be access via same iFlow only. Global variable can be access via differet iFlows. Q:
How to read local variable and global variable? A: Use Content Modifier read to either header or
property
Q: At iFlow first run, variable not created yet but need to use some initial/default value for processing,
how to handle such chicken-and-egg situation?
Q: Is it possible local and global variable having same name? A: Yes. Since the scope is different between
local and global. Q: How to do delta synchronization via timestamp?
A: Use variable to remember last processed timestamp, so that next scheduled run will resume from
last process timestamp onward. Q: What need to consider when design delta synchronization via
timestamp?
(2) Timestamp should be unique (E.g. only date without time might not work)
. (3) The right date field should use for delta synchronization.
(4) Only update last processed timestamp at last step if all processing success.
A: Build in same iFlow a manualRun/adhocRun flag to set manual timestamp, override value in variable.
A: Use global if other iFlow need to access same variable. Global can behave like local, but not the other
way round.
A: Build generic iflow and use Postman to read write global variable.
Q: What ways can be use to delete variable? A: Manual delete via 'Manage Variables' page. Q: What
other potential use of variable?
A: Access same variable value in different branches of Multicast (because property will not work).
Send Pallete function supports: AS2 Adatpor,Ftp ,JMS ,Mail ,SOAP RM adaptor,SFTp adaptor , XI
adaptor
Request Reply Supports: Odata Adatpor, Http adaptor, soap adaptor,Jdbc adaptor ,process direct
Odata Adaptor allows users to communicate with odata APis using Odata protocol
Content Enricher limitation: supports only xml data format and only successfactor ,Soap 1.x and
Odata adaptor types
When ever deploy the iflow for every run the Message Id and correlation DI gets changed
In a single call we cannot pull entire data from successfactor we gor for looping process call
Looping process call – it repeatedly executes the steps defined in te local integration until the
condition is met or max allowed interations are reached
What is technical error – connectivity issues, Memory issues, time out issues
What is functional error – Incorrect Mapping,Data transformation issues ,business rules violation
Main difference between variable and data store is that variable stores a single value while data store
stores entire payload
Send is used when reply is not expected while the request reply pattern is used when response is
expected
- Aggregator accepts only xml data to merge but with gather we can merge xml, text ,tar ,zip
etc
- Aggregator stores payload in data store untile whole set of data receives
Camel spilt complete in splitter header = true -> how will the gather know that this is the last message
Looping process call if it fails at 5 iteration then it will pass the 4 the payload to next palate what if I
ned 1 to 4 payloads also ?
Apart from Process direct is there any options to call iflow one to another – Http or SOAP
Fixed Value Mapping: Fixed Value is very useful when you have lots of conditions coming in one source
fields during mapping, then to avoid the IfThenElse or UDF we go for Fixed Value mapping.
Example: if your source field contains 01 or 02 or 03 or.....12 and you want the result as JAN or FEB or
MAR or ......... DEC.
Advantage: Using fixed value table you can see the result immediately in the mapping.
Disadvantage: If you have used the same fixed value mapping at several places in you mapping then in
case of any change in fixed values you have to make the changes at several places. So in short fixed
value mapping is not good from the maintenance point of view.
Value Mapping: Value Mapping also works in the same way as fixed value mapping, only the thing is you
will get the result at run time only. And you have to define the value mapping in Integration directory.
Advantage: If you have used the same value mapping at several places in you mapping then in case of
any changes in values you don't have to make the changes in you mapping, just make the changes in in
value mapping in Integration Directory, that's it.
Disadvantage: you can't the see the result immediately in mapping. Results can be seen only at run time.
-----------------------------------------------------------------------------------------------------------------------------------
Process direct- body is shared to share the headers from one iflow to another in the integration flow
- Runtime configuration -> Allow headers- you can put * or specify the header name which you
want to transfer
Parallel : if any of the branch fails .. All the braches will executed and status of the iflow in Message
Processing will be failed
Sequential : based on sequential order the branches gets executed and if any of the branch fails ..
remaining branches will not be executed
Both In Parallel and Sequential if any of the branch fails – Join and Gather are not executed
Parallel with Join + gather : if one of the branch fails - it will execute remaining branches and
Message will not reach Join and Gather and the Message Processing status will be in failed status in
Monitor Section ...
Sequential with Join + gather : based on sequential order the branches gets executed and if one of
the branch fails ,it will not execute remaining branches and Message will not reach Join and Gather
and the Message Processing status will be in failed status in Monitor Section ...
-----------------------------------------------------------------------------------------------------------------------------------
How to send property in parent iflow to child iflow in the same tenant which are connected by JMS
adaptor? if you want to transfer the exchange properties from parent to child iflow in the same
tenant via JMS adaptor we have option in the JMS sender adaptor confifuration you have to check the
check box “Transfer exchange properties”
when ever you are applying filter you need to add content Modifier as filter will delete root element
Suppose you are getting incoming payload and you want to send it to different receivers then multicast
is used ..if you are getting incoming payload and you eant to sent it to different receivers depending on
condition then router is used
Splitter + router- if you want to send only relevant data to the receiver instead of sending entire payload
to receiver
Multicast +filter- if you want to send only relevant data to the receiver instead of send entire payload or
data to receiver
Send : can be used to configure a service call to a receiver system where no reply is expected.
Filter – nodelist,node,Boolean,string,Integer
you are using a lookup table in Excel files right you are reading a one column from Excel and look up the
same data in another Excel file and retrive some data
when from the incoming message you pick up one value and read that value in one specific table like
structure then pick up the relative information and push the data that means from the incoming
message you will have one value but the same value will be translated to the receiving side that is what
value mapping is going to do
process call- you need an process call step in your main integration flow which allows you to select and
execute the desired Local integration process from the available local integration processes
---------------------------------------------------------------------------------------------------------------------------------------
Add a looping Process call step to repeatedly execute the steps defined in local integration process until
the condition is met or max iterations allowed is reached which ever comes first
1. First, we will create a CPI link to be able to make calls to the service.
Step 2. We specify that the looping process call will work according to the condition expression specified
in the "condition expression" field. By stating ".hasMoreRecords contains 'true', we indicate that the
loop will continue to run as long as there are multiple records. You can take a look at (hasMoreRecords).
tep 4.We use the "select" clause to choose which fields we want to retrieve from the Orders entity.
Step 5.After passing through the filter, the data will no longer include "Orders" but will start with
"Order." This is because we need the information of "Orders/Order" due to sending the data in
fragments. After completing the process of sending fragmented data, we will merge it in the "Message
Body" of the Content Modifier.
Figure 6.Filter
Step 7.We add the "Orders" tag, which we ignored with the filter, to this content modifier. Once the
loop is completely finished, we add the merged data as a property.
Step 8.To indicate that the last data will come in XML format, we add the "Content-Type" header.
Figure 9.Content Type
Step 10.We determine who the email will come from and who it will go to.
Figure 11.Mail Adapter Processing Information
Step 11. Save and Deploy. Then once we have created a CPI link, we need to call it using the GET method
in Postman after the deployment.
Step 12. We are making a call to the CPI service using the CPI username and password.
Figure 12.Postman
Step 13. It entered the loop a total of 6 times, but on the 6th request, since there was no data left inside,
it combined the data sent in fragments, exited the loop, and continued to 'End'.
Figure 12.Monitoring
When we look at our first loop, it indicates that in the first request, it fetched the first 200 records from
the entire data set and provided the information that the next loop would start with OrderID 10448
using the expression "$skiptoken=10447".
In each loop, as it adds data, it indicates that there were 400 records in the 2nd request, and when it
enters the 3rd loop, it won't fetch the same initial 400 records again. Similarly, it shows that in the next
loop, the data will start with OrderID 10648.
The important point to note is that it continues to loop as long as the condition we set is met, meaning it
enters the loop as long as it evaluates to true.
When we check the final step, we understand that this condition returns false, indicating that it has
fetched all the data inside.
Due to the condition, since the Loop process has ended, we receive information that the last data has
the OrderID number 11047.
Finally, I wanted to add an email adapter. It notifies all the information via email.
${CamelFileName} -File Name
${CamelHttpUri} -HTTP URI (URL with QueryString)
${CamelDestinationOverrideUrl} -SOAP Adapter URL
${exception.message}- Error Message
${exception.stacktrace} -Error Stacktrace
${camelId} -Iflow Name
${SAP_ApplicationID} -ID created in Message Processing Log for searching in monitoring
Interview questions:
Can you please help understand in which situations do we go for Error End and Escalation End? The only
difference that I can see is the message status which is Failed / Escalated. Apart from this, whats the
purpose of both and have you come across real situations needing to use them separately? Please share
your thoughts?
n SAP CPI, an *Error End* event is used when the integration flow encounters a critical failure that
cannot be recovered, such as a mapping error, connectivity issue, or missing mandatory data, and needs
to explicitly terminate while logging the error for monitoring and troubleshooting. For example, if an API
call fails due to invalid credentials, the flow can be terminated using an Error End to ensure the issue is
logged as a failure in monitoring. An *Escalation End* event, on the other hand, is used to signal a non-
critical error or exceptional scenario that may require alternative handling or routing within a larger
process context, such as notifying a stakeholder or triggering a compensatory process. For instance, if a
stock check API returns "out of stock," an Escalation End could notify the sales team to follow up, while
allowing other parts of the process to continue. Use *Error End* for unrecoverable issues and
*Escalation End* for managed deviations that do not halt the overall process.
In SAP CPI, *trace* mode captures detailed, end-to-end information about the integration flow,
including intermediate payloads, headers, and properties, to help analyze the flow step-by-step, but it
impacts performance significantly and is typically used for troubleshooting. In contrast, *debug* is more
focused on testing specific components like Groovy scripts, enabling developers to identify logic errors
or configuration issues within a particular step, with a lower performance impact and logs written in
scripts or the Message Processing Logs. Trace is for operational analysis, while debug is for development
and testing.
Parallel : if any of the branch fails .. All the braches will executed and status of the iflow in Message
Processing will be failed
Sequential : based on sequential order the branches gets executed and if any of the branch fails ..
remaining branches will not be executed
Multicast with Single receiver (-> Join + Gather )
Both In Parallel and Sequential if any of the branch fails – Join and Gather are not executed
Parallel with Join + gather : if one of the branch fails - it will execute remaining branches and Message
will not reach Join and Gather and the Message Processing status will be in failed status in Monitor
Section ...
Sequential with Join + gather : based on sequential order the branches gets executed and if one of the
branch fails ,it will not execute remaining branches and Message will not reach Join and Gather and
the Message Processing status will be in failed status in Monitor Section ...
when splitting a message into multiple branches, changes made to headers or properties within one
branch are not reflected in the other branches,
Mail Adatpor:
You can download these server certificates ..server certificate chain ..root certificate ..intermediate
certificate ..peer certificate
In the past, I wrote some articles on defining dynamic parameters such as filename, directory, etc in the
receiver file adapters in SAP PI/PO. There are several techniques like ASMA and Variable Substitution.
However, SAP Integration Suite CI (BTP-IS/CI/CPI) technique is more straightforward.
We define the filename with a unique time stamp and copy the file name prefix from the incoming file.
Imagine a scenario where you have files with different file name prefixes in a certain directory in the
SFTP server. I want to build an iFlow that can fetch and route these files to the target based on their file
name prefix.
For example, files starting with “Order” should be moved to “Orders” target folder on SFTP server.
Invoices to “Invoices” folder and all other files to “Other” folder.
In this scenario, we will make use of the following features of SAP Integration Suite interface
development techniques,
Step 3 – Make Use of Exchange Parameter or Header Parameter to Set the Directory
Let’s make use of content modifiers to determine the directory at runtime. We will have an exchange
property parameter named “directory” to set the value of the directory.
${file:onlyname.noext}
Camel Simple Expression to get the prefix or the filename of the incoming file without the extension.
${date:now:yyyy-MM-dd}
Camel Simple Expression to add the date as the 2nd part of the file name in the format yyyy-MM-dd.
${date:now:HH-MM-SS}
Camel Simple Expression to add the time as the 3rd part of the file name in the format HH-MM-SS.
Other Methods of Setting a Dynamic File Name in SAP Integration Suite CI (BTP-IS/CI/CPI)
In the example, we made use of a custom Exchange/Header Parameter, a standard header parameter
and a Camel Simple Expression to dynamically define the directory and filename at the receiver adapter.
import java.text.SimpleDateFormat
// Get message ID
message.setHeader("CamelFileName", fileName)