0% found this document useful (0 votes)
42 views33 pages

If You Store Values in Headers or Properties in Your Iflow

The document details the use of variables in integration flows (iflow) within SAP CPI, explaining the distinction between local and global variables, their storage, and retrieval methods. It also covers various integration patterns, error handling, and the functionalities of different adapters and components in the integration process. Additionally, it addresses common questions and scenarios related to variable management, data synchronization, and the execution of integration flows.

Uploaded by

sapcpibuddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views33 pages

If You Store Values in Headers or Properties in Your Iflow

The document details the use of variables in integration flows (iflow) within SAP CPI, explaining the distinction between local and global variables, their storage, and retrieval methods. It also covers various integration patterns, error handling, and the functionalities of different adapters and components in the integration process. Additionally, it addresses common questions and scenarios related to variable management, data synchronization, and the execution of integration flows.

Uploaded by

sapcpibuddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 33

---------------------------------------------------------------------------------------------------------------------------------

If you store values in headers or properties in your iflow, those values will be gone when the iflow
finishes. If you need to store values longer than that, you can store them in a variable using the Write
Variables step. In other words, they're a way to persist data.

A global variable can be accessed by any iflow, but a local variable can only be accessed by the iflow that
wrote it.

You can see all the variables, you've stored, in the Operations view => Manage Stores => Variables.

Variables are similar to Data Stores, really. But variables store scalar values (i.e. one number, one
timestamp etc.) whereas Data Stores contain complete payloads (e.g. an XML or JSON document).

Please note that while there's a Write Variables step, there's no Read Variables step. To fetch the value
of a variable into a property or header, you use a Content Modifer with the type set to either Local
Variable or Global Variable.

In order to store the Last Run Date which will be further used in the same process but in different
execution

Q: What is the different between local variable and global variable?

A: Local variable can be access via same iFlow only. Global variable can be access via differet iFlows. Q:
How to read local variable and global variable? A: Use Content Modifier read to either header or
property

. Q: How to write Variable?

A: In iFlow, use 'Write Variables' step, take value from header/property/Xpath/expression.

Q: At iFlow first run, variable not created yet but need to use some initial/default value for processing,
how to handle such chicken-and-egg situation?

A: Use Content Modifier read variable and set default value.

Q: Beside create, how to update variable?

A: Just use 'Write Variables', if existed will just update/replace.

Q: Is it possible local and global variable having same name? A: Yes. Since the scope is different between
local and global. Q: How to do delta synchronization via timestamp?
A: Use variable to remember last processed timestamp, so that next scheduled run will resume from
last process timestamp onward. Q: What need to consider when design delta synchronization via
timestamp?

A: (1)Data should be sorted by timestamp.

(2) Timestamp should be unique (E.g. only date without time might not work)

. (3) The right date field should use for delta synchronization.

(4) Only update last processed timestamp at last step if all processing success.

(5) Timer/scheduler interval.

Q: What if I need to revert back to earlier past timestamp?

A: Build in same iFlow a manualRun/adhocRun flag to set manual timestamp, override value in variable.

Q: Should I use global variable or local variable?

A: Use global if other iFlow need to access same variable. Global can behave like local, but not the other
way round.

Q: Can I use SAP CPI manage variable page to write variable?

A: No, as of now, no way to create/update variable manually,but via IFlow only.

Q: What other way to manage read write of global variable?

A: Build generic iflow and use Postman to read write global variable.

Q: What ways can be use to delete variable? A: Manual delete via 'Manage Variables' page. Q: What
other potential use of variable?

A: Access same variable value in different branches of Multicast (because property will not work).

Send Pallete function supports: AS2 Adatpor,Ftp ,JMS ,Mail ,SOAP RM adaptor,SFTp adaptor , XI
adaptor

Request Reply Supports: Odata Adatpor, Http adaptor, soap adaptor,Jdbc adaptor ,process direct

Odata Adaptor allows users to communicate with odata APis using Odata protocol
Content Enricher limitation: supports only xml data format and only successfactor ,Soap 1.x and
Odata adaptor types

Every Message will have Message ID and correlation ID

 XSD – field structure


 XML- field value

When ever deploy the iflow for every run the Message Id and correlation DI gets changed

 XSD will hold xml values in runtime


 Wsdl will hold – soap structure

Splitter – Grouping and streaming focus on Interviews

In a single call we cannot pull entire data from successfactor we gor for looping process call

Looping process call – it repeatedly executes the steps defined in te local integration until the
condition is met or max allowed interations are reached

 Outbound Integration- data going out of s/4 hana


 Inbound Integration- data coming into s/4

Write variable – local and global

Content modifier to read the local variable in same iflow

Content modifier to read the global variable in different iflow

What is technical error – connectivity issues, Memory issues, time out issues

What is functional error – Incorrect Mapping,Data transformation issues ,business rules violation

Main difference between variable and data store is that variable stores a single value while data store
stores entire payload

Send is used when reply is not expected while the request reply pattern is used when response is
expected

Request reply/Content Enricher is synchronous call

- Aggregator accepts only xml data to merge but with gather we can merge xml, text ,tar ,zip
etc
- Aggregator stores payload in data store untile whole set of data receives
Camel spilt complete in splitter header = true -> how will the gather know that this is the last message

Will router will fail if two conditions satisfy?

Looping process call if it fails at 5 iteration then it will pass the 4 the payload to next palate what if I
ned 1 to 4 payloads also ?

Apart from Process direct is there any options to call iflow one to another – Http or SOAP

Wt is the diff between fix value and value mapping?..

Fixed Value Mapping: Fixed Value is very useful when you have lots of conditions coming in one source
fields during mapping, then to avoid the IfThenElse or UDF we go for Fixed Value mapping.

Example: if your source field contains 01 or 02 or 03 or.....12 and you want the result as JAN or FEB or
MAR or ......... DEC.

Advantage: Using fixed value table you can see the result immediately in the mapping.

Disadvantage: If you have used the same fixed value mapping at several places in you mapping then in
case of any change in fixed values you have to make the changes at several places. So in short fixed
value mapping is not good from the maintenance point of view.

Value Mapping: Value Mapping also works in the same way as fixed value mapping, only the thing is you
will get the result at run time only. And you have to define the value mapping in Integration directory.

Advantage: If you have used the same value mapping at several places in you mapping then in case of
any changes in values you don't have to make the changes in you mapping, just make the changes in in
value mapping in Integration Directory, that's it.

Disadvantage: you can't the see the result immediately in mapping. Results can be seen only at run time.

-----------------------------------------------------------------------------------------------------------------------------------

 JMs Reciever Adaptor – Post /publish the message to queue( send)


 JMS sender Adaptor – consume the message fro queue (read)

Data store two flows - Integration flow and global

Write step : to store the payload into a data store

Get step : to get the payload from data store

Process direct- body is shared to share the headers from one iflow to another in the integration flow
- Runtime configuration -> Allow headers- you can put * or specify the header name which you
want to transfer

Multicast (Multiple Recievers )

Parallel : if any of the branch fails .. All the braches will executed and status of the iflow in Message
Processing will be failed

Sequential : based on sequential order the branches gets executed and if any of the branch fails ..
remaining branches will not be executed

Multicast with Single receiver (

Both In Parallel and Sequential if any of the branch fails – Join and Gather are not executed

Parallel with Join + gather : if one of the branch fails - it will execute remaining branches and
Message will not reach Join and Gather and the Message Processing status will be in failed status in
Monitor Section ...

Sequential with Join + gather : based on sequential order the branches gets executed and if one of
the branch fails ,it will not execute remaining branches and Message will not reach Join and Gather
and the Message Processing status will be in failed status in Monitor Section ...

-----------------------------------------------------------------------------------------------------------------------------------

How to send property in parent iflow to child iflow in the same tenant which are connected by JMS
adaptor? if you want to transfer the exchange properties from parent to child iflow in the same
tenant via JMS adaptor we have option in the JMS sender adaptor confifuration you have to check the
check box “Transfer exchange properties”

when ever you are applying filter you need to add content Modifier as filter will delete root element

Suppose you are getting incoming payload and you want to send it to different receivers then multicast
is used ..if you are getting incoming payload and you eant to sent it to different receivers depending on
condition then router is used

Splitter + router- if you want to send only relevant data to the receiver instead of sending entire payload
to receiver

Multicast +filter- if you want to send only relevant data to the receiver instead of send entire payload or
data to receiver

Write variables – mostly used in delta loads


Request –Response (synchronous call) Calls an external receiver system synchronously and retrieves a
response.

Send : can be used to configure a service call to a receiver system where no reply is expected.

Filter – nodelist,node,Boolean,string,Integer

Activate to view larger image,

you are using a lookup table in Excel files right you are reading a one column from Excel and look up the
same data in another Excel file and retrive some data

when from the incoming message you pick up one value and read that value in one specific table like
structure then pick up the relative information and push the data that means from the incoming
message you will have one value but the same value will be translated to the receiving side that is what
value mapping is going to do

process call- you need an process call step in your main integration flow which allows you to select and
execute the desired Local integration process from the available local integration processes

Local Integration process:

---------------------------------------------------------------------------------------------------------------------------------------

Add a looping Process call step to repeatedly execute the steps defined in local integration process until
the condition is met or max iterations allowed is reached which ever comes first
1. First, we will create a CPI link to be able to make calls to the service.

Figure 2. Sender HTTPS Adapter

Adapter Type: HTTPS


Address: Specific

Step 2. We specify that the looping process call will work according to the condition expression specified
in the "condition expression" field. By stating ".hasMoreRecords contains 'true', we indicate that the
loop will continue to run as long as there are multiple records. You can take a look at (hasMoreRecords).

When this condition returns false, the loop will end.


Figure 3. Loop Process Call

Step 3.OData informations.

tep 4.We use the "select" clause to choose which fields we want to retrieve from the Orders entity.

Our method is GET.


We need to mark "Process in Pages". If we don't mark it, the system will send all the data at once after
entering the loop once.

Figure 5.Odata Adapter Processing Information

Step 5.After passing through the filter, the data will no longer include "Orders" but will start with
"Order." This is because we need the information of "Orders/Order" due to sending the data in
fragments. After completing the process of sending fragmented data, we will merge it in the "Message
Body" of the Content Modifier.

Figure 6.Filter

Step 6.${property.payloadStack}${in.body} : We use it to continue adding each incoming fragmented


data to the body. You can take a look at it.
Figure 7. Content Modifier-Exchange Properties-Append Body

Step 7.We add the "Orders" tag, which we ignored with the filter, to this content modifier. Once the
loop is completely finished, we add the merged data as a property.

Figure 8.Content Modifier-Message Body

Step 8.To indicate that the last data will come in XML format, we add the "Content-Type" header.
Figure 9.Content Type

Step 9. We fill in the necessary information for the email adapter.

Figure 10.Mail Adapter Connection Information

Step 10.We determine who the email will come from and who it will go to.
Figure 11.Mail Adapter Processing Information

Step 11. Save and Deploy. Then once we have created a CPI link, we need to call it using the GET method
in Postman after the deployment.

Step 12. We are making a call to the CPI service using the CPI username and password.

Figure 12.Postman

Step 13. It entered the loop a total of 6 times, but on the 6th request, since there was no data left inside,
it combined the data sent in fragments, exited the loop, and continued to 'End'.
Figure 12.Monitoring

When we look at our first loop, it indicates that in the first request, it fetched the first 200 records from
the entire data set and provided the information that the next loop would start with OrderID 10448
using the expression "$skiptoken=10447".
In each loop, as it adds data, it indicates that there were 400 records in the 2nd request, and when it
enters the 3rd loop, it won't fetch the same initial 400 records again. Similarly, it shows that in the next
loop, the data will start with OrderID 10648.

The important point to note is that it continues to loop as long as the condition we set is met, meaning it
enters the loop as long as it evaluates to true.

When we check the final step, we understand that this condition returns false, indicating that it has
fetched all the data inside.
Due to the condition, since the Loop process has ended, we receive information that the last data has
the OrderID number 11047.

Finally, I wanted to add an email adapter. It notifies all the information via email.
 ${CamelFileName} -File Name
 ${CamelHttpUri} -HTTP URI (URL with QueryString)
 ${CamelDestinationOverrideUrl} -SOAP Adapter URL
 ${exception.message}- Error Message
 ${exception.stacktrace} -Error Stacktrace
 ${camelId} -Iflow Name
 ${SAP_ApplicationID} -ID created in Message Processing Log for searching in monitoring

Interview questions:

 optimization techinuque while developing ifows


 What kind of issues we wi get in production
 Why only ODATA for SF integration
 Name 5 SF ODATA APIS
 How do you analysze the requirmentents how do you develop ifow in details

These are the questions asked me in TCS walk-in

${file:Onlyname.noext}_${date:now:yyyy-MM-dd HH:mm:ss SSS}

Can you please help understand in which situations do we go for Error End and Escalation End? The only
difference that I can see is the message status which is Failed / Escalated. Apart from this, whats the
purpose of both and have you come across real situations needing to use them separately? Please share
your thoughts?

n SAP CPI, an *Error End* event is used when the integration flow encounters a critical failure that
cannot be recovered, such as a mapping error, connectivity issue, or missing mandatory data, and needs
to explicitly terminate while logging the error for monitoring and troubleshooting. For example, if an API
call fails due to invalid credentials, the flow can be terminated using an Error End to ensure the issue is
logged as a failure in monitoring. An *Escalation End* event, on the other hand, is used to signal a non-
critical error or exceptional scenario that may require alternative handling or routing within a larger
process context, such as notifying a stakeholder or triggering a compensatory process. For instance, if a
stock check API returns "out of stock," an Escalation End could notify the sales team to follow up, while
allowing other parts of the process to continue. Use *Error End* for unrecoverable issues and
*Escalation End* for managed deviations that do not halt the overall process.

In SAP CPI, *trace* mode captures detailed, end-to-end information about the integration flow,
including intermediate payloads, headers, and properties, to help analyze the flow step-by-step, but it
impacts performance significantly and is typically used for troubleshooting. In contrast, *debug* is more
focused on testing specific components like Groovy scripts, enabling developers to identify logic errors
or configuration issues within a particular step, with a lower performance impact and logs written in
scripts or the Message Processing Logs. Trace is for operational analysis, while debug is for development
and testing.

Multicast (Multiple Recievers )

Parallel : if any of the branch fails .. All the braches will executed and status of the iflow in Message
Processing will be failed

Sequential : based on sequential order the branches gets executed and if any of the branch fails ..
remaining branches will not be executed
Multicast with Single receiver (-> Join + Gather )

Both In Parallel and Sequential if any of the branch fails – Join and Gather are not executed

Parallel with Join + gather : if one of the branch fails - it will execute remaining branches and Message
will not reach Join and Gather and the Message Processing status will be in failed status in Monitor
Section ...

Sequential with Join + gather : based on sequential order the branches gets executed and if one of the
branch fails ,it will not execute remaining branches and Message will not reach Join and Gather and
the Message Processing status will be in failed status in Monitor Section ...

when splitting a message into multiple branches, changes made to headers or properties within one
branch are not reflected in the other branches,

Mail Adatpor:
You can download these server certificates ..server certificate chain ..root certificate ..intermediate
certificate ..peer certificate

Add those certificates in Monitoring->manage security->keystores ->Add certificate


click on confirm same repeat it for remaining two certificates
SFTP;

Set Dynamic Adapter Parameters (File Name, Directory) – SAP BTP IS-CI (CPI)
In SAP Cloud Integration Suite (BTP-IS/CI/CPI), configuring dynamic file names is not only possible but
can be done using a variety of techniques.

In the past, I wrote some articles on defining dynamic parameters such as filename, directory, etc in the
receiver file adapters in SAP PI/PO. There are several techniques like ASMA and Variable Substitution.
However, SAP Integration Suite CI (BTP-IS/CI/CPI) technique is more straightforward.

Common Use Cases of Dynamic File Name and Directory

 Adding a Timestamp to File Names


 Include a custom timestamp in the file name
(e.g., filename_yyyyMMddHHmmss.xml ).
 Creating Unique File Names with Message IDs
 How to append a unique message ID to avoid file overwriting
(e.g., filename_<messageId>.xml ).
 Adding Custom Parameters (e.g., Sender or Receiver Information)
 Dynamically including sender/receiver names in the file name
(e.g., file_<senderID>_to_<receiverID>.xml ).
 Adding Incoming Message Data Segments
 Dynamically including data elements from the incoming message like OrderID,
InvoiceID, etc in the file name (e.g., <OrderID>_yyyyMMddHHmmss.xml ).
 Determination of Target Location Based on Content
 At runtime determine the target directory the file should be saved based on
incoming message content, incoming filename pattern, etc. (e.g. Move files
starting with “Order_” or “Order” directory)
Scenario – Content-Based File Passthrough Interface
I will use the following scenario to demonstrate how the target directory can be determined during
runtime and dynamically assigned to the receiver adapter.

We define the filename with a unique time stamp and copy the file name prefix from the incoming file.

Imagine a scenario where you have files with different file name prefixes in a certain directory in the
SFTP server. I want to build an iFlow that can fetch and route these files to the target based on their file
name prefix.

For example, files starting with “Order” should be moved to “Orders” target folder on SFTP server.
Invoices to “Invoices” folder and all other files to “Other” folder.
In this scenario, we will make use of the following features of SAP Integration Suite interface
development techniques,

 Standard Header/Exchange Property Parameters


 Custom Header/Exchange Property Parameters Using Content Modifier
 Camel Simple Expressions
Step 1 – Configure the SFTP Sender Adapter
I am fetching all the files in the directory “In”. Here the Location ID is the location I have registered in
Cloud Connector. If you are interested in learning more you can check my complete Cloud Integration
with SAP Integration Suite online course.
Step 2 – Configure Content-Based Router
The filename of the incoming file will be available in the header parameter, “CamelFileNameOnly“. We
will route the files based on the prefix of the filename. Using a regex expression, we can find if the
filename matches the pattern we are looking for.
${header.CamelFileNameOnly} regex 'Order.*'

Regular expression to check if the file name starts with “Order”

${header.CamelFileNameOnly} regex 'Invoice.*'

Regular expression to check if the file name starts with “Invoice”

Step 3 – Make Use of Exchange Parameter or Header Parameter to Set the Directory
Let’s make use of content modifiers to determine the directory at runtime. We will have an exchange
property parameter named “directory” to set the value of the directory.

Step 4 – Configure the Reciever Adapter Using Dynamic Parameters


Make use of the Exchange Parameter to define the target directory in the receiver adapter
configuration.
We will make use of a couple of Camel Expressions to define the filename dynamically.

${file:onlyname.noext}

Camel Simple Expression to get the prefix or the filename of the incoming file without the extension.

${date:now:yyyy-MM-dd}

Camel Simple Expression to add the date as the 2nd part of the file name in the format yyyy-MM-dd.

${date:now:HH-MM-SS}

Camel Simple Expression to add the time as the 3rd part of the file name in the format HH-MM-SS.

Other Methods of Setting a Dynamic File Name in SAP Integration Suite CI (BTP-IS/CI/CPI)
In the example, we made use of a custom Exchange/Header Parameter, a standard header parameter
and a Camel Simple Expression to dynamically define the directory and filename at the receiver adapter.

However, other methods can set adapter parameters dynamically at runtime.


Using Groovy Script Or an UDF
Groovy scripting allows for complex logic when setting dynamic file names. This method is helpful when
you need to combine multiple variables or perform complex transformation logic to define the adapter
parameters.

import java.text.SimpleDateFormat

// Get current timestamp

def sdf = new SimpleDateFormat("yyyyMMddHHmmss")

def timeStamp = sdf.format(new Date())

// Get message ID

def messageId = message.getHeader("CamelMessageId", String.class)

// Set file name dynamically

def fileName = "file_" + messageId + "_" + timeStamp + ".xml"

// Set the file name as a header

message.setHeader("CamelFileName", fileName)

Here we define a file name in pattern: file_<messageID>_<time stamp>.xml

Using Content from the Incoming Message


You can set a dynamic file name by extracting content from the incoming message payload or headers,
such as customer ID, order number, or invoice number, and appending it to the file name.

You might also like