0% found this document useful (0 votes)
305 views177 pages

IDOC Now Printout

document
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
305 views177 pages

IDOC Now Printout

document
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 177

Content modifier in body section you can call property and header also

 Idoc- combination of Message type and Idoc type /Basic type


 RFC
 Proxy- they create structure in created in ECC /s4 hana system using tools and they share the
wsdl to cpi team through wsdl they will be sending the data to CPI
 SFTP File

IDOC Adaptor is soap based

 RFC adaptor—if we are not able to connect we will contact Basis team
 XI adaptor

If there are no messages in SM58 the u Have to check in SMQ1 as well

In this we have two nodes

In SAP Cloud Platform Integration (CPI), the main difference between a tenant management node and a
runtime node is that a developer can modify the tenant management node, but only check the status of
messages in the runtime node:

When we login into BTP we have Multiple services


in that we focus on Integration Suite and integration Suite in turn consists of 4 Services

 SAP CPI
 SAP API
Open connectors
 Integration Advisor

Open connector – if there is no adaptor provided by sap and you want to connect cpi with any third
party system then we go for open connector
Integration Advisor: when you have concept of EDI then we have to go for Integration Advisor

Steps for Free account:

 Register and Login


 Create Sub account
 Create Integration Suite Instance
 User Access Management
 Adding capabilities

Under Global Account we can have any number of sub accounts


Second one :Tenant url

Second one is where the iflow is created and first one is where it is deployed

 Basic Edition
 Standard Edition
 Premium Edition

In real time also we cant Poll more than one file using Poll Enricher in Single run....? Poll enrich will pull
only one file real time too

We can also use runtime property SAP_PollEnrichMessageFound to check file exist condition in looping
process call step.
can you please make a video to pick files from 10 different directories of single sftp server using sftp
adapter?

u have to use multiple "poll enrich" functions to pull data from multiple folders

how to maintain multiple directories.. How can we achive this ( via value mapping?)

Q: What is the different between local variable and global variable?


A: Local variable can be access via same iFlow only. Global variable can be access via differet iFlows.

Q: How to read local variable and global variable?


A: Use Content Modifier read to either header or property.

Q: How to write Variable?


A: In iFlow, use 'Write Variables' step, take value from header/property/Xpath/expression.

Q: At iFlow first run, variable not created yet but need to use some initial/default value for processing,
how to handle such chicken-and-egg situation?
A: Use Content Modifier read variable and set default value.

Q: Beside create, how to update variable?


A: Just use 'Write Variables', if existed will just update/replace.

Q: Is it possible local and global variable having same name?


A: Yes. Since the scope is different between local and global.

Q: How to do delta synchronization via timestamp?


A: Use variable to remember last processed timestamp, so that next scheduled run will resume from last
process timestamp onward.

Q: What need to consider when design delta synchronization via timestamp?


A: (1)Data should be sorted by timestamp.
(2) Timestamp should be unique (E.g. only date without time might not work).
(3) The right date field should use for delta synchronization.
(4) Only update last processed timestamp at last step if all processing success.
(5) Timer/scheduler interval.

Q: What if I need to revert back to earlier past timestamp?


A: Build in same iFlow a manualRun/adhocRun flag to set manual timestamp, override value in variable.

Q: Should I use global variable or local variable?


A: Use global if other iFlow need to access same variable. Global can behave like local, but not the other
way round.

Q: Can I use SAP CPI manage variable page to write variable?


A: No, as of now, no way to create/update variable manually,but via IFlow only.

Q: What other way to manage read write of global variable?


A: Build generic iflow and use Postman to read write global variable.

Q: What ways can be use to delete variable?


A: Manual delete via 'Manage Variables' page.

Q: What other potential use of variable?


A: Access same variable value in different branches of Multicast (because property will not work).

After write variable we need to use content modifier –tip

Data Store :

Q: How to write to data store?


A: Use DS write step.

Q: At Write DS, Is it mandatory to specify Entry ID?


A: No.

Q: What happen if write to DS with same entry ID twice?


A: By default will fail/error. If selected 'overwrite existing msg' then will replace/update.

Q: Is it message body only will write to DS?


A: Body always write. If select 'include msg headers' then headers will be write to DS as well.

Q: What kind of payload format can write to DS?


A: No restriction. Xml/Json/Text also is fine.

Q: What are the ways to read DS entries in iFlow?


A: Use DS Get, DS Select or DS sender adapter.

Q: For DS Get, what happen if not specify entry id?


A: Will fail, entry id is mandatory for DS Get.

Q: What are the main different between DS Get and DS Select?


A: DS Get fetch single entry; DS Select fectch multiple.
A: DS Get mandatory to specify entry id; DS Select no option to enter entry id.

A: DS Get support different data format; DS Select only support XML.


Q: After Get or Select from DS, what are the ways to delete the entry?
A: Use 'delete on completion' or DS delete by entry id.

Q: When writing a list of records to DS, if some record processing failed, will the DS operation have
partial success, partial failed?
A: Yes. Now new iflow by default 'transaction handling' is none.

Q: How to select multiple entry from DS, process each entry independently one-by-one, process those
success one, skip those failed one, and further write to different DS in case success or failed?
A: Use combination of DS select, splitter, router, exception sub-process, 'transaction handling' setting, 1
source DS + 2 target DS to achive this. Will show in course lesson.

Q: What data format supported by DS sender adapter?


A: Xml, non-xml or any format also is ok.

Q: What so special about DS sender adapter, compared to DS Get & Select?


A: DS sender adapter have auto retry feature.

Q: Why DS sender retry consider as 'smart' retry? Describe it, please?


A: It have 'Exponential Backoff' retry option. Each retry will double the wait time.

Content Modifier use cases:

Read values into header or property from global variable

Read values into header or property from local variable

Setting below headers that are useful for monitoring

 SAP_applicationID
 SAP_CorrelateMPLs
 SAP_MessageType
 SAP_Sender
 SAP_Reciever
 SAP_MessageProcessingLogID

For converter json to xml the json cannot start with array so quick and easy way is to use content
modifier to wrap the json object
For appending xml body purpose-Example in looping process call keep appending$
{property.appended_body}

<row> so that all the xml row combine together

Save the payload for later use – use property to temporarly store payload

Extracting value from body and setting header or property from xml with source type as xpath

 SAP TO 3rd party or CPI outbound IDOC


 3rd party /CPI to SAP- Inbound IDOC

Outbound Configuration

Step 1 : Define logical system (LS)

 Go to T-Code : BD54
 Assign this logical system for client.

Go to T-Code : SALE => Logical Systems => Assign Logical System to Client

Step 2 : Define Http connection to External Server (Type G)

Step 3 : Define Port

In PI, we create tRFC port. In CPI we create XML HTTP Port

 Go to T-Code : WE21

 Click node : XML HTTP

 Click F7 / Create

Step 4 : Add TRUST of CPI into SAP ERP

Step 5 : Test connection from SAP ERP to SAP CPI ( Check connection HTTP to External Server)

Go to T-Code : SM59

Choose connection type G : HTTP connection to external server

Click Test Connection

Inbound Configuration:

Step 1 : Active service

 Go to T-Code : SICF

 Search service with path : /sap/bc/srt/idoc


 F8 to execute.

Step 2 : Register service

Go to T-Code : SRTIDOC

Choose Register Service. Press F8 to run with default

If service is registered, we will receive this message


Step 3 : Run Test Service to get Endpoint.

Go to T-Code : SICF

Search service /sap/bc/srt/idoc

Right click and choose Test Service

Note the URL that comes in your browser. This is the URL where your IDoc adapter in HCI needs to point.
The URL will be of format :

http://host:port/sap/bc/srt/idoc?sap-client=<clientnumber>

Step 4 : Create Logical System

 Go to T-Code : SALE

 Create Logical System

 Assign this logical system to client.

Step 5 : Create partner profile and add Inbound parameter

Go to T-Code : WE20

Choose partner type = LS from left side tree view

Choose create and input information in step 4 – local system as Partner No.

Step 6 : Test service from POSTMAN

Case 1 : No body or wrong data XML.

Go to T-Code : SOAMANAGER
SAP :

 IDOC
 RFC
 Proxy - we use either soap adaptor or xi adaptor

In CPI

Consumer Proxy: its also called Reciever proxy( Abap proxy at receiver side

Provider Proxy: it is also called Sender proxy (ABAP proxy at sender side)

SAP application (IDOC/RFC/Proxy) on Premise if it is at Sender side then we don’t need to use cloud
Connector

SOAP adaptor :

we will generate wsdl and give it to Abaper or ABAP team

End point will be given to Basis team so that In SOA manager they will create logical port

XI Adaptor:
Rfc g type

Sxmb_adm(transaction code): IS _URL CPI ki veledi destination maintain cheyali

Idoc – to transfer data asynchornously

SAP application ECC/,S/4,crm

PI – ALE Support

 SM59->rfc destination
 Port
 Logical system
 Partner profile

CPI :

 STRUST (TCODE)- Import SSL certificate


 Rfc ->HTTP destination ( HTTP destination are two types G AND H type ) we use G type
connection
 Port
 Logical system
 Partner Profile

We need to import certificates at both ends ie., ECC and CPI side

 We need to download CPI certificates from Manage store and provide it to Basis team them will
upload in STRUST
 For SAP Certificate we have to download from STRUST and we have to upload to CPI -> Manage
Security -
 Cloud Connector:
 SAP on premise application on sender side then we don’t need cloud connector Ie.,RFC/IDoc
/Proxy

 CPI :
 We will create two iflows
 Common iflow - For every IDoc ki ade RFC port vadutamu adi RFC destination vadutamu sap
side CPI lo ki ragane value mapping dawara idoc MatMas ayithe ee receiver ki velali..
orders ayithe ee receiver ki velali ani Value Mapping lo we put conditions .. we identify the
receiver based on source agency ..target agency..idoc type ,Message type) we are routing
Messages
 Idoc Sender- CPIValue MappingProcess Direct
 Process Direct Sender  CPI SFTP/REST(Actual receiver)

 In SAP Side Instead of Multiple RFC destinations
 Idoc Sender SIDE AYITHE In Control records you will have document Number field -> Idoc
Number it will created in SAP application we get it from SAP Application
 IDOC receiver side AITHE we are generating it in SAP Application it will come in IDOC
response header Property(SAP will be generating it)..
 Ie.,Applicaiton Message ID

 Process direct - to call from one iflow to another iflow Its Internal adaptor
 When to use which ADATPOR:

 1) You can use relevant adapter based on the Target Systems call( they have IDoc or they have
proxy or they have web service exposed from SOAMANAGER).
 2) If standard IDocs are available in SAP S4 or ECC either inbound or outbound for the particular
interface you will be designing your IFlow having IDoc adapter.
 3) If there are no standard IDocs available then they might take Custom Proxy approach then
you should go with XI adapter.
 4) If there are any services exposed as web services from SOAMANAGER[ ex Successfactors
Employee and ORG Data replication ] then you will go with SOAP Adapter.
BD54 Transaction code or Sale is one and same it will navigate to same screen

ABAP language is used

Ways to connect SAP s4 Hana to CPI

 IDOC
 RFC
Proxy
 SFTP File

Intermediate Document

Idoc Number:

This however means, for each IDOC type thats flows from S4H to CPI, a separate Type G connection will
have to be created [/cxf/matmas or /cxf/debmas....] in SM59

One could possibly mitigate this by having a generic routing iFlow [/cxf/idoc and thereby having only 1
SM59 connection for IDOCs to this CPI iFlow] and then making process direct calls to individual iFlows
processing different IDOC types, depending on field values in the control segment.

It would be ideal though, if SAP could build something on S4H, which would allow specifying a generic
path prefix [/cxf/idoc/<idoctype>] in SM59 connection [so that there is one connection, like in tRFC port
for IDOCs between ECC and PO ] and dynamically route the calls [?] to the iFlow endpoints [ /matmas or
/debmas... maintained in Connection of respective iFlows] from S4H.
Control records – idoc type message type idoc number Sender port reciever port, Reciever
partner,sender partner ..

Data Records

-
Direct 1- outbound

Direct 2 is Inbound

In Part1 I described the necessary settings to RZ11, STRUST and SM59 type G RFC Connection, IDOC Port
in WE21 and Partner Profile in WE20

1.
System Connection between S4H and CPI
2. Background Configurations required
3. Configure Idoc Sender Adapter
4. Search messages in message monitoring based on Idoc Number
To be able to send messages from SAP S4Hana to SAP CPI import CPI certificates in S4Hana STRUST. You
can download CPI certificates either from keystore or from any browser.

1. Login to S4hana system --> Tcode STRUST (Import all 3 CPI certificates) --> Add to Certificate List

2. Create Logical System for S4H and CPI system in Tcode BD54
3. Create RFC Destination of Type G -->

Host = CPI Tenant,


Port = 443, Path Prefix = from Sender Adapter (CPI - IFlow); Logon & Security

4. RFC Connection test returns HTTP 500 response


5. Create Port WE21 --> Port Type --> XML HTTP

6. Partner Profile WE20 -->

1. Import Outbound Message type with port created in earlier step


7. In the Custom IFlow, give the same path prefix as given in RFC Destination

IDoc Adapter

Configuration -->

8. In-order to be able to search my Iflow in monitoring based on Idoc number, I have configure
Content Modifier with Header Values

I am using
Headers --> SapIDocDbId, SapIDocTransferId, SapIDocType in my mail body and created
SAP_ApplicationID to pass Idoc Number from payload which will help me to monitor messaged
based on Idoc Number. Value for SAP_ApplicationID is Xpath of my Idoc DOCNUM
9. Configure Mail adapter and deploy IFlow

10. Trigger IDoc in S4Hana, WE02 -->

1. Status = 03 --> Idoc Triggered to CPI


2. Status = 02 --> Check Iflow status in CPI

11. CPI --> Message Monitoring

12. Mail Received Successfully


as shown below.

End to End IDOC CPI IFLOW


The main methods of generating outbound iDocs in SAP are,

1. Using ALE Change Pointer mechanism


2. Via Output Determination
3. Using custom ABAP Program
Purchase order data /invoice /shipment notification/product/vendor
IDoc Number using IDoc number we can check the status

Abap team will sit with functional team ..prepare the idocs we will use these idoc structure

How gather know all records recived or not right?

The answer is based on one camel condition by default before gather we use splitter right in that we
have one option like camelsplitComplete end is true .. When we get that true value that was the final
record so gather also understand there is no more records
S4 Hana is both on premise and cloud

How many tenants we can create depends on subscription in btp..right

In real time –Dev and QA in one Tenant

Production – another tenant

In real time we are not the owners for creating tenant btp team only provide the tenant to us And even
we are not having the access also for btp

Difference between data store and variable ?

Is data store we can store entire message/payload whereas variable stores single value like timestamp
or date Data store will store until the the days which we mention
Data Store for this cpi taking some charges so we have to pay if store long time .. For variable you no
need to pay just it's small place to see something when required.. Aggregator concept was different it's
hold in the run time only but data store holds how much time you want

How will aggregator know that last message is reached?


Scope:

Property is with in the flow only but comming to variable if you declare I flow we can use in the I flow
level only but if you declare in the global level where ever you want you can use

Data store max retiration period is 180 days but by default we have 30 days
Suppose I have two integration flows ( IF1,IF2), where IF2 is connected to IF1 via process direct .

Now properties in IF1 cannot be accessed in IF2 because the scope of property is with the iflow (IF1), but
where as headers created in the IF1 can be accessed in IF2 because it as global scope.

If you want to use that property you need to call that property into the header in the same I flow

Fixed vs value mapping:


Data store use case:

We are having the payroll data with both active and inactive employees that data we can store in the
data store by using write variable

In the next run we can get that data which we have in the data store by using get or select variables and
in the I know level we will compare both active and terminate employees data by using the script or
what ever you use and delete the terminate employees data and we again store the active employees
data in the data store with the entity name payroll records
You can get more details on exception using ${exception.message} or ${exception.stacktrace}.

Say Within 1 integration flow, the main process calls a subprocess using a local process call.

If an exception is raised in the subprocess, the processing of the main integration process ends?

IDoc scenarios:

Common iflow - Idoc sender-cpi-value mapping -process direct

Process direct sender-cpi- actual reciever

Value mapping we define conditions if it's Matmas it should go to this receiver If it's order it should go
to this reciever

In cpi each interface has different endpoint


Sap side instead of multiple RFC destination we can have one ..this way we can reduce burden at sap
side

Extract an iDoc structure from SAP ERP to utilize in SAP CPI

As we currently replace our SAP Business Connector with SAP CPI, we faced the question on how to get
the iDoc structure out of SAP ERP to import into SAP CPIs message mapping and avoid to manually
create that. After After our research we found a way which really helped us so that I want to share the
details with you within this blog article.
In the end, it is a very easy to go procedure. For this example we use the SYSTAT01 iDoc which we need
for this article. Nevertheless, it works with the other iDocs and also iDoc enhancements as well.

1. After you have logged in into your SAP CRM/ERP, you access transaction WE60 and enter the
iDoc name you want to download the XSD for.
2. Before submitting or something else you open in the menu bar the Documentation menu and
select XML Schema.

3. You confirm that you want to export it as Unicode.

4. Now you see the XSD content. To download that, right click the screen and select View Source.
An editor window opens and when you save that file as *.xsd, you have the XSD-structure of the
iDoc.

5. Upload the downloaded XSD-file to your SAP CPI message mapping component as source
message.
And its done. Now you can develop your CPI message mapping as every other message mapping
and complete your development.

Just one question: Is there a special reason why you didn't use the report SRT_IDOC_WSDL_NS to
download the wsdl of the IDOC structure?

IDOC segments with max occurs upto 99999 ( length 5) is supported in CPI.

minOccurs="0" maxOccurs="99999">

So please replace anything more than 5 with unbounded.

if minOccurs="0" maxOccurs="999999">
replace it with if minOccurs="0" maxOccurs="unbounded">

I need to export the XSD file for an IDOC structure that has a custom extension. Since the WE60
transaction has a single select option for the Basic type or the Extension, how do you get an XSD for the
complete IDOC structure?

You just need to set the extension on WE60. It gets you the full idoc XSD including the parent nodes

Import Certificate into STRUST


You need to import all certificates to both the SSL Client (Anonymous) as well as the SSL Client
(Standard) in the transaction STRUST in your ABAP system.

1. Log on to your ABAP system


2. Go to transaction STRUST
3. Switch to change mode (Ctrl+F1)
4. Double-click on "SSL client SSL Client (Anonymous) in the navigation tree on the left side
5. Click menu "Certificate" > "Import"
1. Use the file path selector to select your DigiCert Root CA certificate downloaded before
2. Click the button with the green check-mark
6. Click menu "Edit" > "Add certificate"
7. Double-click on "SSL client SSL Client (Standar" in the navigation tree on the left side
8. Make sure the certificate information are still visible in the "Certificate" area in the lower right
panel
9. Click menu "Edit" > "Add certificate"
10. Repeat steps 4 to 9 for all remaining certificates
11. Click the "Save" button

-------------------------------------------------------------------------------------------------------------------------------

 Create context
 Remove context
 Collapse context
 Splitbyvalue
 Useoneasmany
 Mapwithdefault
 Formatbyexamole
 Sort
 Sortbykey
 Createif
 Get property
 Get header

What is pagination in SAP CPI/CI?

Pagination in SAP CPI/CI: Pagination in SAP CPI/CI refers to the process of dividing a large dataset into
smaller, more manageable chunks, called pages, to improve performance and reduce the amount of
data transferred between systems.

Why is pagination needed in SAP CPI/CI?

Pagination is necessary in SAP CPI/CI because: Large datasets can cause performance issues and slow
down the integration process. Transferring large amounts of data can lead to increased latency and
network congestion. Some APIs and systems have limitations on the amount of data that can be
retrieved in a single request.

How does pagination work in SAP CPI/CI?

In SAP CPI/CI, pagination works by: Dividing the dataset: The dataset is divided into smaller pages, each
containing a fixed number of records.

Retrieving a page:

The integration flow retrieves one page of data at a time, using a pagination token or cursor to keep
track of the current position in the dataset. Processing the page: The integration flow processes the
retrieved page of data and then requests the next page, using the pagination token or cursor to retrieve
the next set of records.

Types of pagination in SAP CPI/CI:

There are two main types of pagination in SAP CPI/CI:

Client-side pagination: The client (i.e., the integration flow) handles pagination by specifying the page
size and page number in the request.

Server-side pagination: The server (i.e., the API or system being integrated) handles pagination by
returning a fixed number of records per page and providing a pagination token or cursor to retrieve the
next page.

Common pagination parameters in SAP CPI/CI:


Some common pagination parameters used in SAP CPI/CI include:

$top: Specifies the number of records to return per page.

$skip: Specifies the number of records to skip before returning the next page.

$paging: Specifies the pagination method (e.g., cursor or snapshot).

$snapshotSize: Specifies the size of the snapshot (i.e., the number of records to return per page).

Benefits of pagination in SAP CPI/CI:

Pagination in SAP CPI/CI provides several benefits, including: Improved performance: By reducing the
amount of data transferred and processed, pagination can improve the performance of the integration
flow. Reduced latency:

Pagination can reduce the latency associated with transferring large amounts of data. Increased
reliability: Pagination can make the integration flow more reliable by reducing the risk of errors caused
by large datasets.

Content Transport Using CTS+

SAP Cloud Integration provides an option to transport integration content directly to CTS+ system. You
can then transport this content from the CTS+ system to your target SAP Cloud Integration tenant.
Here's how you can transport content to CTS+.

1. Select the integration package that you want to transport.

2. Choose Transport.

If you don't see the Transport button, contact your tenant administrator to enable transport
option in the tenant settings.

In the Transport Comments prompt, you can see the type of transport under the Mode field configured
by the tenant administrator. Provide comments under the Comments section and choose Transport.

You see a prompt with the Transport ID. The integration package is transported to the CTS+ system.

; Import the content in the target system

1. Transport using Transport mode CTS+Direct


CTS+Direct would push MTAR content to an open Transport request in the configured CTS+ system
maintained as a Destination.

 In the Test/Source tenant, select the integration package that you want to transport.
 Click on Transport.

 In the Transport Comments prompt, provide comments and click on Transport.

 Transport id will be displayed as shown above.


 Navigate to target system and select the transport ID request and click on Import.

 Confirm Import with Date: Immediate and Import Options as: Leave Transport Requests in
Queue for Later Import.
 The Integration package is imported into the Production tenant Account.

Note: In case of errors, you can check the logs by selecting the Import request and clicking on Logs
button.

--------------------------------------------------------------------------------------------------------------------------

Content Transport using Manual Export and Import

Context
One of the options to transport content from one tenant to another is to use
the Export and Import options for your integration package. The application imports the integration
package to your local file system in the form of a .zip file. You can import the same file in the target
tenant using the Import option.

Procedure

1. Select the integration package that you want to export.

2. Choose Export.

A .zip file is downloaded to the default browser download location on your local file system.

3. Log in to the SAP Cloud Integration web application to which you want to import the content.
Choose .
4. Choose Import.

A new window opens in your file system explorer, allowing you to access your local file system.

5. Navigate to the folder path where the .zip file was downloaded in step 3.

6. Select the file and choose Open.

You see a prompt indicating the successful import of the .zip file. You can see the imported
integration package in the Design tab of your target tenant.

How to setup a SAP BTP Integration Suite Trail Account Ready to use & Develop iFlows?

SAP BTP Cockpit > Services > Service Marketplace > Integration
Testing Message Mapping
The mapping editor provides two ways of testing message mapping:

1. Simulate – for testing the entire mapping XML.

2. Display Queue – for testing a specific node of the XML.

1. Simulate: The mapping simulates option enables you to test the entire mapping structure. The system
shows if the mapping contains any errors, giving you a chance to fix these errors before deploying the
integration flow. Once you complete the mapping, you can choose Simulate to run a simulation of the
mapping.

Display Queue: The display queue option enables you to test the mapping of a specific node. In
the Mapping expression area, provide a Test Input File and choose  (Display Queue) to display the
simulated output for the provided test input file.

Even if the integration flow isn't in edit mode you can execute simulate and display queue test. You can
hence perform the tests for configure only content as well.

You can refer to input XML file uploaded for simulation, for display queue also, and vice versa.

When you revise a mapping schema, be it the source or target message, existing mapping definitions
continue to remain as is. That is, if the revised schema doesn't include a previously used node, the
graphical editor doesn't automatically delete such unused nodes.

What is “Context” in SAP Integration Suite (CPI) and PI/PO Mapping?


Before we look at the differences between Collapse Context and Remove Context, let’s look at what a
‘Context’ in the Graphical Mapping feature of SAP Integration Platforms means.
XML messages in SAP PI/PO/CPI are handled as Queues. In a queue, Context is the position or the level
of an XML node (element) relative to a parent node.

If elements belong to the same parent node, the elements are in the same Context. When the parent
node changes, Context Change is inserted into the queue. Context Changes are shown in dark grey

color.’

Therefore, when we talk about the Context of a message it is always based on a certain parent node. An
XML element can have different Context queues depending on the parent node it is related to.
Let’s use the Message Type below to understand Context Changes and Node Functions.

XML can have multiple exam grades of multiple students.

 Element ‘Grade’ context changes based on immediate parent node ‘Grades’.


Grade values of each exam are assigned to separate Contexts.

 Element ‘Grade’ Context Changes based on higher level parent node ‘StudentDetail’.

Now all grades of each student are in one Context.

Collapse Context vs Remove Context.


Although the two node functions behave differently, both Collapse Context and Remove Context delete
the Context Changes from the input queue.
1. How Context Values are Copied in Collapse and Remove Context.
The major difference between these two functions is how they copy input Context values to the output.
While Collapse Context only copies the first value of each Context from the input, Remove
Context copies all the values from the input to the output.
2. How SUPPRESS Values Are Interpreted by Collapse Context and Remove Context When “Keep
SUPPRESS Values” Are Disabled.
Let’s look at how each function acts when “keep SUPPRESS values” property of IF-THEN function is
disabled.

Another difference is how these node functions handle SUPPRESS values in queues. While Remove
Context ignores the SUPPRESS values and does not copy them to output, Collapse Context copies them
as null [] values to output.
Collapse Context copies SUPPRESS Values as [].

Remove Context ignores SUPPRESS Values.


3. How SUPPRESS Values Are Interpreted by Collapse Context and Remove Context When “Keep
SUPPRESS Values” Is Activated.

In this situation both functions handle SUPPRESS values the same way. SUPPRESS values are considered
similar to any other value in the same context.
Both functions remove the Context Changes. The collapse Context function copies the first value from
each Context, and Remove Context copies all values of each context to output.
You can only remove these SUPPRESSED values using a UDF.

Although in this example I use SAP PI/PO, in SAP Integration Suite the functions behaves the same way.
The concept remains the same in SAP CPI.

If you have any questions about Collapse Context and Remove Context node functions, please, leave a
comment below. I will be happy to help.

Remove Context ignores SUPPRESS Values.


---------------------------------------------------------------------------------------------------------------------------------

Set Dynamic Adapter Parameters (File Name, Directory) – SAP BTP IS-CI (CPI)

In SAP Cloud Integration Suite (BTP-IS/CI/CPI), configuring dynamic file names is not only possible but
can be done using a variety of techniques.

In the past, I wrote some articles on defining dynamic parameters such as filename, directory, etc in the
receiver file adapters in SAP PI/PO. There are several techniques like ASMA and Variable Substitution.
However, SAP Integration Suite CI (BTP-IS/CI/CPI) technique is more straightforward.

Common Use Cases of Dynamic File Name and Directory

 Adding a Timestamp to File Names


 Include a custom timestamp in the file name
(e.g., filename_yyyyMMddHHmmss.xml).
 Creating Unique File Names with Message IDs
 How to append a unique message ID to avoid file overwriting
(e.g., filename_<messageId>.xml).
 Adding Custom Parameters (e.g., Sender or Receiver Information)
 Dynamically including sender/receiver names in the file name
(e.g., file_<senderID>_to_<receiverID>.xml).
 Adding Incoming Message Data Segments
 Dynamically including data elements from the incoming message like OrderID,
InvoiceID, etc in the file name (e.g., <OrderID>_yyyyMMddHHmmss.xml).
 Determination of Target Location Based on Content
 At runtime determine the target directory the file should be saved based on
incoming message content, incoming filename pattern, etc. (e.g. Move files
starting with “Order_” or “Order” directory)
Scenario – Content-Based File Passthrough Interface
I will use the following scenario to demonstrate how the target directory can be determined during
runtime and dynamically assigned to the receiver adapter.
We define the filename with a unique time stamp and copy the file name prefix from the incoming file.

We define the filename with a unique time stamp and copy the file name prefix from the incoming file.
Imagine a scenario where you have files with different file name prefixes in a certain directory in the
SFTP server. I want to build an iFlow that can fetch and route these files to the target based on their file
name prefix.

For example, files starting with “Order” should be moved to “Orders” target folder on SFTP server.
Invoices to “Invoices” folder and all other files to “Other” folder.

In this scenario, we will make use of the following features of SAP Integration Suite interface
development techniques,

 Standard Header/Exchange Property Parameters


 Custom Header/Exchange Property Parameters Using Content Modifier
 Camel Simple Expressions
Step 1 – Configure the SFTP Sender Adapter
I am fetching all the files in the directory “In”. Here the Location ID is the location I have registered in
Cloud Connector. If you are interested in learning more you can check my complete Cloud Integration
with SAP Integration Suite online course.
Step 2 – Configure Content-Based Router
The filename of the incoming file will be available in the header parameter, “CamelFileNameOnly“. We
will route the files based on the prefix of the filename. Using a regex expression, we can find if the
filename matches the pattern we are looking for.

Regular expression to check if the file name starts with “Order”

${header.CamelFileNameOnly} regex 'Order.

Regular expression to check if the file name starts with “Invoice”

${header.CamelFileNameOnly} regex 'Invoice.*'

Step 3 – Make Use of Exchange Parameter or Header Parameter to Set the Directory
Let’s make use of content modifiers to determine the directory at runtime. We will have an exchange
property parameter named “directory” to set the value of the directory.
Step 4 – Configure the Reciever Adapter Using Dynamic Parameters

Make use of the Exchange Parameter to define the target directory in the receiver adapter
configuration.
We will make use of a couple of Camel Expressions to define the filename dynamically.

Camel Simple Expression to get the prefix or the filename of the incoming file without the extension.

${file:onlyname.noext}

Camel Simple Expression to add the date as the 2nd part of the file name in the format yyyy-MM-dd.

${date:now:yyyy-MM-dd}

Camel Simple Expression to add the time as the 3rd part of the file name in the format HH-MM-SS.

${date:now:HH-MM-SS}

Other Methods of Setting a Dynamic File Name in SAP Integration Suite CI (BTP-IS/CI/CPI)

In the example, we made use of a custom Exchange/Header Parameter, a standard header parameter
and a Camel Simple Expression to dynamically define the directory and filename at the receiver adapter.

However, other methods can set adapter parameters dynamically at runtime.

Using Groovy Script Or an UDF


Groovy scripting allows for complex logic when setting dynamic file names. This method is helpful when
you need to combine multiple variables or perform complex transformation logic to define the adapter
parameters.

import java.text.SimpleDateFormat // Get current timestamp def sdf = new


SimpleDateFormat("yyyyMMddHHmmss") def timeStamp = sdf.format(new Date()) // Get message ID
def messageId = message.getHeader("CamelMessageId", String.class) // Set file name dynamically def
fileName = "file_" + messageId + "_" + timeStamp + ".xml" // Set the file name as a header
message.setHeader("CamelFileName", fileName)

Here we define a file name in pattern: file_<messageID>_<time stamp>.xml


Using Content from the Incoming Message
You can set a dynamic file name by extracting content from the incoming message payload or headers,
such as customer ID, order number, or invoice number, and appending it to the file name.

Create Use an Xpath expression or other methods to extract the values you need from the incoming
message payload and assign

To summarize, we can make use of standard and custom Header/Exchange Property parameters to
determine different receiver adapter parameters. Not only use parameters, but you can also use Simple
Expressions to assign custom values during runtime.

Introduction to Node Function UseOneAsMany.


UseOneAsMany is a node function included in SAP PI/PO and SAP Integration Suite (CPI) Graphical
Mapping. UseOneAsMany allows us to repeat a value from the source message to multiple segments of
the target message. Understanding how this node function operates allows us to avoid creating complex
User-Defined Functions (UDFs). For example, if you need to repeat a value of a header segment into
multiple repeating target elements, the node function UseOneAsMany will be your best option.

This function UseOneAsMany works the same way in both the new SAP Integration Suite (CPI) and older
PI/PO.

The key to learning UseOneAsMany node function is to understand the Input variables and the rules of
using them.

SAP Versions used in the illustration:

 SAP PO 7.5
 SAP Integration Suite on BTP (CPI)
Input and Output of UseOneAsMany Node Function.
UseOneAsMany has 3 inputs or arguments. The output of the UseOneAsMany function is derived from
these three inputs.

 Input 1: Value from source message which should be repeated at target.


 Input 2: Number of times the first argument ‘Input 1’ should be repeated at the target.
 Input 3: Context Change of the target or output.
Rules of Using the Inputs of UseOneAsMany.
The three Input variables mentioned above should be used in a certain way for the node function to
work properly. Here are the rules you need to follow when assigning these three variables from Source
message.

 Rule 1: Input 1 should not contain repeating values in the same context.
 Rule 2: The total number of context changes in Input 1 and Input 2 should be equal.
 Rule 3: The number of values in Input 2 should be equal to Input 3.
Rule 1: Input 1 cannot have repeating values.
Each context of Input 1 should have only 1 value.

Input 1 is the value from the source message that we are trying to repeat in the target, therefore, each
context of Input 1 should have only 1 value.

Here are some examples of how to assign the first argument of UseOneAsMany function should be
assigned.

Input 1 correctly assigned with no repeating values in a context.


Input 1 with a single value correctly assigned with no repeating values in a context.
In both cases above each context has only 1 value. Values are not repeated.

Incorrect assignment of argument. Multiple values in the same context.

In this context, values are repeated. There are two values in the same context. This is an invalid value
assignment for argument 1 of this node function.
Rule 2: Input 1 and Input 2 Should Have the Same Number of Context Changes.
The number of context changes in Input 1 and Input 2 should be equal.

Here are some examples of correct and incorrect assignments of Input 1 and Input 2 of the node
function.

The number of context changes in Input 1 and Input 2 are equal. Total of 2 context changes.
Context changes of Input 1 and Input 2 are equal. A total of 3 context changes

Incorrect assignment of Input 1 and Input 2 of UseOneAsMany. Context changes are not equal.

Rule 3: Total Number of Values in Input 2 and Input 3 should be Equal.


The total number of values in Input 2 and Input 3 queues should be equal.
For example, both queues of arguments have 4 values each.

Correct use of arguments 2 and 3 in node function UseOneAsMany.

If you follow these rules when assigning the inputs, you will not come across any Queue Exceptions.

Illustration UseOneAsMany Node Function with Examples.


Let’s look at the use cases of UseOneAsMany node function with a couple of examples.

Example 1: Repeat the Purchase Order Number Using UseOneAsMany.


Let’s assume we have an interface to transfer Purchase Order information. The source message and
desired target message are as follows.

Purchase Order (PONumber) should be repeated at target for each line item.
Source and Target Messages:

The purchase order number should be repeated in each line item.

UseOneAsMany Mapping:
Input 1 is PONumber in Order context. Argument no 2 is assigned from LineItem and argument no 3 is
assigned from ItemNo segment.
Context of the Input 1 <PONumber> is <Orders>.

UseOneAsMany Arguments in Detail:

Input 1 does not have any repeating values. Only one purchase order number for each context. The first
purchase order number PO1 is repeated 3 times and the Second purchase order PO2 is repeated once.

Input 3 of the function defines the context changes of the target element. Notice that the total number
of values in Input 2 is equal to the total number of values in Input 3.
Example 2: Repeat Product ID at Target Message using UseOneAsMany.

Here we have a source message with product header details. A product can have multiple names. A
target message should be created with each product name and corresponding product ID (ProductID).

Source and Target Messages:

ProductID should be repeated at the target message.

UseOneAsMany Graphical Mapping for ProductID:


Input 1 is ProductID. Input 2 and 3 are both ProductName but the third argument uses the SplitByValue
function to derive the correct context.
UseOneAsMany Arguments in Detail:
Product-1 should be repeated twice, while Product-2 has only one product name. Product-3 has three
different names, therefore it should be assigned to the target 3 times.

UseOneasMany node function arguments.

For Input 3 we use SplitByValue node function to define the desired context changes.

Notice that we have adhered to the rules 1, 2 and 3 of UseOneAsMany function. No repeated values in
argument 1 which is the Product ID queue.
Argument 2 defines the number of times Product ID should be repeated. Inputs 2 and 3 contain the
same number of context changes in the queue.

Inputs 2 and 3 have the same number of values.

Example 3: an Alternative Way of Mapping Product ID.


Let’s take the same source and target messages as in example 2, but use the UseOneAsMany node
function differently.

Alternative Assignment of Node Function Arguments:


We can assign the arguments as below.

SplitByValue after UseOneasMany function


I would like to highlight the argument (Input) 3 in this example. Rule 3 is satisfied here as the number of
values in Input 2 and Input 3 match. We use the SplitByValue after UseOneAsMany to satisfy the context
change at the target.
[SAP-CPI] HOW TO SEND IDOC FROM SAP ERP TO SAP CPI

Step 1 : Define logical system (LS)

Go to T-Code : BD54

Assign this logical system for client.

Go to T-Code : SALE => Logical Systems => Assign Logical System to Client
Step 2 : Define Http connection to External Server (Type G)

This section require that connection from SAP/ERP must be connect with CPI so, you have to contact
with administrator or Basis team to configure this connection through Firewall or PROXY.

Also, check issue If have related with network issue

Go to T-Code : SM59

Click on node : HTTP Connections to External Server

Click : Create button


 (1) : Name of connection
 (2) : Host, this host will get from iFow which deployed.
 (3) : Port. Normally is 443
 (4) : Path Prefix. Base on config of integration flow.

Click on tab : Logon and Security

Step 3 : Define Port

In PI, we create tRFC port. In CPI we create XML HTTP Port

Go to T-Code : WE21

Click node : XML HTTP


Click F7 / Create

(1) : Choose connection which defined in step (2)

Step 4 : Add TRUST of CPI into SAP ERP

Go to SAP BTP => Account => SAP Integration Suite application => Monitor => Key Store

Create new key pair


Click this entries and download certificate.

Go to SAP ERP, T-Code : STRUST

Add certificate into as below


Step 5 : Test connection from SAP ERP to SAP CPI ( Check connection HTTP to External Server)

Go to T-Code : SM59

Choose connection type G : HTTP connection to external server

Click Test Connection

ICM_HTTP_PROXY_CONN_REFUSED : Message is SAP ERP cannot connect to SAP CPI. So, kindly contact
with administrator to resolve this issue. Ex : Open port from SAP/ERP to SAP CPI through Firewall…

Step 5 : Define partner profile (WE20)

 Go T-Code : WE20
 Create new Partner Type : LS

 Partner No : Logical System


 Partn.Type : LS
 Ty. : User

In outbound parameter, click create

Add Message Type =WP_PLU

Receiver Port : Choose port in WE21


After all, We have screen as :

Test scenarios

Case 1 : Connection refuse when send data from SAP to CPI

T-Code : WE19

Edit Port of CPI

Send Outbound Processing


Go to T-Code : WE02, check IDOC sent to External port but get Status = 02

Case 2 : Wrong SSL


Case 3 : It’s OK.

T-Code : SM59

Click : Connection Test

Input account to connect CPI


Go to WE19

Send IDOC with PORT CPI

Check T-Code : WE02


Check on SFTP on-premise
[SAP CPI] – HOW TO TRIGGER IDOC FROM CPI TO SAP ERP USING BASIC AUTHENTICATION
How to trigger IDOC from CPI to SAP ERP using basic authentication. This scenario is useful in case which
we want create data xml from third party and send this data xml to SAP ERP to create IDOC and posting
document. Another way, It’s called Inbound IDOC XML. Kindly take a look this diagram to get more
clearly
In this article, we will create scenario as :

 Third party will send data XML with structure ORDERS05 – Sales Order to SFTP. In fact, for simple we will
use POSTMAN to send XML to CPI
 In integration flow, for simple we just use one HTTPS for sender adapter and one IDOC for receiver
adapter
 In last, after data sent successful to SAP ERP, we will check SALES ORDER created.

Ok, here we go

Configure SAP ERP Settings

Step 1 : Active service

Go to T-Code : SICF

Search service with path : /sap/bc/srt/idoc

F8 to execute.
Step 2 : Register service

Go to T-Code : SRTIDOC
Choose Register Service. Press F8 to run with default

If service is registered, we will receive this message

else, It will be successful.

Step 3 : Run Test Service to get Endpoint.

Go to T-Code : SICF

Search service /sap/bc/srt/idoc

Right click and choose Test Service


Note the URL that comes in your browser. This is the URL where your IDoc adapter in HCI needs to point.
The URL will be of format :

http://host:port/sap/bc/srt/idoc?sap-client=<clientnumber>

Step 4 : Create Logical System

Go to T-Code : SALE

Create Logical System

Assign this logical system to client.


Step 5 : Create partner profile and add Inbound parameter

Go to T-Code : WE20

Choose partner type = LS from left side tree view

Choose create and input information in step 4 – local system as Partner No.
Step 6 : Test service from POSTMAN

Case 1 : No body or wrong data XML.

Go to T-Code : SOAMANAGER
Case 2 : Wrong user/pass or User not in ROLEs

Case 3 : It’s OK. IDOC will create in SAP ERP. Check T-Code : WE02
SAP Cloud Connector Configuration

Login SAP Cloud Connector

Choose Cloud to On-Premise at left side.

Create

Back-end Type : ABAP System


Next, reference to Test Service above to take internal host and port

Next, choose virtual host


Add Resources of this host. Take in URL when run Test service in above step. In this case
is : /sap/bc/srt/idoc

Finally, check on SAP BTP


CPI Configuration

Step 1 : Create Credential Name with User/Pass use for create IDOC on SAP/ERP. This user have to
create on SAP and take ROLE can create IDOC and Posting document

Go to SAP BTP

Go to Integration Suite Application

Go to Monitor

Go to Security Material under group Manage Security


Fill user/pass. This name will use for in step config Integration Flow.

Step 2 : Design integration flow

In this integration flow, use components :


Configuration for sender HTTPS

Configuration for sender SFTP ( Solution 2)

Configuration for receiver IDOC


 (1) : Address : http://<Vitualhost on SCC> + <Resource on SCC> ? sap-client = xxx
 (2) : On-premise
 (3) : Basic
 (4) : Credential name which configured in step 1 of CPI Configuration

TEST from CPI by POSTMAN

Case 1 : No body

Case 2 : It’s OK, but invalid Partner Profile

Payload
POSTMAN :

SAP/ERP T-Code : WE02


Case 3 : SAP Cloud Connector cannot connect to backend

With this case, check SAP Cloud Connector to check destination


Case 4 : Connection time out

How to track SAP CPI generated IDOC in SAP ERP

Add Content Modifier component.

Add header as below:

Name Value

Name SapMessageId

Source Value SAP_MessageProcessingLogID

Source Type Header

Send data to SAP/ERP.

Check control record => details. We will see one text look like
Check on LOG of CPI

SAP CPI] – SCENARIO FOR RFC RECEIVER ADAPTER WITH CLOUD CONNECTOR
I want to discuss about scenario How to send message XML from third party system to SAP backend with
RFC adapter receiver, SAP Cloud Connector and SAP CPI. This scenario use for in cases integration with
SAP backend system by RFC connection. For clearly, kindly take a look this diagram
A. SAP Cloud Connection Configuration

Step 1 : Add mapping virtual to internal system

 Click Cloud to On-Premise


 Click Add
Step 2 : Add resource for this mapping. Choose function name on SAP backend

 Choose Virtual Host


 Click Add Resource (below)
 Enter function name (Function module or Webservice Define on SAP)
 If have many Function Name, we can add many times

If in integration flow, we check option Send Confirm Transaction, we have to add 2 following function
name :

– BAPI_TRANSACTION_COMMIT

– BAPI_TRANSACTION_ROLLBACK
Everything will look as after done

B. SAP CPI Configuration

Step 1 : Add new destination

 Go to SAP BTP
 Click Destinations on left side menu. Under Connectivity
 Click button New Destination
 (1) : Name. This Name will use as connection name in RFC receiver adapter of integration flow
 (2) : Type : RFC
 (3) : OnPremise
 (4)(5) : User / Pass of SAP ERP. This user must be take ROLE accordingly

Step 2 : Add property of destination

Now, we have to add some property of destination which create in step 1. Add following property

jco.client.ashost Virtual host on SCC

jco.client.client Client of SAP ERP. EX : 190

jco.client.lang Language. Ex : EN

jco.client.sysnr System number of SAP ERP. EX : 00

Step 3 : CPI integration flow design

C. Issues

Issue 1 : If body send request invalid structure XML of Function Module, we ‘ll receive this message.
When take this issue, we have to check structure XML of Function Module and fix it.

D. Test case

For test purpose, we use standard function module : SXIDEMO_AIRL_FLIGHT_CHECKAVAIL

Structure XML of this function module as

<?xml version="1.0" encoding="UTF-8"?>

<ns0:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL

xmlns:ns0="urn:sap-com:document:sap:rfc:functions">

<FLIGHT_KEY>
<AIRLINEID/>

<CONNECTID/>

<FLIGHTDATE/>

</FLIGHT_KEY>

</ns0:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL>

Case 1 : No resource of this function module in SAP Cloud Connector

<?xml version="1.0" encoding="UTF-8" standalone="no"?>

<rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL.Exception

xmlns:rfc="urn:sap-com:document:sap:rfc:functions">

<Name>Partner signaled an error for conversation ID [69507688] : Access denied for


SXIDEMO_AIRL_FLIGHT_CHECKAVAIL on sap-ecd-app:sapgw00. Expose the function module in your
Cloud Connector in case it was a valid request.</Name>

<Text>Partner signaled an error for conversation ID [69507688] : Access denied for


SXIDEMO_AIRL_FLIGHT_CHECKAVAIL on sap-ecd-app:sapgw00. Expose the function module in your
Cloud Connector in case it was a valid request.</Text>

<Message>

<ID/>

<Number/>

</Message>

<Attributes></Attributes>

</rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL.Exception>
Case 2 : Invalid date

com.sap.it.rt.adapter.http.api.exception.HttpResponseException: An internal server error occured:

<?xml version="1.0" encoding="UTF-8" standalone="no"?>

<rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL.Exception

xmlns:rfc="urn:sap-com:document:sap:rfc:functions">

<Name>FLIGHT_NOT_FOUND</Name>

<Text>FLIGHT_NOT_FOUND</Text>

<Message>

<ID>BC_IBF</ID>

<Number>055</Number>

</Message>

<Attributes>

<V1> 0000 20220202</V1>

</Attributes>

</rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL.Exception>.

Case 3 : Invalid structure xml in Body request

com.sap.it.rt.adapter.http.api.exception.HttpResponseException: An internal server error occured: <?


xml version="1.0" encoding="UTF-8" standalone="no"?
><rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAILs.Exception xmlns:rfc="urn:sap-
com:document:sap:rfc:functions"><Exceptions>
<E1>Error Getting Function</E1>

</Exceptions></rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAILs.Exception>.

The MPL ID for the failed message is : AGI7Xs4vl3zVENgi_6XYmcdoSr6E

Case 4 : OK

<?xml version="1.0" encoding="UTF-8"?>

<ns0:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL

xmlns:ns0="urn:sap-com:document:sap:rfc:functions">

<FLIGHT_KEY>

<AIRLINEID>LH</AIRLINEID>

<CONNECTID>9981</CONNECTID>

<FLIGHTDATE>20021221</FLIGHTDATE>

</FLIGHT_KEY>

</ns0:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL>

<?xml version="1.0" encoding="UTF-8" standalone="no"?>

<rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL.Response xmlns:rfc="urn:sap-
com:document:sap:rfc:functions">

<FLIGHT_AVAILABILITY>

<ECONOMAX>320</ECONOMAX>
<ECONOFREE>308</ECONOFREE>

<BUSINMAX>20</BUSINMAX>

<BUSINFREE>19</BUSINFREE>

<FIRSTMAX>0</FIRSTMAX>

<FIRSTFREE>0</FIRSTFREE>

</FLIGHT_AVAILABILITY>

</rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL.Response>
[SAP CPI] – INTEGRATION WITH KAFKA IN CPI

we will explore one more adapter in CPI – Kafka adapter. Particular, we will explore how to send file to
Kafka and receive file from Kafka. Yes, It’s same SFTP.
Step 1 Build Kafka service

First, we need build server which install service Kafka. Fortunately, we have one service free trial 30 days
on cloud. Kindly go to here to create Kafka service on Cloud.

After created Kafka on cloud, we will have Kafka as below

Download 3 files

 Access key -> Service.key


 Access Certificate -> Service.cert
 CA Certificate -> ca.pem

Step 2 Create key store (file P12)

By using OpenSSL tool, run command line to create key store (P12 file)

pkcs12 -export -name server-cert -in service.cert -inkey service.key -out dev-cpi-server-key-store.p12
Enter password. This password will use in another step.

After done, we will have file in the same folder of service.cert, ca.pem and service.key

Step 3 Create Key Pair in SAP CPI

In this step we will use key store which created in Step 2 and add it into SAP CPI.

Monitor – Keystore
Add – KeyPair

 (1) : File P12 which created in step 2


 (2) : Password in step 2.

Step 3 Create Key Pair in SAP CPI

In this step we will use key store which created in Step 2 and add it into SAP CPI.

Monitor – Keystore

Add – KeyPair
 (1) : File P12 which created in step 2
 (2) : Password in step 2.
Step 4 Down load Server Certificate of Kafka and import into CPI

Next, we will connect to Kafka host with key pair in step 3 to download Kafka certificate and after that
import this certificate into CPI

Go to Kafka cloud service and copy host

Go to CPI – Monitor – Connectivity Tests


Use keypair in step 3 with host and port kafka to test connection to kafka service and download server
certificate

Import certificate into key store SAP CPI

Monitor – Key Store – Add Certificate

We have to import 2 certificates of server into SAP CPI


Step 5 Test connection Kafka in SAP CPI

Monitor – Connectivity Tests – Kafka


As we see, connection is ok. Go to here we finished connection from SAP CPI to Kafka.

Kafka Receiver: Send file from S4 to Kafka (Outbound)

Data send from S4/HANA and Save into Kafka with JSON Format
Unit Test

Send Data from S4. Check in WE02


Copy IDOC number: xxxxx35 search in SAP CPI

Check in topic Kafka

Data send from S4/HANA and Save into Kafka with Binary Format
NOTE

With this format, we will use component Base64 Encode to convert payload XML of S4/HANA to Base64
binary

This scenario, you can try by yourself.

Kafka Sender : Collect data from Kafka to S4 IDOC (Inbound IDOC)

Data in Kafka is XML to Binary

With this format, 3rd system will create payload with Format XML, after that convert XML payload to
Base64 and send to Kafka

CPI will collect data in Kafka with Kafka sender adapter and send it to S4/HANA by IDOC Receiver
adapter.

Payload XML
Convert to base64 and save to Kafka

Integration Flow with sender is Kafka and Receiver is IDOC


Check log in SAP CPI with Payload
Check on S4/HANA
Data in Kafka is JSON

All step is the same above. You can try by yourself.

NOTE

After collect data from KAFKA send to S4/HANA, message in Kafka will not deleted, but It will marked
with anything and will not send to S4/HANA many times.
[SAP CPI] – MONITORING MESSAGE IN CPI AND S/4
Hi guys, in this article I want to share a tip about monitoring message in CPI and S/4. As you known,
monitoring and search message is important in integration system maintenance to fix issue if it happen.
For example, NON-SAP system send message XML to S/4 through CPI by using IDOC adapter receiver,
how to get message ID of CPI at S/4 and how to get IDOC number of S/4 at SAP CPI ? One more example
in IDOC outbound scenario, S/4 send message XML to NON-SAP through CPI by using IDOC adapter
sender, how to get IDOC number of S/4, some data in data record of IDOC at SAP CPI ? All this answer
for these question, I will share in this article.

I. Understand about header in CPI for log message ID.

In CPI, we have some header which using for search message in CPI.

SAP_ApplicationID – This is where you can put the content of your payload. So you can put in your
invoice number. This value will display in field Application Message ID in MPL search

SAP_MessageType – This is where you can put extra data like message type, basic type, extension…This
value will display in field Application Message Type in MPL search

SAP_Receiver – This is where you can put name of receiver system. This value will display in field
Receiver in MPL search

SAP_Sender – This is where you can put name of sender system. This value will display in field Sender in
MPL search

SAP_MessageProcessingLogCustomStatus – This is where you can put extra data for field Custom Status.
This is should be configure at tab properties

SAP_MessageProcessingLogID – This is message processing log ID in MPL

Add more, you can use groovy script to add custom header in MPL search as below

import com.sap.gateway.ip.core.customdev.util.Message;

def Message processData(Message message) {


def messageLog = messageLogFactory.getMessageLog(message);

if(messageLog != null){

//Read IDoc number from Header

def IDOCNUM = message.getHeaders().get("IDOCNUM");

//Set IDoc number as Custom Header

if(IDOCNUM!=null)

messageLog.addCustomHeaderProperty("IDOCNUM", IDOCNUM);

return message;

II. Logging in Inbound Scenarios with header in MPL

Scenario: POS/NON-SAP will send message to S/4 through CPI. In MPL CPI we have to log some data

 SAP_ApplicationID – POS number


 SAP_MessageType – Message Type
 SAP_Sender – POS
 SAP_Receiver – Logical System of S/4. This value should be parameter.
 SAP_MessageProcessingLogID – Mapping into field ARCKEY in IDOC control record EDI_DC
 Custom Header Property – IDOC number reply from S/4

In this scenario, we also use groovy script to send value header from outside into message mapping to
resolve some data dynamic for fields of control record – EDIDC_40 over every environment.

 TABNAM
 MANDT
 DIRECT
 IDOCTYP
 MESTYP
 SNDPOR
 SNDPRT
 SNDPRN
 RCVPOR
 RCVPRT
 RCVPRN
 ARCKEY : urn:sap.com:msgid=<messageID>

To do this, we create groovy script

/* Refer the link below to learn more about the use cases of script.

https://help.sap.com/viewer/368c481cd6954bdfa5d0435479fd4eaf/Cloud/en-US/
148851bf8192412cba1f9d2c17f4bd25.html

If you want to know more about the SCRIPT APIs, refer the link below
https://help.sap.com/doc/a56f52e1a58e4e2bac7f7adbf45b2e26/Cloud/en-US/index.html */

import com.sap.gateway.ip.core.customdev.util.Message;

import java.util.HashMap;

import com.sap.it.api.mapping.*;

import com.sap.it.api.mapping.MappingContext;

def String getheader(String header_name, MappingContext context) {

def headervalue= context.getHeader(header_name);

return headervalue;

}
This groovy script will receive header name from content modifier of IFLOW and return header value. By
this way, we can configure dynamic for these value of EDIDC_40 control record of IDOC.

This is integration flow for inbound IDOC from POS/NON-SAP to S/4

This is configuration for content modifier

With IDOC adapter receiver, we should be use with REQUEST REPLY to get response from S4 to CPI.
This is payload response

So, after this step we will use content modifier and groovy script to set custom header

Add header name is IDOCNUM with value is //ns2:DbId/text() and data type is java.lang.String
Take focus for prefix namespace NS2, we will set namespace prefix

After all, we will have result

For search in Monitor Message Processing


III. Logging in Outbound Scenarios with header in MPL

All of thing the same in scenario inbound, for outbound scenario, send message type DEBMAS from S4
to POS/NON-SAP. We set header in content modifier

And add one field at tab Exchange Property for Custom Status :
SAP_MessageProcessingLogCustomStatus

This is result
SUMMARY

In this article, I talked about monitoring message in CPI with headers. I also talked about how to set
dynamic value for IDOC control record from header value outside into message mapping by using groovy
script. Thank you for your reading and any advise kindly leave your comment on this. Thanks.

[SAP S/4 HANA CLOUD] – HOW TO SEND DATA FROM S/4 HANA CLOUD INTO SAP CPI
Hi guys, as you know we will upgrade to S/4HANA cloud in near future and as integration consultant, we
need to explore how to integrated between 3rd system with S/4HANA Cloud through SAP CPI.

In this article, I want to share the first scenario How to send data from S/4 Cloud into external system
through CPI by using DRF – Data replication framework.

First, we need to view flow which we will do in this scenario.


To do this scenario, we have to need some ROLES in S/4 cloud

 Administrator : SAP_BR_ADMINISTRATOR

 Administrator – Data Replication : BR_ADMINISTRATOR_DATA_REPL

 Configuration Expert – Business Network Integration : BR_CONF_EXPERT_BUS_NET_INT

IDENTIFY DRF OBJECT ON API BUSINESS HUB

Go to SAP API Business

Search Buisness Partner SOAP in SAP S/4 HANA Cloud


As you see, we want to send Business partner data to external system, so we get communication
scenario SP_COM_008
CONFIGURATION COMMUNICATION SYSTEM ON S/4 CLOUD

Login to S/4 Cloud

Search Communication Systems App

Add New Communication System


Because of this scenario is send data from S/4 cloud to CPI (Outbound), so we just configure in tab
Outbound communication.

Input anything in hostname and save. We will comeback this screen after configuration artifact on SAP
CPI

CREATE PACKAGE AND ARTIFACTS ON SAP CPI

Go to SAP CPI

Create package

Create Artifact

Add sender adapter is SOAP because of we use communication scenario (SP_COM_0008)

Save & Deploy this IFLOW we will get endpoint URL


CREATE SERVICE KEY ON SAP INTEGRATION SUITE

In this step, we need to create service key in CPI. This step look like we configure user for outbound
from ON PREMISE / RISE SAP in SM59.
Copy Client ID and Client Secret for next step.

UPDATE OUTBOUND USER AND SOAP END POINT ON COMMUNICATION SYSTEM APP S/4 CLOUD

Go to S/4 Cloud

Go to Communication System App

Edit
Save

CONFIGURATION COMMUNICATION ARRANGEMENT APP IN S4 CLOUD

Go to Commnication Arrangement App

Create new
Select scenario like in start step. SAP_COM_0008

We need to send Business Partner, so we have to select and configure for Business Partner
We have to define Replication Model and Output Mode for every object which we want to send from
S/4 Cloud to external system in section Additional Properties.

With Output Mode, we have 2 options

 P : Pooled Output
 D : Direct Output.

Save and Check connection to SAP CPI


TEST SCENARIO SEND DATA TO SAP CPI BY USING REPPLICATION APP

In this section, we have to test scenario send data to SAP CPI . To do this step, we have to assign
ROLE BR_ADMINISTRATOR_DATA_REPL for user

Go to Maintain Business ROLEs App

Create Business Role from Template. Ex : Z_BR_ADMINISTRATOR_DATA_REPL


Assign user into this new business role.

This time, search app with key : REPLICATION

And go to Replicate by Replication Model App


MONITOR STATUS OF MESSAGE REPLICATE INTO EXTERNAL SYSTEM

Go to Monitor Replication App


Check MPL on SAP CPI
[SAP S/4 HANA CLOUD] – HOW TO SEND DATA FROM S/4HANA CLOUD TO EXTERNAL BY OUTPUT TYPE
Hi guys, in previous turtorial I shared step by step How to send data from S/4HANA Cloud to external
system by using DRF – Data Replication Framework. Next, in this tutorial, I will share step by step send
data to external system by using output type. For simple I will use PURCHASE ORDER business object to
demo

EXPLORING API RELEVANT TO PURCHASE ORDER BUSINESS OBJECT

Go to SAP API hub


Select corresponding communication scenario for your integrated. In this, I select SAP_COM_0224
CONFIGURATION INTEGRATED ON S4HC

STEP 01 – COMMUNICATION USER

This is user which is used for communicate between systems. Every external system we should be create
every user communication.

With INBOUND user we will create on S4HC

With OUTBOUND user we will create on external system and use in S4HC for OUTBOUND service

STEP 02 – COMMUNICATION SYSTEM

In integrated strategies, we will many external system. Ex: POS, CPI, WMS..etc..With every external
system, we have to create on S4HC.

We will use COMMUNICATION SYSTEM app to do this


(1) – Host name of external system. If we use SAP CPI as external system then this is host name of CPI
when we deploy IFLOW.

(1) – User inbound which we create in step 01

(2) – In this we use SAP CPI to receive data from S4HC, so this user is Client ID, Client Secret in service
key of IFLOW
STEP 03 – COMMUNICATION ARRANGEMENT

After select communication scenario on API hub, we will use COMMUNICATION ARRANGEMENTS app
to create scenario.

NOTE

If you can not find scenario in this, check scope item actived by SAP or not.
(1) – Select SYSTEM which integrated with S4HC in this scenario.

When you select SYSTEM, user INBOUND and OUTBOUND will fill automatically.

(1) – Input path. This path we will get in SAP CPI after deploy IFLOW.

Because of we select method SOAP so we need HOST & RESOURCE SOAP of external system. In this we
use SAP CPI so IFLOW will like this
STEP 04 – CONFIGURATION OUTPUT TYPE

Next, we will configure output type by using OUTPUT PARAMETER DETERMINATION app.

With this configure, we just focus into OUTPUT TYPE, RECEIVER, CHANNEL
(1) – Rule for PURCHASE ORDER

(2) – Step OUTPUT TYPE

(3) – We just send all purchase order with DOCUMENT TYPE is NB


STEP 05 – CONFIGURATION MESSAGE DASHBOARD

In integrated scenario, we need to monitor all message which sent from S4HC, so we have to configure
to do this by using message dashboard.

First we need add namespace for user by using ASSIGN RECIPIENTS TO USERS app
UNIT TEST AND MONITORING

We will send purchase order to external system by using MANAGE PURCHASE ORDERS app
We will monitoring message by using MESSAGE DASHBOARD app
Monitoring on receiver system – SAP CPI
Payload

INTEGRATION APPROACHES AND ADVISING IN SAP RETAIL (PART 1)


May 26, 2021 Cuong Dang (Cody) CPI - Integration Scenarios, CPI - Tips & Troubleshooting, SAP CPI, SAP
IDOC, SAP PO - Integration concepts, SAP PO-Integration scenarios, SAP PO-Tips & Troubleshooting, SAP
POS DM | CAR, SAP Process Orchestration, Uncategorized One comment
Hi Fellows,

Today I will share with you about Integration Approaches and Performances. That’s is now a new topic
with you guys when design Integration Landscape with third parties system.

We all know that integration design depended on the system’s technology, enhancement capabilities,
project timeline, costing, and adaption of systems in Landscape.

In this topic, I will not mention things the dependences mentioned above. I will share my opinions about
how to design Integration Landscape flexibility, scalable, off-line ability, and independence.

In the limitation of knowledge, please note that this article is “My opinions“, please share your opinion
in the comment below topic.

[1] – Messages Exchange Models

Talking about Integration, we will think about documents/objects transfer from System A to System B
and Versa. Yes, that’s right data will exchange from system to system with the different data stores
(Database) and different system technology.

Data Exchange, in reality, we can see in daily life. Customer play order on the E-Commerce website, data
will send to Store / Warehouse. The storekeeper will pick and package goods and put them in a
temporary location, delivery man receives delivery orders and deliver them to the customer. The
customer received goods and confirm Orders/Payments… Basically objects exchange between seller and
customer is goods.

All sales activities captured in Systems, E-Commerce system can be in-house application, Order
Management System(OMS) can be in-house or from Software Vendor, Delivery Management
System(DMS) can from In-house or from Vendor.

To present business process objects exchange between System to System we can simply name is
“Messages“

[1.1] – Synchronous Messages


When you look at a very simple conversation of two ladies. Left Lady says “Hi” and waiting for the right
Lady to respond to her message. The right Lady responded back “Hi, How are you ?“.

In the example above we can think that similar to Synchronous messages. The left Lady start
sends/exchange message to right Lady and always waiting for right Lady response. The whole
conversation we can assume is “Session“.

The conversation between ladies happened at the same time block, we can call it as”Real-time“

For Synchronous system to system messages exchanges similar to a conversation between two Ladies.
System A sends Messages to System B and always waiting for a response from System B to finish the
session.

Synchronous messages exchange in Integration:

 Web-service
 Database Direct
[1.2] – Asynchronous Messages

Mailman goes to houses and sends newspapers to the mailbox as scheduled, he does not care about the
house owner received or not, he moves to the next house and puts to house’s mailbox.

The messages here are newspapers, each time he put newspapers into the mailbox he was completed
the “session“. Each hour, he can send many newspapers to houses.

In some scenarios valuable goods, mailmen need sign-off from the receiver.

Mailman send Mails and house’s receives Mails in a different time block, we can call it as “Off-Line“
Very simple concepts right ?. In Systems data exchange also the same concept for Asynchronous
messages. System A can send out messages to many receiving systems without any response
expectation.

In the integration design, files can be exchanged between System A to System B as much as possible and
no matter when and how System B receives files.

Asynchronous messages exchange in Integration:

 File transfer
 Send Email

[2] – What’s Integration Approaches ?

There isn’t an approach for integration design, it’s based on:

 How many messages will be exchanged between Systems to Systems?


 What are the limitations of systems involve in Integration Landscape?
 How good connectivity qualities of your network?
 Does System A wait for System B messages to processing the next actions immediately?
 Do you need real-time data?

[3] – Advising for SAP Retail Integration

[3.1] – Master Data

SAP ERP send Master data to third parties systems.

Normally, SAP ERP will managed all master data:

 Plant Master
 Vendor Master
 Customer Master
 Material Master
 Promotion Definition
 Selling Price
 Assortment.
 …

Almost master data objects defined structure to exchange to None-SAP systems as “SAP IDOC” , values
of objects populated in to IDOC structure.

New/Changing data recorded in SAP ERP system at “Change Pointer” data store.

Why should not use Webservices to Exchange Master Data (Synchronous Integration) ?

 Many legacy / third parties systems will receive master data from SAP ERP with different data
requirements. Development or Consume Webservices API will decentralize.
 Master data in Retail is huge and can be changed regularly. Performances of SAP ERP will dependent on
receiving systems.
 Leak information SAP ERP public all object’s data as Webservices provider, otherwise we has to develop
individual API for each receiving system.
 Master data changed in SAP ERP and third systems won’t know which object was changed exactly. In
some scenarios, 3rd system have to get all data as schedule and filter out changed objects, it will cause
unwanted data transfer between SAP and the None-SAP system.
 Bottleneck during message exchange.

Why should use files to Exchange Master Data (Asynchronous Integration) ?

 SAP build-in IDOC for master data, we can re-use data structures.
 Each partner system can have different data requirements, We can re-use original messages with
filter/mapping to adapt specific requirements.
 Huge master data can send to the None-SAP system without any performance any dependences.
 SAP trigged changing objects and only send from “Change Pointer” containers, that’s reduces
redundancies data.
 Flexibility to control field mapping with 3rd system.
 Reduce development tasks.
 Messages splitting helps data consistent.

End of Part 1

In this part, i will talk about transactional data.


[3] – Advising for SAP Retail Integration

[3.2] – Transaction Data

Integration objects

There are many documents will be exchanged from System to system, between SAP and None-SAP
systems.

We can divide by three groups:

Font-end Operation Transactions

 Customer play sales order on font-end system (POS, Mobile APP, E-Commerce website, or another touch
points…)
 Customer do Payment.
 Customer confirm Received Goods.
 Customer query Loyalty Information.
 Customer query Promotion Program.
 Customer Pick Goods at Store.
 Delivery Processing.
 Services Orders Documents.
 Deposit Documents.
 Credit Notes Documents.
 Installment Documents.
 Reservation Documents
 E-Invoice Documents
 Stock Availability Check.
 …

Backend Operation Transactions


 Goods movement between store and store Documents.
 Goods movement between Store and Distribution Center (DC) Documents.
 Goods movement between Distribution Center (DC) and Distribution Center (DC) Documents.
 Contract Management Documents.
 Purchase Order (PO) Documents.
 Goods Receive Documents.
 Delivery Orders Documents.
 Financial Documents.
 Production Order Documents for Manufacturing process.
 Work In Process Documents for Manufacturing process.
 E-Invoice Documents
 …

Analysis Transactions

 Retail Sales Transactions.


 Whole Sales Transactions.
 Profitability Analysis.
 Retail KPI Data (Sales per employee, Conversion rate, Gross and net profit, Average transaction value,
Online sales relative to brick-and-mortar locations, Year over year growth, Stock turn, Gross Margin
Return on Investment, Sell-through, Shrinkage, Customer retention…)
 …

When you look at the list of documents above in Font-end, back-end and Analysis groups, documents
handled by SAP or third parties system we will realize that not all documents need to transfer
immediately like “Delivery Orders(DO)“.

So what documents need transfer immediately (Synchronous Messages) ?

Cash and Carry: Customers choose items and do payments at the store, receipts are confirmed after
customers do payment at the cashier. For grocery or FMCG, many customers come to the supper market
and do the payment at the same time, especially is a weekend. If we designed that Synchronous
message will be used then if any issues with network connection, HQ/SAP severs performance issues
that will hugely impact with business and sales opportunities. The sales data no need transfer to the
backend system immediately. The highest priority is to make POS process input Sales as fast as possible.

Click and Collect – Home delivery: Opposite with Cash and Carry, customers select items on Online
touchpoints (Mobile App, E-Commerce website) and they don’t know the actual physical inventory of
items. The Font-end system needs to check inventory immediately from the back-end system. The after
confirmed order in the system, the delivery process will be performing as schedules. We don’t need to
make delivery orders immediately but we need to reserve goods immediately to reduce out-of-stoc

sales scenarios above “Cash and Carry” and “Home Delivery” we note that those transactions interact
directly by the customer we can use Synchronous message transfer.
Reservation: Reserve goods to make sure that Stock Available to deliver to Customer. Reservation action
need execute immediately when customer confirmed Orders.

Stock Availability Check: Consumers want to know Stock Availability on Online touch points. The
synchronous message needs to be applied to get current system stock indicators.

Payment and Financials Transaction: Accounting want to control revenue as soon as possible.

What documents no need transfer immediately (Asynchronous Messages) ?

Font-end Operation Transactions

 Sales per receipt in Cash and Carry Model. There are many receipts placed in the system at the same
time, the cashier needs to scan items as fast as possible. POS need perform query and response result as
soon as possible…

Back-end Operation Transactions

 Transaction related to Inventory movements in retail also huge and does not impact the business
directly. Store / DC operator can capture movement transaction and system will exchange together
schedules.

Analysis Transaction

 Analysis transactions will send to the analysis system when the source system finishes a business
processing. To make sure that data enough information and data to calculate, the Analysis system needs
to collect as much as possible so data transfer from ERP to the Analysis system can very huge. The
synchronous in this kind of transaction not necessary.

SFTP questions :

 Assume we are getting order details from source system but business Order No in file name on
the target system can we implement this requirement in SAP CPI if yes how to implement it
 How to populate file names dynamically using FTP/SFTP adaptor?
 Assume we are getting order details from source system but business need time stamp
(yyyy/mm/dd)in file name on the target system can we implement this requirement in SAP CPI
if yes how to implement it
 How to pick the files from multiple folders in cpi through SFTP Adaptor
 Zipping multiple files in sap cpi and send to Target as zip folder
 SFTP adaptor to poll multiple files from different folders

-----------------------------------------------------------------------------------------------------------------------------------
 How to transport the iflow from QA to Prod using CTS+ in sap cpi how to say in interviw
 Say client provided some Idocs and said they failed to reach reciever system now how do you
reprocess them from cpi as we don't have Idocs number stored in CPI ?
 End to end flow from design to production deployment What work we do in each environment
in detail how to say in interview
 Exception handling using data store and JMS in detail–business use case
 Introduction in brief about cpi and one complex scenario
 ValueMapping,Data store ,Multicast – business use case for each

You might also like