IDOC Now Printout
IDOC Now Printout
RFC adaptor—if we are not able to connect we will contact Basis team
XI adaptor
In SAP Cloud Platform Integration (CPI), the main difference between a tenant management node and a
runtime node is that a developer can modify the tenant management node, but only check the status of
messages in the runtime node:
SAP CPI
SAP API
Open connectors
Integration Advisor
Open connector – if there is no adaptor provided by sap and you want to connect cpi with any third
party system then we go for open connector
Integration Advisor: when you have concept of EDI then we have to go for Integration Advisor
Second one is where the iflow is created and first one is where it is deployed
Basic Edition
Standard Edition
Premium Edition
In real time also we cant Poll more than one file using Poll Enricher in Single run....? Poll enrich will pull
only one file real time too
We can also use runtime property SAP_PollEnrichMessageFound to check file exist condition in looping
process call step.
can you please make a video to pick files from 10 different directories of single sftp server using sftp
adapter?
u have to use multiple "poll enrich" functions to pull data from multiple folders
how to maintain multiple directories.. How can we achive this ( via value mapping?)
Q: At iFlow first run, variable not created yet but need to use some initial/default value for processing,
how to handle such chicken-and-egg situation?
A: Use Content Modifier read variable and set default value.
Data Store :
Q: When writing a list of records to DS, if some record processing failed, will the DS operation have
partial success, partial failed?
A: Yes. Now new iflow by default 'transaction handling' is none.
Q: How to select multiple entry from DS, process each entry independently one-by-one, process those
success one, skip those failed one, and further write to different DS in case success or failed?
A: Use combination of DS select, splitter, router, exception sub-process, 'transaction handling' setting, 1
source DS + 2 target DS to achive this. Will show in course lesson.
SAP_applicationID
SAP_CorrelateMPLs
SAP_MessageType
SAP_Sender
SAP_Reciever
SAP_MessageProcessingLogID
For converter json to xml the json cannot start with array so quick and easy way is to use content
modifier to wrap the json object
For appending xml body purpose-Example in looping process call keep appending$
{property.appended_body}
Save the payload for later use – use property to temporarly store payload
Extracting value from body and setting header or property from xml with source type as xpath
Outbound Configuration
Go to T-Code : BD54
Assign this logical system for client.
Go to T-Code : SALE => Logical Systems => Assign Logical System to Client
Go to T-Code : WE21
Click F7 / Create
Step 5 : Test connection from SAP ERP to SAP CPI ( Check connection HTTP to External Server)
Go to T-Code : SM59
Inbound Configuration:
Go to T-Code : SICF
Go to T-Code : SRTIDOC
Go to T-Code : SICF
Note the URL that comes in your browser. This is the URL where your IDoc adapter in HCI needs to point.
The URL will be of format :
http://host:port/sap/bc/srt/idoc?sap-client=<clientnumber>
Go to T-Code : SALE
Go to T-Code : WE20
Choose create and input information in step 4 – local system as Partner No.
Go to T-Code : SOAMANAGER
SAP :
IDOC
RFC
Proxy - we use either soap adaptor or xi adaptor
In CPI
Consumer Proxy: its also called Reciever proxy( Abap proxy at receiver side
Provider Proxy: it is also called Sender proxy (ABAP proxy at sender side)
SAP application (IDOC/RFC/Proxy) on Premise if it is at Sender side then we don’t need to use cloud
Connector
SOAP adaptor :
End point will be given to Basis team so that In SOA manager they will create logical port
XI Adaptor:
Rfc g type
PI – ALE Support
SM59->rfc destination
Port
Logical system
Partner profile
CPI :
We need to import certificates at both ends ie., ECC and CPI side
We need to download CPI certificates from Manage store and provide it to Basis team them will
upload in STRUST
For SAP Certificate we have to download from STRUST and we have to upload to CPI -> Manage
Security -
Cloud Connector:
SAP on premise application on sender side then we don’t need cloud connector Ie.,RFC/IDoc
/Proxy
CPI :
We will create two iflows
Common iflow - For every IDoc ki ade RFC port vadutamu adi RFC destination vadutamu sap
side CPI lo ki ragane value mapping dawara idoc MatMas ayithe ee receiver ki velali..
orders ayithe ee receiver ki velali ani Value Mapping lo we put conditions .. we identify the
receiver based on source agency ..target agency..idoc type ,Message type) we are routing
Messages
Idoc Sender- CPIValue MappingProcess Direct
Process Direct Sender CPI SFTP/REST(Actual receiver)
In SAP Side Instead of Multiple RFC destinations
Idoc Sender SIDE AYITHE In Control records you will have document Number field -> Idoc
Number it will created in SAP application we get it from SAP Application
IDOC receiver side AITHE we are generating it in SAP Application it will come in IDOC
response header Property(SAP will be generating it)..
Ie.,Applicaiton Message ID
Process direct - to call from one iflow to another iflow Its Internal adaptor
When to use which ADATPOR:
1) You can use relevant adapter based on the Target Systems call( they have IDoc or they have
proxy or they have web service exposed from SOAMANAGER).
2) If standard IDocs are available in SAP S4 or ECC either inbound or outbound for the particular
interface you will be designing your IFlow having IDoc adapter.
3) If there are no standard IDocs available then they might take Custom Proxy approach then
you should go with XI adapter.
4) If there are any services exposed as web services from SOAMANAGER[ ex Successfactors
Employee and ORG Data replication ] then you will go with SOAP Adapter.
BD54 Transaction code or Sale is one and same it will navigate to same screen
IDOC
RFC
Proxy
SFTP File
Intermediate Document
Idoc Number:
This however means, for each IDOC type thats flows from S4H to CPI, a separate Type G connection will
have to be created [/cxf/matmas or /cxf/debmas....] in SM59
One could possibly mitigate this by having a generic routing iFlow [/cxf/idoc and thereby having only 1
SM59 connection for IDOCs to this CPI iFlow] and then making process direct calls to individual iFlows
processing different IDOC types, depending on field values in the control segment.
It would be ideal though, if SAP could build something on S4H, which would allow specifying a generic
path prefix [/cxf/idoc/<idoctype>] in SM59 connection [so that there is one connection, like in tRFC port
for IDOCs between ECC and PO ] and dynamically route the calls [?] to the iFlow endpoints [ /matmas or
/debmas... maintained in Connection of respective iFlows] from S4H.
Control records – idoc type message type idoc number Sender port reciever port, Reciever
partner,sender partner ..
Data Records
-
Direct 1- outbound
Direct 2 is Inbound
In Part1 I described the necessary settings to RZ11, STRUST and SM59 type G RFC Connection, IDOC Port
in WE21 and Partner Profile in WE20
1.
System Connection between S4H and CPI
2. Background Configurations required
3. Configure Idoc Sender Adapter
4. Search messages in message monitoring based on Idoc Number
To be able to send messages from SAP S4Hana to SAP CPI import CPI certificates in S4Hana STRUST. You
can download CPI certificates either from keystore or from any browser.
1. Login to S4hana system --> Tcode STRUST (Import all 3 CPI certificates) --> Add to Certificate List
2. Create Logical System for S4H and CPI system in Tcode BD54
3. Create RFC Destination of Type G -->
IDoc Adapter
Configuration -->
8. In-order to be able to search my Iflow in monitoring based on Idoc number, I have configure
Content Modifier with Header Values
I am using
Headers --> SapIDocDbId, SapIDocTransferId, SapIDocType in my mail body and created
SAP_ApplicationID to pass Idoc Number from payload which will help me to monitor messaged
based on Idoc Number. Value for SAP_ApplicationID is Xpath of my Idoc DOCNUM
9. Configure Mail adapter and deploy IFlow
Abap team will sit with functional team ..prepare the idocs we will use these idoc structure
The answer is based on one camel condition by default before gather we use splitter right in that we
have one option like camelsplitComplete end is true .. When we get that true value that was the final
record so gather also understand there is no more records
S4 Hana is both on premise and cloud
In real time we are not the owners for creating tenant btp team only provide the tenant to us And even
we are not having the access also for btp
Is data store we can store entire message/payload whereas variable stores single value like timestamp
or date Data store will store until the the days which we mention
Data Store for this cpi taking some charges so we have to pay if store long time .. For variable you no
need to pay just it's small place to see something when required.. Aggregator concept was different it's
hold in the run time only but data store holds how much time you want
Property is with in the flow only but comming to variable if you declare I flow we can use in the I flow
level only but if you declare in the global level where ever you want you can use
Data store max retiration period is 180 days but by default we have 30 days
Suppose I have two integration flows ( IF1,IF2), where IF2 is connected to IF1 via process direct .
Now properties in IF1 cannot be accessed in IF2 because the scope of property is with the iflow (IF1), but
where as headers created in the IF1 can be accessed in IF2 because it as global scope.
If you want to use that property you need to call that property into the header in the same I flow
We are having the payroll data with both active and inactive employees that data we can store in the
data store by using write variable
In the next run we can get that data which we have in the data store by using get or select variables and
in the I know level we will compare both active and terminate employees data by using the script or
what ever you use and delete the terminate employees data and we again store the active employees
data in the data store with the entity name payroll records
You can get more details on exception using ${exception.message} or ${exception.stacktrace}.
Say Within 1 integration flow, the main process calls a subprocess using a local process call.
If an exception is raised in the subprocess, the processing of the main integration process ends?
IDoc scenarios:
Value mapping we define conditions if it's Matmas it should go to this receiver If it's order it should go
to this reciever
As we currently replace our SAP Business Connector with SAP CPI, we faced the question on how to get
the iDoc structure out of SAP ERP to import into SAP CPIs message mapping and avoid to manually
create that. After After our research we found a way which really helped us so that I want to share the
details with you within this blog article.
In the end, it is a very easy to go procedure. For this example we use the SYSTAT01 iDoc which we need
for this article. Nevertheless, it works with the other iDocs and also iDoc enhancements as well.
1. After you have logged in into your SAP CRM/ERP, you access transaction WE60 and enter the
iDoc name you want to download the XSD for.
2. Before submitting or something else you open in the menu bar the Documentation menu and
select XML Schema.
4. Now you see the XSD content. To download that, right click the screen and select View Source.
An editor window opens and when you save that file as *.xsd, you have the XSD-structure of the
iDoc.
5. Upload the downloaded XSD-file to your SAP CPI message mapping component as source
message.
And its done. Now you can develop your CPI message mapping as every other message mapping
and complete your development.
Just one question: Is there a special reason why you didn't use the report SRT_IDOC_WSDL_NS to
download the wsdl of the IDOC structure?
IDOC segments with max occurs upto 99999 ( length 5) is supported in CPI.
minOccurs="0" maxOccurs="99999">
if minOccurs="0" maxOccurs="999999">
replace it with if minOccurs="0" maxOccurs="unbounded">
I need to export the XSD file for an IDOC structure that has a custom extension. Since the WE60
transaction has a single select option for the Basic type or the Extension, how do you get an XSD for the
complete IDOC structure?
You just need to set the extension on WE60. It gets you the full idoc XSD including the parent nodes
-------------------------------------------------------------------------------------------------------------------------------
Create context
Remove context
Collapse context
Splitbyvalue
Useoneasmany
Mapwithdefault
Formatbyexamole
Sort
Sortbykey
Createif
Get property
Get header
Pagination in SAP CPI/CI: Pagination in SAP CPI/CI refers to the process of dividing a large dataset into
smaller, more manageable chunks, called pages, to improve performance and reduce the amount of
data transferred between systems.
Pagination is necessary in SAP CPI/CI because: Large datasets can cause performance issues and slow
down the integration process. Transferring large amounts of data can lead to increased latency and
network congestion. Some APIs and systems have limitations on the amount of data that can be
retrieved in a single request.
In SAP CPI/CI, pagination works by: Dividing the dataset: The dataset is divided into smaller pages, each
containing a fixed number of records.
Retrieving a page:
The integration flow retrieves one page of data at a time, using a pagination token or cursor to keep
track of the current position in the dataset. Processing the page: The integration flow processes the
retrieved page of data and then requests the next page, using the pagination token or cursor to retrieve
the next set of records.
Client-side pagination: The client (i.e., the integration flow) handles pagination by specifying the page
size and page number in the request.
Server-side pagination: The server (i.e., the API or system being integrated) handles pagination by
returning a fixed number of records per page and providing a pagination token or cursor to retrieve the
next page.
$skip: Specifies the number of records to skip before returning the next page.
$snapshotSize: Specifies the size of the snapshot (i.e., the number of records to return per page).
Pagination in SAP CPI/CI provides several benefits, including: Improved performance: By reducing the
amount of data transferred and processed, pagination can improve the performance of the integration
flow. Reduced latency:
Pagination can reduce the latency associated with transferring large amounts of data. Increased
reliability: Pagination can make the integration flow more reliable by reducing the risk of errors caused
by large datasets.
SAP Cloud Integration provides an option to transport integration content directly to CTS+ system. You
can then transport this content from the CTS+ system to your target SAP Cloud Integration tenant.
Here's how you can transport content to CTS+.
2. Choose Transport.
If you don't see the Transport button, contact your tenant administrator to enable transport
option in the tenant settings.
In the Transport Comments prompt, you can see the type of transport under the Mode field configured
by the tenant administrator. Provide comments under the Comments section and choose Transport.
You see a prompt with the Transport ID. The integration package is transported to the CTS+ system.
In the Test/Source tenant, select the integration package that you want to transport.
Click on Transport.
Confirm Import with Date: Immediate and Import Options as: Leave Transport Requests in
Queue for Later Import.
The Integration package is imported into the Production tenant Account.
Note: In case of errors, you can check the logs by selecting the Import request and clicking on Logs
button.
--------------------------------------------------------------------------------------------------------------------------
Context
One of the options to transport content from one tenant to another is to use
the Export and Import options for your integration package. The application imports the integration
package to your local file system in the form of a .zip file. You can import the same file in the target
tenant using the Import option.
Procedure
2. Choose Export.
A .zip file is downloaded to the default browser download location on your local file system.
3. Log in to the SAP Cloud Integration web application to which you want to import the content.
Choose .
4. Choose Import.
A new window opens in your file system explorer, allowing you to access your local file system.
5. Navigate to the folder path where the .zip file was downloaded in step 3.
You see a prompt indicating the successful import of the .zip file. You can see the imported
integration package in the Design tab of your target tenant.
How to setup a SAP BTP Integration Suite Trail Account Ready to use & Develop iFlows?
SAP BTP Cockpit > Services > Service Marketplace > Integration
Testing Message Mapping
The mapping editor provides two ways of testing message mapping:
1. Simulate: The mapping simulates option enables you to test the entire mapping structure. The system
shows if the mapping contains any errors, giving you a chance to fix these errors before deploying the
integration flow. Once you complete the mapping, you can choose Simulate to run a simulation of the
mapping.
Display Queue: The display queue option enables you to test the mapping of a specific node. In
the Mapping expression area, provide a Test Input File and choose (Display Queue) to display the
simulated output for the provided test input file.
Even if the integration flow isn't in edit mode you can execute simulate and display queue test. You can
hence perform the tests for configure only content as well.
You can refer to input XML file uploaded for simulation, for display queue also, and vice versa.
When you revise a mapping schema, be it the source or target message, existing mapping definitions
continue to remain as is. That is, if the revised schema doesn't include a previously used node, the
graphical editor doesn't automatically delete such unused nodes.
If elements belong to the same parent node, the elements are in the same Context. When the parent
node changes, Context Change is inserted into the queue. Context Changes are shown in dark grey
color.’
Therefore, when we talk about the Context of a message it is always based on a certain parent node. An
XML element can have different Context queues depending on the parent node it is related to.
Let’s use the Message Type below to understand Context Changes and Node Functions.
Element ‘Grade’ Context Changes based on higher level parent node ‘StudentDetail’.
Another difference is how these node functions handle SUPPRESS values in queues. While Remove
Context ignores the SUPPRESS values and does not copy them to output, Collapse Context copies them
as null [] values to output.
Collapse Context copies SUPPRESS Values as [].
In this situation both functions handle SUPPRESS values the same way. SUPPRESS values are considered
similar to any other value in the same context.
Both functions remove the Context Changes. The collapse Context function copies the first value from
each Context, and Remove Context copies all values of each context to output.
You can only remove these SUPPRESSED values using a UDF.
Although in this example I use SAP PI/PO, in SAP Integration Suite the functions behaves the same way.
The concept remains the same in SAP CPI.
If you have any questions about Collapse Context and Remove Context node functions, please, leave a
comment below. I will be happy to help.
Set Dynamic Adapter Parameters (File Name, Directory) – SAP BTP IS-CI (CPI)
In SAP Cloud Integration Suite (BTP-IS/CI/CPI), configuring dynamic file names is not only possible but
can be done using a variety of techniques.
In the past, I wrote some articles on defining dynamic parameters such as filename, directory, etc in the
receiver file adapters in SAP PI/PO. There are several techniques like ASMA and Variable Substitution.
However, SAP Integration Suite CI (BTP-IS/CI/CPI) technique is more straightforward.
We define the filename with a unique time stamp and copy the file name prefix from the incoming file.
Imagine a scenario where you have files with different file name prefixes in a certain directory in the
SFTP server. I want to build an iFlow that can fetch and route these files to the target based on their file
name prefix.
For example, files starting with “Order” should be moved to “Orders” target folder on SFTP server.
Invoices to “Invoices” folder and all other files to “Other” folder.
In this scenario, we will make use of the following features of SAP Integration Suite interface
development techniques,
Step 3 – Make Use of Exchange Parameter or Header Parameter to Set the Directory
Let’s make use of content modifiers to determine the directory at runtime. We will have an exchange
property parameter named “directory” to set the value of the directory.
Step 4 – Configure the Reciever Adapter Using Dynamic Parameters
Make use of the Exchange Parameter to define the target directory in the receiver adapter
configuration.
We will make use of a couple of Camel Expressions to define the filename dynamically.
Camel Simple Expression to get the prefix or the filename of the incoming file without the extension.
${file:onlyname.noext}
Camel Simple Expression to add the date as the 2nd part of the file name in the format yyyy-MM-dd.
${date:now:yyyy-MM-dd}
Camel Simple Expression to add the time as the 3rd part of the file name in the format HH-MM-SS.
${date:now:HH-MM-SS}
Other Methods of Setting a Dynamic File Name in SAP Integration Suite CI (BTP-IS/CI/CPI)
In the example, we made use of a custom Exchange/Header Parameter, a standard header parameter
and a Camel Simple Expression to dynamically define the directory and filename at the receiver adapter.
Create Use an Xpath expression or other methods to extract the values you need from the incoming
message payload and assign
To summarize, we can make use of standard and custom Header/Exchange Property parameters to
determine different receiver adapter parameters. Not only use parameters, but you can also use Simple
Expressions to assign custom values during runtime.
This function UseOneAsMany works the same way in both the new SAP Integration Suite (CPI) and older
PI/PO.
The key to learning UseOneAsMany node function is to understand the Input variables and the rules of
using them.
SAP PO 7.5
SAP Integration Suite on BTP (CPI)
Input and Output of UseOneAsMany Node Function.
UseOneAsMany has 3 inputs or arguments. The output of the UseOneAsMany function is derived from
these three inputs.
Rule 1: Input 1 should not contain repeating values in the same context.
Rule 2: The total number of context changes in Input 1 and Input 2 should be equal.
Rule 3: The number of values in Input 2 should be equal to Input 3.
Rule 1: Input 1 cannot have repeating values.
Each context of Input 1 should have only 1 value.
Input 1 is the value from the source message that we are trying to repeat in the target, therefore, each
context of Input 1 should have only 1 value.
Here are some examples of how to assign the first argument of UseOneAsMany function should be
assigned.
In this context, values are repeated. There are two values in the same context. This is an invalid value
assignment for argument 1 of this node function.
Rule 2: Input 1 and Input 2 Should Have the Same Number of Context Changes.
The number of context changes in Input 1 and Input 2 should be equal.
Here are some examples of correct and incorrect assignments of Input 1 and Input 2 of the node
function.
The number of context changes in Input 1 and Input 2 are equal. Total of 2 context changes.
Context changes of Input 1 and Input 2 are equal. A total of 3 context changes
Incorrect assignment of Input 1 and Input 2 of UseOneAsMany. Context changes are not equal.
If you follow these rules when assigning the inputs, you will not come across any Queue Exceptions.
Purchase Order (PONumber) should be repeated at target for each line item.
Source and Target Messages:
UseOneAsMany Mapping:
Input 1 is PONumber in Order context. Argument no 2 is assigned from LineItem and argument no 3 is
assigned from ItemNo segment.
Context of the Input 1 <PONumber> is <Orders>.
Input 1 does not have any repeating values. Only one purchase order number for each context. The first
purchase order number PO1 is repeated 3 times and the Second purchase order PO2 is repeated once.
Input 3 of the function defines the context changes of the target element. Notice that the total number
of values in Input 2 is equal to the total number of values in Input 3.
Example 2: Repeat Product ID at Target Message using UseOneAsMany.
Here we have a source message with product header details. A product can have multiple names. A
target message should be created with each product name and corresponding product ID (ProductID).
For Input 3 we use SplitByValue node function to define the desired context changes.
Notice that we have adhered to the rules 1, 2 and 3 of UseOneAsMany function. No repeated values in
argument 1 which is the Product ID queue.
Argument 2 defines the number of times Product ID should be repeated. Inputs 2 and 3 contain the
same number of context changes in the queue.
Go to T-Code : BD54
Go to T-Code : SALE => Logical Systems => Assign Logical System to Client
Step 2 : Define Http connection to External Server (Type G)
This section require that connection from SAP/ERP must be connect with CPI so, you have to contact
with administrator or Basis team to configure this connection through Firewall or PROXY.
Go to T-Code : SM59
Go to T-Code : WE21
Go to SAP BTP => Account => SAP Integration Suite application => Monitor => Key Store
Go to T-Code : SM59
ICM_HTTP_PROXY_CONN_REFUSED : Message is SAP ERP cannot connect to SAP CPI. So, kindly contact
with administrator to resolve this issue. Ex : Open port from SAP/ERP to SAP CPI through Firewall…
Go T-Code : WE20
Create new Partner Type : LS
Test scenarios
T-Code : WE19
T-Code : SM59
Third party will send data XML with structure ORDERS05 – Sales Order to SFTP. In fact, for simple we will
use POSTMAN to send XML to CPI
In integration flow, for simple we just use one HTTPS for sender adapter and one IDOC for receiver
adapter
In last, after data sent successful to SAP ERP, we will check SALES ORDER created.
Ok, here we go
Go to T-Code : SICF
F8 to execute.
Step 2 : Register service
Go to T-Code : SRTIDOC
Choose Register Service. Press F8 to run with default
Go to T-Code : SICF
http://host:port/sap/bc/srt/idoc?sap-client=<clientnumber>
Go to T-Code : SALE
Go to T-Code : WE20
Choose create and input information in step 4 – local system as Partner No.
Step 6 : Test service from POSTMAN
Go to T-Code : SOAMANAGER
Case 2 : Wrong user/pass or User not in ROLEs
Case 3 : It’s OK. IDOC will create in SAP ERP. Check T-Code : WE02
SAP Cloud Connector Configuration
Create
Step 1 : Create Credential Name with User/Pass use for create IDOC on SAP/ERP. This user have to
create on SAP and take ROLE can create IDOC and Posting document
Go to SAP BTP
Go to Monitor
Case 1 : No body
Payload
POSTMAN :
Name Value
Name SapMessageId
Check control record => details. We will see one text look like
Check on LOG of CPI
SAP CPI] – SCENARIO FOR RFC RECEIVER ADAPTER WITH CLOUD CONNECTOR
I want to discuss about scenario How to send message XML from third party system to SAP backend with
RFC adapter receiver, SAP Cloud Connector and SAP CPI. This scenario use for in cases integration with
SAP backend system by RFC connection. For clearly, kindly take a look this diagram
A. SAP Cloud Connection Configuration
If in integration flow, we check option Send Confirm Transaction, we have to add 2 following function
name :
– BAPI_TRANSACTION_COMMIT
– BAPI_TRANSACTION_ROLLBACK
Everything will look as after done
Go to SAP BTP
Click Destinations on left side menu. Under Connectivity
Click button New Destination
(1) : Name. This Name will use as connection name in RFC receiver adapter of integration flow
(2) : Type : RFC
(3) : OnPremise
(4)(5) : User / Pass of SAP ERP. This user must be take ROLE accordingly
Now, we have to add some property of destination which create in step 1. Add following property
jco.client.lang Language. Ex : EN
C. Issues
Issue 1 : If body send request invalid structure XML of Function Module, we ‘ll receive this message.
When take this issue, we have to check structure XML of Function Module and fix it.
D. Test case
<ns0:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL
xmlns:ns0="urn:sap-com:document:sap:rfc:functions">
<FLIGHT_KEY>
<AIRLINEID/>
<CONNECTID/>
<FLIGHTDATE/>
</FLIGHT_KEY>
</ns0:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL>
<rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL.Exception
xmlns:rfc="urn:sap-com:document:sap:rfc:functions">
<Message>
<ID/>
<Number/>
</Message>
<Attributes></Attributes>
</rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL.Exception>
Case 2 : Invalid date
<rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL.Exception
xmlns:rfc="urn:sap-com:document:sap:rfc:functions">
<Name>FLIGHT_NOT_FOUND</Name>
<Text>FLIGHT_NOT_FOUND</Text>
<Message>
<ID>BC_IBF</ID>
<Number>055</Number>
</Message>
<Attributes>
</Attributes>
</rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL.Exception>.
</Exceptions></rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAILs.Exception>.
Case 4 : OK
<ns0:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL
xmlns:ns0="urn:sap-com:document:sap:rfc:functions">
<FLIGHT_KEY>
<AIRLINEID>LH</AIRLINEID>
<CONNECTID>9981</CONNECTID>
<FLIGHTDATE>20021221</FLIGHTDATE>
</FLIGHT_KEY>
</ns0:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL>
<rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL.Response xmlns:rfc="urn:sap-
com:document:sap:rfc:functions">
<FLIGHT_AVAILABILITY>
<ECONOMAX>320</ECONOMAX>
<ECONOFREE>308</ECONOFREE>
<BUSINMAX>20</BUSINMAX>
<BUSINFREE>19</BUSINFREE>
<FIRSTMAX>0</FIRSTMAX>
<FIRSTFREE>0</FIRSTFREE>
</FLIGHT_AVAILABILITY>
</rfc:SXIDEMO_AIRL_FLIGHT_CHECKAVAIL.Response>
[SAP CPI] – INTEGRATION WITH KAFKA IN CPI
we will explore one more adapter in CPI – Kafka adapter. Particular, we will explore how to send file to
Kafka and receive file from Kafka. Yes, It’s same SFTP.
Step 1 Build Kafka service
First, we need build server which install service Kafka. Fortunately, we have one service free trial 30 days
on cloud. Kindly go to here to create Kafka service on Cloud.
Download 3 files
By using OpenSSL tool, run command line to create key store (P12 file)
pkcs12 -export -name server-cert -in service.cert -inkey service.key -out dev-cpi-server-key-store.p12
Enter password. This password will use in another step.
After done, we will have file in the same folder of service.cert, ca.pem and service.key
In this step we will use key store which created in Step 2 and add it into SAP CPI.
Monitor – Keystore
Add – KeyPair
In this step we will use key store which created in Step 2 and add it into SAP CPI.
Monitor – Keystore
Add – KeyPair
(1) : File P12 which created in step 2
(2) : Password in step 2.
Step 4 Down load Server Certificate of Kafka and import into CPI
Next, we will connect to Kafka host with key pair in step 3 to download Kafka certificate and after that
import this certificate into CPI
Data send from S4/HANA and Save into Kafka with JSON Format
Unit Test
Data send from S4/HANA and Save into Kafka with Binary Format
NOTE
With this format, we will use component Base64 Encode to convert payload XML of S4/HANA to Base64
binary
With this format, 3rd system will create payload with Format XML, after that convert XML payload to
Base64 and send to Kafka
CPI will collect data in Kafka with Kafka sender adapter and send it to S4/HANA by IDOC Receiver
adapter.
Payload XML
Convert to base64 and save to Kafka
NOTE
After collect data from KAFKA send to S4/HANA, message in Kafka will not deleted, but It will marked
with anything and will not send to S4/HANA many times.
[SAP CPI] – MONITORING MESSAGE IN CPI AND S/4
Hi guys, in this article I want to share a tip about monitoring message in CPI and S/4. As you known,
monitoring and search message is important in integration system maintenance to fix issue if it happen.
For example, NON-SAP system send message XML to S/4 through CPI by using IDOC adapter receiver,
how to get message ID of CPI at S/4 and how to get IDOC number of S/4 at SAP CPI ? One more example
in IDOC outbound scenario, S/4 send message XML to NON-SAP through CPI by using IDOC adapter
sender, how to get IDOC number of S/4, some data in data record of IDOC at SAP CPI ? All this answer
for these question, I will share in this article.
In CPI, we have some header which using for search message in CPI.
SAP_ApplicationID – This is where you can put the content of your payload. So you can put in your
invoice number. This value will display in field Application Message ID in MPL search
SAP_MessageType – This is where you can put extra data like message type, basic type, extension…This
value will display in field Application Message Type in MPL search
SAP_Receiver – This is where you can put name of receiver system. This value will display in field
Receiver in MPL search
SAP_Sender – This is where you can put name of sender system. This value will display in field Sender in
MPL search
SAP_MessageProcessingLogCustomStatus – This is where you can put extra data for field Custom Status.
This is should be configure at tab properties
Add more, you can use groovy script to add custom header in MPL search as below
import com.sap.gateway.ip.core.customdev.util.Message;
if(messageLog != null){
if(IDOCNUM!=null)
messageLog.addCustomHeaderProperty("IDOCNUM", IDOCNUM);
return message;
Scenario: POS/NON-SAP will send message to S/4 through CPI. In MPL CPI we have to log some data
In this scenario, we also use groovy script to send value header from outside into message mapping to
resolve some data dynamic for fields of control record – EDIDC_40 over every environment.
TABNAM
MANDT
DIRECT
IDOCTYP
MESTYP
SNDPOR
SNDPRT
SNDPRN
RCVPOR
RCVPRT
RCVPRN
ARCKEY : urn:sap.com:msgid=<messageID>
/* Refer the link below to learn more about the use cases of script.
https://help.sap.com/viewer/368c481cd6954bdfa5d0435479fd4eaf/Cloud/en-US/
148851bf8192412cba1f9d2c17f4bd25.html
If you want to know more about the SCRIPT APIs, refer the link below
https://help.sap.com/doc/a56f52e1a58e4e2bac7f7adbf45b2e26/Cloud/en-US/index.html */
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
import com.sap.it.api.mapping.*;
import com.sap.it.api.mapping.MappingContext;
return headervalue;
}
This groovy script will receive header name from content modifier of IFLOW and return header value. By
this way, we can configure dynamic for these value of EDIDC_40 control record of IDOC.
With IDOC adapter receiver, we should be use with REQUEST REPLY to get response from S4 to CPI.
This is payload response
So, after this step we will use content modifier and groovy script to set custom header
Add header name is IDOCNUM with value is //ns2:DbId/text() and data type is java.lang.String
Take focus for prefix namespace NS2, we will set namespace prefix
All of thing the same in scenario inbound, for outbound scenario, send message type DEBMAS from S4
to POS/NON-SAP. We set header in content modifier
And add one field at tab Exchange Property for Custom Status :
SAP_MessageProcessingLogCustomStatus
This is result
SUMMARY
In this article, I talked about monitoring message in CPI with headers. I also talked about how to set
dynamic value for IDOC control record from header value outside into message mapping by using groovy
script. Thank you for your reading and any advise kindly leave your comment on this. Thanks.
[SAP S/4 HANA CLOUD] – HOW TO SEND DATA FROM S/4 HANA CLOUD INTO SAP CPI
Hi guys, as you know we will upgrade to S/4HANA cloud in near future and as integration consultant, we
need to explore how to integrated between 3rd system with S/4HANA Cloud through SAP CPI.
In this article, I want to share the first scenario How to send data from S/4 Cloud into external system
through CPI by using DRF – Data replication framework.
Administrator : SAP_BR_ADMINISTRATOR
Input anything in hostname and save. We will comeback this screen after configuration artifact on SAP
CPI
Go to SAP CPI
Create package
Create Artifact
In this step, we need to create service key in CPI. This step look like we configure user for outbound
from ON PREMISE / RISE SAP in SM59.
Copy Client ID and Client Secret for next step.
UPDATE OUTBOUND USER AND SOAP END POINT ON COMMUNICATION SYSTEM APP S/4 CLOUD
Go to S/4 Cloud
Edit
Save
Create new
Select scenario like in start step. SAP_COM_0008
We need to send Business Partner, so we have to select and configure for Business Partner
We have to define Replication Model and Output Mode for every object which we want to send from
S/4 Cloud to external system in section Additional Properties.
P : Pooled Output
D : Direct Output.
In this section, we have to test scenario send data to SAP CPI . To do this step, we have to assign
ROLE BR_ADMINISTRATOR_DATA_REPL for user
This is user which is used for communicate between systems. Every external system we should be create
every user communication.
With OUTBOUND user we will create on external system and use in S4HC for OUTBOUND service
In integrated strategies, we will many external system. Ex: POS, CPI, WMS..etc..With every external
system, we have to create on S4HC.
(2) – In this we use SAP CPI to receive data from S4HC, so this user is Client ID, Client Secret in service
key of IFLOW
STEP 03 – COMMUNICATION ARRANGEMENT
After select communication scenario on API hub, we will use COMMUNICATION ARRANGEMENTS app
to create scenario.
NOTE
If you can not find scenario in this, check scope item actived by SAP or not.
(1) – Select SYSTEM which integrated with S4HC in this scenario.
When you select SYSTEM, user INBOUND and OUTBOUND will fill automatically.
(1) – Input path. This path we will get in SAP CPI after deploy IFLOW.
Because of we select method SOAP so we need HOST & RESOURCE SOAP of external system. In this we
use SAP CPI so IFLOW will like this
STEP 04 – CONFIGURATION OUTPUT TYPE
Next, we will configure output type by using OUTPUT PARAMETER DETERMINATION app.
With this configure, we just focus into OUTPUT TYPE, RECEIVER, CHANNEL
(1) – Rule for PURCHASE ORDER
In integrated scenario, we need to monitor all message which sent from S4HC, so we have to configure
to do this by using message dashboard.
First we need add namespace for user by using ASSIGN RECIPIENTS TO USERS app
UNIT TEST AND MONITORING
We will send purchase order to external system by using MANAGE PURCHASE ORDERS app
We will monitoring message by using MESSAGE DASHBOARD app
Monitoring on receiver system – SAP CPI
Payload
Today I will share with you about Integration Approaches and Performances. That’s is now a new topic
with you guys when design Integration Landscape with third parties system.
We all know that integration design depended on the system’s technology, enhancement capabilities,
project timeline, costing, and adaption of systems in Landscape.
In this topic, I will not mention things the dependences mentioned above. I will share my opinions about
how to design Integration Landscape flexibility, scalable, off-line ability, and independence.
In the limitation of knowledge, please note that this article is “My opinions“, please share your opinion
in the comment below topic.
Talking about Integration, we will think about documents/objects transfer from System A to System B
and Versa. Yes, that’s right data will exchange from system to system with the different data stores
(Database) and different system technology.
Data Exchange, in reality, we can see in daily life. Customer play order on the E-Commerce website, data
will send to Store / Warehouse. The storekeeper will pick and package goods and put them in a
temporary location, delivery man receives delivery orders and deliver them to the customer. The
customer received goods and confirm Orders/Payments… Basically objects exchange between seller and
customer is goods.
All sales activities captured in Systems, E-Commerce system can be in-house application, Order
Management System(OMS) can be in-house or from Software Vendor, Delivery Management
System(DMS) can from In-house or from Vendor.
To present business process objects exchange between System to System we can simply name is
“Messages“
In the example above we can think that similar to Synchronous messages. The left Lady start
sends/exchange message to right Lady and always waiting for right Lady response. The whole
conversation we can assume is “Session“.
The conversation between ladies happened at the same time block, we can call it as”Real-time“
For Synchronous system to system messages exchanges similar to a conversation between two Ladies.
System A sends Messages to System B and always waiting for a response from System B to finish the
session.
Web-service
Database Direct
[1.2] – Asynchronous Messages
Mailman goes to houses and sends newspapers to the mailbox as scheduled, he does not care about the
house owner received or not, he moves to the next house and puts to house’s mailbox.
The messages here are newspapers, each time he put newspapers into the mailbox he was completed
the “session“. Each hour, he can send many newspapers to houses.
In some scenarios valuable goods, mailmen need sign-off from the receiver.
Mailman send Mails and house’s receives Mails in a different time block, we can call it as “Off-Line“
Very simple concepts right ?. In Systems data exchange also the same concept for Asynchronous
messages. System A can send out messages to many receiving systems without any response
expectation.
In the integration design, files can be exchanged between System A to System B as much as possible and
no matter when and how System B receives files.
File transfer
Send Email
Plant Master
Vendor Master
Customer Master
Material Master
Promotion Definition
Selling Price
Assortment.
…
Almost master data objects defined structure to exchange to None-SAP systems as “SAP IDOC” , values
of objects populated in to IDOC structure.
New/Changing data recorded in SAP ERP system at “Change Pointer” data store.
Why should not use Webservices to Exchange Master Data (Synchronous Integration) ?
Many legacy / third parties systems will receive master data from SAP ERP with different data
requirements. Development or Consume Webservices API will decentralize.
Master data in Retail is huge and can be changed regularly. Performances of SAP ERP will dependent on
receiving systems.
Leak information SAP ERP public all object’s data as Webservices provider, otherwise we has to develop
individual API for each receiving system.
Master data changed in SAP ERP and third systems won’t know which object was changed exactly. In
some scenarios, 3rd system have to get all data as schedule and filter out changed objects, it will cause
unwanted data transfer between SAP and the None-SAP system.
Bottleneck during message exchange.
SAP build-in IDOC for master data, we can re-use data structures.
Each partner system can have different data requirements, We can re-use original messages with
filter/mapping to adapt specific requirements.
Huge master data can send to the None-SAP system without any performance any dependences.
SAP trigged changing objects and only send from “Change Pointer” containers, that’s reduces
redundancies data.
Flexibility to control field mapping with 3rd system.
Reduce development tasks.
Messages splitting helps data consistent.
End of Part 1
Integration objects
There are many documents will be exchanged from System to system, between SAP and None-SAP
systems.
Customer play sales order on font-end system (POS, Mobile APP, E-Commerce website, or another touch
points…)
Customer do Payment.
Customer confirm Received Goods.
Customer query Loyalty Information.
Customer query Promotion Program.
Customer Pick Goods at Store.
Delivery Processing.
Services Orders Documents.
Deposit Documents.
Credit Notes Documents.
Installment Documents.
Reservation Documents
E-Invoice Documents
Stock Availability Check.
…
Analysis Transactions
When you look at the list of documents above in Font-end, back-end and Analysis groups, documents
handled by SAP or third parties system we will realize that not all documents need to transfer
immediately like “Delivery Orders(DO)“.
Cash and Carry: Customers choose items and do payments at the store, receipts are confirmed after
customers do payment at the cashier. For grocery or FMCG, many customers come to the supper market
and do the payment at the same time, especially is a weekend. If we designed that Synchronous
message will be used then if any issues with network connection, HQ/SAP severs performance issues
that will hugely impact with business and sales opportunities. The sales data no need transfer to the
backend system immediately. The highest priority is to make POS process input Sales as fast as possible.
Click and Collect – Home delivery: Opposite with Cash and Carry, customers select items on Online
touchpoints (Mobile App, E-Commerce website) and they don’t know the actual physical inventory of
items. The Font-end system needs to check inventory immediately from the back-end system. The after
confirmed order in the system, the delivery process will be performing as schedules. We don’t need to
make delivery orders immediately but we need to reserve goods immediately to reduce out-of-stoc
sales scenarios above “Cash and Carry” and “Home Delivery” we note that those transactions interact
directly by the customer we can use Synchronous message transfer.
Reservation: Reserve goods to make sure that Stock Available to deliver to Customer. Reservation action
need execute immediately when customer confirmed Orders.
Stock Availability Check: Consumers want to know Stock Availability on Online touch points. The
synchronous message needs to be applied to get current system stock indicators.
Payment and Financials Transaction: Accounting want to control revenue as soon as possible.
Sales per receipt in Cash and Carry Model. There are many receipts placed in the system at the same
time, the cashier needs to scan items as fast as possible. POS need perform query and response result as
soon as possible…
Transaction related to Inventory movements in retail also huge and does not impact the business
directly. Store / DC operator can capture movement transaction and system will exchange together
schedules.
Analysis Transaction
Analysis transactions will send to the analysis system when the source system finishes a business
processing. To make sure that data enough information and data to calculate, the Analysis system needs
to collect as much as possible so data transfer from ERP to the Analysis system can very huge. The
synchronous in this kind of transaction not necessary.
SFTP questions :
Assume we are getting order details from source system but business Order No in file name on
the target system can we implement this requirement in SAP CPI if yes how to implement it
How to populate file names dynamically using FTP/SFTP adaptor?
Assume we are getting order details from source system but business need time stamp
(yyyy/mm/dd)in file name on the target system can we implement this requirement in SAP CPI
if yes how to implement it
How to pick the files from multiple folders in cpi through SFTP Adaptor
Zipping multiple files in sap cpi and send to Target as zip folder
SFTP adaptor to poll multiple files from different folders
-----------------------------------------------------------------------------------------------------------------------------------
How to transport the iflow from QA to Prod using CTS+ in sap cpi how to say in interviw
Say client provided some Idocs and said they failed to reach reciever system now how do you
reprocess them from cpi as we don't have Idocs number stored in CPI ?
End to end flow from design to production deployment What work we do in each environment
in detail how to say in interview
Exception handling using data store and JMS in detail–business use case
Introduction in brief about cpi and one complex scenario
ValueMapping,Data store ,Multicast – business use case for each