0% found this document useful (0 votes)
323 views

Looping Process Call

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
323 views

Looping Process Call

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 57

Looping Process call: Add a looping Process call step to repeatedly execute the steps defined in local

integration process until the condition is met or max iterations allowed is reached which ever comew
first

Idempotent process:Adds a process call step to check for Duplicate Ids and conditionaly execute the
steps defined in local integration process

Poll enrich:Polls to external system(SFTP server) and enriches incoming message

Request Reply: it’s a call to external system and wait for response synchronous call

Content Enricher :Mechanism to combine the incoming message with additional data received from an
external system

Encryptor:Encrypts the content of incoming message body

Decryptor:Decrypts the encrypted data contained in the body of incoming message body

Signer:Digitally signs the message content

Verifier :verifies the signature of message content

Splitter : Breaks down a composite message into series of individual messages

General Splitter:Breaks down a Message into individual messages keeping the encapsulating
elements

Iterrating splitter:Breaks down a Message into individual Messages without encapsulating elements

Router – Default route

 We can specify any number of branches with each branch having a condition
 If the incoming message satisfies the condition it will route to the corresponding branch

 Splitter should always contain Gather


 Multicast : should have combination of Join+ Gather

Aggregator : In the source if you are getting more than one incoming message

Simulate vs simulation

Simulate – its done in only Message Mapping

Simulation : we can perform the simulation in between pallete functions


Test the Message Mapping

 Display queue
 Simulate

Trace will be enabled for only 10 mins Once enabled we can see Message logs till 1 hour

Manage Security

 Keystores: Add certificates ,keystore, SSH key


 PGP Keys: we can Add and download Public keys /Secret keys
 Security Material- User credentials ,OAUTH credentials we will create

COMPLETED: Message has been delivered to receiver successfully.

PROCESSING: Message is currently being processed.

RETRY: Status retry is set if an error occurred during message processing, and a retry was automatically
started.

ESCALATED: During message processing an error occurred, and no retry has been triggered. For
synchronous messages, an error message is sent to the sender.

FAILED: Message processing failed, message hasn’t been delivered to receiver, and no retries are
possible. In other words: FAILED is a final status, and message processing ultimately has failed.

CANCELLED: Manual cancellation of entries in the JMS queue - MPL is set to status cancelled.

DISCARDED: For scheduler triggered integration flows, the MPL is shown on the worker node where the
message processing started first. For all subsequent message processing starts, the message status is set
to DISCARDED.

For example, assume that an integration flow is initiated by a Timer event (scheduler) and a worker
node goes into an out of memory state. The system starts the worker node again and synchronizes the
integration flow as soon as the node is active. The message is restarted in that case, and a new message
ID generated. The message with the original ID goes into status DISCARDED.
ABANDONED: Message processing was interrupted or log wasn't updated for an uncommonly long time.
The status can change in case processing is resumed.

, "Failed" indicates a complete inability to process the message, while "Escalated" suggests that the
message encountered an issue that needs further investigation but isn’t outright failed. Both statuses
are crucial for monitoring and maintaining the health of integrations in SAP CPI.

Failed

This status indicates that the message processing encountered a critical error that prevented it from
being processed successfully. Common reasons for a failure might include configuration errors, missing
mappings, or connectivity issues.

In this we have two nodes

In SAP Cloud Platform Integration (CPI), the main difference between a tenant management node and a
runtime node is that a developer can modify the tenant management node, but only check the status of
messages in the runtime node:

Variable :

: What is the different between local variable and global variable?


A: Local variable can be access via same iFlow only. Global variable can be access via differet iFlows.

How to read local variable and global variable?


A: Use Content Modifier read to either header or property.

Q: How to write Variable?


A: In iFlow, use 'Write Variables' step, take value from header/property/Xpath/expression.
Is it possible local and global variable having same name?
A: Yes. Since the scope is different between local and global.

: How to do delta synchronization via timestamp?


A: Use variable to remember last processed timestamp, so that next scheduled run will resume from last
process timestamp onward.

Q: What need to consider when design delta synchronization via timestamp?


A: (1)Data should be sorted by timestamp.
(2) Timestamp should be unique (E.g. only date without time might not work).
(3) The right date field should use for delta synchronization.
(4) Only update last processed timestamp at last step if all processing success.
(5) Timer/scheduler interval.

Q: What if I need to revert back to earlier past timestamp?


A: Build in same iFlow a manualRun/adhocRun flag to set manual timestamp, override value in variable.

: Should I use global variable or local variable?


A: Use global if other iFlow need to access same variable. Global can behave like local, but not the other
way round.

Q: What ways can be use to delete variable?


A: Manual delete via 'Manage Variables' page.

Q: What other potential use of variable?


A: Access same variable value in different branches of Multicast (because property will not work).

At iFlow first run, variable not created yet but need to use some initial/default value for processing, how
to handle such chicken-and-egg situation?
A: Use Content Modifier read variable and set default value.
-------------------------------------------------------------------------------------------------------------------------------

Data Store:

Q: How to write to data store?


A: Use DS write step.

Q: What is the different between Visibility: 'Global' and 'iFlow'?


A: 'Global' mean any iFlows can access DS; 'iFlow' mean only same iFlow that write can read it back.

Q: At Write DS, Is it mandatory to specify Entry ID?


A: No.
Q: What happen if write to DS with same entry ID twice?
A: By default will fail/error. If selected 'overwrite existing msg' then will replace/update.

Q: Is it message body only will write to DS?


A: Body always write. If select 'include msg headers' then headers will be write to DS as well.

Q: What kind of payload format can write to DS?


A: No restriction. Xml/Json/Text also is fine.

Q: For DS Get, what happen if not specify entry id?


A: Will fail, entry id is mandatory for DS Get.

Q: What are the main different between DS Get and DS Select?


A: DS Get fetch single entry; DS Select fectch multiple.
A: DS Get mandatory to specify entry id; DS Select no option to enter entry id.
A: DS Get support different data format; DS Select only support XML.

Q: After Get or Select from DS, what are the ways to delete the entry?
A: Use 'delete on completion' or DS delete by entry id.

Q: When writing a list of records to DS, if some record processing failed, will the DS operation have
partial success, partial failed?
A: Yes. Now new iflow by default 'transaction handling' is none.

Q: How to select multiple entry from DS, process each entry independently one-by-one, process those
success one, skip those failed one, and further write to different DS in case success or failed?
A: Use combination of DS select, splitter, router, exception sub-process, 'transaction handling' setting, 1
source DS + 2 target DS to achive this. Will show in course lesson.

Q: What data format supported by DS sender adapter?


A: Xml, non-xml or any format also is ok.

Q: What so special about DS sender adapter, compared to DS Get & Select?


A: DS sender adapter have auto retry feature.

Q: Why DS sender retry consider as 'smart' retry? Describe it, please?


A: It have 'Exponential Backoff' retry option. Each retry will double the wait time.

Looping Process call: In single call we cannot pull the entire data from success factor we go for looping
process call .. it will iterate/recall same local integration multiple times based on the count which we
provided in the looping process call it will work based on the condition which we provided .. you can
select local integration process you want to call
If you are querying SAP SuccessFactors EC using the CompoundEmployee API, how would you query all
the records if the page size is set to 200 and they're a thousand records in EC?

Looping Process Call: Call the Local Integration Process till Condition specified is true.

For example, it can be used to iterate through a list of failed messages present in a datastore for retry.
Similarly, it can also be used for sending a small subset of data at a time, which in turn improves
performance by reducing the memory and processing time for each iteration.

A Looping Process Call involves invoking a subprocess multiple times within a single flow based on a
collection of items or until a certain condition is met.

For each call how many records you want to pull you need to specify

ODATA Sender Side:

ODATA IS ADAPTOR Specific to call ODAT Service we need to use Odata adaptor

ODATA Reciever side:

Mandatory Fields

 Address
 Operation Details
 Resource Path
To call Success factor we need to use Success Factor Adaptor which is Adaptor Specific
ABAP language is used

Ways to connect SAP s4 Hana to CPI

 IDOC
 RFC
Proxy
 SFTP File

Intermediate Document

Idoc Number:
This however means, for each IDOC

Control records – idoc type message type idoc number Sender port reciever port, Reciever
partner,sender partner ..

Data Records
-

Direct 1- outbound

Direct 2 is Inbound

High Level Steps :


cxf/debmas....] in SM59

We can also use runtime property SAP_PollEnrichMessageFound to check file exist condition in looping
process call step.

Can you Please differentiate Poll enricher and Content Enricher...

Poll enrich to pull the file from SFTP server using SFTP adapter ..Content Enricher to enrich the XML data
from Odata service using Odata/SF adapter

In real time also we cant Poll more than one file using Poll Enricher in Single run....? Poll enrich will pull
only one file real time too

Poll Enricher :

 It will poll only one file at one execution.


 It doesn’t have any scheduler
 we need to select the Sender and also the arrow should point from sender to poll enrich
pallet function

SFTP – test connectivity

 How to check whether my file exists in sftp server or not ?


 Know host file ?
 Dynamic file name – timestamp only date i need

On Premise and internet --- on Premise (cloud connector )- location ID

General splitter vs iterator splitter?

 Breaks down a message into multiple messages keeping the encapsulating elements
 Breaks down a Message into Multiple messages without encapsulating elements

Sequential Multicast vs Parallel multicast

Sequential Multicast: In sequential multicast, messages are sent to each receiver one after the other.
The next receiver only receives the message after the previous one has been processed.

If one of the branches fails during processing, the sequential multicast will stop further processing of the
remaining branches.
Parallel Multicast: :

In parallel multicast, messages are sent to all receivers simultaneously. Each receiver processes the
message independently and concurrently.

If one branch fails in a parallel multicast, the other branches continue processing. Each branch can
handle its own errors independently,

Summary of Failure Scenarios

 Sequential Multicast: If a branch fails, processing stops for all remaining branches.
 Parallel Multicast: If a branch fails, the other branches continue to process the message
independently.

Gather vs Join:

If you want to combine messages that are transmitted to more than one route by a Multicast step, you
need to place Join step before the Gather step.

If you want to combine messages that are split using a Splitter step, you use only the Gather step.

The Join step is used in combination with the Gather step

Request Reply- HTTP ,ODATA,JDBC

Iflow:

Sender ->HTTPs adaptor -> Start Message ->Content Modifier ->General Splitter ->Content Modifier

Both Content Modifier -${in.body}


 End
 Content Modifier
 General splitter 1

These three will be repeated 10 times you can observe Segment 3 ..segment 4..Segment
5..Segment 6..Segment 7 ..Segment8..Segment9..Segment10

Basically after splitter what ever you keep in the iflow it will be executed Multiple times depending
on the incoming Message being split to Individual messages.
Either you will have data issue or Connectivity issue once you send an message to Queue – after
successful deployment of iflow the the message will be stored in Monitor -> Manage Store->Message
Queues

 If its data issues how many times you retry also it will fail
 If its connecitvity issue then we can do a retry
 Messages will be in Queue for 30 days if the message in the queue is not consumed ..
 After Message has been successfully reached to the reciever the Message from the Queue will
be automatically deleted

If the Message doesn’t reach the Reciever, the Status of the iflow will be retry instead of failed status
untill the Retry time gets exhaused / Max retry interval

once the Message reaches the Reciever the status of the iflow will change from retry to completed

Retry will happen based on Queue configuration at Sender side JMS adaptor

Dead letter Queue: after all the retries the Message will go to Dead Letter Queue

In the Same iflow we can use JMS as Reciever and Sender


Two Integrations Process flows one as JMS Reciever and other as JMS Sender

Message Mapping tips : to start working on Mapping right click on Mapping you will see + icon to create
the Mapping

You need to add source xsd and target xsd


If you want to delete the Mapping ->click on the fx icon and right click to see the delete option

You can also edit the Mapping by clicking on edit button on the top

If you want to work on Mapping click on

Testing :

 To test the Mapping you can click on simulate button and you need to upload the source xml
and click on test button it will show target values
 You can also test the individual node using display queue

1. createIf

 Description: Creates an element if it doesn't already exist.


 Use Case: When building a payload structure, use createIf to ensure that essential elements
(e.g., an address or a customer ID) are present, even if the source data is missing them.

2. exists

 Description: Checks for the presence of an element or property in the payload.


 Use Case: In conditional routing, use exists to determine if a certain field (e.g., customerID) is
present to decide the next steps in the integration flow.

3. removeContext

 Description: Removes an element from the context.


 Use Case: If certain temporary data (e.g., intermediate calculation results) is no longer needed,
removeContext can be used to clean up and reduce payload size.

4. collapseContext

 Description: Merges elements from the current context into the parent context.
 Use Case: When you have deeply nested structures and want to simplify the data, use
collapseContext to flatten it for easier processing in downstream steps.
5. useOneAsMany

 Description: Converts a single element into a collection.


 Use Case: When the source data may provide a single item (like a single invoice) but the target
requires an array (like multiple invoices), use useOneAsMany to ensure compatibility.

6. mapWithDefault

 Description: Maps a value to a new key, providing a default if the original value is absent.
 Use Case: In cases where certain values might not be present in incoming messages,
mapWithDefault helps avoid null references by providing a default (e.g., mapping shippingCost
to 0 if not found).

7. formatByExample

 Description: Formats data based on an example structure.


 Use Case: When integrating data from different sources with varying formats, use
formatByExample to standardize the output (e.g., ensuring all date formats are consistent).

8. getHeader

 Description: Retrieves a specific header value from the incoming message.


 Use Case: When processing messages that include metadata (like authentication tokens),
getHeader can extract this information for validation or logging.

9. getProperty

 Description: Fetches a property value from the integration flow context.


 Use Case: When you need to access configuration values (like environment-specific settings),
getProperty can retrieve these for use throughout the flow.

10. sortByKey

 Description: Sorts elements based on a specified key.


 Use Case: In scenarios where the order of elements matters (like in billing or reporting), use
sortByKey to ensure they are organized correctly before processing.

11. sort

 Description: Sorts elements in the payload based on a specified criterion.


 Use Case: When dealing with collections that need to be in a specific order (e.g., sorting
transactions by date), use sort to ensure the desired sequence.

12. splitByValue

 Description: Divides a collection into smaller parts based on a specified value.


 Use Case: When processing batch data that needs to be segmented (like splitting orders by
regions), splitByValue can be used to create distinct groups for further handling.

13. replaceValue

 Description: Replaces specified values in the payload with new ones.


 Use Case: For data cleansing, if certain values (like obsolete status codes) need to be
standardized or updated to reflect current definitions, use replaceValue.

Process call vs Looping process call

Process call: synchronous call(it will wait till the output is returned to process call from local integration
Process )

- Generally Process call and Local Integration process are interlinked ie.(in programming its
caller and calling method ..here Process call is caller method and Local integration process is
calling method Through Process call we will call the Local Integration Process .. once you add
the Process call into the iflow -> in the configuration it will We need to select the local
integration process that you need to call

- How will you know the gather weather last message was reached?
- What are the different ways in which an Integration Flow can be migrated from one tenant to
another?
- Where are the certificates installed in Cloud Integration?
- Why Value Mapping was provided by SAP when Fixed Value Mapping does the same job?
- Have you worked on sftp adaptor if yes what are pre requisites to use SFTP adaptor

- How to check the sftp connectivity from cpi ?


- What is use of Known _host file ?and how to get the details
- What is business case to go for ftp adaptor
- How to populate file names dynamically using FTP adaptor
- Assume we are getting order details from system but business order No in the file name on the
target system can we implement this requirement in sap cpi if yes how to implement it
- What are the issues you faced with sftp adaptor and ftp adaptor take one or more issues and
tell me the root cause and resolution steps ?
- Can you tell me some of Odata commands what you used for business case?
- How to filter content while quering data from Odata entries?
- Can you tell me any 5 apache camel expressions what you used in current project
- Assume target wants encrypted file how you are going to implement this requirement ?what
key and whose key you will use to encrypt?
- Where do you store these keys and certificates in sap cpi
- Have you used Encryption and Decryption pallets ?
- Difference between content modifier vs content enrich
- What is difference between parallel vs sequential multi cast
- If you use sequential multicast if one of the Reciever got failed then what will happen next?
whether data will go to next receiver or not?
- If you use parallel multicast if one of the Reciever failed then what will happen next ?whether
data will go to next receiver or not?

- How are you going to handle exceptions in sap cpi tell me Apache camel expression Name to
hold these exceptions?
- Assume that , In current project we need to pass source data to multiple receivers once message
successfully processed to receiver 1 then only we need to process Reciever 2 can we implement
this type of requirement if yes how to implement
- What is difference between Error End and End message and Escalation end ?
- How to call local integration process/subprocess into main process?
- Did you get chance to work on local integration process if yes what is business case
- In content enricher ,if you use Aggregation method –enrich what are the mandatory parameters
you observed?
- Have you correlate orginal message and lookup message
- What is difference between merge and enrich in content enricher
- Have you worked on mail adaptor if yes,what are pre requisites to use mail adaptor
- How to populate subject line dynamically? In body i want to populate iflow name ,Message ID
and data time ..etc if any integration process fail how to implement this requirement
- How to schedule the interface in sap cpi to run automatically
- By mistake you have deleted integration package from design can we revoke /revert that
integration package
- Did you get any chance to work with rest api?
- How to call the Rest api using OAuth client credentials?

IDOC AUTHENTICATION SENDER SIDE – USER ROLE OR CLIENT CERTIFICATE


IDOC AUTENTICATION RECIEVER SIDE : BASIC OR CLIENT CERTIFICATE
SFTP : will pick only one file from the directory

In SFTP configuration File name :* (if you give star it will pick all the files in the folder )Cloud
connector : to send any data to on premise application

What cases we will use it :

We will use only reciever side one more thing action from cpi also we will use at sender side

JDBC – we have to pull the data from database server (sender and Reciever side)

SFTP /FTP- we are pulling the file from SFTP server (sender and Reciever side)

Odata – we are pulling the data (Sender and Reciever side)

RFC-Both sides

Any action if You want to send the data or pull the data in that case we will use cloud connector

Never use Cloud connector :

 HTTPs- Action from outside we are pushing the data to cpi through post man
 IDoc
 SOAP
 Proxy
 Proxy

If cpi initiating pull request then we use sftp, jdbc, odata.. And if cpi initiates push request we use http,
soap, idoc etc we generate ulr and share it with source server team they will consume our cpi url.. Is my
understanding right?- Yes

Setup will be done by basis team

JMS is for only internal communication.. We can not use JMS for external server

Ways to connect from S4 Hana to SAP CPI :

 Idoc
 RFC
 Proxy
 SFTP File

Landscape -3 system

 Development Environment – ED1


 QA environment – System Name :EQ1
 Production Environment -System Name EP1

- outbound configuration

Idoc type

Control records : Idoc number,Idoc typec,Sender port, sender port type,reciever port,Reciever port
type,Sender partner ,Reciever partner
 The IDoc Control Record parameters that need to be populated are,
o IDOCTYP - IDoc Type
o MESTYP - Message Type
o SNDPOR - Sender Port
o SNDPRT - Sender Partner Type
o SNDPRN - Sender Partner Name
o RCVPOR - Receiver Port
o RCVPRT - Receiver Partner Type
o RRVPRN - Receiver Partner Name

Data records :

We have developed an interface ..where we will getting the Material data from S4 hana which need to
be sent to third party system ie., SFTP Messaging Mapping ->xml to csv

Steps for Outbound Configurations of IDoc in S4HANA:

Once we have deployed an iflow we get

Out bound IDoc Configutation from S4 to CPI


To be able to send messages from SAP S4Hana to SAP CPI import CPI certificates in S4Hana STRUST. You
can download CPI certificates either from keystore or from any browser.

Working with PGP Keys in SAP CPI(Public +Private keys) - Kleopatra

Configuration of PGP Keys

 Public Key: You need to import the recipient's public key to encrypt messages. This key is usually
provided by the recipient.
 Private Key: For decrypting messages, you need to import your own private key. This is the key
that corresponds to the public key used by the sender.

Monitoring ->Manage Security->PGP Keys-> ADD(Public Keys + Secret Keys)


Which ever you have generated in Monitor ->Manage security->PGP Keys -> you might have uploaded
the public key and that name has to provided in Encryption Key User ID

Signing: Select including option it will ask for private key We need to generate private key and public
key has to be given to recipient
Which ever you have generated in Monitor ->Manage security->PGP Keys -> you might have uploaded
the public key and that name has to provided in Encryption Key User ID

Signing: Select including option it will ask for private key We need to generate private key and public
key has to be given to recipient
We use SFTP As sender and reciever

How to check whether the file exists in sftp server in sap cpi?

In the process of test connectivity to SFTP server you need to select the Authentication type as User
credentials /Publick key and once you provide the details ..you can click on the checkbox Check
Directory access and provide the Directory Path to see all the files in that directory..below is the
screenshot for the same.
 Go to Connectivity Test in SAP CPI monitor

We need to provide the connectivity details to connect SFTP server (Monitor->Connectivity Test-> SSH
tab) once the connection is successful we can able to see the directory details along with files

High Level Steps :

 If the Authentication type is user credentials we need to create it in Monitor->Manage Security


Material -> click on create button from drop down select -> User credentials and then provide
the details
 if its not reachable ask sftp team and firewall team ..please open the firewall and port enable

Public Key Authentication:

 if the Authentication type is public key Go to Monitor ->Manage Keystore and click on drop
down next to create SSH key
 Once you create SSH key you need to download public key and share this with sftp server team
 Once you download this public key and send it to sfmc team they will assign one user to this
public key
 Sftp team will add user to this public key this is not our responsibility just showing for your
understanding
 Once you have done sftp connectivity test now you can use it in iflow sftp adaptor
configuration

SFTP: using SFTP adaptor we are connecting to sftp server

You need to have connectivity details to connect to sftp server

Authentication:

 User Credentials
 Public Key
If the Authentication type is user credentials we need to create it in Manage Security and then provide it

Its successfully reachable to cpi if its not reachable sftp team and firewall team ..please open the
firewall and port enable
To avoid this issue:

Copy Host Key and upload here


Generate SSH key & SFTP Connectivity Using Public key
Go to Manage Keystore and click on drop down next to create
Once you create SSH key you need to download public key and share this with sftp server team

Once you download this public key and send it to sftp team they will assign one user to this public key

Sftp team will add user to this public key this is not our responsibility just showing for your
understanding
Now we need to check the connectivity in CPI

We are getting error


Now change the Proxy type as On Premise ---- Location ID
Now change it to Authentication to Publick key and Enter user name and Private key we generated in
cpi
Once you have done sftp connectivity test now you can use it in iflow sftp adaptor configuration
Always remember Address : Host + Port host :22->port number is 22

Proxy type will decide whether it is Onpremise or cloud (internet)based on the option you select

How to update already existing SSH key which is near to expire?

Don't have option for extention, need to generate new key and have to share with SFTP team

you connect to SFTP receiver side we need to creat SSH key pair and all when ever we create Sender
side SFTP what action we need to take? we need to create Again SSH for sender side ? same we need
add in Cloud connecter level ?

Adaptor level configuration: once the test connectivity is successful we will come back to sftp adaptor
In real time we will Delete the file after Processing
General splitter vs Iterator splitter

Splitter : to Split the composite message into Individual Message we go for splitter
 Iterator splitter:breaks down the message into individual messages without encapsulating
elements(without header elment)
 General Splitter: breaks down the message into individual messages keeping encapsulating
elements(including header element )
 Request reply (http/soap/odata)– it will send request and get the response so two times it will
be called one for request and one for response

Process direct – without load balancer ..Any number producers but only consumer N_-1 Multiple
producers can connect to a single consumer, but the reverse is not possible.

IN Message Monitoring if you want to search the related iflows information we need to use –
correlation ID

 Process direct adapter is used for communication between integration artifact/flows in SAP
Cloud Platform Integration platform for the tenant.
Producer flow –
 Consumer flow -
Splitter & Gather :

Splitter is used in combination with gather..

Multicast is used in combination with join+gather

 If you want to combine messages that are split using a Splitter step, you use only
the Gather step.

 If you want to combine messages that are transmitted to more than one route by
a Multicast step, you need to place Join step before the Gather step.

 You do not introduce Gather or Join step after EDI Splitter

General splitter with encapsulating element (root element)

Iterator element ( without root element) encapsulating element

JMS is for only internal communication we cannot use JMS for external server

The OData adapter allows you to communicate with an OData API using OData protocol. You use
messages in ATOM or JSON format for communication.

How do you decide the sender because we have different web services like odata service or soap
based service and Rest service ..

 Web application – http adaptor


 Web service – soap adaptor
 Odata api – Odata adaptor

 Source – https
 Reciever – http
 Cfx – soap
 Http – http

You can get more details on exception using

 {exception.message}
 {exception.stacktrace}

If you Use End Event – the message status is set to Failed to catch an error in an exeption sub process
to add additional information to the Message processing log

If you use Error End Event : the message status is set to completed you have the option to define a
custom status that allows you to distinguish this error situation from successfully processed message

If you use Escalated end event – the message status is set to escalated

Value Mapping we define conditions if it's Matmas it should go to this receiver and If it's order it
should go to this receiver ani

Idoc sender-cpi-value mapping -process direct

Process direct sender-cpi- actual receiver

Suppose I have two integration flows ( IF1,IF2), where IF2 is connected to IF1 via process direct .

Now properties in IF1 cannot be accessed in IF2 because the scope of property is with the iflow (IF1),
but where as headers created in the IF1 can be accessed in IF2 because it as global scope.

Property is with in the flow only but coming to variable if you declare I flow we can use in the I flow
level only but if you declare in the global level where ever you want you can use

Difference between data store and variable ?

Is data store we can store entire message/payload whereas variable stores single value like
timestamp or date

If we are using https at sender side we need to set the url in cpi and provide it to source team so that
they hit the endpoint... Once you deploy in the monitoring you can see the endpoint which need to be
given to source team
Cfx -soap

Http

---------------------------------------------------------------------------------------------------------------------------

SFTP we will use timer to pull the file from directory instead of sender..

We use timer to run the iflow on schedule ..Jdbc we will use content modifier to write select query
and use request reply- call to reciever here reciever is jdbc server.. using jdbc adaptor..

Cloud connector is on premise application..used to connect on premise application

Iflow can have any number of local integration process and integration process..

Process call in main flow is used to call Local integration process/sub flow

Process direct is to call two independent iflows

A Looping Process Call in Cloud Integration that invokes a Local Integration Process iteratively till the
conditions are met.

The key to the solution is The Process Direct adapter which can be used to establish communication
between integration processes within the same tenant.

How will Gather know that the last split message has reached?

Data store:

Visibility: integration flow and global

Once you successfully read the data from data store it will be deleted if you check the checkbox
delete on completion

Each data store can have any number of messages ?

Say in your XML you have 30 product IDs 30 products will created in data store..each product ID is
different right you will get different product IDs from source

If you send same message to the data store using write operation it will fail unless you check
overwrite existing message

say you are getting multiple products from source ..you can use Splitter before datastore so that we
will split one large message into individual messages
Select the multiple records from data source using select operation. product ID or invoice id or
employee id as entity id

Difference between fixed values vs Value mapping

Fixed Value Mapping: Fixed Value is very useful when you have lots of conditions coming in one
source fields during mapping, then to avoid the IfThenElse or UDF we go for Fixed Value mapping.

Example: if your source field contains 01 or 02 or 03 or.....12 and you want the result as JAN or FEB or
MAR or ......... DEC.

Advantage: Using fixed value table you can see the result immediately in the mapping.

Disadvantage: If you have used the same fixed value mapping at several places in you mapping then in
case of any change in fixed values you have to make the changes at several places. So in short fixed
value mapping is not good from the maintenance point of view.

Value Mapping: Value Mapping also works in the same way as fixed value mapping, only the thing is
you will get the result at run time only. And you have to define the value mapping in Integration
directory.

Advantage: If you have used the same value mapping at several places in you mapping then in case of
any changes in values you don't have to make the changes in you mapping, just make the changes in
in value mapping in Integration Directory, that's it.

Disadvantage: you can't the see the result immediately in mapping. Results can be seen only at run
time

Router vs Multi cast

 Router will route the message to only one branch based on condition met
 But multicast will send message to all branches

Router should have Minimum two branches -- in that one is default route

SFTP will pull only one file for one run.. SFTP will have scheduler at sender side

Data issue.. connectivity issue..


Once you send an message to queue .. message will be stored in monitor ->manage store->Message
Queues

If it's data issue how many times you retry also it will fail.. If it's connectivity issue then we can do
retry Message will be in Queue for 30days.. Message processing that status will retry.. It has not
failed it went to retry status. You don't see failed status ..you will see retry status

----------------------------------------------------------------------------------------------------------------------------

SFTP :

Mandatory configurations:

 Address :HOST + post


 Proxy type – internet or on premise
 Authentication- public key

Connectivity test -> SSH


 How to check the sftp connectivity from cpi ?-test connectivity->SSH
 What is use of Known _host file ?and how to get the details- to establish connection between
server and cpi - copy the know host file from connectivity test and upload it on security
material ->upload

 How to populate file names dynamically using SFTP adaptor?- time stamp ,Message ID..->
(Camel expression for current date ) , Append time stamp check box
 Have you worked on sftp adaptor if yes what are pre requisites to use SFTP adaptor
 How do you check whether the file exists or not in sftp server ?- - check the directory Access
check box in connectivity test ->SSH

Interview question :Assume we are getting order details from source system but business wants Order
No in file name on the target system can we implement this requirement in SAP CPI if yes how to
implement it

 We need to store the xpath coming from source xml into an property with name as orderNo
 Sftp adaptor level connection lo – ${property.orderNo}
SFTP : Monitor->Manage security-> Connectivity test-> SSH

 Mandatory configurations:
 Address :HOST + post
 Proxy type – internet or on premise(location ID)
 Authentication- public key

Steps to Follow :

 Monitor -> Manage Security-> Manage Keystore -> click Add From Drop down select SSH key
First generate SSH key
 Once you Generate SSH key you can download Public key that public key need to be given to
SFTP team so that they will assign one user to the public key
 Now go to Connectivity Test-> SSH tab and then Provide the details

Basis team ssh keys they will be creating and they will let you the location ..

Mail Server connectivity test


Fixed values – Max we can give 10 fields

Value Mapping Interview question say in source xml we have 1 field that has 100 values so is it
necessary to create 100 fields in values mapping?-i will import all the 100 fields in an csv and then
upload it into value mapping

and different failure options. 1. Use Key 2. Use Default Value 3. Throw exception
Value Mapping – Import: Importing a value mapping comes into picture when you need to move the
objects from one Environment (say Development) to another Environment (say Quality). This will
reduce the efforts and is the ideal way.

45. What is Exception Handling in SAP CPI?


A: Exception Handling in CPI involves capturing and managing errors during integration flow execution. It
typically involves using exception subprocesses, alerting, and retries to handle and mitigate issues.

42. Explain the concept of Retry Mechanism in SAP CPI.


A: The CPI retry Mechanism automatically retries failed message processing steps, either due to
temporary network issues or other system errors, ensuring the reliability of message delivery.

41. How do you configure Security Certificates in SAP CPI?


A: Security certificates in CPI are configured in the Keystore. These certificates ensure secure
communication channels, like HTTPS and FTPS, and enable digital message signing.
37. Explain the Exception Subprocess in SAP CPI.
A: An Exception Subprocess is a mechanism to handle errors within an iFlow. It defines specific actions
to be taken when an error occurs during message processing, such as sending alerts or retrying
operations.
38. What is the Timer Process in SAP CPI?
A: The Timer Process is a start event in an iFlow that triggers the integration flow based on a scheduled
time. It is useful for periodic tasks such as syncing data between systems.
39. What is the Local Integration Process in SAP CPI?
A: A Local Integration Process is a subprocess within an iFlow that can encapsulate a set of processing
steps. It helps in reusing logic within the same iFlow, promoting modularity and code reuse.

40. What are the differences between a Polling and an Event-Driven Process in SAP CPI?
A:
 Polling Process: Regularly checks a source (e.g., a directory or a database) for new data.
 Event-Driven Process: Triggered by an event, such as the arrival of a message, without the need for
regular polling.

26. How do you enable Trace for an iFlow in SAP CPI?


A: Tracing in SAP CPI can be enabled within the iFlow by setting the Log Level to Trace in the integration
flow properties. This allows detailed logging of message payloads, headers, and properties during
runtime.

25. What is the role of the ProcessDirect Adapter in SAP CPI?


A: The ProcessDirect Adapter is used for synchronous and asynchronous message exchange between
different flows within the same CPI tenant.

. Explain the role of the Router in SAP CPI.


A: The Router is used for message routing based on conditions. It helps direct the message flow to
different branches of an iFlow depending on predefined conditions.

17. What is the use of Splitters in SAP CPI?


A: Splitters are used to split a single message into multiple smaller messages for parallel processing or
further transformation.

19. What is Message Persistence in SAP CPI?


A: Message Persistence refers to storing message data to ensure its availability in case of system failure
or for reprocessing purposes.

20. What is Data Transformation in SAP CPI?


A: Data Transformation involves converting data from one format or structure to another as it passes
between different systems during integration.

You might also like