MALL Interfaces AID
MALL Interfaces AID
Infosys
QUALITY SYSTEM DOCUMENTATION
References
COPYRIGHT NOTICE
2011 - 2012 Infosys Limited, Bangalore, India. All Rights Reserved. Infosys believes the information in this
document is accurate as of its publication date; such information is subject to change without notice. Infosys
acknowledges the proprietary rights of other companies to the trademarks, product names and such other
intellectual property rights mentioned in this document. Except as expressly permitted, neither this
documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form
or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior
permission of Infosys Limited and/or any named intellectual property rights holders under this document.
Infosys Limited
Electronic City,
Hosur Road,
Bangalore 560 100
India.
Telephone: 91 80 2852 0261
Fax: 91 80 2852 0362
Website: http://www.infosys.com
Note: Please update Doc ID, other front page details and rev. history upon usage of this template
TABLE OF CONTENTS
Contents
1. INTRODUCTION TO THE APPLICATION INFORMATION DOCUMENT....................................
1.1 AUDIENCE.......................................................................................................................................................
1.2 DOCUMENT STRUCTURE AND RECOMMENDED USE.......................................................................................
2. EXECUTIVE SUMMARY..........................................................................................................................
2.1 DESCRIPTION OF THE APPLICATION................................................................................................................
2.2 RISK MANAGEMENT........................................................................................................................................
2.3 CURRENT & FUTURE WORK PLAN.................................................................................................................
2.4 OFFSHORE TRANSITION PLAN.........................................................................................................................
3. INTRODUCTION........................................................................................................................................
3.1 APAC MALL INTERFACES...............................................................................................................................
3.1.1 Overview:...............................................................................................................................................
3.1.2 Data Flow..............................................................................................................................................
3.1.3 Mall Interfaces data Provider................................................................................................................
3.1.4 List of Tables that used in the process:................................................................................................
3.1.5 Program:..............................................................................................................................................
3.1.6 Folder: Where FTP get stored after BW deliver file to XI.................................................
3.1.7 Mall Interfaces:...................................................................................................................................
3.1.8 Process chains.....................................................................................................................................
4. ISSUES & SUGGESTIONS......................................................................................................................
4.1 ISSUES AND PAIN AREAS...............................................................................................................................
4.2 SUGGESTIONS................................................................................................................................................
This section describes the intended audience for this document, the organization of sections and the
conventions used for the various flow diagrams.
1.1 Audience
Fossil SMEs and Infosys BW Support Team
2 Executive Summary
2. Introduction
Retail Inventory
Domain : Retail
Purpose: APAC Mall Interfaces
Its business activities : Providing files to Malls
2.1.1 Overview:
Business Case: In APAC (countries like Malaysia, Singapore, china,
India etc.), some malls accept stores rent in the form of Some fixed
rent + certain percentage of commission based on stores sales.
This is the term which is mentioned and agreed on the rental contract
between tenant and the mall, thus becoming a legal document.
Therefore, we need to share our stores sales information with the mall
(electronically) in certain frequency and format so that mall can
calculate our store rent monthly.
Earlier Process: Before the APAC retail go live, we were using option
2, means we used to install mall provided application on our POS
system (LSR system ) by which sales information is shared (Pushed)
directly from our POS system to mall directly.
Current Process: After APAC go live (in 2013), means we move to
SAP, so then our POS system (Wincor system) are come under the PCI
compliance and therefore 3rd party application are not allowed.
So now we are using the 3rd option, where we send our sales
information to the malls in designated database as per the technical
specification and request format.
1. Here in the store we have POS system (Wincor system) which will
record sales transaction.
2. Data from POS system will be sent to POS database.
3. From the POS database, data will moved to SAP POSDM system
which will have audit sales information.
4. From POSDM, data will be sent to BRP system (Retail BW) in
every 30 minutes, our BW environment. Here it will go into the
Sales Audit cube (ZSAUDIT).
5. From BW system, we are sending data to the mall. So for that we
have done lot of transformation as every mall has their own
specification, so they used to provide us their particular format
and we used to provide data in that format.
a. We have FTP server, where we put all the files and certain
jobs pick the file from their end and provide the data to Mall.
b. Another method is handshake method where we ping their
server and they provide the EPOCH time. On basis of that we
providing data directly to the mall and we got the feedback
that file is successfully deliver or failed. They will also provide
some specific issue if it is failing with because of some error in
file.
Sales data from POS - > POSDM will flow in real time and once the
audit is passed they are picked by BW retail in 30 minutes.
Now on BI, these sales are stored on ZSAUDIT (having global data)
sales Cube at very detailed level and has transaction data which is
mostly mall needed. So, this cubes serves the data to all mall
interfaces either directly (Query on ZSAUDIT or direct use of ZSAUDIT
cube) or indirectly by using next level cube ZMALI_C01 designed purely
for mall interfaces.
ZSAUDIT is the sales cube that is having global sales data, granularity
is very high and has transaction level data. Sales Data is sent to mall
daily.
Issues:
If Selection Criteria is taken as Calendar day then it will take the last
day data and will update the data. But it may be issue that some data
will not come to the BW from POSDM because of auditing issue, but we
are sending data whatever is coming, so after the correction from audit
team when data will come, it will come with that transaction date so it
will be never picked as we are loading on Cal day. So there will be
mismatch with mall and BW data. Reconciliation will never happen. So
in the new cube ZMALL_C01, we have add one new field loaded on.
This is getting loaded on daily basis. Every day on the process chain
get trigger so at the first step so it will take all the stores from ZSAUDIT
and so selection condition here will be ZLOADEDON so it will fetch all
the old and new transaction so it will cover the late posting too.
MBS used the some kind of handshake method, so we send the file for
the 1 hour sale based on the EPOCH time, so when we go and ping
their server, they tell us what is the EPOCH time for which we used to
send the data.
We take that as selection parameter to cutting the data out of cube
and send the information based on that.
If Mall asked for the past sale then they need to reset their timers for
EPOCH time then we can send that.
If because of some reason like information may incorrect or file issue or
data server may busy) data will not get updated and after that if we
will send the data again for the next hour then it will not take the data
as it was expecting the EPOCH time data that it have set, so we need
to update that data first for that particular hour sale then it will move
further.
Use: Every record that we sent, we update this table and look up to the
table and send the data for that.
This particular table is used by four stores. Two of Raffle and Two of
SOZHOU.
2.1.5 Program:
This programs are put as an interface, which we can access via TCode:
ZMALI. These programs are present in the routines of the APDs of the
Interface, therefore, it is automated. This program runs in every loop
i.e. for every transaction.
We need to delete the records as once the store is closed, for those
stores we are not sending the file. Hence, deleting the store
information from the master table.
2.1.6 Folder: Where FTP get stored after BW deliver file to XI.
Dallas FTP Server: \\dalcftpe02\ftpdata-2\it
We have country wise folder which is further categorized based on Mall
and then further to dev, qa and prd.
Live FTP files are available on Prd folder.
Infrastructure team will pick this file and deliver this file to mall. They
have created some window job which will pick this file from here and
will deliver this file to their server.
Right now we have Singapore, china, Malaysia, India. We also have the
requirement from JAPAN but we are not building any interfaces for
japan right now as they have some local language requirement, so
local team has created some interface for them.
Except Utama all the other malls work in same way. Not sure how
Utama works.
For Gunney mall we are delivering .csv format.
Details of Interfaces:
Flow from BW system to Mall Interfaces:
1. ZMALI_APD03_ION_ORCHARD_D_3266 : MallInt/APD03: Ion
Orchard/3266 (Daily)
1. Source:
Query: ZMALI_APD03_ZSAUDIT_POS_Q04
Query is built on Cube ZSAUDIT.
Variants are created in RSRT, it is in 3.x. If we created the variant in BEx
analyzer then it will get updated in the backend table and will not be in
readable format so here we are using old method.
All the variants created in 3.x version are stored in the table.
The Date field in the variant is filled with customer exit variable, In case the
input is blank then it will run for current day or if we pass some value then it
will run for that particular day.
In Automatic way, it always run for current day.
Customer exit code:
2. Here we are populating the extra field that we need in target. Fields are
Type and Machine Id. This are the fixed value that we are passing to
the field.
3. As per the above code, the Proxy provided by the XI team is called to
submit the output records. The output and record structure is defined
in the proxy server which is used by the PI team to send the data to
the Mall.
Document Type is Daily Field. Therefore, it is considered as D.
Machine ID is mandatory field and taken as default 9999999.
Below is the code:
This is a single loop as it is one record. For some mall, they want record
by record submission so we use proxy inside the loop and loop run for
each transaction and submit the data and run it again after
confirmation.
Error Handling
If the Tenants POS system is not operational due to the POS technical problem and is not able to FTP data
from the HQ, the Tenant representative has to update the CMT Management immediately.
2. ZMALI_APD04_ION_ORCHARD_M_3266: MallInt/APD04: Ion
Orchard/3266 (Monthly)
ZMALI_APD10_SGIMM_WSI_M_3273: MallInt/APD10: SG IMM
WSI/3273 (Monthly)
It is same as above, here we are pushing the monthly data so here the
transaction type is T. Everything is same except this.
3. ZMALI_APD01_STDSOMERSET_D_3267 : MallInt/APD01:
Somerset FOS/3267 (Day/Hour/In/out)
ZMALI_APD02_STDSOMERSET_D_3268: MallInt/APD02: Somerset WSI/3268
(Day/Hour/In/out)
ZMALI_APD11_SGJEM_WSO_D_3272: MallInt/APD11: SG JEM
WSO/3272(Day/Hour/In/out)
ZMALI_APD14_SUNTEC_FOS_D_3271: MallInt/APD14: SUNTEC FOS/3271
(Day/Hour/In/out)
Above 4 technically having the same interface.
Interface:
Snapshot of APD: ZMALI_APD01_STDSOMERSET_D_3267
Interface:
Snapshot of ZMALI_APD05_JOHOR_D_3471: MallInt/APD05: JOHOR/3471 (Daily)
1. Getting data from Query for the required Mall with the help of variant.
2. Here we are lookup on the table ZMALI_APD_MASTER. Every time we
load the data we increment the date into the table and based on that
we are picking the load. First transformation, we are preparing Header
line and record structure data for 101(Header record for each
transaction) and 111 (Information of the items sold including void
items).
3. Second transformation, we are preparing record structure data for 121
(footer record of a transaction that stores all Tax, Cess, Service Charge
and Discount details. Cess and Service Charge are used for F&B only)
and 131 (This is the footer record of a transaction that stores the
payment medium, currency code and amount tendered or changed).
4. We are preparing the last line of file (Footer line).
5. Pushing data to XI via proxy.
6. Placing flat file at application server.
1. Getting data from Query for the required Mall with the help of variant.
2. We are putting the data into the target format and assigning Type as D
(as it is daily file).
1. Here, we have two part in interface. First part will not work if they will
not have any sales but we need to generate files with the data that
there is no sales so we are generating that with the below part.
2. In the first part, we are fetching data from query and transforming it to
the target format.
3. In the 2nd part, if we are not having any sales then we are preparing the
file with 0 value.
4. When we push the data to XI then XI will check whether we have sales
or not. If there is sales then it will delete the part which say that there
is no sales otherwise it will send the data with one record that there is
no sales.
5. Placing flat file to Application server.
1. It is using the cube and at the filer we are taking 40 days data means
current day-40 days and the specific retail location.
2. Here we are transforming the data as Target needs. Here we are
proving data that the transaction is sales or return. Return is refund
here. So they are having different cases for Refund. Refund case can
be of three types:
a. Exchange with same value watch. It may be different watch or
different color watch. So in this transaction value will be 0. It is
still sales.
b. Total Refund: I this case customer can ask for full refund so it will
be return.
c. Exchange with higher value match: it may be the case customer
come back and buy a higher value watch in exchange of
previous one. So this transaction will come under the tender
value. It can also be the case that they will insist for lower value
watch but we dont entertain such cases but if customer insist
then we give them lower value watch but we dont return the
money.
3. If there is no sales then files will be generated as 0 value.
Here we are using the handshake method. We are directly pushing the information to their server and ping
their server and get the EPOCH time and on this EPOCH time we receive the other record.
Once we submit the record to the server then we ping them and we ask for confirmation and then they give
us the next EPOCH time which we maintain in table, on the basis of that EPOCH time it will fetch next set
of data.
Here we are sending data hourly basis for a day.
1. Here we are getting data from cube and having filter on it for 20
days from current day and for retail location.
2. 1st transformation, we are requesting the date for which we need to
fetch the data. EPOCH will get updated into the Table
ZMALI_BIXI_MBS. We also have validation that it will not run for
current day.
3. 2nd Transformation, It will get the EPOCH time from table and then
will fetch the data for that period. It will fetch the data till EPOCH
time from the filter as we have given in filter and will delete the
record that is not needed.
4. After the Submitting data to proxy, we will again ping to their server
and will confirm that data is submitted so we will get the next
EPOCH time, and it will update into our table. If it is not submitted
successfully then it will not update the next EPOCH time then data
will go again for that time.
Here, we have 4 stores, it also kind of handshake method. We dont generate any FTP, we put directly
to their server but we dont use EPOCH time here.
Store:
3748: SHANGHAI RAFFLES CITY ACC
3790: BEIJING RAFFLES CITY ACC
3793: SUZHOU VILLAGE OUTLET
3794: SUZHOU VILLAGE WSIO
1. Here we are getting data from cube and having filter on Cal day,
location and transaction number.
2. In transformation, we are creating unique code by concatenating
location, Cal day, POS number and transaction number so we dont
deliver the same file again if feed is successfully posted to their server.
3. In first transformation, we are creating Unique Id, determining payment
details as per mall requirement, Item number and other derivation like
sales header, sales item, return sales and doc currency.
4. In 2nd transformation, we are assigning value of KFs retail sales, tax
amount and Discount value by calculation for sales item number.
5. In 3rd transformation, we are assigning value of KFs discount value,
sales quantity, retail sales, tax amount and normal sales value on the
basis of scope and transaction number.
6. While submitting proxy, we are
Determining Master data from master data table (ZMALI_RAFFLE_T01)
Determining sales total value which includes sale/discount/tax and
managing it into the format that we are submitting to proxy
Determining Items level information includes sales/discount/promo/tax
and managing it into the format that we are submitting to proxy
Determining tenders and delivery information and managing it.
Avoiding duplicate feed on the basis of Unique code and the feed
status = X
We are updating status table before submitting to proxy
Updating status table with proxy feedback.
Maintaining 90 days data in log table.
7. We handover the record to XI, XI will submit the file to them and they
will give the feedback at the same time about it success or failed, it will
also certain information like issues if it failed. Means it is input and
output proxy.
Snapshot of Z_APAC_ZMALI_C01_PC01:
Snapshot of Z_APAC_MALI_META:
The following are the existing issues/suggestions that can be considered for this application
3.2 Suggestions