0% found this document useful (0 votes)
101 views87 pages

SC8131 SC8132 Integration Guide 2.11

This document provides integration guidelines for the SC8131/SC8132 video analytics cameras. It describes how to get counting results through CGI, report pushing, and RS485 interfaces. It also explains how to get metadata through an authentication WebSocket connection or RTSP stream. Finally, it outlines the RESTful APIs for configurations, data queries, and camera information.

Uploaded by

motogalca
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
101 views87 pages

SC8131 SC8132 Integration Guide 2.11

This document provides integration guidelines for the SC8131/SC8132 video analytics cameras. It describes how to get counting results through CGI, report pushing, and RS485 interfaces. It also explains how to get metadata through an authentication WebSocket connection or RTSP stream. Finally, it outlines the RESTful APIs for configurations, data queries, and camera information.

Uploaded by

motogalca
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 87

SC8131/SC8132 Integration Guide

Rev: v2.11

2020/12/24

1
Revision History
Version Date Editor Comment
0.1 2016/01/29 Eric Fang First Draft
0.2 2016/02/24 Evan Modify Metadata format
Chen
0.3 2016/03/25 Evan 1. Add Stereo Tracker configuration by ONVIF
Chen protocol
2. Add configuration of RTSP metadata stream
0.4 2016/04/22 Eric Fang 1. Modify height unit use mm
2. Add Setting WebSocket port number
1.0 2016/05/16 Eric Fang 1. Modify 3.1. WebSocket protocol description
2. Modify 4. Example code refine
1.1 2016/06/14 Eric Fang 1. Insert new chapter 2. Read Before Use
2. For SC8131-VVTK-0101i
1.2 2016/07/06 Eric Fang 1. Modify 2. Add default account and password
information
1.3 2016/11/22 Hsuany 1. Add new items for rules
1.4 2017/01/16 Hsuany 1. Modify Zone naming in metadata and
eventdata
1.5 2017/03/31 Evan 1. Add chapter 6 RESTful APIs
2. Modify websocket package: remove cookie
2.0 2017/06/09 Evan Remove Stereo Tracker configuration by ONVIF
protocol
2.1 2018/6/15 Evan/ 1. Add Get Counting Result by CGI, Report
Small Push, RS485
2. Modify RESTful APIs
2.2 2018/09/26 small 1. Add Device Log
2. Add Queue management
2.3 2019/04/15 small 1. Update Get Accumulative Counting Result
2.4 2019/04/22 terry 1. Update Classification and Area information
2.5 2019/12/23 ting 1. Add authorization WebSocket introduction
2.6 2020/09/10 ethan Websocket count event add accumulated in/out
data
2.7 2020/10/06 ethan Add rule test restful api
2.8 2020/10/08 ethan Add queue event metadata and restful api

2
2.9 2020/10/22 Terry Stop "unAuth websocket" on default mode.
2.10 2020/11/11 ethan 1. Add passerby event restful api
2. Add counting event restful api description
2.11 2020/12/24 Albus Add manually modify stitching transform

3
Table of Contents
Revision History .......................................................................................................................... 2

1. Introduction..................................................................................................................... 7

2. Read Before Use .............................................................................................................. 7

3. Get Counting Result ......................................................................................................... 7

3.1. Get Counting Result by CGI ................................................................................................... 7

3.2. Get Counting Result by Report Push .................................................................................... 8

3.3. Format of Counting Result .................................................................................................. 11

3.3.1. XML ............................................................................................................................................... 11

3.3.2. CSV ................................................................................................................................................ 14

3.3.3. JSON.............................................................................................................................................. 14

3.4. Get Data by RESTful API ...................................................................................................... 17

3.4.1. Query Counting/Flowpath Database .......................................................................................... 17

3.4.2. Query Zone Database .................................................................................................................. 19

3.4.3. Get Accumulative Counting Result.............................................................................................. 20

3.4.4. Get Device Log ............................................................................................................................. 22

3.4.5. Get Queue Data ........................................................................................................................... 24

3.4.6. Get Current Queue Event Information ....................................................................................... 25

3.4.7. Get Passerby Data ........................................................................................................................ 26

3.5. Get Counting Result by RS485 (only SC8132) ..................................................................... 27

3.5.1. IBIS settings .................................................................................................................................. 27

3.5.2. Support of IBIS command ............................................................................................................ 28

3.5.3. Support the first counting rule .................................................................................................... 29

3.5.4. 4-wire communication................................................................................................................. 29

4. Get Metadata ................................................................................................................ 30

4.1. Authentication WebSocket Connection ............................................................................. 30

4
4.1.1. WebSocket Connection ....................................................................................................... 30

4.1.2. WebSocket Server Configuration API ................................................................................. 30

4.1.3. WebSocket Protocol ............................................................................................................ 31

4.1.4. WebSocket URL with Data Filter......................................................................................... 32

4.1.5. WebSocket Connection ....................................................................................................... 33

4.1.6. WebSocket Server Configuration API ................................................................................. 33

4.1.7. WebSocket Protocol ............................................................................................................ 34

4.1.8. WebSocket URL with Data Filter......................................................................................... 34

4.2. Get Metadata by RTSP Metadata Stream .......................................................................... 36

4.2.1. Setup RTSP Metadata Stream ..................................................................................................... 36

4.3. Metadata Format ................................................................................................................ 40

5. RESTful APIs ................................................................................................................... 50

5.1. API List ................................................................................................................................. 50

5.2. App Feature ......................................................................................................................... 52

5.3. Package Version .................................................................................................................. 53

5.4. Configuration ....................................................................................................................... 54

5.4.1. Analytics Engine ........................................................................................................................... 54

5.4.2. Rule Engine................................................................................................................................... 58

5.4.3. Alarm ............................................................................................................................................ 71

5.4.4. Report push .................................................................................................................................. 73

5.5. Data...................................................................................................................................... 77

5.5.1. Start/Stop Map ............................................................................................................................ 77

5.6. Camera information ............................................................................................................ 78

5.6.1. Project .......................................................................................................................................... 78

5.6.2. Time .............................................................................................................................................. 80

5.6.3. Status ............................................................................................................................................ 81

5
5.7. Stitching ............................................................................................................................... 82

5.7.1. Get Stitching information ................................................................................................... 82

5.8. Tracking................................................................................................................................ 83

5.8.1. Tracking ON .................................................................................................................................. 83

5.8.2. Tracking OFF ................................................................................................................................. 85

5.9. Integration Test ................................................................................................................... 86

6
1. Introduction
This document describes how to integrate the real-time metadata and counting result of the 3D
analytics system “Stereo Tracker”.

2. Read Before Use


Some kinds of situation client connect to “Stereo Tracker” to configure or get metadata may
failed or timeout. Because of the “Stereo Tracker” need three basic network port for http, websocket
and rtsp protocols. So, when camera installation in NAT network environment or firewall, the NAT
server port forwarding need consider the camera's protocols port number and firewall also. Please to
check configuration camera's protocols port, NAT device setting guide and the firewall setting guide.

3. Get Counting Result


3.1. Get Counting Result by CGI
You can click the CGI button to produce a line of CGI command. Copy and save the command
for later use or to be implemented elsewhere. This command can specify and retrieve video
counting results.

7
CGI format is described as following,

http://{IP}/Stereo-Counting/cgi-bin/report_pull.cgi ?
format={xml,json,csv} &
starttime={starttime timestamp} &
endtime={endtime timestamp} &
aggregation={aggregation level in seconds} &
lite={0,1}&
localtime={0,1}&
countingeventdb={0,1}

Key Description
* Querying start time timestamp [ timestamp in second or the ISO8601
starttime
formatted date time string, e.g. 2016-03-20T12:00:00 ]
* Querying end time timestamp [ timestamp in second or the ISO8601
endtime
formatted date time string, e.g. 2016-03-21T08:00:00 ]
* aggregation Report aggregation level for each record in unit of second
format [Option] Report format including XML(default), JSON, CSV
lite [Option] Set Flag to 1 to ignore in/out zero records. [default turn off : 0]
[Option] Set Flag to 1 to take input starttime, endtime, and the StartTime,
EndTime in report as camera local time.
localtime
[default turn off : 0 -> input starttime, endtime and all time format in
report is in UTC timestamp]
[Option] Set Flag to 1 to use event triggered time as the aggregation level
countingeventdb
[default turn off : 0]

3.2. Get Counting Result by Report Push


Configure the report push protocols so that you can receive periodic counting reports. The reports
include camera information and aggregated counts by the configured intervals for each counting
rule. Click the Add button to begin.

8
The status of the last scheduled task.
status
: success : failed [empty]: not yet executed
Name User defined target name
Protocol We support three protocols including HTTP, FTP and EMAIL
HTTP: http://IPAddress:PORTURI
FTP : ftp://IPAddress:PORT -> Destination
Address
Email: ServerIPAddress:PORT
SD card: NA
The duration between next pushed aggregated report. At the same time, it is also
Delivery
the total duration of one report. In this camera, we support 1 min, 5 mins, 15 mins,
Schedule
30 mins, 1 hr, 12 hrs, 1 day. All schedule starts from 00:00.
This is the aggregation period for each data in reports. Events in the same
Aggregation aggregation level would be accumulated as one data. In this camera, we support
level the same options with Delivery schedule. Note that, aggregation level must be
shorter than Deliver schedule.
In lite mode would ignore the zero data to reduce the size of each report. If the lite
Lite mode is No, then the report would contain zero in/out record even if there is no
count event occurs in that aggregation period.
This camera now supports three report format including XML, CSV and JSON. The
Format
detailed content of each format would be introduced later.
By clicking Delete button, camera would remove all the data of that target,
[Delete]
including the target information, report parameter setting and stored reports.

9
General
Local time Show the StartTime, EndTime in camera locale time with ISO8601 format
Email
Sender email Valid email address of sender
Recipient email Valid email addresses of recipients. (seperated with semicolon ;)
Server address SMTP server IP address
Username username if SMTP server requires authorization
Password Corresponding password of Username
Port SMTP server port number
SSL mode Send the email in SSL mode
FTP
Server address FTP server IP address
FTP server port number
Port
Username Username if FTP server requires authorization
Password Corresponding password of Username
FTP folder name Destination folder path
Filename format* We support user customize the reports filename through some variables.
The detailed of supported variables are listed later.
HTTP
Server address HTTP server IP address
Port HTTP server port number
Server uri HTTP server route uri
Username Username if HTTP server requires authorization
Password Corresponding password of Username
SD card
Filename format* We support user customize the reports filename through some variables.
The detailed of supported variables are listed later.
Cyclic Storage If enable cyclic storage, SD memory management will be enabled. If the
memory usage is up to 90% of total memory size, old contents will be
deleted to get more free space for updating data. If cyclic storage is not
enabled, reports will not be recorded if usage is higher than 90% of total
memory size.

* support customizing report filename:


%T Report timestamp in UTC time

10
%F Report format in xml, json or csv
%N User defined server name
%M Mac address in serial
%G Group ID
%D Device ID
%S Schedule duration in second
%A Aggregation level in second
%L "LITE" if in lite mode, "" otherwise

Use the Test button to push a test packet. When the test is successfully performed, click the Save
button.

3.3. Format of Counting Result


Stereo camera supports several types of report format, json, xml, and csv. The followings are some
examples of report with different types of format.

3.3.1. XML
Here is an XML example shows two rules with its own statistic data, note that, camera will send zero
counting if there is no count for that interval.

11
<Message>
<Source>
<UtcTime>2016-08-01T08:03:56Z</UtcTime>
<GroupID>0</GroupID>
<DeviceID>0</DeviceID>
<ModelName>SC8131</ModelName>
<MacAddress>00:02:D1:39:2D:25</MacAddress>
<IPAddress>172.16.7.138</IPAddress>
<TimeZone>+8</TimeZone>
<DST>0</DST>
</Source>
<Data RuleType="Counting">
<CountingInfo RuleName="Counting1">
<In>0</In>
<Out>0</Out>
<StartTime>2016-07-26T00:00:00+0800</StartTime>
<EndTime>2016-07-26T12:00:00+0800</EndTime>
</CountingInfo>
<CountingInfo RuleName="Counting1">
<In>0</In>
<Out>0</Out>
<StartTime>2016-07-26T12:00:00+0800</StartTime>
<EndTime>2016-07-27T00:00:00+0800</EndTime>
</CountingInfo>
</Data>
<Data RuleType="ZoneDetection">
<ZoneInfo RuleName="Zone1">
<InwardCount>39</InwardCount>
<SumOutwardDuration>299</SumOutwardDuration>
<TotalCount>39</TotalCount>
<AvgDuration>7.67</AvgDuration>
<AvgCount>0.00</AvgCount>
<StartTime>2016-07-26T00:00:00+0800</StartTime>
<EndTime>2016-07-26T12:00:00+0800</EndTime>
</ZoneInfo>
<ZoneInfo RuleName="Zone1">
<InwardCount>37</InwardCount>

12
<SumOutwardDuration>407</SumOutwardDuration>
<TotalCount>37</TotalCount>
<AvgDuration>11.00</AvgDuration>
<AvgCount>0.01</AvgCount>
<StartTime>2016-07-26T12:00:00+0800</StartTime>
<EndTime>2016-07-27T00:00:00+0800</EndTime>
</ZoneInfo>
</Data>
</Message>

The description of XML (XSD) is as below:


<?xml version="1.0" encoding="UTF-8"?>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" elementFormDefault="qualified"
attributeFormDefault="unqualified">
<xs:element name="Message">
<xs:complexType>
<xs:sequence>
<xs:element name="Source">
<xs:complexType>
<xs:sequence>
<xs:element name="UtcTime" type="xs:string"></xs:element>
<xs:element name="GroupID" type="xs:string"></xs:element>
<xs:element name="DeviceID" type="xs:string"></xs:element>
<xs:element name="ModelName" type="xs:string"></xs:element>
<xs:element name="MacAddress" type="xs:string"></xs:element>
<xs:element name="IPAddress" type="xs:string"></xs:element>
<xs:element name="TimeZone" type="xs:string"></xs:element>
<xs:element name="DST" type="xs:string"></xs:element>
</xs:sequence>
</xs:complexType>
</xs:element>
<xs:element name="Data" maxOccurs="unbounded">
<xs:complexType>
<xs:sequence>
<xs:element name="CountingInfo" maxOccurs="unbounded">
<xs:complexType>
<xs:sequence>

13
<xs:element name="In" type="xs:string"></xs:element>
<xs:element name="Out" type="xs:string"></xs:element>
<xs:element name="StartTime"
type="xs:string"></xs:element>
<xs:element name="EndTime"
type="xs:string"></xs:element>
</xs:sequence>
< xs:attribute name="RuleName" type="xs:string"/>
</xs:complexType>
</xs:element>
</xs:sequence>
<xs:attribute name="RuleType" type="xs:string"/>
</xs:complexType>
</xs:element>
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:schema>

3.3.2. CSV
CSV example shows the same data in csv format, note that, camera will send zero counting even if
there is no count for that interval when you uncheck the lite mode.

ReportTime,GroupID,DeviceID,ModelName,MacAddress,IPAddress,TimeZone,DST
2016-08-01T08:39:23Z,0,0,SC8131,00:02:D1:39:2D:25,172.16.7.138,+8,0
RuleType,RuleName,In,Out,StartTime,EndTime
Counting,Counting1,0,0,2016-07-26T00:00:00+0800,2016-07-26T12:00:00+0800
Counting,Counting1,0,0,2016-07-26T12:00:00+0800,2016-07-27T00:00:00+0800
RuleType,RuleName,InwardCount,SumOutwardDuration,TotalCount,AvgDuration,AvgCount,StartTime,EndTime
ZoneDetection,Zone1,39,299,39,7.67,0.00,2016-07-26T00:00:00+0800,2016-07-26T12:00:00+0800
ZoneDetection,Zone1,37,407,37,11.00,0.01,2016-07-26T12:00:00+0800,2016-07-27T00:00:00+0800

3.3.3. JSON
The following JSON example shows the same condition in json format. Zero counting data are still
sent when you unchecked the lite mode.

14
{
"Source" : {
"ReportTime" : "2016-08-01T08:41:25Z",
"GroupID" : "0",
"DeviceID" : "0",
"ModelName" : "SC8131",
"MacAddress" : "00:02:D1:39:2D:25",
"IPAddress" : "172.16.7.138",
"TimeZone" : "+8",
"DST" : "0"
},
"Data" : [{
"RuleType" : "Counting",
"CountingInfo" : [{
"RuleName" : "Counting1",
"In" : 0,
"Out" : 0,
"StartTime" : "2016-07-26T00:00:00+0800",
"EndTime" : "2016-07-26T12:00:00+0800"
}, {
"RuleName" : "Counting1",
"In" : 0,
"Out" : 0,
"StartTime" : "2016-07-26T12:00:00+0800",
"EndTime" : "2016-07-27T00:00:00+0800"
}
]
}, {
"RuleType" : "ZoneDetection",
"ZoneInfo" : [{
"RuleName" : "Zone1",
"InwardCount" : 39,
"SumOutwardDuration" : 299,
"TotalCount" : 39,
"AvgDuration" : 7.67,
"AvgCount" : 0.00,
"StartTime" : "2016-07-26T00:00:00+0800",

15
"EndTime" : "2016-07-26T12:00:00+0800"
}, {
"RuleName" : "Zone1",
"InwardCount" : 37,
"SumOutwardDuration" : 407,
"TotalCount" : 37,
"AvgDuration" : 11.00,
"AvgCount" : 0.01,
"StartTime" : "2016-07-26T12:00:00+0800",
"EndTime" : "2016-07-27T00:00:00+0800"
}
]
}
]
}

For Counting and Flow Path, there are two statistic data in report, In and Out.
Report tag name Description
In The number of objects crossing the rule line or detected area toward
the direction "In".
Out The number of objects crossing the rule line or detected area toward
the direction "Out".

For rule type, Zone, the descriptions of statistic data in report are illustrated as follow table.
Report tag name Description
InwardCount Numbers of objects which go inward in aggregation time
SumOutwardDuration Sum of dwelling duration of objects which go outward in aggregation
time
TotalCount Total counts of dwelling objects in aggregation time
AvgDuration Average duration of dwelling objects in aggregation time
AvgCount Average counts of dwelling objects in aggregation time

16
The following is an example for TotalCount, InwardCount, AvgDuration, and SumOutwardDuration
showing in diagram.

30sec 1min 2min 3min


30sec 60sec 30sec
n 60sec

TotalCount 2 2 3
InwardCount 2 2 1
AvgDuration 30 40 50
SumOutwardDuration 60 0 150

3.4. Get Data by RESTful API


Please refer to Chap.5 for more details about RESTful API.

3.4.1. Query Counting/Flowpath Database


URI
/VCA/Data/DB/Counting?StartTime=[UTC timestamp]&EndTime=[UTC timestamp]
Description
Specify time interval in UTC timestamp to get counting result from counting/flowpath DB

Default value
N/A
GET
Input data UTC timestamp
Return data Counting/Flowpath result in JSON format
Request example
GET /VCA/Data/DB/Counting?StartTime=1524700800&EndTime=1524729600 HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290MTIzNA==\r\n
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64)\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: text/plain\r\n
\r\n
{
17
"CountingInfo": [
{
"RuleName": "Rule@Counting",
"UTC": 1524720420,
"In": 0,
"Out": 0
},
{
"RuleName": "Rule@FlowPathCounting",
"UTC": 1524720420,
"In": 0,
"Out": 0
},
{
"RuleName": "Rule@Counting",
"UTC": 1524720480,
"In": 0,
"Out": 0
},
{
"RuleName": "Rule@FlowPathCounting",
"UTC": 1524720480,
"In": 0,
"Out": 0
},
{
"RuleName": "Rule@Counting",
"UTC": 1524720540,
"In": 0,
"Out": 0
}
]
}
POST
N/A
Parameters Description
CountingInfo Counting rule result
18
RuleName Rule name
UTC The UTC timestamp of this passerby result
In The number of objects detected as “In” by counting rule
Out The number of objects detected as “Out” by counting rule

3.4.2. Query Zone Database


URI
/VCA/Data/DB/Zone?StartTime=[UTC timestamp]&EndTime=[UTC timestamp]
Description
Specify time interval in UTC timestamp to get counting result from Zone DB
Default value
N/A
GET
Input data UTC timestamp
Return data Zone detection result in JSON format
Request example
GET /VCA/Data/DB/Zone?StartTime=1524672000&EndTime=1524733200 HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290MTIzNA==\r\n
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64)\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: text/plain\r\n
\r\n
{
"ZoneInfo": [
{
"RuleName": "Rule-2",
"UTC": 1524730560,
"AvgCount": 1.45,
"AvgDuration": 6.21429,
"InwardCount": 14,
"TotalCount": 14,
"SumOutwardDuration": 17,
"VisitCount": 4
19
}
]
}
POST
N/A
Parameters Description
ZoneInfo Zone detection result
RuleName Rule name
UTC The UTC timestamp of this zone result
AvgCount Average counts of dwelling objects in aggregation time
AvgDuration Average duration of dwelling objects in aggregation time
InwardCount Numbers of objects which go inward in aggregation time
TotalCount Total counts of dwelling objects in aggregation time
Sum of dwelling duration of objects which go outward in aggregation
SumOutwardDuration
time
VisitCount total counts of objects passing through zone

3.4.3. Get Accumulative Counting Result


URI
url:/VCA/Data/AccCounting/[RULENAME]
Description
Get accumulative counting result of the specify rule, replace [RULENAME] with the name of the
rule.
Default value
N/A
GET
Input data Rule name
Return
Accumulative result in JSON format
data
Request example
GET /VCA/Data/AccCounting/Rule@Counting HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290MTIzNA==\r\n
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64)\r\n
Accept: */*\r\n
Response example

20
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Data": {
"RuleName": "ZoneRule ",
"fData": 0,
"iData": [
37,
25,
0,
0,
0,
0,
0,
0,
0,
0
],
"strData": ""
},
"Status": 200
}
POST
N/A

Parameters Description
Data Rule data
RuleName Rule name
fData Not used
Integer array, the value meaning is different based on different rule type.
Counting type
"iData":
iData [Out,In,reserved,reserved,reserved,reserved,reserved,reserved,reserved,reserved],
Zone type
"iData":
[Inside,MaxWaitTime,MinWaitTime,AverageWaitTime,reserved,reserved,
21
reserved,reserved,reserved,reserved ],

strData Not used


100: Continue
This interim response is used to inform the client that the initial part of the
request has been received and has not yet been rejected by the server. The client
SHOULD continue by sending the remainder of the request or, if the request has
already been completed, ignore this response.
Status
200: OK
The request has succeeded.

400: Bad Request


The request could not be understood by the server due to malformed syntax.

3.4.4. Get Device Log


URI
url:/VCA/Data/DB/Log?StartTime=[UTC timestamp]&EndTime=[UTC timestamp]
Description
Specify time interval in UTC timestamp to get Key Log from log DB
Default value
N/A
GET
Input data UTC timestamp
Return data Log in JSON format
Request example
GET /VCA/Data/DB/Log?StartTime=1524672000&EndTime=1524733200 HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290MTIzNA==\r\n
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64)\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: text/plain\r\n
\r\n
{
"LogInfo": [
22
{
"MessageID": 7,
"Message": "User Added, password was changed",
"RecordTime": 1536300675
},
{
"MessageID": 11,
"Message": "The vca app stopped",
"RecordTime": 1536300996
},
{
"MessageID": 10,
"Message": "The vca app started",
"RecordTime": 1536301159
}
]
}
POST
N/A
Parameters Description
LogInfo Key Log information
Message ID
0 = Unknow message
1 = The position of analytic rules was changed
2 = The installation height was changed
3 = The object filter was changed
4 = The report push configuration was changed
MessageID 5 = The network setting was changed
6 = The video recording setting was changed
7 = User Added, password was changed
9 = The slave is offline after stitching
10 = The vca app started
11 = The vca app stopped
12 = The vca app close failed last time
Message Message of the Log
RecordTime The UTC timestamp of this log

23
3.4.5. Get Queue Data
URI
url:/VCA/Data/DB/Queue?StartTime=[UTC timestamp]&EndTime=[UTC timestamp]
Description
Specify time interval in UTC timestamp to get counting result from Queue DB
Default value
N/A
GET
Input data UTC timestamp
Return data Queue detection result in JSON format
Request example
GET /VCA/Data/DB/Queue?StartTime=1524672000&EndTime=1524733200 HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290MTIzNA==\r\n
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64)\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: text/plain\r\n
\r\n
[
{
"RuleName": "QueueRule",
"Objects": [
{
"Id": 965,
"Service": [
0,
0
],
"Wait": [
1537257025,
1537257034
]
}
]
},
24
{
"RuleName": " QueueRule ",
"Queue": [
1537257025,
1537257034
]
}
]
POST
N/A
Parameters Description
RuleName Rule Name
Objects Object information
ID Object ID
Service [Start Time Stamp, Wait Time Stamp]
Wait [Start Time Stamp, Wait Time Stamp]
Queue [Start Time Stamp, Wait Time Stamp]

3.4.6. Get Current Queue Event Information


URI
url: /VCA/Data/Queue?RuleName=[RULENAME]
Description
Get current queue event information of the specific rule.
Default value
N/A
GET
Input data Rule name
Return data Queue current event information in JSON format
Request example
GET /VCA/Data/Queue?RuleName=QueueRule HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290MTIzNA==\r\n
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64)\r\n
Accept: */*\r\n
Response example

25
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{"AvgDuration":0,"MaxDuration":0,"MinDuration":0,"QueueLength":0,"QueueState":"Close","Rul
eName":"Rule-1","ServiceDuration":0,"Status":200}
POST
N/A

Parameters Description
AvgDuration Average waiting duration of objects in queue
MaxDuration Maximum waiting duration of objects in queue
MinDuration Minimum waiting duration of objects in queue
QueueLength The number of objects waiting in queue
QueueState The current queue status, “Open” or “Close”
RuleName The rule name of queue
ServiceDuration Current service duration of object in service zone
200: OK
The request has succeeded.
Status
400: Bad Request
"Error":{"Message":"RuleName must be specified: ?RuleName=XXX"}

3.4.7. Get Passerby Data


URI
url:/VCA/Data/DB/Passerby?StartTime=[UTC timestamp]&EndTime=[UTC timestamp]
Description
Specify time interval in UTC timestamp to get passerby result from Passerby DB
Default value
N/A
GET
Input data UTC timestamp
Return data Passerby detection result in JSON format
Request example
GET /VCA/Data/DB/Passerby?StartTime=1524672000&EndTime=1524733200 HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290MTIzNA==\r\n

26
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64)\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: text/plain\r\n
\r\n
{
" PasserbyInfo": [
{
"RuleName": "Rule-1",
"UTC": 1605077580,
"Count": 5
}
]
}
POST
N/A
Parameters Description
PasserbyInfo Passerby detection result
RuleName Rule name
UTC The UTC timestamp of this passerby result
Count The count number of passerby object

3.5. Get Counting Result by RS485 (only SC8132)


3.5.1. IBIS settings
The SC8132 provides IBIS protocol (Integrated Board Information System, Protocol specification
according to the VDV 300 standard) for communicating with OBC (on-board computer). User can get
passenger counting result if you have connected camera with OBC via RS485.

Setup Camera-ID to identify individual counter at communication of OBC.


Camera-ID is kept within the parameter of: 1 ... 63.
According to recommendations VDV 300 default IBISIP baud rate is 1200bps. We also provide baud
rates 2400bps, 4800bps, 9600bps, 19200bps, 38400bps which are available options for IBIS.

27
3.5.2. Support of IBIS command
User can send IBIS command to SC8132 via RS485. The description of supported command and
corresponding reply will be shown in this table. We follow the rules of IBIS protocol so that there is
no reply if a command is not supported by SC8132.
Command Description
This type of command represents "Movement started".
bF"CamID"
CamID is the value of camera id set by user.
This type of command represents "Query of IBIS status".
bS"CamID"
CamID is the value of camera id set by user.
This type of command represents "Query of counting result".
bE"CamID"
CamID is the value of camera id set by user.

Reply Description
bF This type of reply represents "Acknowledgement".
bS3 This type of reply represents "Status of IBIS".
This type of reply represents "Counting result of passengers".
B1, B2: Number of boarding passengers, high-order digit first, 0…255.
A1, A2: Number of alighting passengers, high-order digit first, 0…255.
The representation of characters are shown in following table.
bB1B2A1A2 Decimal Hexadecimal Character
0 0 0
1 1 1
2 2 2
3 3 3
28
4 4 4
5 5 5
6 6 6
7 7 7
8 8 8
9 9 9
10 A :
11 B ;
12 C <
13 D =
14 E >
15 F ?
Example: b051?
Number of boarding passengers = 5
Number of alighting passengers = 31

3.5.3. Support the first counting rule


User can set at most 5 rules. However, user can only get the result from the first (counting or
flowpath) rule.

3.5.4. 4-wire communication


The SC8132 uses 4-wire communication. User should follow the rules described as follows when
connecting camera with OBC via RS485.
AP: receiver data (RX+) AN: receiver ground (RX-)
BP: transmitter data (TX+) BN: transmitter ground (TX-)

29
4. Get Metadata
4.1. Authentication WebSocket Connection
4.1.1. WebSocket Connection
WebSocket client could be implemented using different programming languages such as
JavaScript, PHP, NodeJS, C, C++, Java...etc. This document outlines the WebSocket APIs available on
camera.
*Note1: From FW 0105o, SC stops "unAuth websocket" by factory default or after camera
restore. (use ip/VCA/Config/AE/WebSocket/Enable api then it get false)
*Note2: From FW 0105o, it supports Auth-Websocket on browser IE and chrome, while for
FW older than 0105o it only supports unAuth websocket.

4.1.2. WebSocket Server Configuration API


http://{IP}/VCA/Config/AE/WebSocket
Parameters list:
{
"AuthWSPort": 80,
"AuthWSSPort": 443,
"ProtocolName": "tracker-protocol",
}

30
Parameter Value range Description
AuthWSPort [Integer] WebSocket port.

The same with HTTP port.

AuthWSSPort [Integer] WebSocket security port.

The same with HTTPS port.

Protocol tracker-protocol WebSocket Protocol

Example:
Getting configuration by “curl”:
~# curl -i --user root:password “http://172.20.6.2/VCA/Config/AE/WebSocket”

HTTP/1.1 200 OK
Content-type: application/json

{"AuthWSPort":80,"AuthWSSPort":443,"ProtocolName":"tracker-protocol" }

4.1.3. WebSocket Protocol

WebSocket Protocol is based on Hypertext Transfer Protocol (HTTP) or Hypertext Transfer Protocol
Secure (HTTPS) mode of Package.

Hypertext Transfer Protocol (HTTP) -> ws


Hypertext Transfer Protocol Secure (HTTPS) -> wss

Example:
HTTP:
Package Website Url : http://172.20.6.2/VCA/www/index.html
WebScoket Url : ws://172.20.6.2/ws/vca?data=event,meta

HTTPS:
Package Website Url : https://172.20.6.2/VCA/www/index.html

31
WebScoket Url : wss://172.20.6.2/ws/vca?data=event,meta

4.1.4. WebSocket URL with Data Filter

Data filter is used to indicate data such as Metadata, Eventdata.


ws://{account}:{password}@{Camera_IP}/ws/vca?data={Filter_1},{Filter_2}

The detail is following table


Filter Description
meta Metadata provides scene information such as motion, object, face…etc.
Refer to the metadata document for details

event Eventdata provides event information by generated-rules such as line-counting,


flow path counting, zone detection… etc.
Refer to the eventdata document for details
stitch Stitch provided camera stitching information.

status Status provides camera status information such as zoom position, Light
State….etc.

param Param provides camera information such as FOV, image center, and camera
height.

heartbeat WebSocket heartbeat.

Example in JavaScript
WebSocket = new WebSocket("ws://root:[email protected]/ws/vca?data=meta,event",
"tracker-protocol");

Client request:
GET /ws/vca?data=meta,event HTTP/1.1
Sec-WebSocket-Version: 13
Sec-WebSocket-Key: +DZ7PjY6sHWLs7pKTYESTQ==
Connection: Upgrade
Upgrade: WebSocket

32
Sec-WebSocket-Extensions: permessage-deflate; client_max_window_bits
Sec-WebSocket-Protocol: tracker-protocol
Host: 172.20.6.2:80
Authorization: Basic cm9vdDp2aXZvMjMyNA==

Server response:
HTTP/1.1 101 Switching Protocols
Upgrade: WebSocket
Connection: Upgrade
Sec-WebSocket-Accept: 0ApxjPpakfRTEw4MMEGgA8e7Nw4=
Sec-WebSocket-Protocol: tracker-protocol

Once the WebSocket connection is ready, camera will send metadata and eventdata instantly, and
updated every frame to the socket connection.
4.1.5. WebSocket Connection

WebSocket client could be implemented using different programming languages such as


JavaScript, PHP, NodeJS, C, C++, Java...etc. This document outlines the WebSocket APIs available on
camera.

4.1.6. WebSocket Server Configuration API


http://{IP}/VCA/Config/AE/WebSocket

Parameters list:
{
"AuthWSPort": 80,
"AuthWSSPort": 443,
"ProtocolName": "tracker-protocol",
}

Parameter Value range Description


AuthWSPort [Integer] WebSocket port.

The same with HTTP port.

33
AuthWSSPort [Integer] WebSocket security port.

The same with HTTPS port.

Protocol tracker-protocol WebSocket Protocol

Example:
Getting configuration by “curl”:
~# curl -i --user root:password “http://172.20.6.2/VCA/Config/AE/WebSocket”

HTTP/1.1 200 OK
Content-type: application/json

{"AuthWSPort":80,"AuthWSSPort":443,"ProtocolName":"tracker-protocol" }

4.1.7. WebSocket Protocol

WebSocket Protocol is based on Hypertext Transfer Protocol (HTTP) or Hypertext Transfer Protocol
Secure (HTTPS) mode of Package.

Hypertext Transfer Protocol (HTTP) -> ws


Hypertext Transfer Protocol Secure (HTTPS) -> wss

Example:
HTTP:
Package Website Url : http://172.20.6.2/VCA/www/index.html
WebScoket Url : ws://172.20.6.2/ws/vca?data=event,meta

HTTPS:
Package Website Url : https://172.20.6.2/VCA/www/index.html
WebScoket Url : wss://172.20.6.2/ws/vca?data=event,meta

4.1.8. WebSocket URL with Data Filter

Data filter is used to indicate data such as Metadata, Eventdata.


ws://{account}:{password}@{Camera_IP}/ws/vca?data={Filter_1},{Filter_2}

34
The detail is following table
Filter Description
meta Metadata provides scene information such as motion, object, face…etc.
Refer to the metadata document for details

event Eventdata provides event information by generated-rules such as line-counting,


flow path counting, zone detection… etc.
Refer to the eventdata document for details
stitch Stitch provided camera stitching information.

status Status provides camera status information such as zoom position, Light
State….etc.

param Param provides camera information such as FOV, image center, and camera
height.

heartbeat WebSocket heartbeat.

Example in JavaScript
WebSocket = new WebSocket("ws://root:[email protected]/ws/vca?data=meta,event",
"tracker-protocol");

Client request:
GET /ws/vca?data=meta,event HTTP/1.1
Sec-WebSocket-Version: 13
Sec-WebSocket-Key: +DZ7PjY6sHWLs7pKTYESTQ==
Connection: Upgrade
Upgrade: WebSocket
Sec-WebSocket-Extensions: permessage-deflate; client_max_window_bits
Sec-WebSocket-Protocol: tracker-protocol
Host: 172.20.6.2:80
Authorization: Basic cm9vdDp2aXZvMjMyNA==

Server response:
HTTP/1.1 101 Switching Protocols
Upgrade: WebSocket
Connection: Upgrade

35
Sec-WebSocket-Accept: 0ApxjPpakfRTEw4MMEGgA8e7Nw4=
Sec-WebSocket-Protocol: tracker-protocol

Once the WebSocket connection is ready, camera will send metadata and eventdata instantly, and
updated every frame to the socket connection.

4.2. Get Metadata by RTSP Metadata Stream


Real-time metadata and event message can be accessed from RTSP metadata stream. The steps of
connecting to metadata track are as follow.

4.2.1. Setup RTSP Metadata Stream


You should follow standard RTSP to setup streaming connection, regarding the metadata track,
Reply packet has a description to metadata track, the followings are steps to setup bitstream and
metadata track.
The Session Description Protocol (SDP) is used to describe streams in RTSP. We use SDP name to
define different stream:
Stream 1: live.sdp
Stream 2: live2.sdp
Stream 3: live3.sdp

Here is a RTSP example, Client C requests a presentation from media server M (172.16.2.136):
1.

C-> M DESCRIBE example


DESCRIBE rtsp://172.16.2.136:554/live2.sdp RTSP/1.0
CSeq: 1
Accept: application/sdp
User-Agent: RTPExPlayer
Bandwidth: 512000
Accept-Language: en-GB
M->C Response example
RTSP/1.0 200 OK
CSeq: 1
Date: Fri, 25 Mar 2016 15:54:1 GMT
Content-Base: rtsp://172.16.2.136/live2.sdp/
Content-Type: application/sdp
Content-Length: 462
36
v=0
o=RTSP 1458921241 700 IN IP4 0.0.0.0
s=RTSP server
c=IN IP4 239.128.1.100/15
t=0 0
a=charset:Shift_JIS
a=range:npt=0-
a=control:*
a=etag:1234567890
m=video 5564 RTP/AVP 98
b=AS:0
a=rtpmap:98 H264/90000
a=control:trackID=2
a=x-onvif-track:trackID=2
a=fmtp:98 packetization-mode=1; profile-level-id=4d4033; sprop-parameter-
sets=J01AM4uVAKAMcg==,KP4DmIA=
m=application 6564 RTP/AVP 108
a=control:trackID=11
a=rtpmap:108 vnd.vivotek.metj/90000
Server will reply media information included video and metadata track number, in the example,
trackID=1 means H.264 bitstream, trackID=11 means metadata track.

2.

C-> M SETUP video example


SETUP rtsp://172.16.2.136:554/live2.sdp/trackID=2 RTSP/1.0
CSeq: 2
Transport: RTP/AVP/TCP;unicast;interleaved=0-1
Accept-Language: en-GB
M->C Response example
RTSP/1.0 200 OK
CSeq: 2
Date: Fri, 25 Mar 2016 15:54:1 GMT
Session: 00578707;timeout=70
Transport: RTP/AVP/TCP;interleaved=0-1;unicast;mode=play

37
3.

C-> M SETUP metadata example


SETUP rtsp://172.16.2.136:554/live2.sdp/trackID=11 RTSP/1.0
CSeq: 3
Session: 00578707
Transport: RTP/AVP/TCP;unicast;interleaved=2-3
Accept-Language: en-GB
M->C Response example
RTSP/1.0 200 OK
CSeq: 3
Date: Fri, 25 Mar 2016 15:54:1 GMT
Session: 00578707;timeout=70
Transport: RTP/AVP/TCP;interleaved=2-3;unicast;mode=play

4.

C-> M PLAY example


PLAY rtsp://172.16.2.136:554/live2.sdp/ RTSP/1.0
CSeq: 4
Session: 00578707
M->C Response example
RTSP/1.0 200 OK
CSeq: 4
Date: Fri, 25 Mar 2016 15:54:1 GMT
Session: 00578707;timeout=70
RTP-Info:
url=rtsp://172.16.2.136:554/live2.sdp/trackID=2;seq=0;rtptime=0;ssrc=578707,url=rtsp://172.16.2.136:554/
live2.sdp/trackID=11;seq=0;rtptime=0;ssrc=578707
Range: npt=0-
RTCP-Interval: 250

Here is the network packet snapshot:

38
Metadata packet: Type-108

39
4.3. Metadata Format
The metadata contains two type of information: MetaData and Event.
MetaData is presented in JSON format, it shows object tracking information.
{
"Tag": "MetaData",
"Ver": "1.0.0"
"Stitch": {
"Objects": [{
"Centroid": {
"x": 4468,
"y": 604
},
"GId": 278,
"Height": 1585,
"Id": 278,
"Origin": {
"x": 6800,
"y": 1763
},
"OriginUtcTime": "2018-06-01T10:08:59.654Z",
"RuleAttribute": [{
"ObjDuration": 6,
"RuleName": "Rule-1"
}
]

}
],

"UtcTime": "2018-06-01T10:09:05.253Z"
},
"Project": {
"Cx": 683.0857,
"Cy": 488.8214,
"f": 279.9816,
"a": 8.33756,
"ZoomPos": 0,

40
"Offsetx": 500.0,
"Offsety": 400.0,
" Roll": 0,
" Tilt": 0,
"W": 320,
"H": 208,
"CamH": 2600
},
"Frame": {
"UtcTime": "2018-04-27T05:20:20.063Z",
"Objects": [
{
"Id": 17,
"GId": 17,
"Height": 1776,
"Origin": {
"x": 176,
"y": 91
},
"Centroid": {
"x": 176,
"y": 154
},
"Classification": [{
"Likelihood": 99,
"MeanLikelihood": 75,
"Type": 1
}
],
"CurrentArea": 30,
"MaxArea": 43,
"OriginUtcTime": "2018-04-27T05:18:34.073Z",
"RuleAttribute": [{
" ObjDuration ": 7,
"RuleName": "Zone1"
},
{

41
" ObjDuration ": 7,
"RuleName": "Zone2"
}
]

},
{
"Id": 18,
"GId": 17,
"Height": 1967,
"Origin": {
"x": -261,
"y": 269
},
"Centroid": {
"x": -381,
"y": -211
},
"OriginUtcTime": "2018-04-27T05:19:04.087Z",
"RuleAttribute": [{
"ObjDuration": 4,
"RuleName": "Zone1"
}
]

}
]

}
}

Type of metadata.
Tag Metadata: object tracking information
Event: Event information

42
Ver Version of Metadata format
Stitch (option) On Stiching mode
UtcTime The UTC time of this frame. (YYYY-MM-DDTHH:mm:sssZ)
Objects The information of he tracked objects
Id Object ID
GId Group ID
Height Object height. unit: mm
The original coordinate of the detected object during the tracking
Origin
(foot position)
UTC time that original coordinate of the detected object during the
OriginUtcTime
tracking
x x-coordinate
y y-coordinate
Centroid The current coordinate of the detected object (foot position)
x x-coordinate
y y-coordinate
Classification Object class information after enabling Human
Likelihood Value is [1,99], it is Max. of object ClassifyValue
MeanLikelihood It is an average of the object ClassifyValue
Object type:
Type
0 is known, 1 is Human and 5 is Other
CurrentArea Current object area after enabling Human
MaxArea The max object area in its history after enabling Human
RuleAttribute Zone information of objects
ObjDuration Duration of dwelling objects in zone
RuleName Name of zone which the object is detected
Coordinate information, user can calculate the object tracking
Project
window by these parameters.
Cx Q Matrix parameter. The x-coordinate of the focal center.
Cy Q Matrix parameter. The y-coordinate of the focal center.
f Q Matrix parameter. The focal length.
a Q Matrix parameter. The baseline length.
Offsetx The x-coordinate of zoomin offset.
Offsety The y-coordinate of zoomin offset.
W The width of depth map
H The height of depth map

43
CamH The installed height of the camera. unit: mm
Roll Unsupported
Tilt Unsupported
ZoomPos Unsupported
Frame Descriptions of the objects.
UtcTime The UTC time of this frame. (YYYY-MM-DDTHH:mm:sssZ)
Objects The information of he tracked objects
Id Object ID
GId Group ID
Height Object height. unit: mm
The original coordinate of the detected object during the tracking
Origin
(foot position)
UTC time that original coordinate of the detected object during the
OriginUtcTime
tracking
x x-coordinate
y y-coordinate
Centroid The current coordinate of the detected object (foot position)
x x-coordinate
y y-coordinate
RuleAttribute Zone information of objects
ObjDuration Duration of dwelling objects in zone
RuleName Name of zone which the object is detected

The object is shown as below, it has a tracking window, an original point and a current point.

Origin

44
Centroid

You can use the “Project” parameters to derive 8 coordinate points of the tracking window, the
order of the 8 points is as below, point 0~7.

4 5

Top

7 6

0 1

Bottom

3 2

Here is a sample JavaScript code to demo how to calculate all the coordinate points:
var BOX3D_HALF_WIDTH = 180;
var m_rectDisparityROI = new jsPoint(0,0);
// Q Matrix after stereo calibration
var Cx = 683.0857;
var Cy = 488.8214;
var f = 279.9816;
var a = 8.33756;
45
var b = 0;
var m_matQ = [[1, 0, 0, -Cx],[0, 1, 0, -Cy],[0, 0, 0, f],[0, 0, a, b]];

var offset_3D = [new js3DPoint(-1,-1,0),new js3DPoint(1,-1,0),new js3DPoint(1,1,0),new js3DPoint(-1,1,0),


//bottom
new js3DPoint(-1,-1,1), new js3DPoint(1,-1,1),new js3DPoint(1,1,1), new js3DPoint(-
1,1,1)]; // top

function jsPoint(x, y)
{
this.x = x;
this.y = y;
}

function js3DPoint(x, y, z)
{
this.x = x;
this.y = y;
this.z = z;
}

function ProjectToRectifiedPoint(x, y, z)
{
var dst = new jsPoint(x * m_matQ[2][3] / z - m_matQ[0][3], 2460y * m_matQ[2][3] / z -
m_matQ[1][3]);

dst.x -= m_rectDisparityROI.x;
dst.y -= m_ rectDisparityROI.y;
return dst;
}
function GetBoundingBox(CamHeight, Height, CenterOfGravity, eMode)
{
var vertex_num = 8;
var pointImageCoordinate = new Array();

for (var iIndex = (eMode=="3D"?0:4) ; iIndex < vertex_num ; iIndex++)


{
var vertex = new jsPoint(CenterOfGravity.x + offset_3D[iIndex].x * BOX3D_HALF_WIDTH,
CenterOfGravity.y + offset_3D[iIndex].y * BOX3D_HALF_WIDTH);
var pt = ProjectToRectifiedPoint(vertex.x, vertex.y, CamHeight - offset_3D[iIndex].z * Height);
pointImageCoordinate.push(pt);
}

return pointImageCoordinate;
}
46
function main()
{
var CamHeight = 2400;
var Height = 1776;
var Centroid= new jsPoint(176,154);
var Origin = new jsPoint(176,91);
var Box3D = GetBoundingBox(CamHeight, Height, Centroid,'3D');
var OriginPoint = ProjectToRectifiedPoint(Origin.x, Origin.y, CamHeight);
console.log("3DBox : ");
console.log(Box3D);
}
console.log('starting DEMO GetBoundingBox3D...')
main();

The coordinates are derived from the depth map (W, H), if you need to draw the tracking window
on different resolution image, you have to do the transformation from (W, H) to (W’, H’).

The other type of metadata is Event, which is also presented in JSON format, it shows the real-time
counting event.
{
"Tag": "Event",
"Ver": "1.0.0",
"Data": [
{
"RuleType": "Counting",
"CountingInfo": [
{
"RuleName": "Counting1",
"In": 0,
"Out": 1,
"Time": "2015-12-15T06:32:20.876Z"
}
]
},
{
"RuleType": "Counting",

47
"CountingInfo": [
{
"RuleName": "Counting2",
"AccIn": 1,
"AccOut": 4,
"In": 2,
"Out": 3,
"Time": "2015-12-15T06:32:20.876Z"
}
]
}, {
"RuleType" : "ZoneDetection",
"ZoneInfo" : [
{
"AvgDuration" : 9,
"Inside" : 1,
"MaxDuration" : 9,
"MinDuration" : 9,
"RuleName" : "Zone1",
"Time" : "2015-12-15T06:32:20.876Z "
}
],
"RuleType" : "QueueAnalysis",
"QueueInfo" : [
{
"AvgDuration" : 0,
"MaxDuration" : 0,
"MinDuration" : 0,
"QueueLength" : 0,
"QueueState" : "Open",
"RuleName" : "Rule-4",
"ServiceDuration" : 0,
"Time" : "2020-10-08T05:37:31.863Z"
}
]
}
]

48
}

Type of metadata.
Tag Metadata: object tracking information
Event: Event information
Ver Version of Event format
Data Descriptions of the event
RuleType The rule type, currently we support one rule type: Counting
CountingInfo The information of counting event
RuleName The rule name
The number of objects that are crossing the counting line toward the “In”
In
direction
The number of objects that are crossing the counting line toward the “Out”
Out
direction
The accumulated number of objects that are crossing the counting line
AccIn
toward the “In” direction
The accumulated number of objects that are crossing the counting line
AccOut
toward the “Out” direction
Time The event time. (YYYY-MM-DDTHH:mm:ssZ)
ZoneInfo The information of zone event
AvgDuration Average waiting duration of objects in zone
Inside Current object numbers in zone
MaxDuration Maximum waiting duration of objects in zone
MinDuration Minimum waiting duration of objects in zone
RuleName The rule name of zone
Time The event time (YYYY-MM-DDTHH:mm:ssZ)
QueueInfo The information of queue event
AvgDuration Average waiting duration of objects in queue
MaxDuration Maximum waiting duration of objects in queue
MinDuration Minimum waiting duration of objects in queue
QueueLength The number of objects waiting in queue
QueueState The current queue status, “Open” or “Close”
RuleName The rule name of queue
ServiceDuration Current service duration of object in service zone
Time The event time (YYYY-MM-DDTHH:mm:ssZ)

49
5. RESTful APIs
SC8131/SC8132 supports RESTful APIs to get and configure the stereo camera’s parameters, the APIs
are consist of URL and standard HTTP methods.
The URL is as following type:
http://IP/VCA/REQUEST
The supported “REQUEST” will be listed in next chapter.

Supported HTTP methods:


GET: Retrieve a representation of the addressed member of the collection or the entire collection,
expressed in JSON.
POST: Set the entire collection, data format is JSON.

5.1. API List


URI
/VCA/APIList
Description
All supported request list
Default value
N/A
GET
Input data None
Return data All supported request list in JSON format
Request example
GET /VCA/APIList HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Data": {
"APIList": [
"/VCA/APIList",
50
"/VCA/Alarm",
"/VCA/Alarm/Reset",
"/VCA/AutoCalibration/Apply",
"/VCA/AutoCalibration/Configure",
"/VCA/AutoCalibration/Tuning",
"/VCA/Camera/Fov",
"/VCA/Camera/Intrinsics",
"/VCA/Camera/LensInfo",
"/VCA/Camera/Matrix",
"/VCA/Camera/OCenter",
"/VCA/Camera/Profile",
"/VCA/Camera/Profile/CameraHeight",
"/VCA/Camera/Project",
"/VCA/Camera/Status",
"/VCA/Camera/Time",
"/VCA/Capability",
"/VCA/Config/AE",
"/VCA/Config/Alarm",
"/VCA/Config/DI",
"/VCA/Config/RE",
"/VCA/Config/Reload",
"/VCA/Config/ReportPush",
"/VCA/Config/Validation",
"/VCA/DB",
"/VCA/DI/Off",
"/VCA/DI/On",
"/VCA/Data/AccCounting",
"/VCA/Data/DB/Counting",
"/VCA/Data/DB/Heatmap",
"/VCA/Data/DB/Queue",
"/VCA/Data/DB/Zone",
"/VCA/Data/FlowPath/ScaledVector",
"/VCA/Data/StartStopMap",
"/VCA/Debug/Data",
"/VCA/Debug/EmptyObjectMetadata/Off",
"/VCA/Debug/EmptyObjectMetadata/On",
"/VCA/FWConfig/MotionDetection",

51
"/VCA/Files/Intrinsic/Validity",
"/VCA/Rule",
"/VCA/Rule/FirstCountingRule",
"/VCA/Rule/FirstCountingRule/Reset",
"/VCA/Rule/Reset",
"/VCA/Scene/Depth",
"/VCA/Scene/Disparity",
"/VCA/Scene/Raw",
"/VCA/Stitching/CameraList",
"/VCA/Stitching/CameraList/Reorder",
"/VCA/Stitching/Configure",
"/VCA/Stitching/ManualTraining",
"/VCA/Stitching/Matching",
"/VCA/Stitching/Pause",
"/VCA/Stitching/State",
"/VCA/Stitching/StitchInfo",
"/VCA/Stitching/StitchInfo/TranslationToMaster",
"/VCA/Stitching/StitchedMap",
"/VCA/Stitching/Training",
"/VCA/Stitching/Training/Status",
"/VCA/Stitching/Training/Stop",
"/VCA/Tracking/Off",
"/VCA/Tracking/On",
"/VCA/Version",
"/VCA/WebSocket/Status/HeartBeat/IntervalSec"
]
},
"Status": 200
}\n
POST
N/A

5.2. App Feature


URI
/VCA/Capability
Description
52
All supported features
Default value
N/A
GET
Input data None
Return data All supported features list in JSON format
Request example
GET /VCA/Capability HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Data": {
"Feature": [
"Heatmap", "CrowdDetect", "QueueDetect", "FaceDetect", "Stitching",
"Doorbell", "CarCountingOnly", "AgeGender", "Walmart", "BankTailgate",
"RecordMetadata", "SkinCountMax", "MetaValidation", "SmartMotion", "WithLicense",
"3DWorldRule", "UIRedLine", "SkinSwarco",
"3D-counting", "2D-counting"
],
"ModelName": "SC8131"
},
"Status": 200
}\n
POST
N/A

5.3. Package Version


URI
/VCA/Version
Description
53
Package version
Default value
N/A
GET
Input data None
Return data Package version in JSON format
Request example
GET /VCA/Version HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Data": {
"Git_Version": "94846e4 (ivsd-ci@)",
"IsEnableStitching": false,
"VCA_Version": "6.0.20"
},
"Status": 200
}\n
POST
N/A

5.4. Configuration
5.4.1. Analytics Engine
URI
/VCA/Config/AE
Description
Analytics engine settings
Default value
N/A
GET
54
Input data None
Return data Analytics engine settings in JSON format
Request example
GET /VCA/Config/AE HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"AggressiveHeightFilter": true,
"AutoGSensorMode": false,
"AutoHeight": false,
"AutoRollAngle": 0,
"AutoTiltAngle": 0,
"CamHeight": 2400,
"Confidence": 3,
"HasGSensor": false,
"MaxDisparity": 32,
"MaxObjectDistance": 5000,
"MaxObjectHeight": 1900,
"MaxShoppingUnitDistance": 0,
"MinDisparity": 0,
"MinObjectHeight": 800,
"Motion": {
"MainConfig": {
"EnableObjectFilter": false,
"Sensitivity": 70
},
"Schedule": {
"AutoMode": true,
"Begin": "18:00",
"Enable": false,
"End": "06:00"
55
},
"ScheduleConfig": {
"EnableObjectFilter": false,
"Sensitivity": 70
}
},
"RollAngle": 0,
"Sensitivity": 10,
"TamperingSensitivity": 5,
"TiltAngle": 0,
"UpdateSecond": 600,
"WebSocket": {
"Enable": true,
"Port": 888,
"ProtocolName": "tracker-protocol",
"WSSPort": 889
},
"ZoomInFactor": 1
}\n
POST
N/A
Parameters Description
AggressiveHeightFilter Unsupported
AutoGSensorMode Unsupported
Auto detects camera installed height in Boolean format
AutoHeight
true: enable auto detect height
false: disable auto detect height
AutoRollAngle Unsupported
AutoTiltAngle Unsupported
Set camera installed height manually. The value in unit of mm.
CamHeight
Valid range: 1600~5500
User can adjust confidence level to filter out the controversial points
of depth. The controversial points are those points we could not
compute depth information. Under some situations like
Confidence
overexposed view or smooth background which no objects or
patterns for computing depth information, confidence level would
need to be adjusted to keep the performance for tracking.
56
With lower level, the image would be much smoother; more noise
points or unknown points would be filtered out.
Valid range: 0~30
HasGSensor Unsupported
Minimum value of disparity.
MaxDisparity
Valid range: 12~48
MaxObjectDistance Unsupported
Maximum height of objects which will be considered for tracking.
MaxObjectHeight The value in unit of mm.
Valid range: 500~2500
Maximum height of objects which will be considered for tracking.
MaxShoppingUnitDistance The value in unit of mm.
Valid range: 500~2500
Minimum value of disparity.
MinDisparity
Valid range: 0~48
Minimum height of objects which will be considered for tracking.
MinObjectHeight The value in unit of mm.
Valid range: 500~2500
Motion Unsupported
MainConfig Unsupported
EnableObjectFilter Unsupported
Sensitivity Unsupported
Schedule Unsupported
AutoMode Unsupported
Begin Unsupported
Enable Unsupported
End Unsupported
ScheduleConfig Unsupported
EnableObjectFilter Unsupported
Sensitivity Unsupported
RollAngle Unsupported
Sensitivity of human detection, the higher the value the more likely
Sensitivity object will be determined as human
Valid range: 1~10
TamperingSensitivity Unsupported
TiltAngle Camera installed tilt angle

57
UpdateSecond Not used
WebSocket Websocket parameters
Websocket transmission
Enable “true”: enable
“false”: disable
Websocket port number
Port
Default: 888
ProtocolName Websocket protocol name
Secure websocket port number
WSSPort
Default: 889
Digital zoom-in the field of view. The appropriately installed height
is 2.4~3.6 meter, if the user needs to install it higher than 3.6
ZoomInFactor
meters, they should adjust this parameter to get more accurate
tracking and counting, but the tradeoff is losing the FOV.

5.4.2. Rule Engine


URI
/VCA/Config/RE
Description
Rule engine settings
Default value
N/A
GET
Input data None
Return data Rule settings in JSON format
Request example
GET /VCA/Config/RE HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
58
"Counting": {
"Counting1": {
"CountMode": "AfterExit",
"Direction": "Any",
"Line": [
[
{
"x": 3941,
"y": 2809
},
{
"x": 5099,
"y": 5543
},
{
"x": 4825,
"y": 8716
}
]
],
"Line3D": [
[
{
"x": -86,
"y": -800
},
{
"x": -447,
"y": -101
},
{
"x": -86,
"y": 597
}
]
],
"MaxHeightFilter": 2100,

59
"MaxShoppingUnitDistance": 0,
"MinHeightFilter": 1200,
"ReportTimeInterval": 1,
"ResetTimeInterval": 1,
"DBReportTimeInterval": 60,
"EnableClassify": 0,
"Target": 0,
"IsStitchType": false
}
},
"ExclusiveArea": {
"Field": [[{
"x": 1169,
"y": 1459
},
{
"x": 1006,
"y": 5277
},
{
"x": 3940,
"y": 4409
},
{
"x": 3489,
"y": 429
}],
[{
"x": 5796,
"y": 6328
},
{
"x": 5871,
"y": 9075
},
{
"x": 9068,

60
"y": 8267
},
{
"x": 8341,
"y": 5035
}]]
},
"DoorArea": {
"Field": [[{
"x": 0,
"y": 9000
},
{
"x": 0,
"y": 10000
},
{
"x": 10000,
"y": 10000
},
{
"x": 10000,
"y": 9000
},
{
"x": 6500,
"y": 3000
},
{
"x": 3500,
"y": 3000
}]]
},
"FlowPathCounting": {
"FlowPathCounting1": {
"CountMode": "AfterExit",
"Direction": "Any",

61
"Line": [
[
{
"x": 3333,
"y": 6391
},
{
"x": 3333,
"y": 3333
}
],
[
{
"x": 6666,
"y": 6391
},
{
"x": 6666,
"y": 3333
}
],
[
{
"x": 6111,
"y": 6391
},
{
"x": 6111,
"y": 3333
}
],
[
{
"x": 5555,
"y": 6391
},
{

62
"x": 5555,
"y": 3333
}
],
[
{
"x": 5000,
"y": 6391
},
{
"x": 5000,
"y": 3333
}
],
[
{
"x": 4444,
"y": 6391
},
{
"x": 4444,
"y": 3333
}
],
[
{
"x": 3888,
"y": 6391
},
{
"x": 3888,
"y": 3333
}
]
],
"Line3D": [
[

63
{
"x": -1216,
"y": 482
},
{
"x": -1216,
"y": -800
}
],
[
{
"x": 1043,
"y": 482
},
{
"x": 1043,
"y": -800
}
],
[
{
"x": 666,
"y": 482
},
{
"x": 666,
"y": -800
}
],
[
{
"x": 290,
"y": 482
},
{
"x": 290,
"y": -800

64
}
],
[
{
"x": -86,
"y": 482
},
{
"x": -86,
"y": -800
}
],
[
{
"x": -463,
"y": 482
},
{
"x": -463,
"y": -800
}
],
[
{
"x": -840,
"y": 482
},
{
"x": -840,
"y": -800
}
]
],
"MaxHeightFilter": 2100,
"MaxShoppingUnitDistance": 0,
"MinHeightFilter": 1200,
"ReportTimeInterval": 1,

65
"ResetTimeInterval": 1,
"Sensitivity": 50,
"DBReportTimeInterval": 60,
"EnableClassify": 0,
"Target": 0,
"EnableDI": 0,
"DIDelaySec": 0,
"IsStitchType": false
}
},
"ZoneDetection": {
"Zone1": {
"EnterDelay": 5,
"Field": [[{
"x": 2086,
"y": 2558
},
{
"x": 2248,
"y": 8024
},
{
"x": 8164,
"y": 8443
},
{
"x": 8139,
"y": 2139
}]],
"LeaveDelay": 5,
"MaxHeightFilter": 2100,
"MaxShoppingUnitDistance": 0,
"MinHeightFilter": 1200,
"ReportTimeInterval": 1,
"ResetTimeInterval": 1,
"DBReportTimeInterval": 60,
"EnableClassify": 0,

66
"Target": 0,
"IsStitchType": false
}
},
" QueueAnalysis" : {
"Queue1" : {
"Line" : [[{
"x" : 7000,
"y" : 7353
}, {
"x" : 5000,
"y" : 7353
}
], [{
"x" : 7000,
"y" : 4411
}, {
"x" : 5000,
"y" : 4411
}
], [{
"x" : 0,
"y" : 0
}, {
"x" : 0,
"y" : 0
}
], [{
"x" : 0,
"y" : 0
}, {
"x" : 0,
"y" : 0
}
], [{
"x" : 0,
"y" : 0

67
}, {
"x" : 0,
"y" : 0
}
]
],
"Field" : [[{
"x" : 4000,
"y" : 7353
}, {
"x" : 4900,
"y" : 7353
}, {
"x" : 4900,
"y" : 4411
}, {
"x" : 4000,
"y" : 4411
}
]],
"Distance" : 1300,
"OpenDelay" : 3,
"CloseDelay" : 3,
"EnterDelay" : 3,
"LeaveDelay" : 3,
"MaxHeightFilter" : 2100,
"MaxShoppingUnitDistance" : 0,
"MinHeightFilter" : 900,
"EnableClassify": 0,
"DBReportTimeInterval" : 0,
"ReportTimeInterval" : 0,
"ResetTimeInterval" : 0,
"Target": 0,
"IsStitchType": false
}
}
}\n

68
POST
N/A
Parameters Description
Counting Line counting type
FlowPathCounting Flow path counting type
Rules for validating the behavior of objects which crossing the
rule line. Three options for counting mode in string format.

- AfterExit
CountMode
- FirstPass

- EveryPass

Configure the valid direction of objects that being counted.


Three options are provided in string format.

- In
Direction
- Out

- Any

Array of lines used for detecting the passing of an object. The


Line value of the x, y position is defined as the ratio of image size:
the range is 0~10000.
x x-coordinate
y y-coordinate
Line3D World coordinates
x x-coordinate (mm)
y y-coordinate (mm)
Maximum height filter value in unit of mm.
MaxHeightFilter
Valid range: 1100~2500
Maximum distance of group counting. Currently SC8131
supports only 0 and 900. 0 means each object will be counted
MaxShoppingUnitDistance individually; that is, group counting is disabled. 900 represents if
the distance of objects is close enough for rule algorithm, then a
group of objects will be counted as one event.
MinHeightFilter Minimum height filter value in unit of mm.
69
Valid range: 800~2200
ReportTimeInterval Time interval for reporting count information in unit of second.
ResetTimeInterval Periodic count reset time in unit of second.
Periodic time of write counting data to database in unit of
DBReportTimeInterval
second.
Enable classifier.
EnableClassify 0: Disable
1: Enable
Classifier target.
Target 0:unknow
1: People
IsStitchType Is the rule on stitching mode
Exclude certain areas in your field of view from tracking
ExclusiveArea
detection.
A polygon set of detection area. The range of the polygon point
Field number is <3~20>. The value of the x, y position is defined as
the ratio of image size, and the range is 0~10000.
x x-coordinate
y y-coordinate
Exclude certain areas in your field of view from tracking
DoorArea
detection.
A polygon set of detection area. The range of the polygon point
Field number is <3~20>. The value of the x, y position is defined as
the ratio of image size, and the range is 0~10000.
x x-coordinate
y y-coordinate
ZoneDetection Zone counting type data
Detection will start after EnterDelay time when an object enter
EnterDelay
the zone area.
A polygon set of detection area. The range of the polygon point
Field number is <3~20>. The value of the x, y position is defined as
the ratio of image size, and the range is 0~10000.
x x-coordinate
y y-coordinate
Field3D World coordinates
x x-coordinate (mm)

70
y y-coordinate (mm)
Detection will stop after LeaveDelay time when an object leave
LeaveDelay
the zone area.
Exclude certain areas in your field of view from tracking
QueueDetection
detection.
Array of lines used for detecting the passing of an object. The
Line value of the x, y position is defined as the ratio of image size:
the range is 0~10000.
x x-coordinate
y y-coordinate
A polygon set of detection area. The range of the polygon point
Field number is <3~20>. The value of the x, y position is defined as
the ratio of image size, and the range is 0~10000.
x x-coordinate
y y-coordinate
Distance Distance between people to be counted as one queue
OpenDelay Start analysis after seconds
CloseDelay Stop analysis after seconds
EnterDelay Add 1 person after entering queue for seconds
LeaveDelay Minus 1 person after leaving queue for seconds

5.4.3. Alarm
URI
/VCA/Config/Alarm
Description
Analytics event rules settings
Default value
N/A
GET
Input data None
Return data Analytics event rules settings in JSON format
Request example
GET /VCA/Config/Alarm HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n

71
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Counting": {
"0": {
"AlarmRuleName": "Counting1",
"Condition": "GTE",
"Source": "In",
"ThreshHold": 10,
"TopicName": "tns1:RuleEngine/TrackerAlarm/test2",
"RuleName": "test1"
}
},
"ZoneDetection": {
"0": {
"AlarmRuleName": "Zone1",
"Condition": "LTE",
"Source": "Inside",
"ThreshHold": 2,
"TopicName": "tns1:RuleEngine/TrackerAlarm/test",
"RuleName": "test3"
}
}
}\n
POST
N/A
Parameters Description
Counting Counting event rule
ZoneDetection Zone detection event rule
AlarmRuleName The name of the binding analytics rule target
The condition for the specified source to trigger this event
Condition GTE: greater than or equal to
LTE: less than or equal to
72
The trigger source of the analytics rule.
For Counting line and Flow path, there are three available sources:

- In: the accumulation number of crossing people in "In" direction.

- Out: the accumulation number of crossing people in "Out"

direction.

- Remaining: same with "In-Out", the difference of accumulation

numbers between two directions.


Source For ZoneDetection, there are four available source:

- Inside: current total number of objects in Zone area

- Maximum waiting duration: maximum waiting duration of

current objects in Zone area

- Minimum waiting duration: minimum waiting duration of

current objects in Zone area


Average waiting duration: average waiting duration of all objects in
Zone area currently
ThreshHold The source threshold to trigger the event
TopicName Internal use

5.4.4. Report push


URI
/VCA/Config/ReportPush
Description
Settings of report push
Default value
N/A
GET
Input data None
Return data Settings of report push in JSON format
Request example
GET /VCA/Config/ReportPush HTTP/1.1\r\n
73
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"1": {
"name": "Reportpush1",
"aggregation": 900,
"format": "Json",
"schedule": 3600,
"lite": 0,
"localtime": 1,
"servertypeselector": "email",
"recipient": "[email protected]",
"sender": "[email protected]",
"sslmode": 0,
"url": "mail.vivotek.com",
"usr": "ben.wu",
"pwd": "1234",
"port": 888,
"sdcyclic": 0,
"fileformat": "",
"uri": "",
"status": "success"
},
"2": {
"name": "FTP123",
"aggregation": 900,
"format": "XML",
"schedule": 1800,
"lite": 1,
"localtime": 0,
"servertypeselector": "ftp",

74
"recipient": "",
"sender": "",
"sslmode": 0,
"url": "192.168.1.155",
"usr": "ben.wu",
"pwd": "1234",
"port": 21,
"sdcyclic": 0,
"fileformat": "ftpreport_%T.%F",
"uri": "ftp path",
"status": "fail"
}
}\n
POST
N/A
Parameters Description
name User defined target name
This is the aggregation period for each data in reports. Events in the same
aggregation level would be accumulated as one data. In this camera, we
aggregation support the same options with Delivery schedule. Note that, aggregation
level must be shorter than Deliver schedule.
The unit is second.
This camera now supports three report format including XML, CSV and
format
JSON. The detailed content of each format would be introduced later.
The duration between next pushed aggregated report. At the same time, it
schedule is also the total duration of one report. In this camera, we support 1 min, 5
mins, 15 mins, 30 mins, 1 hr, 12 hrs, 1 day. All schedule starts from 00:00.
In lite mode would ignore the zero data to reduce the size of each report. If
the lite mode is No, then the report would contain zero in/out record even if
lite there is no count event occurs in that aggregation period.
0: disable
1: enable
Show the StartTime, EndTime in camera local time with ISO8601 format
localtime 0: disable
1: enable
Server type supports:
servertypeselector
“http”
75
“https”
“ftp”
“email”
“sdcard”
recipient Valid email addresses of recipients. (seperated with semicolon ;)
sender Valid email address of sender
http/email secure mode.
sslmode 0: disable secure mode
1: enable secure mode
url SMTP/FTP/HTTP/HTTPS server IP address
usr Username if server requires authorization
pwd Corresponding password of Username
port server port number
If enable cyclic storage, SD memory management will be enabled. If the
memory usage is up to 90% of total memory size, old contents will be deleted
sdcyclic
to get more free space for updating data. If cyclic storage is not enabled,
reports will not be recorded if usage is higher than 90% of total memory size.
Customize reports filename.
Default: report_%T.%F

%T: Report timestamp in UTC time


%F: Report format in xml, json or csv
%N: User defined server name
fileformat
%M: Mac address in serial
%G: Group ID
%D: Device ID
%S: Schedule duration in second
%A: Aggregation level in second
%L: LITE" if in lite mode, "" otherwise
uri HTTP server route uri or ftp path
Send result:
“success”
status
“fail”
“unset”

76
5.5. Data
5.5.1. Start/Stop Map
URI
/VCA/Data/StartStopMap
Description
Get coordination of all start-stop pairs
Default value
N/A
GET
Input data None
Return data Start/Stop map in JSON format
Request example
GET /VCA/Data/StartStopMap HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Data": [
{
"ex": 1023,
"ey": -558,
"sx": 1560,
"sy": 350
},
{
"ex": 1297,
"ey": 695,
"sx": 540,
"sy": -550
}
],
"Status": 200
77
}\n
POST
N/A
Parameters Description
Data Start-Stop map, object array.
sx The x-coordinate of the Stop point.
sy The y-coordinate of the Stop point.
ex The x-coordinate of the Start point.
ey The y-coordinate of the Start point.
100: Continue
This interim response is used to inform the client that the initial part of the
request has been received and has not yet been rejected by the server. The
client SHOULD continue by sending the remainder of the request or, if the
request has already been completed, ignore this response.
status
200: OK
The request has succeeded.

400: Bad Request


The request could not be understood by the server due to malformed syntax.

5.6. Camera information


5.6.1. Project
URI
/VCA/Camera/Project
Description
Parameters of projection
Default value
N/A
GET
Input data None
Return data Parameters of projection in JSON format
Request example
GET /VCA/Camera/Project HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
78
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Baseline": 0.016719916835427284,
"CamHeight": 2400,
"Cx": 141.23857116699219,
"Cy": 116.08460235595703,
"FocalLength": 116.56810760498047,
"ORGHeight": 960,
"ORGWidth": 1280,
"OffsetX": 0,
"OffsetY": 0,
"ROIHeight": 541,
"ROIWidth": 859,
"ResolutionH": 208,
"ResolutionW": 320,
"TiltAngle": 0,
"ZoomInFactor": 1,
"ZoomInOffsetX": -1,
"ZoomInOffsetY": -1
}\n
POST
N/A
Parameters Description
Baseline 1/(camera baseline)
CamHeight The installed height of the camera. unit: mm
Cx The x-coordinate of the focal center.
Cy The y-coordinate of the focal center.
FocalLength The focal length.
ORGHeight 2560
ORGWidth 960
OffsetX 0
OffsetY 0
79
ROIHeight The height of original Single view resolution
ROIWidth The width of original Single view resolution
ResolutionH The Hight of Depth Image resolution
ResolutionW The Hight of Depth resolution
TiltAngle Camera installed tilt angle
Digital zoom-in the field of view. The appropriately installed height is 2.4~3.6
meter, if the user needs to install it higher than 3.6 meters, they should adjust
ZoomInFactor
this parameter to get more accurate tracking and counting, but the tradeoff is
losing the FOV.
ZoomInOffsetX The x-coordinate of zoomin offset.
ZoomInOffsetY The y-coordinate of zoomin offset.

5.6.2. Time
URI
/VCA/Camera/Time
Description
Current Unix timestamp with format [Second].[Millisecond]
Default value
N/A
GET
Input data None
Return data Current Unix timestamp
Request example
GET /VCA/Camera/Time HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: text/html\r\n
\r\n
1525052709.715\n
POST
N/A

80
Parameters Description
N/A

5.6.3. Status
URI
/VCA/Camera/Status
Description
Camera current status
Default value
N/A
GET
Input data None
Return data Camera current status in JSON format
Request example
GET /VCA/Camera/Status HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"DIState": "Off",
"TamperingState": "Normal",
"UtcTime": "2017-07-25T10:06:25.727Z"
}\n
POST
N/A
Parameters Description
DIState Digital input status
TamperingState Tampering status
UtcTime Current time

81
5.7. Stitching
5.7.1. Get Stitching information
URI
/VCA/Stitching/StitchInfo
Description
Get Stitching information pairly, including X/Y translation and rotation
Default value
N/A
GET
Input data None
Return data Status
Request example
GET /VCA/Stitching/StitchInfo HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Status": 200,
"Transformation": [
{
"Camera1": 0,
"Camera1Roi": {
"Height": 480,
"Width": 480
},
"Camera2": 1,
"Camera2Roi": {
"Height": 480,
"Width": 480
},
"Confidence": 10,
"IntTranslationX": 53,
82
"IntTranslationY": -6991,
"PairStatus": "done",
"Rotation": 1.517895,
"RotationDegree": 273.221008,
"Scale": 1,
"TranslationX": 0.006261,
"TranslationY": -0.827888
}
]
}
POST
{
"Transformation": [
{
"Camera1": 0,
"Camera2": 1,
Input data "IntTranslationX": -1988,
"IntTranslationY": 5155,
"RotationDegree": 177.25
}
]
}
{
Return data "Status": 200
}
Parameters Description
Only works when stitching is done
Camera index of current transform master, please refer to GET command
Camera1
results
Camera2 Camera index of current transform slave
IntTranslationX The distance of slave camera in X coordinate of master camera, in millimeter
IntTranslationY The distance of slave camera in Y coordinate of master camera, in millimeter
The rotation degree for coordinate of slave camera to match coordinate of
RotationDegree
master camera

5.8. Tracking
5.8.1. Tracking ON
83
URI
/VCA/Tracking/ON
Description
Enable object tracking
Default value
N/A
GET
Input data None
Return data Status
Request example
GET /VCA/Tracking/On HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Status": 200
}\n
POST
N/A
Parameters Description
100: Continue
This interim response is used to inform the client that the initial part of the
request has been received and has not yet been rejected by the server. The
client SHOULD continue by sending the remainder of the request or, if the
request has already been completed, ignore this response.
Status
200: OK
The request has succeeded.

400: Bad Request


The request could not be understood by the server due to malformed syntax.

84
5.8.2. Tracking OFF
URI
/VCA/Tracking/Off
Description
Disable object tracking
Default value
N/A
GET
Input data None
Return data Status
Request example
GET /VCA/Tracking/Off HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Status": 200
}\n
POST
N/A
Parameters Description
Status 100: Continue
This interim response is used to inform the client that the initial part of the
request has been received and has not yet been rejected by the server. The
client SHOULD continue by sending the remainder of the request or, if the
request has already been completed, ignore this response.

200: OK
The request has succeeded.

85
400: Bad Request
The request could not be understood by the server due to malformed syntax.

5.9. Integration Test


5.9.1. Rule Trigger Test
URI
/VCA/Rule/Test
Description
Forces all rules which are user set are triggered, and sending event
Default value
N/A
GET
Input data None
Return data Status
Request example
GET /VCA/Rule/Test HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Status": 200
}\n
POST
N/A
Parameters Description
Status 200: OK
The request has succeeded.

400: Bad Request


The request could not be understood by the server due to malformed syntax.

86
5.9.2. Counting Rule Trigger Test
URI
/VCA/Rule/Test?Counting&In=[In count]&Out=[Out count]
Description
Forces all counting rules which are user set are triggered, and sending specific event data
Default value
N/A
GET
Input data Number
Return data Status
Request example
GET /VCA/Rule/Test HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Status": 200
}\n
POST
N/A
Parameters Description
Status 200: OK
The request has succeeded.

400: Bad Request


The query format is not correct.

87

You might also like