SC8131 SC8132 Integration Guide 2.11
SC8131 SC8132 Integration Guide 2.11
Rev: v2.11
2020/12/24
1
Revision History
Version Date Editor Comment
0.1 2016/01/29 Eric Fang First Draft
0.2 2016/02/24 Evan Modify Metadata format
Chen
0.3 2016/03/25 Evan 1. Add Stereo Tracker configuration by ONVIF
Chen protocol
2. Add configuration of RTSP metadata stream
0.4 2016/04/22 Eric Fang 1. Modify height unit use mm
2. Add Setting WebSocket port number
1.0 2016/05/16 Eric Fang 1. Modify 3.1. WebSocket protocol description
2. Modify 4. Example code refine
1.1 2016/06/14 Eric Fang 1. Insert new chapter 2. Read Before Use
2. For SC8131-VVTK-0101i
1.2 2016/07/06 Eric Fang 1. Modify 2. Add default account and password
information
1.3 2016/11/22 Hsuany 1. Add new items for rules
1.4 2017/01/16 Hsuany 1. Modify Zone naming in metadata and
eventdata
1.5 2017/03/31 Evan 1. Add chapter 6 RESTful APIs
2. Modify websocket package: remove cookie
2.0 2017/06/09 Evan Remove Stereo Tracker configuration by ONVIF
protocol
2.1 2018/6/15 Evan/ 1. Add Get Counting Result by CGI, Report
Small Push, RS485
2. Modify RESTful APIs
2.2 2018/09/26 small 1. Add Device Log
2. Add Queue management
2.3 2019/04/15 small 1. Update Get Accumulative Counting Result
2.4 2019/04/22 terry 1. Update Classification and Area information
2.5 2019/12/23 ting 1. Add authorization WebSocket introduction
2.6 2020/09/10 ethan Websocket count event add accumulated in/out
data
2.7 2020/10/06 ethan Add rule test restful api
2.8 2020/10/08 ethan Add queue event metadata and restful api
2
2.9 2020/10/22 Terry Stop "unAuth websocket" on default mode.
2.10 2020/11/11 ethan 1. Add passerby event restful api
2. Add counting event restful api description
2.11 2020/12/24 Albus Add manually modify stitching transform
3
Table of Contents
Revision History .......................................................................................................................... 2
1. Introduction..................................................................................................................... 7
3.3.3. JSON.............................................................................................................................................. 14
4
4.1.1. WebSocket Connection ....................................................................................................... 30
5.5. Data...................................................................................................................................... 77
5
5.7. Stitching ............................................................................................................................... 82
5.8. Tracking................................................................................................................................ 83
6
1. Introduction
This document describes how to integrate the real-time metadata and counting result of the 3D
analytics system “Stereo Tracker”.
7
CGI format is described as following,
http://{IP}/Stereo-Counting/cgi-bin/report_pull.cgi ?
format={xml,json,csv} &
starttime={starttime timestamp} &
endtime={endtime timestamp} &
aggregation={aggregation level in seconds} &
lite={0,1}&
localtime={0,1}&
countingeventdb={0,1}
Key Description
* Querying start time timestamp [ timestamp in second or the ISO8601
starttime
formatted date time string, e.g. 2016-03-20T12:00:00 ]
* Querying end time timestamp [ timestamp in second or the ISO8601
endtime
formatted date time string, e.g. 2016-03-21T08:00:00 ]
* aggregation Report aggregation level for each record in unit of second
format [Option] Report format including XML(default), JSON, CSV
lite [Option] Set Flag to 1 to ignore in/out zero records. [default turn off : 0]
[Option] Set Flag to 1 to take input starttime, endtime, and the StartTime,
EndTime in report as camera local time.
localtime
[default turn off : 0 -> input starttime, endtime and all time format in
report is in UTC timestamp]
[Option] Set Flag to 1 to use event triggered time as the aggregation level
countingeventdb
[default turn off : 0]
8
The status of the last scheduled task.
status
: success : failed [empty]: not yet executed
Name User defined target name
Protocol We support three protocols including HTTP, FTP and EMAIL
HTTP: http://IPAddress:PORTURI
FTP : ftp://IPAddress:PORT -> Destination
Address
Email: ServerIPAddress:PORT
SD card: NA
The duration between next pushed aggregated report. At the same time, it is also
Delivery
the total duration of one report. In this camera, we support 1 min, 5 mins, 15 mins,
Schedule
30 mins, 1 hr, 12 hrs, 1 day. All schedule starts from 00:00.
This is the aggregation period for each data in reports. Events in the same
Aggregation aggregation level would be accumulated as one data. In this camera, we support
level the same options with Delivery schedule. Note that, aggregation level must be
shorter than Deliver schedule.
In lite mode would ignore the zero data to reduce the size of each report. If the lite
Lite mode is No, then the report would contain zero in/out record even if there is no
count event occurs in that aggregation period.
This camera now supports three report format including XML, CSV and JSON. The
Format
detailed content of each format would be introduced later.
By clicking Delete button, camera would remove all the data of that target,
[Delete]
including the target information, report parameter setting and stored reports.
9
General
Local time Show the StartTime, EndTime in camera locale time with ISO8601 format
Email
Sender email Valid email address of sender
Recipient email Valid email addresses of recipients. (seperated with semicolon ;)
Server address SMTP server IP address
Username username if SMTP server requires authorization
Password Corresponding password of Username
Port SMTP server port number
SSL mode Send the email in SSL mode
FTP
Server address FTP server IP address
FTP server port number
Port
Username Username if FTP server requires authorization
Password Corresponding password of Username
FTP folder name Destination folder path
Filename format* We support user customize the reports filename through some variables.
The detailed of supported variables are listed later.
HTTP
Server address HTTP server IP address
Port HTTP server port number
Server uri HTTP server route uri
Username Username if HTTP server requires authorization
Password Corresponding password of Username
SD card
Filename format* We support user customize the reports filename through some variables.
The detailed of supported variables are listed later.
Cyclic Storage If enable cyclic storage, SD memory management will be enabled. If the
memory usage is up to 90% of total memory size, old contents will be
deleted to get more free space for updating data. If cyclic storage is not
enabled, reports will not be recorded if usage is higher than 90% of total
memory size.
10
%F Report format in xml, json or csv
%N User defined server name
%M Mac address in serial
%G Group ID
%D Device ID
%S Schedule duration in second
%A Aggregation level in second
%L "LITE" if in lite mode, "" otherwise
Use the Test button to push a test packet. When the test is successfully performed, click the Save
button.
3.3.1. XML
Here is an XML example shows two rules with its own statistic data, note that, camera will send zero
counting if there is no count for that interval.
11
<Message>
<Source>
<UtcTime>2016-08-01T08:03:56Z</UtcTime>
<GroupID>0</GroupID>
<DeviceID>0</DeviceID>
<ModelName>SC8131</ModelName>
<MacAddress>00:02:D1:39:2D:25</MacAddress>
<IPAddress>172.16.7.138</IPAddress>
<TimeZone>+8</TimeZone>
<DST>0</DST>
</Source>
<Data RuleType="Counting">
<CountingInfo RuleName="Counting1">
<In>0</In>
<Out>0</Out>
<StartTime>2016-07-26T00:00:00+0800</StartTime>
<EndTime>2016-07-26T12:00:00+0800</EndTime>
</CountingInfo>
<CountingInfo RuleName="Counting1">
<In>0</In>
<Out>0</Out>
<StartTime>2016-07-26T12:00:00+0800</StartTime>
<EndTime>2016-07-27T00:00:00+0800</EndTime>
</CountingInfo>
</Data>
<Data RuleType="ZoneDetection">
<ZoneInfo RuleName="Zone1">
<InwardCount>39</InwardCount>
<SumOutwardDuration>299</SumOutwardDuration>
<TotalCount>39</TotalCount>
<AvgDuration>7.67</AvgDuration>
<AvgCount>0.00</AvgCount>
<StartTime>2016-07-26T00:00:00+0800</StartTime>
<EndTime>2016-07-26T12:00:00+0800</EndTime>
</ZoneInfo>
<ZoneInfo RuleName="Zone1">
<InwardCount>37</InwardCount>
12
<SumOutwardDuration>407</SumOutwardDuration>
<TotalCount>37</TotalCount>
<AvgDuration>11.00</AvgDuration>
<AvgCount>0.01</AvgCount>
<StartTime>2016-07-26T12:00:00+0800</StartTime>
<EndTime>2016-07-27T00:00:00+0800</EndTime>
</ZoneInfo>
</Data>
</Message>
13
<xs:element name="In" type="xs:string"></xs:element>
<xs:element name="Out" type="xs:string"></xs:element>
<xs:element name="StartTime"
type="xs:string"></xs:element>
<xs:element name="EndTime"
type="xs:string"></xs:element>
</xs:sequence>
< xs:attribute name="RuleName" type="xs:string"/>
</xs:complexType>
</xs:element>
</xs:sequence>
<xs:attribute name="RuleType" type="xs:string"/>
</xs:complexType>
</xs:element>
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:schema>
3.3.2. CSV
CSV example shows the same data in csv format, note that, camera will send zero counting even if
there is no count for that interval when you uncheck the lite mode.
ReportTime,GroupID,DeviceID,ModelName,MacAddress,IPAddress,TimeZone,DST
2016-08-01T08:39:23Z,0,0,SC8131,00:02:D1:39:2D:25,172.16.7.138,+8,0
RuleType,RuleName,In,Out,StartTime,EndTime
Counting,Counting1,0,0,2016-07-26T00:00:00+0800,2016-07-26T12:00:00+0800
Counting,Counting1,0,0,2016-07-26T12:00:00+0800,2016-07-27T00:00:00+0800
RuleType,RuleName,InwardCount,SumOutwardDuration,TotalCount,AvgDuration,AvgCount,StartTime,EndTime
ZoneDetection,Zone1,39,299,39,7.67,0.00,2016-07-26T00:00:00+0800,2016-07-26T12:00:00+0800
ZoneDetection,Zone1,37,407,37,11.00,0.01,2016-07-26T12:00:00+0800,2016-07-27T00:00:00+0800
3.3.3. JSON
The following JSON example shows the same condition in json format. Zero counting data are still
sent when you unchecked the lite mode.
14
{
"Source" : {
"ReportTime" : "2016-08-01T08:41:25Z",
"GroupID" : "0",
"DeviceID" : "0",
"ModelName" : "SC8131",
"MacAddress" : "00:02:D1:39:2D:25",
"IPAddress" : "172.16.7.138",
"TimeZone" : "+8",
"DST" : "0"
},
"Data" : [{
"RuleType" : "Counting",
"CountingInfo" : [{
"RuleName" : "Counting1",
"In" : 0,
"Out" : 0,
"StartTime" : "2016-07-26T00:00:00+0800",
"EndTime" : "2016-07-26T12:00:00+0800"
}, {
"RuleName" : "Counting1",
"In" : 0,
"Out" : 0,
"StartTime" : "2016-07-26T12:00:00+0800",
"EndTime" : "2016-07-27T00:00:00+0800"
}
]
}, {
"RuleType" : "ZoneDetection",
"ZoneInfo" : [{
"RuleName" : "Zone1",
"InwardCount" : 39,
"SumOutwardDuration" : 299,
"TotalCount" : 39,
"AvgDuration" : 7.67,
"AvgCount" : 0.00,
"StartTime" : "2016-07-26T00:00:00+0800",
15
"EndTime" : "2016-07-26T12:00:00+0800"
}, {
"RuleName" : "Zone1",
"InwardCount" : 37,
"SumOutwardDuration" : 407,
"TotalCount" : 37,
"AvgDuration" : 11.00,
"AvgCount" : 0.01,
"StartTime" : "2016-07-26T12:00:00+0800",
"EndTime" : "2016-07-27T00:00:00+0800"
}
]
}
]
}
For Counting and Flow Path, there are two statistic data in report, In and Out.
Report tag name Description
In The number of objects crossing the rule line or detected area toward
the direction "In".
Out The number of objects crossing the rule line or detected area toward
the direction "Out".
For rule type, Zone, the descriptions of statistic data in report are illustrated as follow table.
Report tag name Description
InwardCount Numbers of objects which go inward in aggregation time
SumOutwardDuration Sum of dwelling duration of objects which go outward in aggregation
time
TotalCount Total counts of dwelling objects in aggregation time
AvgDuration Average duration of dwelling objects in aggregation time
AvgCount Average counts of dwelling objects in aggregation time
16
The following is an example for TotalCount, InwardCount, AvgDuration, and SumOutwardDuration
showing in diagram.
TotalCount 2 2 3
InwardCount 2 2 1
AvgDuration 30 40 50
SumOutwardDuration 60 0 150
Default value
N/A
GET
Input data UTC timestamp
Return data Counting/Flowpath result in JSON format
Request example
GET /VCA/Data/DB/Counting?StartTime=1524700800&EndTime=1524729600 HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290MTIzNA==\r\n
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64)\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: text/plain\r\n
\r\n
{
17
"CountingInfo": [
{
"RuleName": "Rule@Counting",
"UTC": 1524720420,
"In": 0,
"Out": 0
},
{
"RuleName": "Rule@FlowPathCounting",
"UTC": 1524720420,
"In": 0,
"Out": 0
},
{
"RuleName": "Rule@Counting",
"UTC": 1524720480,
"In": 0,
"Out": 0
},
{
"RuleName": "Rule@FlowPathCounting",
"UTC": 1524720480,
"In": 0,
"Out": 0
},
{
"RuleName": "Rule@Counting",
"UTC": 1524720540,
"In": 0,
"Out": 0
}
]
}
POST
N/A
Parameters Description
CountingInfo Counting rule result
18
RuleName Rule name
UTC The UTC timestamp of this passerby result
In The number of objects detected as “In” by counting rule
Out The number of objects detected as “Out” by counting rule
20
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Data": {
"RuleName": "ZoneRule ",
"fData": 0,
"iData": [
37,
25,
0,
0,
0,
0,
0,
0,
0,
0
],
"strData": ""
},
"Status": 200
}
POST
N/A
Parameters Description
Data Rule data
RuleName Rule name
fData Not used
Integer array, the value meaning is different based on different rule type.
Counting type
"iData":
iData [Out,In,reserved,reserved,reserved,reserved,reserved,reserved,reserved,reserved],
Zone type
"iData":
[Inside,MaxWaitTime,MinWaitTime,AverageWaitTime,reserved,reserved,
21
reserved,reserved,reserved,reserved ],
23
3.4.5. Get Queue Data
URI
url:/VCA/Data/DB/Queue?StartTime=[UTC timestamp]&EndTime=[UTC timestamp]
Description
Specify time interval in UTC timestamp to get counting result from Queue DB
Default value
N/A
GET
Input data UTC timestamp
Return data Queue detection result in JSON format
Request example
GET /VCA/Data/DB/Queue?StartTime=1524672000&EndTime=1524733200 HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290MTIzNA==\r\n
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64)\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: text/plain\r\n
\r\n
[
{
"RuleName": "QueueRule",
"Objects": [
{
"Id": 965,
"Service": [
0,
0
],
"Wait": [
1537257025,
1537257034
]
}
]
},
24
{
"RuleName": " QueueRule ",
"Queue": [
1537257025,
1537257034
]
}
]
POST
N/A
Parameters Description
RuleName Rule Name
Objects Object information
ID Object ID
Service [Start Time Stamp, Wait Time Stamp]
Wait [Start Time Stamp, Wait Time Stamp]
Queue [Start Time Stamp, Wait Time Stamp]
25
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{"AvgDuration":0,"MaxDuration":0,"MinDuration":0,"QueueLength":0,"QueueState":"Close","Rul
eName":"Rule-1","ServiceDuration":0,"Status":200}
POST
N/A
Parameters Description
AvgDuration Average waiting duration of objects in queue
MaxDuration Maximum waiting duration of objects in queue
MinDuration Minimum waiting duration of objects in queue
QueueLength The number of objects waiting in queue
QueueState The current queue status, “Open” or “Close”
RuleName The rule name of queue
ServiceDuration Current service duration of object in service zone
200: OK
The request has succeeded.
Status
400: Bad Request
"Error":{"Message":"RuleName must be specified: ?RuleName=XXX"}
26
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64)\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: text/plain\r\n
\r\n
{
" PasserbyInfo": [
{
"RuleName": "Rule-1",
"UTC": 1605077580,
"Count": 5
}
]
}
POST
N/A
Parameters Description
PasserbyInfo Passerby detection result
RuleName Rule name
UTC The UTC timestamp of this passerby result
Count The count number of passerby object
27
3.5.2. Support of IBIS command
User can send IBIS command to SC8132 via RS485. The description of supported command and
corresponding reply will be shown in this table. We follow the rules of IBIS protocol so that there is
no reply if a command is not supported by SC8132.
Command Description
This type of command represents "Movement started".
bF"CamID"
CamID is the value of camera id set by user.
This type of command represents "Query of IBIS status".
bS"CamID"
CamID is the value of camera id set by user.
This type of command represents "Query of counting result".
bE"CamID"
CamID is the value of camera id set by user.
Reply Description
bF This type of reply represents "Acknowledgement".
bS3 This type of reply represents "Status of IBIS".
This type of reply represents "Counting result of passengers".
B1, B2: Number of boarding passengers, high-order digit first, 0…255.
A1, A2: Number of alighting passengers, high-order digit first, 0…255.
The representation of characters are shown in following table.
bB1B2A1A2 Decimal Hexadecimal Character
0 0 0
1 1 1
2 2 2
3 3 3
28
4 4 4
5 5 5
6 6 6
7 7 7
8 8 8
9 9 9
10 A :
11 B ;
12 C <
13 D =
14 E >
15 F ?
Example: b051?
Number of boarding passengers = 5
Number of alighting passengers = 31
29
4. Get Metadata
4.1. Authentication WebSocket Connection
4.1.1. WebSocket Connection
WebSocket client could be implemented using different programming languages such as
JavaScript, PHP, NodeJS, C, C++, Java...etc. This document outlines the WebSocket APIs available on
camera.
*Note1: From FW 0105o, SC stops "unAuth websocket" by factory default or after camera
restore. (use ip/VCA/Config/AE/WebSocket/Enable api then it get false)
*Note2: From FW 0105o, it supports Auth-Websocket on browser IE and chrome, while for
FW older than 0105o it only supports unAuth websocket.
30
Parameter Value range Description
AuthWSPort [Integer] WebSocket port.
Example:
Getting configuration by “curl”:
~# curl -i --user root:password “http://172.20.6.2/VCA/Config/AE/WebSocket”
HTTP/1.1 200 OK
Content-type: application/json
{"AuthWSPort":80,"AuthWSSPort":443,"ProtocolName":"tracker-protocol" }
WebSocket Protocol is based on Hypertext Transfer Protocol (HTTP) or Hypertext Transfer Protocol
Secure (HTTPS) mode of Package.
Example:
HTTP:
Package Website Url : http://172.20.6.2/VCA/www/index.html
WebScoket Url : ws://172.20.6.2/ws/vca?data=event,meta
HTTPS:
Package Website Url : https://172.20.6.2/VCA/www/index.html
31
WebScoket Url : wss://172.20.6.2/ws/vca?data=event,meta
status Status provides camera status information such as zoom position, Light
State….etc.
param Param provides camera information such as FOV, image center, and camera
height.
Example in JavaScript
WebSocket = new WebSocket("ws://root:[email protected]/ws/vca?data=meta,event",
"tracker-protocol");
Client request:
GET /ws/vca?data=meta,event HTTP/1.1
Sec-WebSocket-Version: 13
Sec-WebSocket-Key: +DZ7PjY6sHWLs7pKTYESTQ==
Connection: Upgrade
Upgrade: WebSocket
32
Sec-WebSocket-Extensions: permessage-deflate; client_max_window_bits
Sec-WebSocket-Protocol: tracker-protocol
Host: 172.20.6.2:80
Authorization: Basic cm9vdDp2aXZvMjMyNA==
Server response:
HTTP/1.1 101 Switching Protocols
Upgrade: WebSocket
Connection: Upgrade
Sec-WebSocket-Accept: 0ApxjPpakfRTEw4MMEGgA8e7Nw4=
Sec-WebSocket-Protocol: tracker-protocol
Once the WebSocket connection is ready, camera will send metadata and eventdata instantly, and
updated every frame to the socket connection.
4.1.5. WebSocket Connection
Parameters list:
{
"AuthWSPort": 80,
"AuthWSSPort": 443,
"ProtocolName": "tracker-protocol",
}
33
AuthWSSPort [Integer] WebSocket security port.
Example:
Getting configuration by “curl”:
~# curl -i --user root:password “http://172.20.6.2/VCA/Config/AE/WebSocket”
HTTP/1.1 200 OK
Content-type: application/json
{"AuthWSPort":80,"AuthWSSPort":443,"ProtocolName":"tracker-protocol" }
WebSocket Protocol is based on Hypertext Transfer Protocol (HTTP) or Hypertext Transfer Protocol
Secure (HTTPS) mode of Package.
Example:
HTTP:
Package Website Url : http://172.20.6.2/VCA/www/index.html
WebScoket Url : ws://172.20.6.2/ws/vca?data=event,meta
HTTPS:
Package Website Url : https://172.20.6.2/VCA/www/index.html
WebScoket Url : wss://172.20.6.2/ws/vca?data=event,meta
34
The detail is following table
Filter Description
meta Metadata provides scene information such as motion, object, face…etc.
Refer to the metadata document for details
status Status provides camera status information such as zoom position, Light
State….etc.
param Param provides camera information such as FOV, image center, and camera
height.
Example in JavaScript
WebSocket = new WebSocket("ws://root:[email protected]/ws/vca?data=meta,event",
"tracker-protocol");
Client request:
GET /ws/vca?data=meta,event HTTP/1.1
Sec-WebSocket-Version: 13
Sec-WebSocket-Key: +DZ7PjY6sHWLs7pKTYESTQ==
Connection: Upgrade
Upgrade: WebSocket
Sec-WebSocket-Extensions: permessage-deflate; client_max_window_bits
Sec-WebSocket-Protocol: tracker-protocol
Host: 172.20.6.2:80
Authorization: Basic cm9vdDp2aXZvMjMyNA==
Server response:
HTTP/1.1 101 Switching Protocols
Upgrade: WebSocket
Connection: Upgrade
35
Sec-WebSocket-Accept: 0ApxjPpakfRTEw4MMEGgA8e7Nw4=
Sec-WebSocket-Protocol: tracker-protocol
Once the WebSocket connection is ready, camera will send metadata and eventdata instantly, and
updated every frame to the socket connection.
Here is a RTSP example, Client C requests a presentation from media server M (172.16.2.136):
1.
2.
37
3.
4.
38
Metadata packet: Type-108
39
4.3. Metadata Format
The metadata contains two type of information: MetaData and Event.
MetaData is presented in JSON format, it shows object tracking information.
{
"Tag": "MetaData",
"Ver": "1.0.0"
"Stitch": {
"Objects": [{
"Centroid": {
"x": 4468,
"y": 604
},
"GId": 278,
"Height": 1585,
"Id": 278,
"Origin": {
"x": 6800,
"y": 1763
},
"OriginUtcTime": "2018-06-01T10:08:59.654Z",
"RuleAttribute": [{
"ObjDuration": 6,
"RuleName": "Rule-1"
}
]
}
],
"UtcTime": "2018-06-01T10:09:05.253Z"
},
"Project": {
"Cx": 683.0857,
"Cy": 488.8214,
"f": 279.9816,
"a": 8.33756,
"ZoomPos": 0,
40
"Offsetx": 500.0,
"Offsety": 400.0,
" Roll": 0,
" Tilt": 0,
"W": 320,
"H": 208,
"CamH": 2600
},
"Frame": {
"UtcTime": "2018-04-27T05:20:20.063Z",
"Objects": [
{
"Id": 17,
"GId": 17,
"Height": 1776,
"Origin": {
"x": 176,
"y": 91
},
"Centroid": {
"x": 176,
"y": 154
},
"Classification": [{
"Likelihood": 99,
"MeanLikelihood": 75,
"Type": 1
}
],
"CurrentArea": 30,
"MaxArea": 43,
"OriginUtcTime": "2018-04-27T05:18:34.073Z",
"RuleAttribute": [{
" ObjDuration ": 7,
"RuleName": "Zone1"
},
{
41
" ObjDuration ": 7,
"RuleName": "Zone2"
}
]
},
{
"Id": 18,
"GId": 17,
"Height": 1967,
"Origin": {
"x": -261,
"y": 269
},
"Centroid": {
"x": -381,
"y": -211
},
"OriginUtcTime": "2018-04-27T05:19:04.087Z",
"RuleAttribute": [{
"ObjDuration": 4,
"RuleName": "Zone1"
}
]
}
]
}
}
Type of metadata.
Tag Metadata: object tracking information
Event: Event information
42
Ver Version of Metadata format
Stitch (option) On Stiching mode
UtcTime The UTC time of this frame. (YYYY-MM-DDTHH:mm:sssZ)
Objects The information of he tracked objects
Id Object ID
GId Group ID
Height Object height. unit: mm
The original coordinate of the detected object during the tracking
Origin
(foot position)
UTC time that original coordinate of the detected object during the
OriginUtcTime
tracking
x x-coordinate
y y-coordinate
Centroid The current coordinate of the detected object (foot position)
x x-coordinate
y y-coordinate
Classification Object class information after enabling Human
Likelihood Value is [1,99], it is Max. of object ClassifyValue
MeanLikelihood It is an average of the object ClassifyValue
Object type:
Type
0 is known, 1 is Human and 5 is Other
CurrentArea Current object area after enabling Human
MaxArea The max object area in its history after enabling Human
RuleAttribute Zone information of objects
ObjDuration Duration of dwelling objects in zone
RuleName Name of zone which the object is detected
Coordinate information, user can calculate the object tracking
Project
window by these parameters.
Cx Q Matrix parameter. The x-coordinate of the focal center.
Cy Q Matrix parameter. The y-coordinate of the focal center.
f Q Matrix parameter. The focal length.
a Q Matrix parameter. The baseline length.
Offsetx The x-coordinate of zoomin offset.
Offsety The y-coordinate of zoomin offset.
W The width of depth map
H The height of depth map
43
CamH The installed height of the camera. unit: mm
Roll Unsupported
Tilt Unsupported
ZoomPos Unsupported
Frame Descriptions of the objects.
UtcTime The UTC time of this frame. (YYYY-MM-DDTHH:mm:sssZ)
Objects The information of he tracked objects
Id Object ID
GId Group ID
Height Object height. unit: mm
The original coordinate of the detected object during the tracking
Origin
(foot position)
UTC time that original coordinate of the detected object during the
OriginUtcTime
tracking
x x-coordinate
y y-coordinate
Centroid The current coordinate of the detected object (foot position)
x x-coordinate
y y-coordinate
RuleAttribute Zone information of objects
ObjDuration Duration of dwelling objects in zone
RuleName Name of zone which the object is detected
The object is shown as below, it has a tracking window, an original point and a current point.
Origin
44
Centroid
You can use the “Project” parameters to derive 8 coordinate points of the tracking window, the
order of the 8 points is as below, point 0~7.
4 5
Top
7 6
0 1
Bottom
3 2
Here is a sample JavaScript code to demo how to calculate all the coordinate points:
var BOX3D_HALF_WIDTH = 180;
var m_rectDisparityROI = new jsPoint(0,0);
// Q Matrix after stereo calibration
var Cx = 683.0857;
var Cy = 488.8214;
var f = 279.9816;
var a = 8.33756;
45
var b = 0;
var m_matQ = [[1, 0, 0, -Cx],[0, 1, 0, -Cy],[0, 0, 0, f],[0, 0, a, b]];
function jsPoint(x, y)
{
this.x = x;
this.y = y;
}
function js3DPoint(x, y, z)
{
this.x = x;
this.y = y;
this.z = z;
}
function ProjectToRectifiedPoint(x, y, z)
{
var dst = new jsPoint(x * m_matQ[2][3] / z - m_matQ[0][3], 2460y * m_matQ[2][3] / z -
m_matQ[1][3]);
dst.x -= m_rectDisparityROI.x;
dst.y -= m_ rectDisparityROI.y;
return dst;
}
function GetBoundingBox(CamHeight, Height, CenterOfGravity, eMode)
{
var vertex_num = 8;
var pointImageCoordinate = new Array();
return pointImageCoordinate;
}
46
function main()
{
var CamHeight = 2400;
var Height = 1776;
var Centroid= new jsPoint(176,154);
var Origin = new jsPoint(176,91);
var Box3D = GetBoundingBox(CamHeight, Height, Centroid,'3D');
var OriginPoint = ProjectToRectifiedPoint(Origin.x, Origin.y, CamHeight);
console.log("3DBox : ");
console.log(Box3D);
}
console.log('starting DEMO GetBoundingBox3D...')
main();
The coordinates are derived from the depth map (W, H), if you need to draw the tracking window
on different resolution image, you have to do the transformation from (W, H) to (W’, H’).
The other type of metadata is Event, which is also presented in JSON format, it shows the real-time
counting event.
{
"Tag": "Event",
"Ver": "1.0.0",
"Data": [
{
"RuleType": "Counting",
"CountingInfo": [
{
"RuleName": "Counting1",
"In": 0,
"Out": 1,
"Time": "2015-12-15T06:32:20.876Z"
}
]
},
{
"RuleType": "Counting",
47
"CountingInfo": [
{
"RuleName": "Counting2",
"AccIn": 1,
"AccOut": 4,
"In": 2,
"Out": 3,
"Time": "2015-12-15T06:32:20.876Z"
}
]
}, {
"RuleType" : "ZoneDetection",
"ZoneInfo" : [
{
"AvgDuration" : 9,
"Inside" : 1,
"MaxDuration" : 9,
"MinDuration" : 9,
"RuleName" : "Zone1",
"Time" : "2015-12-15T06:32:20.876Z "
}
],
"RuleType" : "QueueAnalysis",
"QueueInfo" : [
{
"AvgDuration" : 0,
"MaxDuration" : 0,
"MinDuration" : 0,
"QueueLength" : 0,
"QueueState" : "Open",
"RuleName" : "Rule-4",
"ServiceDuration" : 0,
"Time" : "2020-10-08T05:37:31.863Z"
}
]
}
]
48
}
Type of metadata.
Tag Metadata: object tracking information
Event: Event information
Ver Version of Event format
Data Descriptions of the event
RuleType The rule type, currently we support one rule type: Counting
CountingInfo The information of counting event
RuleName The rule name
The number of objects that are crossing the counting line toward the “In”
In
direction
The number of objects that are crossing the counting line toward the “Out”
Out
direction
The accumulated number of objects that are crossing the counting line
AccIn
toward the “In” direction
The accumulated number of objects that are crossing the counting line
AccOut
toward the “Out” direction
Time The event time. (YYYY-MM-DDTHH:mm:ssZ)
ZoneInfo The information of zone event
AvgDuration Average waiting duration of objects in zone
Inside Current object numbers in zone
MaxDuration Maximum waiting duration of objects in zone
MinDuration Minimum waiting duration of objects in zone
RuleName The rule name of zone
Time The event time (YYYY-MM-DDTHH:mm:ssZ)
QueueInfo The information of queue event
AvgDuration Average waiting duration of objects in queue
MaxDuration Maximum waiting duration of objects in queue
MinDuration Minimum waiting duration of objects in queue
QueueLength The number of objects waiting in queue
QueueState The current queue status, “Open” or “Close”
RuleName The rule name of queue
ServiceDuration Current service duration of object in service zone
Time The event time (YYYY-MM-DDTHH:mm:ssZ)
49
5. RESTful APIs
SC8131/SC8132 supports RESTful APIs to get and configure the stereo camera’s parameters, the APIs
are consist of URL and standard HTTP methods.
The URL is as following type:
http://IP/VCA/REQUEST
The supported “REQUEST” will be listed in next chapter.
51
"/VCA/Files/Intrinsic/Validity",
"/VCA/Rule",
"/VCA/Rule/FirstCountingRule",
"/VCA/Rule/FirstCountingRule/Reset",
"/VCA/Rule/Reset",
"/VCA/Scene/Depth",
"/VCA/Scene/Disparity",
"/VCA/Scene/Raw",
"/VCA/Stitching/CameraList",
"/VCA/Stitching/CameraList/Reorder",
"/VCA/Stitching/Configure",
"/VCA/Stitching/ManualTraining",
"/VCA/Stitching/Matching",
"/VCA/Stitching/Pause",
"/VCA/Stitching/State",
"/VCA/Stitching/StitchInfo",
"/VCA/Stitching/StitchInfo/TranslationToMaster",
"/VCA/Stitching/StitchedMap",
"/VCA/Stitching/Training",
"/VCA/Stitching/Training/Status",
"/VCA/Stitching/Training/Stop",
"/VCA/Tracking/Off",
"/VCA/Tracking/On",
"/VCA/Version",
"/VCA/WebSocket/Status/HeartBeat/IntervalSec"
]
},
"Status": 200
}\n
POST
N/A
5.4. Configuration
5.4.1. Analytics Engine
URI
/VCA/Config/AE
Description
Analytics engine settings
Default value
N/A
GET
54
Input data None
Return data Analytics engine settings in JSON format
Request example
GET /VCA/Config/AE HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"AggressiveHeightFilter": true,
"AutoGSensorMode": false,
"AutoHeight": false,
"AutoRollAngle": 0,
"AutoTiltAngle": 0,
"CamHeight": 2400,
"Confidence": 3,
"HasGSensor": false,
"MaxDisparity": 32,
"MaxObjectDistance": 5000,
"MaxObjectHeight": 1900,
"MaxShoppingUnitDistance": 0,
"MinDisparity": 0,
"MinObjectHeight": 800,
"Motion": {
"MainConfig": {
"EnableObjectFilter": false,
"Sensitivity": 70
},
"Schedule": {
"AutoMode": true,
"Begin": "18:00",
"Enable": false,
"End": "06:00"
55
},
"ScheduleConfig": {
"EnableObjectFilter": false,
"Sensitivity": 70
}
},
"RollAngle": 0,
"Sensitivity": 10,
"TamperingSensitivity": 5,
"TiltAngle": 0,
"UpdateSecond": 600,
"WebSocket": {
"Enable": true,
"Port": 888,
"ProtocolName": "tracker-protocol",
"WSSPort": 889
},
"ZoomInFactor": 1
}\n
POST
N/A
Parameters Description
AggressiveHeightFilter Unsupported
AutoGSensorMode Unsupported
Auto detects camera installed height in Boolean format
AutoHeight
true: enable auto detect height
false: disable auto detect height
AutoRollAngle Unsupported
AutoTiltAngle Unsupported
Set camera installed height manually. The value in unit of mm.
CamHeight
Valid range: 1600~5500
User can adjust confidence level to filter out the controversial points
of depth. The controversial points are those points we could not
compute depth information. Under some situations like
Confidence
overexposed view or smooth background which no objects or
patterns for computing depth information, confidence level would
need to be adjusted to keep the performance for tracking.
56
With lower level, the image would be much smoother; more noise
points or unknown points would be filtered out.
Valid range: 0~30
HasGSensor Unsupported
Minimum value of disparity.
MaxDisparity
Valid range: 12~48
MaxObjectDistance Unsupported
Maximum height of objects which will be considered for tracking.
MaxObjectHeight The value in unit of mm.
Valid range: 500~2500
Maximum height of objects which will be considered for tracking.
MaxShoppingUnitDistance The value in unit of mm.
Valid range: 500~2500
Minimum value of disparity.
MinDisparity
Valid range: 0~48
Minimum height of objects which will be considered for tracking.
MinObjectHeight The value in unit of mm.
Valid range: 500~2500
Motion Unsupported
MainConfig Unsupported
EnableObjectFilter Unsupported
Sensitivity Unsupported
Schedule Unsupported
AutoMode Unsupported
Begin Unsupported
Enable Unsupported
End Unsupported
ScheduleConfig Unsupported
EnableObjectFilter Unsupported
Sensitivity Unsupported
RollAngle Unsupported
Sensitivity of human detection, the higher the value the more likely
Sensitivity object will be determined as human
Valid range: 1~10
TamperingSensitivity Unsupported
TiltAngle Camera installed tilt angle
57
UpdateSecond Not used
WebSocket Websocket parameters
Websocket transmission
Enable “true”: enable
“false”: disable
Websocket port number
Port
Default: 888
ProtocolName Websocket protocol name
Secure websocket port number
WSSPort
Default: 889
Digital zoom-in the field of view. The appropriately installed height
is 2.4~3.6 meter, if the user needs to install it higher than 3.6
ZoomInFactor
meters, they should adjust this parameter to get more accurate
tracking and counting, but the tradeoff is losing the FOV.
59
"MaxShoppingUnitDistance": 0,
"MinHeightFilter": 1200,
"ReportTimeInterval": 1,
"ResetTimeInterval": 1,
"DBReportTimeInterval": 60,
"EnableClassify": 0,
"Target": 0,
"IsStitchType": false
}
},
"ExclusiveArea": {
"Field": [[{
"x": 1169,
"y": 1459
},
{
"x": 1006,
"y": 5277
},
{
"x": 3940,
"y": 4409
},
{
"x": 3489,
"y": 429
}],
[{
"x": 5796,
"y": 6328
},
{
"x": 5871,
"y": 9075
},
{
"x": 9068,
60
"y": 8267
},
{
"x": 8341,
"y": 5035
}]]
},
"DoorArea": {
"Field": [[{
"x": 0,
"y": 9000
},
{
"x": 0,
"y": 10000
},
{
"x": 10000,
"y": 10000
},
{
"x": 10000,
"y": 9000
},
{
"x": 6500,
"y": 3000
},
{
"x": 3500,
"y": 3000
}]]
},
"FlowPathCounting": {
"FlowPathCounting1": {
"CountMode": "AfterExit",
"Direction": "Any",
61
"Line": [
[
{
"x": 3333,
"y": 6391
},
{
"x": 3333,
"y": 3333
}
],
[
{
"x": 6666,
"y": 6391
},
{
"x": 6666,
"y": 3333
}
],
[
{
"x": 6111,
"y": 6391
},
{
"x": 6111,
"y": 3333
}
],
[
{
"x": 5555,
"y": 6391
},
{
62
"x": 5555,
"y": 3333
}
],
[
{
"x": 5000,
"y": 6391
},
{
"x": 5000,
"y": 3333
}
],
[
{
"x": 4444,
"y": 6391
},
{
"x": 4444,
"y": 3333
}
],
[
{
"x": 3888,
"y": 6391
},
{
"x": 3888,
"y": 3333
}
]
],
"Line3D": [
[
63
{
"x": -1216,
"y": 482
},
{
"x": -1216,
"y": -800
}
],
[
{
"x": 1043,
"y": 482
},
{
"x": 1043,
"y": -800
}
],
[
{
"x": 666,
"y": 482
},
{
"x": 666,
"y": -800
}
],
[
{
"x": 290,
"y": 482
},
{
"x": 290,
"y": -800
64
}
],
[
{
"x": -86,
"y": 482
},
{
"x": -86,
"y": -800
}
],
[
{
"x": -463,
"y": 482
},
{
"x": -463,
"y": -800
}
],
[
{
"x": -840,
"y": 482
},
{
"x": -840,
"y": -800
}
]
],
"MaxHeightFilter": 2100,
"MaxShoppingUnitDistance": 0,
"MinHeightFilter": 1200,
"ReportTimeInterval": 1,
65
"ResetTimeInterval": 1,
"Sensitivity": 50,
"DBReportTimeInterval": 60,
"EnableClassify": 0,
"Target": 0,
"EnableDI": 0,
"DIDelaySec": 0,
"IsStitchType": false
}
},
"ZoneDetection": {
"Zone1": {
"EnterDelay": 5,
"Field": [[{
"x": 2086,
"y": 2558
},
{
"x": 2248,
"y": 8024
},
{
"x": 8164,
"y": 8443
},
{
"x": 8139,
"y": 2139
}]],
"LeaveDelay": 5,
"MaxHeightFilter": 2100,
"MaxShoppingUnitDistance": 0,
"MinHeightFilter": 1200,
"ReportTimeInterval": 1,
"ResetTimeInterval": 1,
"DBReportTimeInterval": 60,
"EnableClassify": 0,
66
"Target": 0,
"IsStitchType": false
}
},
" QueueAnalysis" : {
"Queue1" : {
"Line" : [[{
"x" : 7000,
"y" : 7353
}, {
"x" : 5000,
"y" : 7353
}
], [{
"x" : 7000,
"y" : 4411
}, {
"x" : 5000,
"y" : 4411
}
], [{
"x" : 0,
"y" : 0
}, {
"x" : 0,
"y" : 0
}
], [{
"x" : 0,
"y" : 0
}, {
"x" : 0,
"y" : 0
}
], [{
"x" : 0,
"y" : 0
67
}, {
"x" : 0,
"y" : 0
}
]
],
"Field" : [[{
"x" : 4000,
"y" : 7353
}, {
"x" : 4900,
"y" : 7353
}, {
"x" : 4900,
"y" : 4411
}, {
"x" : 4000,
"y" : 4411
}
]],
"Distance" : 1300,
"OpenDelay" : 3,
"CloseDelay" : 3,
"EnterDelay" : 3,
"LeaveDelay" : 3,
"MaxHeightFilter" : 2100,
"MaxShoppingUnitDistance" : 0,
"MinHeightFilter" : 900,
"EnableClassify": 0,
"DBReportTimeInterval" : 0,
"ReportTimeInterval" : 0,
"ResetTimeInterval" : 0,
"Target": 0,
"IsStitchType": false
}
}
}\n
68
POST
N/A
Parameters Description
Counting Line counting type
FlowPathCounting Flow path counting type
Rules for validating the behavior of objects which crossing the
rule line. Three options for counting mode in string format.
- AfterExit
CountMode
- FirstPass
- EveryPass
- In
Direction
- Out
- Any
70
y y-coordinate (mm)
Detection will stop after LeaveDelay time when an object leave
LeaveDelay
the zone area.
Exclude certain areas in your field of view from tracking
QueueDetection
detection.
Array of lines used for detecting the passing of an object. The
Line value of the x, y position is defined as the ratio of image size:
the range is 0~10000.
x x-coordinate
y y-coordinate
A polygon set of detection area. The range of the polygon point
Field number is <3~20>. The value of the x, y position is defined as
the ratio of image size, and the range is 0~10000.
x x-coordinate
y y-coordinate
Distance Distance between people to be counted as one queue
OpenDelay Start analysis after seconds
CloseDelay Stop analysis after seconds
EnterDelay Add 1 person after entering queue for seconds
LeaveDelay Minus 1 person after leaving queue for seconds
5.4.3. Alarm
URI
/VCA/Config/Alarm
Description
Analytics event rules settings
Default value
N/A
GET
Input data None
Return data Analytics event rules settings in JSON format
Request example
GET /VCA/Config/Alarm HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
71
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Counting": {
"0": {
"AlarmRuleName": "Counting1",
"Condition": "GTE",
"Source": "In",
"ThreshHold": 10,
"TopicName": "tns1:RuleEngine/TrackerAlarm/test2",
"RuleName": "test1"
}
},
"ZoneDetection": {
"0": {
"AlarmRuleName": "Zone1",
"Condition": "LTE",
"Source": "Inside",
"ThreshHold": 2,
"TopicName": "tns1:RuleEngine/TrackerAlarm/test",
"RuleName": "test3"
}
}
}\n
POST
N/A
Parameters Description
Counting Counting event rule
ZoneDetection Zone detection event rule
AlarmRuleName The name of the binding analytics rule target
The condition for the specified source to trigger this event
Condition GTE: greater than or equal to
LTE: less than or equal to
72
The trigger source of the analytics rule.
For Counting line and Flow path, there are three available sources:
direction.
74
"recipient": "",
"sender": "",
"sslmode": 0,
"url": "192.168.1.155",
"usr": "ben.wu",
"pwd": "1234",
"port": 21,
"sdcyclic": 0,
"fileformat": "ftpreport_%T.%F",
"uri": "ftp path",
"status": "fail"
}
}\n
POST
N/A
Parameters Description
name User defined target name
This is the aggregation period for each data in reports. Events in the same
aggregation level would be accumulated as one data. In this camera, we
aggregation support the same options with Delivery schedule. Note that, aggregation
level must be shorter than Deliver schedule.
The unit is second.
This camera now supports three report format including XML, CSV and
format
JSON. The detailed content of each format would be introduced later.
The duration between next pushed aggregated report. At the same time, it
schedule is also the total duration of one report. In this camera, we support 1 min, 5
mins, 15 mins, 30 mins, 1 hr, 12 hrs, 1 day. All schedule starts from 00:00.
In lite mode would ignore the zero data to reduce the size of each report. If
the lite mode is No, then the report would contain zero in/out record even if
lite there is no count event occurs in that aggregation period.
0: disable
1: enable
Show the StartTime, EndTime in camera local time with ISO8601 format
localtime 0: disable
1: enable
Server type supports:
servertypeselector
“http”
75
“https”
“ftp”
“email”
“sdcard”
recipient Valid email addresses of recipients. (seperated with semicolon ;)
sender Valid email address of sender
http/email secure mode.
sslmode 0: disable secure mode
1: enable secure mode
url SMTP/FTP/HTTP/HTTPS server IP address
usr Username if server requires authorization
pwd Corresponding password of Username
port server port number
If enable cyclic storage, SD memory management will be enabled. If the
memory usage is up to 90% of total memory size, old contents will be deleted
sdcyclic
to get more free space for updating data. If cyclic storage is not enabled,
reports will not be recorded if usage is higher than 90% of total memory size.
Customize reports filename.
Default: report_%T.%F
76
5.5. Data
5.5.1. Start/Stop Map
URI
/VCA/Data/StartStopMap
Description
Get coordination of all start-stop pairs
Default value
N/A
GET
Input data None
Return data Start/Stop map in JSON format
Request example
GET /VCA/Data/StartStopMap HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Data": [
{
"ex": 1023,
"ey": -558,
"sx": 1560,
"sy": 350
},
{
"ex": 1297,
"ey": 695,
"sx": 540,
"sy": -550
}
],
"Status": 200
77
}\n
POST
N/A
Parameters Description
Data Start-Stop map, object array.
sx The x-coordinate of the Stop point.
sy The y-coordinate of the Stop point.
ex The x-coordinate of the Start point.
ey The y-coordinate of the Start point.
100: Continue
This interim response is used to inform the client that the initial part of the
request has been received and has not yet been rejected by the server. The
client SHOULD continue by sending the remainder of the request or, if the
request has already been completed, ignore this response.
status
200: OK
The request has succeeded.
5.6.2. Time
URI
/VCA/Camera/Time
Description
Current Unix timestamp with format [Second].[Millisecond]
Default value
N/A
GET
Input data None
Return data Current Unix timestamp
Request example
GET /VCA/Camera/Time HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: text/html\r\n
\r\n
1525052709.715\n
POST
N/A
80
Parameters Description
N/A
5.6.3. Status
URI
/VCA/Camera/Status
Description
Camera current status
Default value
N/A
GET
Input data None
Return data Camera current status in JSON format
Request example
GET /VCA/Camera/Status HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"DIState": "Off",
"TamperingState": "Normal",
"UtcTime": "2017-07-25T10:06:25.727Z"
}\n
POST
N/A
Parameters Description
DIState Digital input status
TamperingState Tampering status
UtcTime Current time
81
5.7. Stitching
5.7.1. Get Stitching information
URI
/VCA/Stitching/StitchInfo
Description
Get Stitching information pairly, including X/Y translation and rotation
Default value
N/A
GET
Input data None
Return data Status
Request example
GET /VCA/Stitching/StitchInfo HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Status": 200,
"Transformation": [
{
"Camera1": 0,
"Camera1Roi": {
"Height": 480,
"Width": 480
},
"Camera2": 1,
"Camera2Roi": {
"Height": 480,
"Width": 480
},
"Confidence": 10,
"IntTranslationX": 53,
82
"IntTranslationY": -6991,
"PairStatus": "done",
"Rotation": 1.517895,
"RotationDegree": 273.221008,
"Scale": 1,
"TranslationX": 0.006261,
"TranslationY": -0.827888
}
]
}
POST
{
"Transformation": [
{
"Camera1": 0,
"Camera2": 1,
Input data "IntTranslationX": -1988,
"IntTranslationY": 5155,
"RotationDegree": 177.25
}
]
}
{
Return data "Status": 200
}
Parameters Description
Only works when stitching is done
Camera index of current transform master, please refer to GET command
Camera1
results
Camera2 Camera index of current transform slave
IntTranslationX The distance of slave camera in X coordinate of master camera, in millimeter
IntTranslationY The distance of slave camera in Y coordinate of master camera, in millimeter
The rotation degree for coordinate of slave camera to match coordinate of
RotationDegree
master camera
5.8. Tracking
5.8.1. Tracking ON
83
URI
/VCA/Tracking/ON
Description
Enable object tracking
Default value
N/A
GET
Input data None
Return data Status
Request example
GET /VCA/Tracking/On HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Status": 200
}\n
POST
N/A
Parameters Description
100: Continue
This interim response is used to inform the client that the initial part of the
request has been received and has not yet been rejected by the server. The
client SHOULD continue by sending the remainder of the request or, if the
request has already been completed, ignore this response.
Status
200: OK
The request has succeeded.
84
5.8.2. Tracking OFF
URI
/VCA/Tracking/Off
Description
Disable object tracking
Default value
N/A
GET
Input data None
Return data Status
Request example
GET /VCA/Tracking/Off HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Status": 200
}\n
POST
N/A
Parameters Description
Status 100: Continue
This interim response is used to inform the client that the initial part of the
request has been received and has not yet been rejected by the server. The
client SHOULD continue by sending the remainder of the request or, if the
request has already been completed, ignore this response.
200: OK
The request has succeeded.
85
400: Bad Request
The request could not be understood by the server due to malformed syntax.
86
5.9.2. Counting Rule Trigger Test
URI
/VCA/Rule/Test?Counting&In=[In count]&Out=[Out count]
Description
Forces all counting rules which are user set are triggered, and sending specific event data
Default value
N/A
GET
Input data Number
Return data Status
Request example
GET /VCA/Rule/Test HTTP/1.1\r\n
Host: 172.16.56.13\r\n
Authorization: Basic cm9vdDpyb290\r\n
User-Agent: curl/7.53.1\r\n
Accept: */*\r\n
Response example
HTTP/1.1 200 OK\r\n
Content-type: application/json\r\n
\r\n
{
"Status": 200
}\n
POST
N/A
Parameters Description
Status 200: OK
The request has succeeded.
87