0% found this document useful (0 votes)
59 views51 pages

Yolo v3 Proj Draft

The document discusses the development of a social distancing admonishing system using YOLOv3 object detection. It aims to automatically monitor social distancing and crowds to reduce manual workload. The system will use a camera and YOLOv3 to count people and check distances, and include an alarm and displays to admonish violations in real-time. The accuracy and cycle time of the system will be evaluated. Implementing this system could help establishments better enforce social distancing and crowd control practices.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views51 pages

Yolo v3 Proj Draft

The document discusses the development of a social distancing admonishing system using YOLOv3 object detection. It aims to automatically monitor social distancing and crowds to reduce manual workload. The system will use a camera and YOLOv3 to count people and check distances, and include an alarm and displays to admonish violations in real-time. The accuracy and cycle time of the system will be evaluated. Implementing this system could help establishments better enforce social distancing and crowd control practices.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 51

Development of Social Distance Admonishing

System using YOLOv3 Object Detection Algorithm

by

Louie Francis C. Eusebio

A Practicum Report Submitted to the School of Graduate Studies


in Partial Fulfilment of the Requirements for the Degree

Master of Science in Electronics Engineering


Major in Control Systems

Engr. Glenn V. Magwili


Thesis Adviser

Mapúa University

July 2021
Chapter 1
INTRODUCTION

In the current times, most establishments implement social distancing and crowd

control practice as the scare of COVID-19 is still present. As per the World Health

Organization (WHO) last June 24, 2021, the number of total confirmed cases is 1,367,894 in

the Philippines with the number of new cases is still at thousands during the month of March.

The suggested action by the WHO, Inter-Agency Task Force (IATF) and Department of Health

(DOH) is the implementation of regulations of limiting people gathering in areas and

maintaining 1 meter distance between people, along with proper wearing of face mask and face

shield. However, most of these establishments conduct these manually and would require their

personnel for this work. Additionally, there may be times their practice may not be properly

implemented as the flow of people coming in and out of their establishment are not in sync

with their schedule in tracking. These concerns may be solved by implementing a camera

system that would continuously perform the following actions of counting the number of

people and monitor the distance between people in the vicinity. An admonish system will also

be placed for immediate mitigation of breaches of social distancing and people limit.

Recent studies for the improvement on social distancing and crowd control practice are

emerging recently. One of these recent studies is by Imran Ahmed, et. al., last November 2020,

focused on the development of a framework with deep learning capability for social distance

tracking that was implemented utilizing an overhead perspective and You Only Look Once

(YOLOv3) object recognition for identifying distance between identified humans in captured

videos. Another recent study is by Poopa Gupta, et. al., last 2020, which focused on using

YOLOv3 algorithm and Single Shot multi-box Detector (SSD) algorithm for counting the
number of people present in captured videos with also considering the detection efficiency.

The two recent studies discussed were only for the detection and lacked an admonish system

utilizing a camera and an alarm to mitigate the violations being detected. A study by Y. Cai et.

al., last March 29, 2016, focused on utilizing a single Pan-tilt-zoom (PTZ) camera for people

tracking and face capture done dynamically. The zoom function of the PTZ camera is for the

detection of the person and his/her face, while the pan and tilt function are for real-time

detection and following of the person. The system first detects the person while the camera is

in zoomed-out mode. At given set time, the camera will be zoomed-in to detect a person on

the testing area while addressing the person’s association with face images and trajectories.

The detection of the PTZ camera is dynamic in nature as it continues to move and follow the

person being detected. Receiving and processing images in this study must be done in real-

time.

The use of camera sensor and algorithm such as YOLOv3 provided an effective means

to detect social distancing activities, but lack thereof an admonishing system for immediate

resolution of the violations. At present, the means of establishments to determine the social

distancing activities and admonishing violators is by having assigned personnel perform it

manually, however their monitoring of the crowd limit and distance between each people may

not be done accurately especially if the movement of people varies. Implementing a social

distancing and admonishing system to have an automatic means to monitor social distancing

activity and halt further violations can be considered. There has no present admonishing system

used for mitigation of social distancing violations.


The main objective of the study is to develop a camera-based admonishing system for

social distancing protocol violations. The specific objectives of the study are as follows: (1)

To design and construct the prototype setup for the Admonishing System for Social Distance

and Crowd Control. (2) To utilize YOLOv3 algorithm for counting people and distance

between people in the vicinity using camera sensor. (3) To develop the alarm and admonishing

system with camera capturing focused on violators and display violators on assigned monitor

display. (4) To determine the accuracy and the cycle time of the admonishing system.

The significance of the study is for automated use of establishments (i.e. stores, offices)

in monitoring social distancing activities of people within its vicinity which can improve the

real-time monitoring and tracking of the establishment personnel. With its automated

capability, it will give less manual workload for the establishment personnel. This is also to

further improve the implementation of crowd control and social distancing practices of

establishments.

The main scope of the study is the development and evaluation of Admonishing System

for Social Distance and Crowd Control. Python will be the main language that will be used for

this study. For the development of the program for social distance tracking, crowd control

tracking and admonishing system, the system will be installed in the multi-purpose hall. The

area represents common lobbies of most establishments. The size of testing site will have a

floor area of at least 20 sq. m. and the maximum people limit for the testing will be limited to

eight (8) people.


Chapter 2

REVIEW OF RELATED LITERATURE

Importance of Social Distancing

As per various health institutions, such as World Health Organization (WHO) and Centers

for Disease Control and Prevention (CDC) of United States of America, Social Distancing (or

physical distancing) is a means of keeping a safe distance between yourself and other people

with at least around 1 meter (other considers having at least a greater distance of 6 feet). This

practice is served important for people to be not infected from virus, including COVID-19, as

the considered main means of transmission of virus is from person to person. It is also widely

considered as a mean to limit the spread of disease and to be closed to contaminated surfaces

and infected people in public places. This is also a means to avoid possible inhaling and being

exposed from foreign droplets and particles coming from other people when they talk, cough

or sneeze. Practicing social distancing also requires the wearing of mask for further protection

from foreign viruses, including COVID-19.


Figure 2.1 WHO Infographic on COVID-19 Facts and Guidelines

Importance of Crowd Limit.

Having dense crowds is considered as a source of transmission of COVID-19 spread

as there would also be minimum distance between each people. Guidelines and rulings on the

allowable crowd limit have been implemented around the world but having priority for full

crowd capacity is allowable for essential and important workforce such as hospitals, frontline

services institutions, and food preparation establishments, and having fractional allowable

crowd capacity for other institutions.


Object Detection

Object Detection is a computer vision technique used in images and videos for

identifying object instances and for leveraging machine learning or deep learning to yield better

output. Object detection should be able to replicate the idea of object recognition of a human.

Object Detection is mostly used in video surveillance and image retrieval systems. Below

shows the application of object detection in identifying and locating “vehicle” object in the

captured image.

Figure 2.2 Sample application of Object Detection in Identifying and Locating

Object Detection thru You Only Look Once (YOLO)

The algorithm “You Only Look Once, Version” or YOLOv3 is a real-time object

detection algorithm used to identify specific objects from images and videos and compare it

with its dataset repository. This algorithm is considered a Convolutional Neural Network

(CNN) in object detection where CNN is a classifier-based systems that process input images

and videos as structured arrays of data and identify patterns between them. YOLOv3 is usually

implemented using Keras and/or OpenCV deep learning libraries and its object classification

systems are usually used by Artificial Intelligence (AI) programs for identifying objects.
YOLO has the main advantage of being much faster in identifying object making it suitable

for real-time usage.

The YOLOv3 mode is able to look at the whole image to performed predictions of the

objects being detected. YOLOv3 algorithm made use of similarity “scores” to predefined

classes and dataset. Each detected object will be assigned with its determined classes that have

the highest similarity “score” from the predefined class. The algorithm of YOLOv3 considers

to first separate the detected image into grid cells. Prediction of boundary boxes count around

detected objects that got high similarity “scores”. Each object will have these scores of its

accuracy with the predefined classes and the generation of boundary boxes by clustering the

dimensions of the boxes from the predefined classes.

Figure 2.3 YOLOv3 Architecture


Figure 2.4 Sample of YOLOv3 Vision

For this study, YOLOv3 will be mainly used in detecting and counting of people

coming from the video feed from the main camera. The dataset repository used for this

algorithm will be coming from commercially available dataset from the official website of

YOLOv3 developer (url: https://pjreddie.com/darknet/yolo/).

The Raspberry Pi 4 Model B

Figure 2.5 Raspberry Pi 4 Model B


The Raspberry Pi is a cost-efficient minicomputer that is capable of doing most of the

fundamental features that a basic computer can functionally perform. It also has ports that

enable one to allow communication with other devices such as monitors, printers, routers, etc.

One of their latest product, The Raspberry Pi 4 Model B, will be used be used for the project.

Some of its specifications include 8 GB of SDRAM, Quad core CPU (64-bit SoC @1.5 GHz),

4 USB ports (2 USB 3.0, 2 USB 2.0), HDMI Ports (micro), Camera and Display ports, and an

SD Card reader.

Raspberry Pi allows users to create embedded systems using is as the main processor.

With it, the programming language Python is used. According to the website of Raspberry Pi:

“A Python learning tool with beginner exercises in using variables, data structures and basic

control flow.”. The programming language is capable of doing the basic programming

functions for statements, operators, inequalities through variables.

Arduino

Arduino is an open-source electronics platform consisting of both programmable

microcontroller and an Integrated Development Environment (IDE) software. The Arduino is

published as an open source with both software and hardware extensible to its users. This

device is commonly used for systems requiring the use of sensors and controlling of various

output peripherals. The Arduino IDE is based on the C++ programming language and is

available on various operating systems such as Windows and Linux.


For this study, Arduino will be used with the Raspberry Pi to control the Pan-Tilt-Zoom

(PTZ) Cameras for the admonishing feature of the whole system. This will be done when

Arduino is connected to the Raspberry Pi and performs serial communication Universal

Asynchronous Receiver/Transmitter (UART) when the whole system performs the

admonishing.

Figure 2.5 Arduino

Universal Asynchronous Receiver/Transmitter (UART)

The UART communication will have two UART-abled devices to communicate

directly using two lines. One line is for the use of data transmitting of the first device to the

receiving second device, and the other line will be for the data transmitting of the second device

to the receiving first device. In this communication scheme, data are being transmitted

asynchronously and no external clock signal is required. The transmission UART-abled device
adds start and stop bits to the data packet it will transmit. When the receiving UART-abled

device detects the start bit, it will detect the incoming bits until the stop bit at baud rate. For

this study, this UART will be used for the primary communication of Arduino and Raspberry

Pi.

Figure 2.6 UART

Camera Sensor

Commercially available modern camera sensors typically can capture moving images

of at least 720p resolution at an average 60 frames per second, have focal length of around 72°,

viewing angle of around 90° - 100°. These specifications can be used in projects involving

object detection, including detection, and counting of human object being captured.

Modern camera sensors specifically made for Raspberry Pi are having the specifications

suitable for object detection. Most of these camera sensors are automatically powered directly

from the Raspberry Pi itself.


One popular camera sensor is the Pi NoIR Camera V2. It is equipped with the Sony

IMX219 8-megapixel sensor and used for capturing high-definition (HD) video at most 1080p

and capturing pictures of 3280 x 2464 pixel. This device is compatible with any models of

Raspberry Pi (including 4 model B) and attached thru the dedicated Camera Sensor interface

of the Raspberry Pi.

Figure 2.6 Pi NoIR Camera V2 (attached to the Raspberry Pi)

Pan-Tilt-Zoom (PTZ) Camera

Pan-Tilt-Zoom Camera, or commonly abbreviated as PTZ Camera, are devices that acts

as a regular camera but can be panned, titled and zoomed while it is being operated and are

mostly for visual surveillance monitoring. The control for pan, tilt and zoom are usually

programmed and controlled by another microcontroller or computer. For this study, PTZ

Cameras will be mainly used for the admonishing system where it will be for capturing social

distancing violators detected by the main camera. The capture video by the PTZ Camera will

be fed to the display monitor in order for the people in the vicinity to identify the violators.
People detection and counting using YOLOv3 and SSD models

In this study, the researchers utilized You Only Look Once (YOLOv3) and Single Shot

multi-box Detector (SSD) algorithms to count number of people at any junction. These

algorithms were used by the researchers independently as one is for object detection and the

other is for object counting. The researchers have found out that SSD has higher accuracy than

YOLOv3 on their testing.

A deep learning-based social distance monitoring framework for COVID-19

In this study, the researcher developed a framework utilizing deep learning platform and

YOLOv3 object recognition to detect humans using a camera set in an overhead perspective

for social distance tracking. A pre-trained algorithm connected with an overhead human data

set was used for their detection algorithm utilizing detected bounding box information. For

each detected human, the centroid pairwise distances are being determined using Euclidean

distance. The researchers estimated an approximation of physical distance to pixel and

violation threshold. The violation threshold established will be used to determine if the

minimum social distance threshold was breached.

The researchers testing included different video sequence to determine the efficiency of

the model. The researchers found out that their framework was successful in identifying

humans who breachers social distance protocols. The accuracy of 92% and 98% achieved by

the detection model without and with transfer learning, respectively. The tracking accuracy of

the model is 95%.


Real-time people counting system using a single video camera

In this study, the researchers attempted to develop a video-based approach for people

monitoring and counting. The researchers developed a real-time people counting system

utilizing a single low-end video camera. The project also considered to overcome the problems

on estimation of scene background and merge-split scenarios whenever multiple persons are

moving closely, as several persons may be considered as a single person by their developed

system. The researchers considered background subtraction using adaptive background and

automatic thresholding to take reduces noises and static object changes. An adaptive Kalman

filter is utilized to track continuously moving objects, in order to allow approximation of future

position of moving objects, especially under heavy occlusion. In removing shadows from the

captured video images. Results from their testing that their method is capable of tracking and

counting of people in background changes or high crowd movement at high framerate.

Real-time distance measurement using a modified camera

In this study, the researchers were trying to develop a real-time method to measure the

distance using a modified camera with image sensor inclined by a certain angle. The camera is

set where it is focused at projection plane. The object distance is determined by acquiring the

image distance and applying it to the lens formula. The image distance was determined as the

distance between the projection plane and the camera lens. The researchers were able to

conclude that their method is effective and can perform measuring by an average time of 3.21

ms.
An Image Processing based Object Counting Approach for Machine Vision Application

In this study, the researchers utilized a camera to perform machine vision applications,

such as object-independent product counting. The camera used have a resolution of 1280x720

px, 59 fps Frame Rate, Global shutter type, CCD Sensor and GigE Interface. The approach for

this study is based on Otsu thresholding and Hough transformation for the system to able count

automatically the product type and color. The product in this study are eggs and soda bottles.

The camera in this study will capture image of the products through a conveyer. These images

will then be applied with the following: Gaussian filter, S Channel, Otsu Threshold, Sobel

Edge Detection and Hough Transformation.

Persistent people tracking and face capture using a PTZ camera

A study by Long Zhao, last March 29, 2016, focused on utilizing a single Pan-tilt-zoom

(PTZ) camera for persistent people tracking and face capture. The system first detects the

person while the camera is in zoomed-out mode. At given set time, the camera will be zoomed-

in to detect a person on the testing area while addressing the person’s association with face

images and trajectories. The detection of the PTZ camera is dynamic in nature as it continues

to move and follow the person being detected. Receiving and processing images in this study

must be done in real-time.


Chapter 3
METHODOLOGY

CONCEPTUAL FRAMEWORK
The study will be conducted in a controlled environment (in the Multi-purpose hall of

a subdivision) with setup representing common lobbies of most establishments. Figure 3.1

shows the conceptual framework of the study:

Figure 3.1 Conceptual Framework of the Study


The main input of the study is a main camera sensor that will be used for acquiring live

videos from the testing area and will be sent to the main computer (Raspberry Pi). The process

of people detection and counting and measuring of distance between people will be performed

in the computer continuously. When social distancing protocol violations were for straight 30

seconds (i.e. number of people in the vicinity exceeds the crowd limit or distance between

detected people is less than one (1) meter), the alarm system and admonishing system will be

activated. An activated alarm system to warn people in the vicinity of the violation and pinpoint

the specific area (or cluster) where the violation has occurred. An admonishing system will be

primarily activated when distance detected between two people is less than 1 meter for thirty

(30) straight seconds. Depending on the area or cluster the violation has occurred, an assigned
PTZ camera will focus on the violators for them to be displayed in the monitor display. The

PTZ cameras will be controlled (panning, tilting and zooming) by the main controller when

needed in order to have an optimized view of the detected violators. The accuracy and cycle

time of the admonishing system will be determined in this research.

The block diagram of the methodology below shows the general procedural steps of the study:

Figure 3.2 Block diagram of the Methodology

The main objective of the study is to develop a camera-based admonishing system for

social distancing protocol violations. The specific objectives as well as the procedural steps

and descriptions to meet the objectives are discussed below:

The specific objectives of the study are as follows:


(1) To design and construct the prototype setup for the Admonishing System for Social

Distance and Crowd Control.

(2) To utilize YOLOv3 algorithm for counting people and distance between people in the

vicinity using camera sensor.

(3) To develop the alarm and admonishing system with camera capturing focused on violators

and display violators on assigned monitor display.

(4) To determine the accuracy and the cycle time of the admonishing system.

The following procedural steps will be done to address the specific objectives.

Procedural steps A and B will address specific objective 1. Procedural step C will address

specific objective 2. Procedural step D will address specific objective 3. And procedural step

E will address specific objective 4.

Figure 3.3 shows the block diagram of the prototype:

Figure 3.3 Block Diagram of the Prototype


The main input peripheral for the system is the camera sensor. Every video it captured

will be fed to the Raspberry Pi and be processed by the people counting social distance tracking

programs. For crowd limit violation, the main output device that will be used are the speakers

for playing the alarm sound for crowd limit violation. For social distancing violation, the

speakers for playing the alarm sound for crowd limit violation and the admonishing process

will be done. For the admonishing, the pan-tilt-zoom (PTZ) camera, through the Arduino

connected to the Raspberry Pi using UART Communication, will focused on the violators, and

detected video will be displayed on the display monitor.

Figure 3.4. illustrates the outline of the setup and clustering of the testing area:

Figure 3.4 Testing Area Setup and Clustering

The testing area will be assigned with six (6) clusters and these clusters will be

considered for the alarm system and admonishing system. Cameras will be placed in the testing
area in frontal view setup: one main camera sensor that will be primarily used for monitoring

of social distancing activities and PTZ cameras that will be primarily used for the admonishing

system and are assigned with three adjacent clusters (one PTZ camera will be assigned for

clusters 1a, 1b and 1c while the other PTZ camera is for clusters 2a, 2b and 2c). A speaker will

also be placed in the testing area for the alarm system and a monitor display will be used to

present social distancing violators.

A. Preparation of Materials

Raspberry Pi 4

The Raspberry Pi 4 will be the main computer of the project. The programs for used

for the admonishing system where it will be for capturing social distancing violators detected

by the main camera. The capture video by the PTZ Camera will be fed to the display monitor

in order for the people in the vicinity to identify the violators.

Figure 3.5 Raspberry Pi 4 Model B

Table 3.1. Specifications of the Raspberry Pi 4 Model B


Specification Value
CPU Broadcom BCM2711, Quad core Cortex-
A72 (ARM v8) 64-bit SoC @ 1.5GHz
RAM 8 GB LPDDR4-3200 SDRAM
IEEE Wireless Standard 2.4 GHz and 5.0 GHz IEEE 802.11ac
wireless
Micro-SD Slot Yes, for loading operating system and data
storage
Number of available USB ports 2 USB 3.0 ports; 2 USB 2.0 ports.
Input Voltage 5V DC (minimum 3A)
High Efficiency Video Coding (HEVC) H.265 (4kp60 decode), H264 (1080p60
decode, 1080p30 encode)
Camera Serial Interface (CSI) Port 2-lane MIPI CSI camera port
Display Serial Interface (DSI) Port 2-lane MIPI DSI display port
HDMI Ports 2 × micro-HDMI ports (up to 4kp60
supported)
Audio/Video Port 4-pole stereo audio and composite video
port
Operating temperature 0 – 50 degrees C ambient

Camera Sensor

The main camera sensor will be used for the people counting and measurement of

distance between detected people. The video captured by the camera will be fed to the

Raspberry Pi computer thru the CSI Port. An extender would also be used in order for giving

more reach for the camera.

Table 3.2 Specifications of the Camera Sensor


Specification Value
Resolution At least 720p (preferably 1080p)
Sensor Type charged-coupled device (CCD)
Aperture 1.8
Focal Length At least 72°
Viewing Angle At least 90°-100°
Input Voltage 3.3-12V
Resolution At least 8 MP or 3280 x 2464
Arduino
Arduino will be used with the Raspberry Pi to control the Pan-Tilt-Zoom (PTZ) Cameras for

the admonishing feature of the whole system. The Arduino and Raspberry Pi will be connected

by UART Communication.

Table 3.3 Specifications of the Arduino


Specification Value
Microcontroller ATmega328
Operating Voltage 5V
Input Voltage Recommended: 7-9V, limits: 6-20V
Digital I/O pins 14 (of which 6 provide PWM output)
Analog input pins 6
SRAM 2 KB
EEPROM 1 KB
Clock Speed 16 MHz

Sample Schematic Diagram of Raspberry Pi and Arduino Connection.

The Raspberry Pi and Arduino will be connected with each other. The Arduino and Raspberry

Pi will be connected by UART Communication.

Figure 3.6 Schematic Diagram of Raspberry Pi and Arduino Connection.


PTZ Cameras

PTZ Cameras will be mainly used for the admonishing system where it will be for

capturing social distancing violators detected by the main camera. The capture video by the

PTZ Camera will be fed to the display monitor in order for the people in the vicinity to identify

the violators. PTZ Cameras will be connected to the output ports of the Arduino

Table 3.3 Specifications of the PTZ Camera


Specification Value
Resolution At least 720p (preferably 1080p)
Pan/Tilt (Left-Right) At least 270°
Pan/Tilt (Vertical) At least 90°
Viewing Angle At least 90°-100
Operating Voltage 5-12V
Digital Zoom Up to 4.0x

Estimated Cost of Project

Table 3.4 shows the estimated cost of the components involved. Price subject to change

depending on the availability and shipping cost.

Table 3.4 Specifications of the PTZ Camera


No. Material Estimated Cost
1 Raspberry Pi 4 Model B Php 5000.00
2 Camera Sensor Php 1600.00
3 Extension Cable for Camera Sensor (1 m.) Php 120
4 PTZ Camera (2 units) Php 3000.00
5 Speakers (General) Php 500
6 Monitor (21 inches) Php 3000
7 Arduino (UNO) Php 500
8 Power Supply for PTZ (5-12 V) – 2 units Php 500
9 Camera Mount (3 pcs.) Php 1200
Estimated Total Cost Php 15420
B. Setup of cameras in the testing area and calibration of program in the Raspberry Pi

Testing Area

The testing area for the development of the prototype will be done on the multi-purpose

hall of a subdivision. The crowd limit is set to 8 people and minimum allowable distance

between people is 1 meter.

Figure 3.7 Testing Area

1. Positioning of Cameras in the testing area.

The camera will be positioned in such manner shown in Figure 3.4.

Several attempts in calibration may be required in order to attain the

most optimized views for the camera. The main camera will be placed

to have a proper frontal view of the vicinity. The PTZ cameras will be

placed and be calibrated such that it can be pan and titled to view its

assigned clusters.
Figure 3.8 Sample Frontal View of Main Camera

The troubleshooting of the camera view will be done until the view practically covered

the whole view of the testing area. Once the optimized positioning of the camera has been

determined, proper mounting and wiring will be done. The image below shows the

approximated position of the cameras (main camera sensor, PTZ Cameras), monitor and

speakers.

Figure 3.9 Approximated Layout of Devices in Testing Area


2. Estimation of Actual Distance Measurement to Pixel Ratio.

Once the main camera has been properly set, the estimation of

actual distance measurement to image pixel ratio of the captured image

will be determined as reference for the program that will be formed for

the social distance monitoring. This will be done for both vertical and

horizontal dimensions of the captured image. Let x be the equivalent

horizontal distance of the per horizontal side of the pixel (mm/pixel) and

y be the equivalent vertical distance of the per vertical side of the pixel

(mm/pixel). The total number of horizontal and vertical pixels will be

based on the specifications of the camera to be used. The formula for x

and y will be as follows:

𝐴𝑝𝑝𝑟𝑜𝑥𝑖𝑚𝑎𝑡𝑒𝑑 ℎ𝑜𝑟𝑖𝑧𝑜𝑛𝑡𝑎𝑙 𝑑𝑖𝑠𝑡𝑎𝑛𝑐𝑒 𝑜𝑓 𝑡ℎ𝑒 𝑡𝑒𝑠𝑡𝑖𝑛𝑔 𝑎𝑟𝑒𝑎


x= (mm/pixel)
𝑇𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 ℎ𝑜𝑟𝑖𝑧𝑜𝑛𝑡𝑎𝑙 𝑝𝑖𝑥𝑒𝑙𝑠

𝐴𝑝𝑝𝑟𝑜𝑥𝑖𝑚𝑎𝑡𝑒𝑑 𝑣𝑒𝑟𝑡𝑖𝑐𝑎𝑙 𝑑𝑖𝑠𝑡𝑎𝑛𝑐𝑒 𝑜𝑓 𝑡ℎ𝑒 𝑡𝑒𝑠𝑡𝑖𝑛𝑔 𝑎𝑟𝑒𝑎


y= (mm/pixel)
𝑇𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑣𝑒𝑟𝑡𝑖𝑐𝑎𝑙 𝑝𝑖𝑥𝑒𝑙𝑠

Sample Computation for image pixel ratio:

Considering: total number of horizontal pixels = 3280 & Horizontal distance of vicinity = 3

meters or 3000 mm
Figure 3.10 Sample Frontal View of Main Camera with Horizontal measurement

3000 𝑚𝑚
x= = 0.915 𝑚𝑚/𝑝𝑖𝑥𝑒𝑙𝑠
3280 𝑝𝑖𝑥𝑒𝑙𝑠

3. Calibration of People Detection and Counting program in the Raspberry Pi

The program for people detection and counting, social

distancing and admonishing system will utilize the YOLOv3 algorithm.

The program will be calibrated locally such that the program would

detect all possible people object as correctly as possible. it will be

optimized with the camera view.

YOLOv3

The algorithm “You Only Look Once, Version” or YOLOv3 is a real-time object

detection algorithm used to identify specific objects from images and videos and compare it

with its dataset repository. This algorithm is considered a Convolutional Neural Network
(CNN) in object detection where CNN is a classifier-based systems that process input images

and videos as structured arrays of data and identify patterns between them. YOLOv3 is usually

implemented using Keras and/or OpenCV deep learning libraries and its object classification

systems are usually used by Artificial Intelligence (AI) programs for identifying objects.

YOLO has the main advantage of being much faster in identifying object making it suitable

for real-time usage.

For this study, YOLOv3 will be mainly used in detecting and counting of people

coming from the video feed from the main camera. The object detection of the study will utilize

pre-trained models and the main python program will be connected to these pre-trained models.

The dataset repository used for this algorithm will be coming from commercially available

dataset from the official website of YOLOv3 developer (url:

https://pjreddie.com/darknet/yolo/). The

Development of Program

The development of program will be thru Pycharm program and Python programming

language. YOLOv3 algorithm and its given dataset repository will be used for people object

detection. Counting of detected people and calculating measurement of distance between

people will be encoded and programmed accordingly. The alarm and admonishing feature

program will also be developed using the same program, with consideration of the connected

output peripherals.
Pycharm

Pycharm is an integrated development environment (IDE) program developed by the

Czech company JetBrains mainly utilizing Python language computer programming. Main

features of Pycharm includes smart code editor and inspection, on-the fly code fixes, code

debugging, and code refactoring. Pycharm supports cross-platform IDE on the following

operating systems: Windows, macOS and Linus. For the project, the free Community Edition

of Pycharm will be used as the included main features are suitable in the development of the

program for the project.

Table 3.5 Specifications Requirements of Pycharm


Specification Value
RAM 4 GB free RAM/ recommended: 8 GB
system RAM
CPU Modern CPU/ recommended: Multi-core
CPU
Disk Space 2.5 GB with 1 GB for caches/ recommended:
5 GB from SSD Drive
Operating System Latest 64-bit versions of Windows, macOS,
or Linux

Figure 3.11 Sample User Interface of PyCharm


Pseudocode for Crowd Control

Program Start
While Camera Sensor is ON
Detect People object (using YOLOv3 algorithm)
Count total number of detected people
If counted detected people > assigned maximum limit people
Assign time = 1 sec.
If counted detected people > assigned maximum limit people
time = time + 1 sec.
If time = 30 secs then
Activate Alarm System for Crowd Violations
Else return
Else
Return
End

Pseudocode for Social Distance Tracking

Program Start
While Camera Sensor is ON
Detect People object (using YOLOv3 algorithm)
Identify centroid of detected people
Measure distance between centroid of detected people
If measured distance < 1m for 30 secs.
Activate system for social distance violation
Activate Admonishing System
Else
Return
End
C. Testing and Simulation of the program for counting people and distance between

people in the vicinity.

The program for counting people and distance between people will be developed

utilizing Yolov3 algorithm with associating dataset coming from the official YOLOv3 website

(by the developer). The program will be tested in a controlled setup initially in order to improve

its accuracy in detecting people, counting people and distance between people. After finalizing

the program, the different subsystem of the project will be tested in each assigned test cases.

The general procedure for the people counting and measuring distance between people is

presented in the flowchart, Figure 3.12 and Figure 3.13. The testing will be done in each

enumerated test cases below.

Figure 3.12 People Counting Flowchart


For People Counting and Crowd Limit control, the prototype thru the camera sensor

will identify people (object) in the vicinity in real-time. In every second, the number of detected

people will be counted, and the system will identify if the count is still within the passable

crowd limit. If the count of detected people is greater than the designated crowd limit of eight

(8) people, there would be a possible crowd limit violation and the system will start an internal

timer for tracking the continuous time that the violation is occurring. If straight 30 seconds has

been tracked for the violation, the crowd system alarm will initiate.

Figure 3.13 Social Distancing Flowchart


For Social Distancing tracking, the prototype thru the camera sensor will identify

people (object) in the vicinity in real-time. Further consideration is to identify the centroid each

identified people object. In every second, the system will measure the centroid of each

identified object, and the system will identify if the measure of each people is greater than the

minimum allowable distance of one (1) meter. If there would be a detected distance of less

than 1 meter, there would be a possible social distancing violation and the system will start an

internal timer for tracking the continuous time that the violation is occurring. If straight 30

seconds has been tracked for the violation, the social distance violation alarm and admonishing

system will initiate.

Testing for People Detection and Counting

The testing is for the prototype’s ability of performing people detection and counting. It

will be tested in each of the test cases enumerated below. The tables below will be used for

testing of the operation of the People Detection and Counting program. The accuracy of Real-

time People Counting capability of the system will be counted automatically by the program

and be counted manually by the researcher. For this study, the threshold limit in terms of people

count is eight (8) for all test cases. For each test cases, there would be 10 trials each. These test

cases will be considered for this testing:

1. People positioning at least 1 meter away from each other and not exceeding crowd

limit.

2. People positioning at least 1 meter away from each other and exceeding crowd limit.

3. Two (2) people violating social distancing protocol and are very close with each other.

4. Several people moving at quick pace.


Table 3.6 People Detection and Counting in Test Case #1
Trial System Count Manual Count Is System Count =
Manual Count (Yes/No)
1
2

10

Table 3.7 People Detection and Counting in Test Case #2


Trial System Count Manual Count Is System Count =
Manual Count (Yes/No)
1
2

10

Table 3.8 People Detection and Counting in Test Case #3


Trial System Count Manual Count Is System Count =
Manual Count (Yes/No)
1
2

10

Table 3.9 People Detection and Counting in Test Case #4


Trial System Count Manual Count Is System Count =
Manual Count (Yes/No)
1
2

10

The accuracy for each testing will be determined by getting the number of total times

that the system count is equal to the manual count divided the total number of trails and then

multiplying by 100%.
Testing for Social Distance Tracking
The testing is for the prototype’s ability of measuring distance between people. It will

be tested in test cases of specific distance between two people. For each trial of each test cases,

the positioning of each people will be varied throughout the test area while maintaining the

assigned distance between. Table 3.10 -3.12 will be used for testing of the operation of the

Social Distance Tracking program. For each test cases, there would be 10 trials each. The

measurement of distance between people will be done both manually and automated by the

system.

These test cases will be considered for this testing:

1. Distance Between is very close (< 1 m.)

2. Distance Between is 1 m.

3. Distance Between is 2 m.

Table 3.10 Social Distance Tracking in Test Case #1


Trial System Manual %Accuracy Is %Accuracy
Measurement Measurement ≥ 95 %
1
2

10

Table 3.11 Social Distance Tracking in Test Case #2


Trial System Manual %Accuracy Is %Accuracy
Measurement Measurement ≥ 95 %
1
2

10
Table 3.12 Social Distance Tracking in Test Case #3
Trial System Manual %Accuracy Is %Accuracy
Measurement Measurement ≥ 95 %
1
2

10

For each test cases, the percentage accuracy will be calculated for each trial.

𝑎𝑐𝑡𝑢𝑎𝑙 − |𝑎𝑐𝑡𝑢𝑎𝑙−𝑚𝑒𝑎𝑠𝑢𝑟𝑒𝑑|
Percentage Accuracy, %Accuracy = 𝑥 100%
𝑎𝑐𝑡𝑢𝑎𝑙

The considered accepted minimum percentage accuracy is 95%. The basis would be

from the published work of R. Keniya & N. Mehendale titled “Real-time social distancing

detector using SocialdistancingNet19 deep learning network” last August 2020 have achieved

accuracy of 92.8 % . Acquiring a percentage accuracy of at least 95% will yield a successful

trial. The accuracy of the social distance tracking will be computed as the total number of

successful trials divided the total number of trails and then multiplying by 100%.

D. Testing alarm system and admonishing system

Alarm System

The alarm system will be activated when these violations are detected by the previous system:

[1] Crowd Limit and [2] Social Distancing.

a. Alarm System for Crowd Limit violation


Once the prototype has detected the crowd limit violations occurring for straight 30

seconds, the alarm system will play and alarm tone and voice message indicating that the crowd

limit has been violated. The alarm system will continue to be activated until the main camera

sensor has detected no crowd limit violation for 15 seconds. The testing for Alarm System for

Crowd Limit violation will be tested in a controlled environment where the crowd limit may

be varied between trials. The general procedure for this is presented in the flowchart, Figure

3.14.

Figure 3.14 Alarm System for Crowd Limit Violation Flowchart

Testing of Alarm System for Crowd Limit violation

For the testing of Alarm System for Crowd Limit violation, it will be tested in each of

the test cases enumerated below. Each test cases will be assigned with different assigned crowd

limit to determine if the system would be flexible with various crowd limit setting. Each test
cases will have ten trials each. For each trail, the testing area will be populated in increments

of 1, starting from 1 people. People in the area will purposely stay in the area for at least 30

seconds. There should be no alarm in case of the number of people is equal or less than the

assigned crowd limit and should alarm if the number of people is greater than the assigned

crowd limit. This is in order to test if the alarm system for crowd limit will response

accordingly. The alarm system for crowd limit violation will only be activated if the detected

number of people in the vicinity is greater than the designated people limit for straight 30

seconds. Once activated, the alarm tone will play, and an automated voice message will

indicate that crowd limit violation has occurred. Table 3.13- 3.15 will be used for testing of

the reliability of the Alarm System for Crowd Limit violation program.

For trials with activated alarm system, the following succeeding procedure will be

considered: Once the alarm is activated, the number of people will be purposely reduced such

that the total number of people is equal or less than the assigned crowd limit. The alarm system

should turn off after continuous 15 seconds of no crowd violation.

These test cases will be considered for this testing:

1. Crowd limit of six (6) people

2. Crowd limit of seven (7) people

3. Crowd limit of eight (8) people

Table 3.13 Alarm System for Crowd Limit violation for Test Case #1
Trial Crowd Populated Did Alarm In case of Is Response
Limit People in System activation, did Correct?
Setting the Testing Activate? alarm stop after (YES/NO)
Area (YES/NO) continuous 15
seconds of no
crowd violation
(YES/NO/NA)
1 6 1
2 6 2
3 6 3
4 6 4
5 6 5
6 6 6
7 6 7
8 6 8
9 6 9
10 6 10

Table 3.14 Alarm System for Crowd Limit violation for Test Case #2
Trial Crowd Populated Did Alarm In case of Is Response
Limit People in System activation, did Correct?
Setting the Testing Activate? alarm stop after (YES/NO)
Area (YES/NO) continuous 15
seconds of no
crowd violation
(YES/NO/NA)
1 7 1
2 7 2
3 7 3
4 7 4
5 7 5
6 7 6
7 7 7
8 7 8
9 7 9
10 7 10

Table 3.15 Alarm System for Crowd Limit violation for Test Case #3
Trial Crowd Populated Did Alarm In case of Is Overall
Limit People in System activation, did Response
Setting the Testing Activate? alarm stop after Correct?
Area (YES/NO) continuous 15 (YES/NO)
seconds of no
crowd violation
(YES/NO/NA)
1 8 1
2 8 2
3 8 3
4 8 4
5 8 5
6 8 6
7 8 7
8 8 8
9 8 9
10 8 10

The reliability (in percentage) will be determined for each test cases. This will be

computed by the total count of correct Overall response of the system over the number of trials,

multiplied by 100.

b. Alarm System for Social Distance violation

Once the prototype has detected social distancing violations occurring for straight 30

seconds, the alarm system will play and alarm tone and voice message indicating that a social

distancing protocol has been violated and the cluster where violation has occurred. The alarm

system will continue to be activated until the main camera sensor has detected the violators to

be separated by at least 1 meters. The general procedure for this is presented in the flowchart,

Figure 3.15.
Figure 3.15 Alarm System for Social Distancing Violation Flowchart
For the testing of Alarm System for Social Distancing violation, the testing area will

be populated with two (2) people that are less than 1 meter between each other. 3 trials will be

dedicated for each cluster, having total of 18 trials. The alarm system for social distancing

violation will only be activated if there are detected people object that are less than 1 meter

away from each other for straight 30 seconds. For each trial, the two people in the area will

purposely stay in the assigned cluster and close to each other for at least 30 seconds.

Afterwards, the alarm for social distancing violation should start with voiceover indicating the

cluster where the violation has occurred. This is to test if the alarm system for crowd limit will

response accordingly. Once activated, the people will purposely stay away from each other.

The alarm system should turn off after if distance between the detected violators is at least 1m.

Table 3.16 will be used for testing of the reliability of the Alarm System for Crowd Limit

violation program.
Table 3.16 Alarm System for Social Distancing violation

Trial Assigned Cluster Did Alarm System In case of Is Overall


Activate? activation, did Response
(YES/NO) alarm stop after Correct?
violators have (YES/NO)
separated?
(YES/NO/NA)
1 1a
2 1a
3 1a
4 1b
5 1b
6 1b
7 1c
8 1c
9 1c
10 2a
11 2a
12 2a
13 2b
14 2b
15 2b
16 2c
17 2c
18 2c

For each trail, alarm system will be tested if it will response correctly. The alarm system

for social distancing violation will only be activated if there is a measured distance between

people less than 1 meters for straight 30 seconds. Once activated, the alarm tone will play, and

an automated voice message will indicate the cluster where the social distancing violation has

occurred. The reliability (in percentage) will be determined by the total count of correct

response of the system over the number of trials, multiplied by 100.

E. Testing to determine the Accuracy and Cycle Time of the Admonishing System
Admonishing System

The admonishing system will utilize secondary PTZ cameras to have an optimized view

of the violators. After the assigned PTZ camera acquired an optimized view of the violators

specifically detected by the main camera sensor, the following actions will occur at the same

time: (1) an alarm tone will be activated and, (2) the PTZ camera will focus on the violators

and a monitor display will show the optimized image of the violators. The actions of the

admonish system will stop once the main camera sensor have detected the violation have

already been mitigated. The discussed recent studies also have its application for general use,

but for these study, our application would be for establishment where there is still a heavy

volume of people during their operating hours (i.e. government establishments lobby, grocery

mart)

The admonishing system will be activated once the prototype has detected social

distancing violations occurring for straight 30 seconds. Prior to the activation of the

admonishing system, the cluster where the violation has occurred will be identified in order to

determine the assigned PTZ camera to be utilized. The general procedure for this is presented

in the flowchart, Figure 3.16.


Figure 3.16 Admonishing System Flowchart

Cycle Time

The cycle time of the admonishing system will start at the moment the prototype has detected

social distancing violations occurring for straight 30 seconds until the PTZ camera acquired an

optimized view of the violators and a monitor display will show the optimized image of the

violators.

Table 3.17 will be used for testing to determine the accuracy and cycle time of the

Admonishing System. For each trail, the admonishing system will be tested if it will response

correctly. The assigned PTZ camera should be able to focus on the detected cluster and detected

violators for detected violators to be displayed on the monitor. The zoom function of the PTZ

camera may be utilized to zoom in on the detected violators for better video presentation.
Table 3.17 Admonishing System Response

Trial Did assigned Did assigned Did the Is Overall Total cycle
PTZ focus on PTZ focus on monitor Response time (in
correct cluster? detected display Correct? (YES, secs.)
(YES/NO) violators? violators? if all previous 3
(YES/NO) (YES/NO) items are YES.
Otherwise, NO)
1
2

20

For each trial, the admonishing system will be tested for the following functions: (1)

assigned PTZ camera focused on correct cluster, (2) assigned PTZ focused on detected

violators and (3) monitor displays properly the detected violators. A “YES” will be tagged if

the function worked properly. The overall response for each trial will be tagged with “YES” if

all 3 functions are properly working (or were tagged with “YES”). The accuracy of the

Admonishing System will be determined based on the successful overall responses. The

calculation of the accuracy is equal to the number of successful overall response over the total

number of trials performed.

𝑇𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝐶𝑜𝑟𝑟𝑒𝑐𝑡 𝑂𝑣𝑒𝑟𝑎𝑙𝑙 𝑅𝑒𝑠𝑝𝑜𝑛𝑠𝑒𝑠


𝐴𝑐𝑐𝑢𝑟𝑎𝑐𝑦 =
𝑇𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑇𝑟𝑖𝑎𝑙𝑠

For determining the cycle time of the admonishing system, the average of all total cycle time

per trials will be solved.

∑(𝑡𝑜𝑡𝑎𝑙 𝑐𝑦𝑐𝑙𝑒 𝑡𝑖𝑚𝑒)


𝐴𝑣𝑒𝑟𝑎𝑔𝑒 𝐶𝑦𝑐𝑙𝑒 𝑇𝑖𝑚𝑒 =
𝑇𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑇𝑟𝑖𝑎𝑙𝑠
REFERENCES

[1] World Health Organization, WHO Coronavirus (COVID-19) Dashboard - Philippines,

n.d. Accessed on: June 1, 2021. [Online]. Available:

https://covid19.who.int/region/wpro/country/ph

[2] I. Ahmed, M.Ahmad, J.J.P.C. Rodrigues, G. Jeon, S. Din, A deep learning-based social

distance monitoring framework for COVID-19, National Center for Biotechnology

Information, Nov. 1, 2020. Accessed on: June 1, 2021. [Online]. Available:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7603992/

[3] P. Gupta, V. Sharma, S. Varma. People detection and counting using YOLOv3 and SSD

models, ScienceDirect, Jan. 20, 2021. Accessed on: June 1, 2021. [Online]. Available:

https://www.sciencedirect.com/science/article/pii/S2214785320392312

[4] Y. Cai & G. Medioni, Persistent people tracking and face capture using a PTZ camera,

ResearchGate, Nov. 2020. Accessed on: June 1, 2021. [Online]. Available:

https://www.researchgate.net/publication/346028195_Persistent_people_tracking_and_face_

capture_using_a_PTZ_camera

[5] S. Gupta et. al., SD-Measure: A Social Distancing Detector, ResearchGate, Nov. 2020.

Accessed on: June 1, 2021. [Online]. Available:

https://www.researchgate.net/publication/345316546_SD-

Measure_A_Social_Distancing_Detector

[6] T. Banas, How to Calculate Relative Accuracy, Sciencing, Oct. 31, 2020. Accessed on:

June 1, 2021. [Online]. Available: https://sciencing.com/calculate-relative-accuracy-

6069718.html
[7] R. Keniya, M. Mehendale, Real-Time Social Distancing Detector Using

Socialdistancingnet-19 Deep Learning Network, SSRN, Aug. 11, 2020. Accessed on: June 1,

2021. [Online]. Available: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3669311

[8] E.Y. Li, Dive Really Deep into YOLO v3: A Beginner’s Guide, Towards Data Science,

Dec. 31, 2019. Accessed on: June 1, 2021. [Online]. Available:

https://towardsdatascience.com/dive-really-deep-into-yolo-v3-a-beginners-guide-

9e3d2666280e

[9] YOLO: Real-Time Object Detection. n.d. Accessed on: June 1, 2021. [Online]. Available:

https://pjreddie.com/darknet/yolo/

[10] RASPBERRY PI FOUNDATION. RASPBERRY PI. n.d. Accessed on: June 1, 2021.

[Online]. Available: https://www.raspberrypi.org/products/raspberry-pi-4-model-b/

[11] RASPBERRY PI FOUNDATION. Camera Module v2. n.d. Accessed on: June 1, 2021.

[Online]. Available: https://www.raspberrypi.org/products/camera-module-v2/

[12] RASPBERRY PI FOUNDATION. Pi NoIR Camera v2. n.d. Accessed on: June 1, 2021.

[Online]. Available: https://www.raspberrypi.org/products/pi-noir-camera-v2/

[13] L. Duran-Polanco, M. Siller. Crowd management COVID-19. Science Direct. April 12,

2021. Accessed on: June 1, 2021. [Online]. Available:

https://www.sciencedirect.com/science/article/pii/S1367578821000249
[14] Ains. 15 Best Raspberry Pi Cameras and Lenses. Seeed Studio. n.d. Accessed on: June

1, 2021. [Online]. Available: https://www.seeedstudio.com/blog/2020/06/17/15-best-

raspberry-pi-cameras-and-lenses-m/

[15] MathWorks, Object Detection, n.d. Accessed on: June 1, 2021. [Online]. Available:

https://www.mathworks.com/discovery/object-detection.html

[16] World Health Organization. COVID-19: physical distancing. n.d. Accessed on: June 1,

2021. [Online]. Available: https://www.who.int/westernpacific/emergencies/covid-

19/information/physical-distancing

[17] World Health Organization. COVID-19 transmission and protective measures. n.d.

Accessed on: June 1, 2021. [Online]. Available:

https://www.who.int/westernpacific/emergencies/covid-19/information/transmission-

protective-measures

[18] V. Meel, YOLOv3: Real-Time Object Detection Algorithm (What’s New?), viso.ai, Feb.

25, 2021. Accessed on: June 1, 2021. [Online]. Available: https://viso.ai/deep-

learning/yolov3-overview/

[19] Sparkfun, What is an Arduino, n.d. Accessed on: June 1, 2021. [Online]. Available:

https://learn.sparkfun.com/tutorials/what-is-an-arduino/all

[20] Arduino, What is Arduino?, Feb. 05, 2018. Accessed on: June 1, 2021. [Online].

Available: https://www.arduino.cc/en/Guide/Introduction
[21] N. Bansal, Object Detection using YoloV3 and OpenCV, Towards Data Science, Mar. 8,

2020. Accessed on: June 1, 2021. [Online]. Available:

https://towardsdatascience.com/object-detection-using-yolov3-and-opencv-19ee0792a420

[22] Q. -C. Mao, H. -M. Sun, Y. -B. Liu and R. -S. Jia, "Mini-YOLOv3: Real-Time Object

Detector for Embedded Applications," in IEEE Access, vol. 7, pp. 133529-133538, 2019, doi:

10.1109/ACCESS.2019.2941547.

[23] S. Campbell, Basics of UART Communication, Circuit Basics, n.d. Accessed on: June 1,

2021. [Online]. Available: https://www.circuitbasics.com/basics-uart-communication/

[24] Centers for Disease Control and Prevention, Guidance for Unvaccinated People, June

11, 2021. Accessed on: June 23, 2021. [Online]. Available:

https://www.cdc.gov/coronavirus/2019-ncov/prevent-getting-sick/social-distancing.html

[25] JetBrains, PyCharm Features, n.d. Accessed on: June 1, 2021. [Online]. Available:

https://www.jetbrains.com/pycharm/features/

[26] JetBrains, Get started, June 11, 2021 Accessed on: June 23, 2021. [Online]. Available:

https://www.jetbrains.com/help/pycharm/quick-start-guide.html#create

[27] S. N. Sinha,, Pan-Tilt-Zoom (PTZ) Camera, Springer Link, Feb. 5, 2016. Accessed on:

June 1, 2021. [Online]. Available:

https://link.springer.com/referenceworkentry/10.1007%2F978-0-387-31439-6_496

[28] T. Bella, Places without social distancing have 35 times more potential coronavirus

spread, study finds, The Washington Post, May 15, 2020 .Accessed on: June 1, 2021.
[Online]. Available: https://www.washingtonpost.com/nation/2020/05/15/social-distancing-

study-coronavirus-spread/

[29] AdrieSenstosa, Raspberry Pi - Arduino Serial Communication, n.d. Accessed on: June

1, 2021. [Online]. Available: https://www.instructables.com/Raspberry-Pi-Arduino-Serial-

Communication/

You might also like