Home Surveillance & Automation Using Device Positioning: Rohith M Sunny G Tenson Tomy Vishal TP
Home Surveillance & Automation Using Device Positioning: Rohith M Sunny G Tenson Tomy Vishal TP
Automation
using Device Positioning
Rohith M
Sunny G
Tenson Tomy
Vishal TP
Objective
Implement a reliable, smart and efficient home
automation system which works on the device
positioning mechanism.
4. Relay Module
Software Tools
5. Arduino Pro Mini
1. Python 3.6.5
6. MPU 6050 ( 6 – DOF sensor) 2. Arduino IDE
7. HMC5883L 3 Axis Compass 3. Processing IDE
8. RFID card reader
Methodology
The system consists of two major devices -
i . Wearable device
ii. Central Controlling unit
i. Wearable Unit:
• It's a device which is worn by the user
• Wearable Unit is a smart device which automatically detects the device
(home appliance - light, fan, TV etc) which the user wants to control
and gives the full control of that device at his fingertip.
ii. Central Controlling Unit:
The Central Controlling Unit consist of microcontroller and relays which
are used to control different devices in the house based on the signal
from the wearable device.
Determining Controlling Room
• In order to identify the device which the user wants to control, first of all
we need to find the room where the user is located.
• The location of the user is found based on the RFID tags.
• The wearable unit consist of RFID card reader and RFID tags with unique
ID are placed at the entrances to each room.
• Once the user enters a room, the RFID tags are detected automatically
and based on the unique ID of RFID tag, we could determine the room
entered by the user.
• While leaving the room also this RFID tag is detected, thus we could
identify that the user has left that room.
Detecting device the user wants to control
• The major function of the wearable unit is to determine the device which
the user is looking at.
• For this initially we create a database for each room, which contains the
location (mainly based on orientation) of each device in that room.
• Training Phase
• Testing Phase
Creating Database
• The database consists of appliances like TV, Table Fan, LED Bulb, Tube
Light.
• LabelImg saves a .xml file containing the label data for each image.
• These .xml files will be used to generate TFRecords, which are one of the
inputs to the TensorFlow trainer.
• Once you have labelled and saved each image, there will be one .xml file
for each image in the \test and \train directories.
Labellmg
• Labellmg is a graphical image annotation tool.
Training Phase
• Training has been done using “Google Colab’.
• First, the image .xml data will be used to create .csv files containing all the
data for the train and test images.
For each default box, we predict both the shape offsets and the confidences for all object
categories ((c1,c2,…,cp)). At training time, we first match these default boxes to the ground
truth boxes. For example, we have matched two default boxes with the cat and one with the
dog, which are treated as positives and the rest as negatives. The model loss is a weighted
SSD - Implementation
Matching strategy
• In Fig. 1, the dog is matched to a default box in the 4 4 feature map, but
not to any default boxes in the 8 8 feature map. This is because those
boxes have different scales and do not match the dog box, and therefore
are considered as negatives during training.
• After the matching step, most of the default boxes are negatives, especially
when the number of possible default boxes is large. This introduces a
significant imbalance between the positive and negative training examples.
• Instead of using all the negative examples, we sort them using the highest
confidence for each default box and pick the top ones so that the ratio
between the negatives and positives is at most 3:1.
Novelty
Highly efficient and real time system that makes smart decisions to
control the devices.
Provides control of all the devices at the user’s fingertip. The remote
controller provides the control of all the devices like switching
ON/OFF of a particular device, adjust fan speed, volume of a
television etc.
10.00%
Face Recognition
Device Recognition
15.00%
Wireless Transmission
30.00%
10.00%
Controlling the devices
Time Chart
Percentage of Work Date of completion Description
20% 15 Nov 2018 Face Recognition
10% 20 Nov 2018 Object detection using
image processing
( Interfacing)
10% 11 Feb 2019 Object detection using
device positioning
10% 11 Feb 2019 Wireless Transmission
Controlling the devices: The remote and central unit is connected. The
devices (LED bulb, fan, TV, AC) can be controlled using the remote.