Vision-Based Approach For Human Motion Detection and Smart Appliance Control
Vision-Based Approach For Human Motion Detection and Smart Appliance Control
Siddharth Swami1, Rajesh Singh2, Anita Gehlot2, Mohammed Ismail Iqbal3, Sameer Dev Sharma4,
Dharmendra Kumar4, Sanjeev Kumar Shah5
1
School of Environment and Natural Resources, Doon University, Dehradun, Uttarakhand, India
2
Division of Research and Innovation, Uttaranchal Institute of Technology, Uttaranchal University, Dehradun, Uttarakhand, India
3
College of Engineering and Technology, University of Technology and Sciences, Nizwa, Oman
4
Uttaranchal School of Computing Sciences, Uttaranchal University, Dehradun, Uttarakhand, India
5
Uttaranchal Institute of Technology, Uttaranchal University, Dehradun, Uttarakhand, India
Corresponding Author:
Siddharth Swami
School of Environment and Natural Resources, Doon University
Dehradun, Uttarakhand, India
Email: [email protected]
1. INTRODUCTION
The incorporation of intelligent technologies into our daily lives has grown more common in a time
of quick technological breakthroughs and a growing emphasis on automation and convenience [1]. Appliance
control systems is able to adjust the real-time occupancy status through feedback, while traditional motion
sensors if not placed properly, it prone to false alarms caused by dogs or inanimate objects [2]. This study
introduces smart pose, a controller for smart home appliances that relies on visual posture recognition. The
Open Pose library is used to extract important points from human joints using a back propagation neural
network [3].
In the study, a novel deep learning-based method for creating Internet of Things (IoT)-based
intelligent home security and appliance management systems in smart cities is presented. It seeks to deliver
the best results even with inadequate information [4]. The idea enunciates a structure that is precise in
detecting gestures and which can be used to control devices through manual movements. Most people have
seen the usefulness of this technology in daily life [5]. A new method has been developed to locate an
immobile subject inside a house by reducing the immobile and mobile obstacles. To this end, it is necessary
to combine millimeter-wave frequency-modulated continuous wave (FMCW) radar with EMA algorithm
having high-pass filter properties [6]. A new method for detecting static objects indoors with minimal motion
and immobility disturbances has been developed. These are done using FMCW millimeter-wave radar and
EMA algorithm with high-pass filtering capabilities. This is of great advantage for human life as it helps in
the protection of people’s assets [7].
This study accentuates the significant strides made towards smart home automation by improving
security, comfort, convenience and safety. Technological advancements and the IoT has enhanced remote
monitoring and securing of homes as well as better regulation of appliances [8]. Numerous home automation
systems have been developed to detect and alert any changes occurring within a house [9]. In this research, a
visual servo controller leads a differential drive mobile robot to the direction of a stationary target. A triangle
trigonometry kinematic model or a weighted graph is used for this purpose. The study details a robot vision
system that decodes original images captured by a camera sensor using optical flow. These processed photos
are then used to classify individuals and behaviors.
2. LITERATURE REVIEW
A simple, inexpensive, and user-friendly method that makes use of a standard camera is best to
obtain vision based detection [10]. Although this method has been meticulously developed with the goal of
accommodating the elderly and those with disabilities, it is nevertheless insensitive to hand form variances
[11]. By detecting human presence, our proposed intelligent electrical appliance management system seeks to
efficiently operate household equipment. Using this technology, appliances can be turned off when no one is
around, potentially saving unnecessary energy [12]. We provide a novel approach to tracking human motion
without requiring additional devices.
Wi-Fi RSSI and CSI data collected from widely available IoT devices are utilized in this method.
First, a 4D feature vector is extracted from the temporal data spread and used to train a two-stage ensemble
machine learning model [13]. To solve the problem of hand gesture-based home appliance control, we
present a basic convolutional neural network (CNN) method [14]. This approach uses hand recognition with
DetNet to calculate 3D hand skeleton positions. Impressively, it attains a testing dataset accuracy percentage
of 99.4%. The findings of our investigation demonstrate how this technology can be utilized to monitor
various areas of a home separately [15]. A smart home can transmit data to a home appliance control module
via a server. For instance, enabling living areas and remote control and monitoring of the same wireless home
are ways through which one can transfer data transparently to a control module for home appliances [16].
A millimeter-wave radar sensor is applied by our technique in this work to identify human
movements such as walking, running, crawling and standing [17]. The statistical nature of radar signals in
movement shifts detection mode and the changes identified are incorporated into two stages of signal
processing used by our method [18]. Then we have to decide what the person is doing with a deep learning-
based categorization system. Both processing steps rely on temporal changes in distance that are provided by
radar spectrograms [19].
Important devices used in this research are a passive infrared sensor (PIR) based motion detection
sensor, an Arduino Mega2560 microcontroller board, an IoT prototype NodeMCU V3 ESP8266 Wi-Fi
development board, and a SIM800L GSM module with GPRS capabilities [20]. We provide here a method to
characterize the movement of a human being in a space surrounded by. This device provides occupancy
tracking across a building while protecting people’s privacy by applying localized processing to ascertain
people’s direction based on information from an infrared array sensor. The system processes data from a
cheap infrared array sensor using a state-of-the-art real-time pattern recognition algorithm [21].
IAES Int J Rob & Autom, Vol. 13, No. 4, December 2024: 445-451
IAES Int J Rob & Autom ISSN: 2722-2586 447
− The ESP32 determines the automation of smart things by using relay modules. Relay devices, which act
as hands for this project, can control when home appliances get switched on or off. When the ESP32
sensor detects human presence and makes a decision about it, these tasks are then performed by the relay
modules which optimize energy utilization and make the user’s life convenient.
The block diagram represents the key parts of our project, which comprises related components and
subsystems. By summarizing, it highlights the way in which these components work together to make an
intelligent and responsive home automation system. You can enlarge this graphic to incorporate more
information or subsystems depending on the difficulty of the task at hand.
4. HARDWARE DEVELOPMENT
The hardware human motion smart home automation is shown in Figure 2.
− The ESP32 board is the brain of the complete system where the external power supply has been attached
with the 5 V and ground pin.
− The PIR Motion sensor has been attached to the ESP32 and in any motion detection, it will send the
information to the ESP32 board.
− The ground pin of the motion sensor has been connected to the common ground and the voltage pin to the
external power. The data pin of the motion sensor has been attached with the 12 no pin of the ESP32 s
board.
− The STM32 board is linked with a buzzer for alerting purposes. The power pin is linked to the ESP32
board’s pin 7, and the buzzer’s ground pin is linked to the ground.
− This project is connected to the ESP01 module, which will send all information and alerts to the cloud.
Therefore, the ESP01’s voltage pin and ground pin have been linked to the ESP32 board’s 3.3 V and
common ground, respectively.
− The ESP01 module’s transmitter pin has been linked to the board’s 19th node, and its Receiver pin has
been attached to the board’s 21st node.
Vision-based approach for human motion detection and smart appliance control (Siddharth Swami)
448 ISSN: 2722-2586
The motion sensor, which acts as the system’s eyes, feeds data to the project. Any movement within
the motion sensor’s defined field of view will trigger its sensitivity. The project signals the possibility of a
potential human presence by recording the event and its timestamp after motion is detected. The system
moves on to the human presence analysis stage. Here, the system delves more deeply into the information the
camera module has recorded [23].
To verify the presence of a person, powerful computer vision algorithms are used to examine the
image. The system logs the event and timestamp if human presence can be positively identified. It then turns
on the pertinent relay module in charge of managing the connected smart appliances. Appliance control is
managed by a timer, enabling automated deactivation after a given period of time or when no further motion
is detected. Updated system state flags show the current occupancy [24]. The system keeps track of the
occupancy status while maintaining constant motion and human presence monitoring. Figure 4 shows real
time detection of the facial with motion sensor comparison for the house automation. The system reacts if no
motion is detected for a predetermined amount of time and human presence is no longer verified. To save
energy, it turns off any relay modules that were previously turned on [25].
IAES Int J Rob & Autom, Vol. 13, No. 4, December 2024: 445-451
IAES Int J Rob & Autom ISSN: 2722-2586 449
To reflect the absence of occupancy, system state flags are updated once more. An alternate
user interface, such as a mobile app or web interface, can be implemented for improved user interaction
and convenience. This interface enables system monitoring and remote control [26]. Features like
feedback, notifications about detected human presence, and appliance management are all part of the
project’s effort to improve user experience. It also has error-handling protocols to deal with different
problems such as malfunctioning hardware, problems with connectivity, and unexpected behavior of
the system [27].
To improve the system’s dependability and reliability, protocols are put in place that include
alerts for system failures and recovery mechanisms. When necessary, a termination procedure is
established to guarantee appropriate system shutdown [28]. This procedure puts an end to the system’s
activities and easily deactivates every part, leaving the system safe and secure. The system then repeats
the monitoring and control phases incessantly. This continuous procedure guarantees that the system
will continue to function over time, producing a resource-efficient and responsive home automation
system [29].
6. CONCLUSION
The study demonstrated computer vision technology integration with motion sensors and smart
appliances to improve contemporary living areas. By enhancing the accuracy of human presence recognition
using a camera module and advanced image analysis techniques, the project enables a context-aware home
automation system that intelligently responds to occupants and manages smart appliances through relay
modules. The system is user-friendly, energy-efficient, and versatile, reducing energy waste by using
real-time occupancy status to turn on or off electric appliances. This aligns with sustainability and eases goals
in modern habitation design. The research emphasizes the use of visual-based techniques in home
automation, particularly the Internet of Things and networked smart devices. The project represents
a significant step towards smart home automation, enhancing residential living and adding to discussions
about IoT, computer vision, and environmentally friendly technology. The goal is to create a world where
homes intuitively recognize and accommodate owners’ preferences, creating a more secure, cozy, and
energy-efficient living space.
REFERENCES
[1] K. Zhang and Y. Zhang, “SmartPose: an intelligent household appliances controller based on visual recognition of human
postures,” in 2020 International Conference on Artificial Intelligence and Computer Engineering (ICAICE), Oct. 2020, pp. 218–
223, doi: 10.1109/ICAICE51518.2020.00048.
[2] S. Rastogi and J. Singh, “Human fall detection and activity monitoring: a comparative analysis of vision -based methods for
classification and detection techniques,” Soft Computing, vol. 26, no. 8, pp. 3679–3701, 2022, doi: 10.1007/s00500-021-
06717-x.
[3] S. Khan, S. Nazir, and H. Ullah Khan, “Smart object detection and home appliances control system in smart cities,” Computers,
Materials & Continua, vol. 67, no. 1, pp. 895–915, 2021, doi: 10.32604/cmc.2021.013878.
[4] Y. Muranaka, M. Al-Sada, and T. Nakajima, “A home appliance control system with hand gesture based on pose estimation,” in
2020 IEEE 9th Global Conference on Consumer Electronics (GCCE), Oct. 2020, pp. 752–755, doi:
10.1109/GCCE50665.2020.9291877.
[5] P. W. Tien, S. Wei, J. K. Calautit, J. Darkwa, and C. Wood, “A vision-based deep learning approach for the detection and
prediction of occupancy heat emissions for demand-driven control solutions,” Energy and Buildings, vol. 226, 2020, doi:
10.1016/j.enbuild.2020.110386.
[6] L. M. Dang, K. Min, H. Wang, M. Jalil Piran, C. Hee Lee, and H. Moon, “Sensor-based and vision-based human activity
recognition: a comprehensive survey,” Pattern Recognition, vol. 108, 2020, doi: 10.1016/j.patcog.2020.107561.
[7] P. Nallabolu, L. Zhang, H. Hong, and C. Li, “Human presence sensing and gesture recognition for smart home applications with
moving and stationary clutter suppression using a 60-GHz digital beamforming FMCW radar,” IEEE Access, vol. 9, pp. 72857–
72866, 2021, doi: 10.1109/ACCESS.2021.3080655.
[8] A. Sobhani, F. Khorshidi, and M. Fakhredanesh, “DeePLS: personalize lighting in smart home by human detection, recognition,
and tracking,” SN Computer Science, vol. 4, no. 6, 2023, doi: 10.1007/s42979-023-02240-y.
[9] J. R. B. Bodollo, J. Daniel V. Cortez, E. R. P. Maraya, E. V. Navarro, R. Q. L. Saquing, and R. E. Tolentino, “Selection of
appliance using skeletal tracking and 3D face tracking for gesture control home automation,” in 2019 1st International
Conference on Advanced Technologies in Intelligent Control, Environment, Computing & Communication Engineering
(ICATIECE), Mar. 2019, pp. 1–7, doi: 10.1109/ICATIECE45860.2019.9063625.
[10] O. Taiwo, A. E. Ezugwu, O. N. Oyelade, and M. S. Almutairi, “Enhanced intelligent smart home control and security system
based on deep learning model,” Wireless Communications and Mobile Computing, vol. 2022, pp. 1–22, Jan. 2022, doi:
10.1155/2022/9307961.
[11] M. R. Islam, M. R. Haque, M. H. Imtiaz, X. Shen, and E. Sazonov, “Vision-based recognition of human motion intent during
staircase approaching,” Sensors, vol. 23, no. 11, 2023, doi: 10.3390/s23115355.
[12] E. Dönmez, A. F. Kocamaz, and M. Dirik, “A vision-based real-time mobile robot controller design based on Gaussian function
for indoor environment,” Arabian Journal for Science and Engineering, vol. 43, no. 12, pp. 7127–7142, Dec. 2018, doi:
10.1007/s13369-017-2917-0.
[13] S. Hoshino and K. Niimura, “Robot vision system for human detection and action recognition,” Journal of Advanced
Computational Intelligence and Intelligent Informatics, vol. 24, no. 3, pp. 346–356, May 2020, doi: 10.20965/jaciii.2020.p0346.
Vision-based approach for human motion detection and smart appliance control (Siddharth Swami)
450 ISSN: 2722-2586
[14] J. Wang, L. Jiang, H. Yu, Z. Feng, R. Castaño-Rosa, and S. Jie Cao, “Computer vision to advance the sensing and control of built
environment towards occupant-centric sustainable development: a critical review,” Renewable and Sustainable Energy Reviews,
vol. 192, 2024, doi: 10.1016/j.rser.2023.114165.
[15] R. Golash and Y. K. Jain, “Vision-based user-friendly and contactless security for home appliance via hand gestures,”
Computationally Intelligent Systems and their Applications, pp. 25–37, 2021, doi: 10.1007/978-981-16-0407-2_3.
[16] F. K. Chuah and S. S. Teoh, “Thermal sensor based human presence detection for smart home application,” in 2020 10th IEEE
International Conference on Control System, Computing and Engineering (ICCSCE), Aug. 2020, pp. 37–41, doi:
10.1109/ICCSCE50387.2020.9204940.
[17] J. Qi, L. Ma, Z. Cui, and Y. Yu, “Computer vision-based hand gesture recognition for human-robot interaction: a review,”
Complex and Intelligent Systems, vol. 10, no. 1, pp. 1581–1606, 2024, doi: 10.1007/s40747-023-01173-6.
[18] A. Natarajan, V. Krishnasamy, and M. Singh, “A machine learning approach to passive human motion detection using WiFi
measurements from commodity IoT devices,” IEEE Transactions on Instrumentation and Measurement, vol. 72, pp. 1–10, 2023,
doi: 10.1109/TIM.2023.3272374.
[19] T. Alanazi, K. Babutain, and G. Muhammad, “A robust and automated vision-based human fall detection system using 3D
multi-stream CNNs with an image fusion technique,” Applied Sciences (Switzerland), vol. 13, no. 12, 2023, doi:
10.3390/app13126916.
[20] T.-H. Tsai, Y.-J. Luo, and W.-C. Wan, “A skeleton-based dynamic hand gesture recognition for home appliance control system,”
in 2022 IEEE International Symposium on Circuits and Systems (ISCAS), May 2022, pp. 3265–3268, doi:
10.1109/ISCAS48785.2022.9937780.
[21] R. Zheng, “Indoor smart design algorithm based on smart home sensor,” Journal of Sensors, vol. 2022, pp. 1–10, Apr. 2022, doi:
10.1155/2022/2251046.
[22] B. I. Alabdullah et al., “Smart home automation-based hand gesture recognition using feature fusion and recurrent neural
network,” Sensors, vol. 23, no. 17, 2023, doi: 10.3390/s23177523.
[23] S. Kang, M. Jang, and S. Lee, “Identification of human motion using radar sensor in an indoor environment,” Sensors, vol. 21, no.
7, Mar. 2021, doi: 10.3390/s21072305.
[24] Munawir, A. Ihsan, and E. Mutia, “Wi-Fi and GSM based motion detection in smart home security system,” IOP Conference
Series: Materials Science and Engineering, vol. 536, no. 1, Jun. 2019, doi: 10.1088/1757-899X/536/1/012143.
[25] D. Nagpal and S. Gupta, “Evolution from handcrafted to learned representation methods for vision-based activity
recognition,” International Conference on Soft Computing for Security Applications, pp. 765–775, 2023, doi: 10.1007/978-
981-99-3608-3_53.
[26] U. Masud, N. Abdualaziz Almolhis, A. Alhazmi, J. Ramakrishnan, F. Ul Islam, and A. Razzaq Farooqi, “Smart wheelchair
controlled through a vision-based autonomous system,” IEEE Access, vol. 12, pp. 65099–65116, 2024, doi:
10.1109/ACCESS.2024.3395656.
[27] D. Wu et al., “Computer vision-based intelligent elevator information system for efficient demand-based operation and
optimization,” Journal of Building Engineering, vol. 81, 2024, doi: 10.1016/j.jobe.2023.108126.
[28] F. X. Gaya-Morey, C. Manresa-Yee, and J. M. Buades-Rubio, “Deep learning for computer vision based activity recognition and
fall detection of the elderly: a systematic review,” Applied Intelligence, 2024, doi: 10.1007/s10489-024-05645-1.
[29] C. Perra, A. Kumar, M. Losito, P. Pirino, M. Moradpour, and G. Gatto, “Monitoring indoor people presence in buildings using
low-cost infrared sensor array in doorways,” Sensors, vol. 21, no. 12, 2021, doi: 10.3390/s21124062.
BIOGRAPHIES OF AUTHORS
IAES Int J Rob & Autom, Vol. 13, No. 4, December 2024: 445-451
IAES Int J Rob & Autom ISSN: 2722-2586 451
Vision-based approach for human motion detection and smart appliance control (Siddharth Swami)