0% found this document useful (0 votes)
24 views9 pages

Paper 5 Virtual ML

The document presents a novel AI and ML-based gesture-controlled virtual mouse system that allows users to perform mouse actions using hand gestures detected by a webcam. This system aims to enhance human-computer interaction by eliminating the need for physical mouse devices, thereby reducing the risk of COVID-19 transmission. The proposed approach is implemented in Python and utilizes computer vision techniques to track and recognize hand gestures for various mouse functions, including cursor movement and clicking actions.

Uploaded by

121vaishnavi4023
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views9 pages

Paper 5 Virtual ML

The document presents a novel AI and ML-based gesture-controlled virtual mouse system that allows users to perform mouse actions using hand gestures detected by a webcam. This system aims to enhance human-computer interaction by eliminating the need for physical mouse devices, thereby reducing the risk of COVID-19 transmission. The proposed approach is implemented in Python and utilizes computer vision techniques to track and recognize hand gestures for various mouse functions, including cursor movement and clicking actions.

Uploaded by

121vaishnavi4023
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

International Journal of Engineering Applied Sciences and Technology, 2022

Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209


Published Online October 2022 in IJEAST (http://www.ijeast.com)

AI AND ML BASED GESTURE CONTROLLED


VIRTUAL MOUSEACTIONS SYSTEM - A
NOVEL APPROACH
Pruthvi Kumar P
Dept. of Computer Science and Engineering
Jawaharlal Nehru New College of Engineering
Shivamogga, Karnataka, India – 577204

Dr. Nirmala Shivanand


Professor, Dept. of Computer Science and Engineering
Jawaharlal Nehru New College of Engineering
Shivamogga, Karnataka, India - 577204

Abstract— In the current system, a wired/wireless mouse scroll up/down, drag/drop objects, and volume/brightness
device is used. This needs a battery, a USB cable, and a actions performed using the system's built-in camera
dongle to connect mouse to a PC. In this work, AI and rather than traditional mouse devices. Also, the fingertip of
ML based gesture controlled virtual mouse actions a hand gesture could be monitored using webcam.
system is proposed and developed. This is a novel Gesture recognition virtual mouse is a detection technique
approach. In this approach, human and PC interaction that employs the mathematical interpretation to detect, track
is achieved using hand gestures. To detect hand gestures, and recognize human hand gestures as given instructions.
to control the position of the virtual mouse with the These may be in any form, such as a hand picture or pixel
fingertip rather than a physical mouse device, a media image. that requires more minor computing difficulties for
pipe model is used. This proposed system is implemented the hand recognition to work using a webcam while using a
in Python and it will reduce the likelihood of COVID-19 wireless or Bluetooth mouse and the dongle connected to
spreading by eliminating human intervention and the the PC. It needs more significant computational problems,
dependency on devices to control the PC directly. and a battery is required to power the mouse. In this
proposed system, the webcam captures frames and then
Keywords— AIML, Python, Virtual Mouse, Computer processes the captured video frames, recognizing various
Vision, Media pipe hand gestures before performing the specific mouse
function.
I. INTRODUCTION The proposed approach works on an inexpensive central
Nowadays, numerous mouse options are available in the processing unit (CPU) processor without graphics
market today, such as Wireless Mouse, Wired Mouse, processing unit (GPU). The system webcam captures hand
Optical Mouse, Laser Mouse, Gaming Mouse, Stylus tips and fingertips images at a rate of 30fps (frames per
Mouse, Presentation Mouse, and Vertical Mouse. second), and it provides long-distance simultaneous
Regardless, users don't know how much the physical mouse fingertip tracking to control the mouse cursor with very high
accuracy improves. The mouse will always have some precision. The model can perform exceptionally well in
limitations since it is a hardware input device integrated real-world and real-time applications using just a CPU and
with the computer. Like any other physical object, the no GPU.
mouse will have a limited lifetime within which it is
functional, and after its lifetime, it is needed to change the II. LITERATURE REVIEW
physical mouse. As technology evolves, everything Many prior experiments on the virtual mouse used hand
becomes virtualized. gesture detection by wearing a glove in hand and colored
The AI [12] and ML based gesture controlled virtual mouse tips in the hands for gesture recognition. Further, these
actions system utilizes human hand gesture using hand and methods were not much accurate in mouse operations.
fingers tips detection to perform mouse operations on the Paper [1] Based on the hand's shape coordinates, the
PC window screen using computer vision techniques. The Kcosine algorithm method is utilized to locate the fingertip
main goal is to perform mouse operations in real-time. position. The system monitors fingers in real-time, 30
These functions include cursor movement, left/right-click, frames per second on a computer with single CPU and

201
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)

Kinect V2. This fingertip-gesture-based interface enables Paper [6] Based on the openCV, wx, and numpy libraries
people to interact with computers by hand. The hand area were used to create the virtual mouse model. The model
of interest and palm centre are initially retrieved by utilising will launch a webcam to collect the user's fingertip colour
in-depth skeleton-joint information images from a Microsoft caps frames. The mouse movement is dependent on the
Kinect Sensor. The shapes of the hand and fingers are then highlighted colour that the user specifies for mouse
retrieved and characterized using a border-tracing technique. movement. This project may be beneficial for presentations
Finally, the fingertip position is mapped to RGB images to as well as eliminate the weight of additional hardware
operate the mouse cursor on a virtual screen. However, the devices. However, it does not operate in low-light conditions
detection depth of the Kcosine algorithm is restricted and since noise appears in collected frames.
unsuitable for outdoor applications. Paper [7], Developed work focuses on improving human
Paper [2] Developed Hand gesture tracking generates computer interaction systems [11] using hand gestures in 3-
images from camera frames by detecting the shape and D space by employing two USB cameras that are orthogonal
forming a convex hull around it. This research employs two to each other to acquire top and side views of different hand
approaches for monitoring fingers: one uses coloured caps movements in position. The hand's pointing motion is
and the other employs hand motion recognition. This estimated and mapped to the coordinate system of the
method has been subjected to extensive testing in real-world screen. It also employed additional hand motions to
circumstances. To execute mouse activities, hand features complete the actions of virtual mouse movements and hand
are extracted using the area ratio of shapes as well as the pointing to point at the screen, as well as other operations
area of features per square centimeter. like folder/object selection. However, the inventor of this
Paper [3] Designed hand gestures is utilized to create an solution uses two cameras to detect the mouse, which is
optical mouse and keyboard. The to computer's camera will very expensive. Click operations don’t work properly.
interpret the images of various movements done by a system camera windowlagging issues.
person's hand. The mouse will move according to the Paper [8] Designed a system can recognise particular human
movement of the gestures, even performing right and left gestures use to communicate information for device control
clicks using the defect calculations of the convex hull is one of the main objectives of gesture recognition. By
algorithm, which generates the mouse and keyboard implementing real-time gesture recognition, a user can
functions with the defects. It can detect and identify hand control the volume of a desktop by making a particular hand
gestures in order to control mouse and keyboard operations gesture in front of a video camera connected to a computer.
and generate a real-world user interface. This project has the With the aid of the OpenCV module, create a hand gesture
potential to be extremely large in medical science where volume control system. In this project no need for a
computing is necessary but limitations are However, if some keyboard or mouse, the system may be controlled here via
external noise or fault is found in the working region of the hand gestures. This is straightforward software control with
camera, the Convex Hull method may create problems. hand gestures. Using simple hand gestures and a desktop-
Paper [4] Based on the object tracking system to control based volume control system, a user may adjust the volume
mouse cursor actions system that uses hand gestures in real-time. Additionally, we suggest using mouse cursor
captured frames from a webcam using an HSV color motion control. however, it only has the ability to increase
detection technique. This system enables the user to control and decrease volume but don’t perform direct volume mute
the computer cursor with their hand bearing colour caps or operation.
tapes that the PC webcam tracks, as well as perform mouse Paper [9] Based on hand gesture recognition is an
operations such as left click, right click, and double click intelligent, intuitive, and practical method of interacting
using various hand gestures. The author implemented this between humans and robots (HRI). In this study, the author
project using the Python programming language. The gathered data for gesture recognition using a novel data
drawback is due to noise in the captured live webcam glove named YoBu. The extreme learning machines (ELM)
frames and it cannot work in low light conditions. and SVM models gathered information on static gestures,
Paper [5] Designed a virtual mouse system based on HCI created a gesture dataset, and examined which variables are
that employs computer vision and hand motions. The crucial for classification. This gesture identification is based
system uses a webcam or built-in camera to take frames and on a 54-dimension data glove called the ELM Kernel, which
analyses them using color segmentation and detection can perform better. Future robotic teleoperation based on
techniques in order to make the frames trackable. After that, gesture recognition may benefit from this study.
it recognizes various user gestures and executes the mouse nonetheless, it has a drawback since the user must wear
operations. So, a user can primarily do left clicks, right gloves.
clicks, and double clicks, as well as scroll up and down Paper [10] Designed a real-time tracking based virtual
operations with their hand in various gestures. The mouse application and virtual painting using finger colour
limitation is that noise is present in the gathered live caps based on gesture patterns and executed using a camera
webcam frames, it cannot function in low light. with the aid of artificial digital vision that integrates image

202
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)

processing and gesture recognition. Because the whole frames of data from a live webcam with the help of the
system is wireless and based on hand motion tracking and OpenCV python library. The Media pipe models are used to
gesture recognition of coloured objects, it performs air track, detect, and recognize the hand gestures from each
gestures, which will be captured and created on the laptop individual frame that is captured.
screen and projected onto a wall for the benefit of the class The proposed system module is divided into 4 parts: the
attendees. This application is particularly for professors or hand tracking module, mouse module, volume module, and
lecturers who are colour blind, have poor vision, or have brightness module. The hand tracking module provides all
vertebral deformities, damaged legs, or spinal abnormalities. of the essential libraries, and the hand detection class
however, the user's usage of coloured caps to operate the contains data about hand tracking, hand detection, hand
application is a constraint. location, hand landmarks, which fingertip is up, distance
From the detailed literature survey, it is evident that between two finger tips, and capture frame time period. To
controller hand and finger tips are not very accurate and do perform mouse operations, the following Mouse class
not provide much hassle-free control throughout all mouse module provides four mouse action methods: cursor
activities. movement, left/right click, scroll up/down, and drag/drop
objects. Following that, the Volume and Brightness class
III. PROPOSED SYSTEM module methods contain system data to perform
The main aim of the proposed work is to perform mouse increase/decrease/mute volume and increase/decrease
operations using human hand and fingers tips in virtual. The control brightness.
system collects real-time datasets by capturing streaming

Fig. 1. Flowchart of Working Real-Time Virtual Mouse Model

203
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)

webcam screen to the PC window full screen for controlling


mouse operation. When the hands are detected, as well as
find which finger is up for performing the specific mouse
actions, a rectangular region box is created in cv2 window
screen. where the mouse moves the cursor across the PC
window full screen and performs the mouse operation which
is shown in Figure 1.

C. List of Virtual Mouse Operations Developed in


Proposed System Model
Fig. 2. 21 Hand Landmarks The proposed system developed with OpenCV and Media
pipe Model detects the hand tip and fingertips and draws a
A. Capturing the System Webcam video frames and red rectangular window box around the cv2 window using a
processing the Hand image media pipe model that calculates the coordinates of the
The proposed gesture controlled virtual mouse system is fingertips of the hand from the screen capture cv2 window
built on a media pipe model and OpenCV libraries that to be able in the system. Therefore, the virtual mouse
capture real-time data frames from webcams and utilize should perform some actions on the PC window screen as
them as input for further computation. the user moves his/her fingertip. The following functions of
Here, the system's primary camera captures user hand the virtual mouse are implemented.
frames as shown in Figure 1.
cap = cv2.VideoCapture(0) i. Movement of Mouse Cursor around the PC Window
The webcam, where each frame is collected until the The Hand Gestures the index finger is up with fingerUp ID
application is terminated. The video frames captured by = 1. That, i.e., the virtual mouse is in moving mode. When
webcam are saved in BGR color format. The BGR color the users move their index finger around on the rectangle
format must be converted to RGB with OpenCV can process region, the cv2 window screen correspondingly mouse
the frames. Then, OpenCV reads frame by frame using the cursor also move in the same direction on the PC window
media pipe model to recognize hand landmarks and position screen. The speed at which the mouse cursor moves is
to detect the hand tips and fingertip's actions in the video proportional to the speed of the hand pointing gesture
frames. movement as shown in Figure 3.

B. Initializing Media pipe Model and Draw


Rectangular Region for Mouse Moving through the
Window
Initializing the media pipe model is responsible for
monitoring hand gestures and other events. Next, set the
minimum confidence for hand detection and hand tracking
by

self.hands=self.mpHands.Hands(self.mode=False,self.maxH
ands=1,self.modelCon=1,self.detectionCon=0.85,
self.trackCon=0.15)
i.e., this threshold is fixed to attain good accuracy for hand
gesture detection. Then, draw an axis line across the
fingertips and hand tips using the media pipe. This keeps
track of finger movement and establishes connections with
neighbouring fingers.
The preconfigured media pipe model is being used to
capture the 21 hand landmarks and monitor them as shown
in Figure 2 If no landmarks are identified, the default value
is zero. This indicates that no action is required. Fig. 3. Virtual Mouse Movement Action
Otherwise, it returns the coordinate value of the X, Y, and
Z axes, indicatingthat action is needed. ii. Left Click
Following the capture of all hand landmarks from the The Hand Gestures of the index and middle fingers are up
webcam, a red rectangle box appears around the cv2 with fingerUp ID = 1 and fingerUp ID = 2. A mouse
Window. This converts fingertip coordinates from the performs the left-click action on the PC window screen

204
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)

when the distance between the index and middle fingertips with fingerUp ID = 1 and fingerUp ID = 4. A mouse
is less than 40 cm or when they are brought closer together. performs the scroll-up action on the PC window screen
The speed of the mouse's left-click action is proportional to when the distance between the index and pinky fingertips is
the distance victory gesture from the start point as shown in less than 90 cm or when they are brought closer together.
Figure 4. The speed of the mouse scroll-up action is proportional to
the distance horns gesture from the start point, and it is
controlled vertical/horizontal by horns gesture actions as
shown in Figure 6

Fig. 4. Virtual Mouse Left Click Action

iii. Right Click


Fig. 6. Virtual Mouse Scroll UP Action
The Hand Gestures of the index and middle fingers are up
with fingerUp ID = 1 and fingerUp ID = 2. A mouse
v. Scroll Down
performs the right-click action on the PC window screen
The Hand Gestures of the index and pinky fingers are up
when the index and middle fingertips are wider than one
with fingerUp ID = 1 and fingerUp ID = 4. A mouse
another, or when the gap between adjacent fingers range
performs the scroll-down action on the PC window screen
between less than 95cm and more than 110cm. The speed
when the distance between the index and pinky fingertips is
of the mouse right-clickaction is proportional to the distance
greater than 90 cm or when they are wider than each
victory gesture from the start point as shown in Figure 5.
other. The speed of the mouse scroll-down action is
proportional to the distance horns gesture from the start
point, and it is controlled vertical by horns gesture actions as
shown in Figure 7.

Fig. 5. Virtual Mouse Right Click Action

iv. Scroll UP Fig. 7. Virtual Mouse Scroll Down Action


The Hand Gestures of the index and pinky fingers are up

205
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)

vi. Drag Object viii. Volume Controller


The Hand Gestures of the thumb and index fingers are up The Hand Gestures of the thumb and index fingers are up
with fingerUp ID = 0 and fingerUp ID = 1. A mouse with fingerUp ID = 0 and fingerUp ID = 1. A mouse
performs the drag object action on the PC window screen controls the volume action on the PC window screen when
when the distance between the thumb and index fingertips is the thumb and index fingertips are closer together to reduce
less than 50 cm or when they are brought closer together. It the volume to 0% or wider than one another to increase the
can move objects(files/folders) from one directory to volume to 100% depending on the length of the adjacent
another. The speed of the mouse drag action is proportional fingers, which range should be between 50 cm and 150 cm.
to the distance pinch gesture movement, and it is controlled The increase/decrease of volume control rate is proportional
drag actions by pinch gesture movement as shown in Figure to the distance pinch gesture movement from the start point
8. as shown in Figure 10.

Fig. 8. Virtual Mouse Drag Object Action

vii. Drop Object


The Hand Gestures of the thumb and index fingers are up Fig. 10. Virtual Mouse Volume Control Action
with fingerUp ID = 0 and fingerUp ID = 1. A mouse
performs the drop object action on the PC window screen ix. Volume Mute
when the index and middle fingertips are wider than one The Hand Gestures of the index and middle fingers are up
another, or when the gap between adjacent fingers range with fingerUp ID = 1 and fingerUp ID = 2. A mouse
between 60cm and 65cm. It can drop objects(files/folders) performs the volume mute action on the PC window screen
from one directory to another. It is controlled drop objects when the distance between the index and middle fingertips
by pinch gesture action as shown in Figure 9. is less than 40 cm or when they are brought closer together.
The speed of the mouse volume mute action is proportional
to the distance victory gesture from the start point as shown
in Figure 11.

Fig. 9. Virtual Mouse Drop Object Action


Fig. 11. Virtual Mouse Volume Mute Action

206
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)

x. Brightness Controller IV. EXPERIMENTAL RESULTS


The Hand Gestures of the thumb and index fingers are up The proposed system is built on the principle of simulating
with fingerUp ID = 0 and fingerUp ID = 1. A mouse mouse actions on a PC screen window by using modern AI
controls the brightness action on the PC window screen and ML technologies, such as computer vision technology,
when the thumb and index fingertips are closer together to to improve human and PC interaction. During the analysis
reduce the volume to 0% or wider than one another to and prediction of the proposed model dataset, real-time data
increase the volume to 100% depending on the length of was captured, such as hand tracking, fingertips detection,
the adjacent fingers, which range should be between 50 and hand gesture recognition from the webcam with
cm and 150 cm.The increase/decrease of brightness control OpenCV. The results of the prediction model recognizing
rate is proportional to the distance pinch gesture movement hand gestures were achieved under various lighting
from the start point as shown in Figure 12. conditions and varying distance between hand and webcam.
Testing this approach is carried out in various lighting
conditions and at varying distances from the webcam. The
proposed approach is tested by each user 16 times by four
different peoples: professionals, ordinary peoples, students,
and expert peoples. The users performed mouse operations
eight times in standard, low, high, and dim lighting
conditions, and eight times at close, long, and normal hand
distances from the webcam, The results of mouse actions
are recorded in table 1.

Table -1: Users Tested Experimental Result

Mouse Expt. 1 Expt. Expt. 3 Expt. Accuracy


Operatio 2 4
ns
Mouse
Cursor 100 100 100 100 100/100
Fig. 12. Virtual Mouse Brightness Controller Action
Moveme
nt
xi. No Actions
Left
The Hand Gestures of the finger’s tips are down the PC
Click 100 100 100 99 100/100
doesn’t perform any mouse actions in the window screen as
shown in Figure 13. Right
Click 99 100 99 98 99/100
Scroll UP
99 100 99 99 99/100
Scroll
Down 100 100 100 99 100/100
Drag
Object 98 99 98 97 98/100
Drop
Object 98 99 98 97 98/100
Volume
Control 100 100 100 100 100/100
Brightne
ss 100 100 100 100 100/100
Control
No
Actions 100 100 100 100 100/100
Fig. 13. Virtual Mouse No Actions Total
Result 994 998 994 989 994/1000

207
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)

V. REFERENCE
Table 1 shows the users manually tested results of a virtual [1] Tran, D.S., Ho, N.H., Yang, H.J., Kim, S.H. and Lee,
mouse actions with professional peoples, ordinary peoples, G.S. (2021). Real-time Virtual Mouse System using
students, and expert peoples. The results indicate no RGB-D Images and Fingertip Detection. In
statistically significant difference between real and virtual Proceeding of International Conference on
mouse operations. Users can perform dynamic virtual Multimedia Tools and Applications, pp.10473-
mouse operations with 99% accuracy. After analyzing the 10490.
mouse operations perform admirably in a variety of [2] Reddy, Vantukala VishnuTeja, Thumma
illumination situations, noisy backgrounds, distance Dhyanchand, Galla Vamsi Krishna, and Satish
tracking. The prediction model obtained astounding Maheshwaram. (2020). Virtual Mouse Control Using
accuracy after examining the outcomes of the suggested Colored Finger Tips and Hand Gesture Recognition.
model. As shows in figure 14 bar graph chart that In Proceeding of International Conference in
demonstrates the percentage level of mouse operation Hyderabad Section, IEEE, pp.1-5.
accuracy that was attained after each user's virtual mouse [3] Chowdhury, S.R., Pathak, S., Praveena, M.A. (2020).
activity. Gesture Recognition Based Virtual Mouse and
Keyboard. In Proceeding of 4th International
Conference on Trends in Electronics and Informatics,
IEEE, pp.585-589.
[4] Shetty, Monali, Christina A. Daniel, Manthan K.
Bhatkar, and Ofrin P. Lopes. (2020). Virtual Mouse
Using Object Tracking. In Proceeding of 5th
International Conference on Communication and
Electronics Systems, IEEE, pp.548-553.
[5] Shibly, Kabid Hassan, Samrat Kumar Dey, Md
Aminul Islam, Shahriar Iftekhar Showrav. (2019).
Design and development of hand gesture based
virtual mouse. In Proceeding of 1st International
Conference on Advancesin Science, Engineering and
Robotics Technology, IEEE, pp.1-5.
[6] Varun, K.S., Puneeth, I. and Jacob, T.P. (2019).
Virtual mouse implementation using OpenCV. In
Proceeding of 3rd International Conference on
Trends in Electronics and Informatics, IEEE, pp.435-
Fig. 14. Level of Accuracy of virtual mouse actions 438.
[7] Shajideen, S. M. S., Preetha, V. H. (2018). Hand
In existing works, analysis of results were not done. In our Gestures- Virtual Mouse for Human Computer
work, the analysis of results are conducted by giving our Interaction. In Proceeding of International Conference
software application to different types of users (professional on Smart Systems and Inventive Technology, IEEE,
peoples, ordinary peoples, students, and expert peoples). pp.543-546.
Based on this accuracy is computed. Table 1 show the [8] SK. Abdul Soniya, R.V. Harshitha, Y. Veera Reddy,
accuracy of various mouse actions tested by different types Dr. Ratna Raju Mukiri. (2018). A Novel Approach:
of peoples. Hand Gesture Volume Control System. In
Proceeding of International Journal of Research,
IV. CONCLUSION volume-7, issue-7, pp.200-206.
The proposed system uses a PC webcam to capture 30fps [9] Lu, D., Yu, Y. and Liu, H. (2016). Gesture
real-time frames with OpenCV. The mediapipe model recognition using data glove: An extreme learning
recognises those frames to process hand gestures and machine method. In Proceeding of IEEE international
perform mouse operations like left/right-click, scroll conference on robotics and biomimetics, IEEE,
up/down, drag/drop objects, and volume/brightness control pp.1349-1354.
without using a physical mouse devices. It is evident from [10] Bhattacharjee, A., Jana, I., Das, A., Kundu, D.,
test results, that the proposed system model performs well Ghosh, S. and Gupta, S.D. (2015). A novel
even in dim lighting conditions. The mouse operations work probabilistic approach of colored object detection and
very well in real-world and real-time applications using just design of a gesture based real-time mouse tracking
a CPU and no GPU. The proposed model's accuracy is 99%. along with virtual teaching intended for color-blind
people. In Proceeding of 2nd International

208
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)

Conference on Signal Processing and Integrated


Networks, IEEE, pp.512-519.
[11] Anusha S, N Vignesh Karthik and Sampada K S.
(2018). Comparative Study on Voice Based Chat
Bots. International Journal of Computer Sciences and
Engineering, Vol. 6, No. 12, pp. 172-175.
[12] Sampad Mondal, Subham Paul, Somjeet Ganguly and
Ratul Dey. (2019). Virtual Personal Assistant with
Face Recognition Login System Using Machine
Learning and Artificial Intelligence. International
Journal of Engineering Science and Computing, Vol.
9, No. 4, pp. 21603-21605.

209

You might also like