Paper 5 Virtual ML
Paper 5 Virtual ML
Abstract— In the current system, a wired/wireless mouse scroll up/down, drag/drop objects, and volume/brightness
device is used. This needs a battery, a USB cable, and a actions performed using the system's built-in camera
dongle to connect mouse to a PC. In this work, AI and rather than traditional mouse devices. Also, the fingertip of
ML based gesture controlled virtual mouse actions a hand gesture could be monitored using webcam.
system is proposed and developed. This is a novel Gesture recognition virtual mouse is a detection technique
approach. In this approach, human and PC interaction that employs the mathematical interpretation to detect, track
is achieved using hand gestures. To detect hand gestures, and recognize human hand gestures as given instructions.
to control the position of the virtual mouse with the These may be in any form, such as a hand picture or pixel
fingertip rather than a physical mouse device, a media image. that requires more minor computing difficulties for
pipe model is used. This proposed system is implemented the hand recognition to work using a webcam while using a
in Python and it will reduce the likelihood of COVID-19 wireless or Bluetooth mouse and the dongle connected to
spreading by eliminating human intervention and the the PC. It needs more significant computational problems,
dependency on devices to control the PC directly. and a battery is required to power the mouse. In this
proposed system, the webcam captures frames and then
Keywords— AIML, Python, Virtual Mouse, Computer processes the captured video frames, recognizing various
Vision, Media pipe hand gestures before performing the specific mouse
function.
I. INTRODUCTION The proposed approach works on an inexpensive central
Nowadays, numerous mouse options are available in the processing unit (CPU) processor without graphics
market today, such as Wireless Mouse, Wired Mouse, processing unit (GPU). The system webcam captures hand
Optical Mouse, Laser Mouse, Gaming Mouse, Stylus tips and fingertips images at a rate of 30fps (frames per
Mouse, Presentation Mouse, and Vertical Mouse. second), and it provides long-distance simultaneous
Regardless, users don't know how much the physical mouse fingertip tracking to control the mouse cursor with very high
accuracy improves. The mouse will always have some precision. The model can perform exceptionally well in
limitations since it is a hardware input device integrated real-world and real-time applications using just a CPU and
with the computer. Like any other physical object, the no GPU.
mouse will have a limited lifetime within which it is
functional, and after its lifetime, it is needed to change the II. LITERATURE REVIEW
physical mouse. As technology evolves, everything Many prior experiments on the virtual mouse used hand
becomes virtualized. gesture detection by wearing a glove in hand and colored
The AI [12] and ML based gesture controlled virtual mouse tips in the hands for gesture recognition. Further, these
actions system utilizes human hand gesture using hand and methods were not much accurate in mouse operations.
fingers tips detection to perform mouse operations on the Paper [1] Based on the hand's shape coordinates, the
PC window screen using computer vision techniques. The Kcosine algorithm method is utilized to locate the fingertip
main goal is to perform mouse operations in real-time. position. The system monitors fingers in real-time, 30
These functions include cursor movement, left/right-click, frames per second on a computer with single CPU and
201
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)
Kinect V2. This fingertip-gesture-based interface enables Paper [6] Based on the openCV, wx, and numpy libraries
people to interact with computers by hand. The hand area were used to create the virtual mouse model. The model
of interest and palm centre are initially retrieved by utilising will launch a webcam to collect the user's fingertip colour
in-depth skeleton-joint information images from a Microsoft caps frames. The mouse movement is dependent on the
Kinect Sensor. The shapes of the hand and fingers are then highlighted colour that the user specifies for mouse
retrieved and characterized using a border-tracing technique. movement. This project may be beneficial for presentations
Finally, the fingertip position is mapped to RGB images to as well as eliminate the weight of additional hardware
operate the mouse cursor on a virtual screen. However, the devices. However, it does not operate in low-light conditions
detection depth of the Kcosine algorithm is restricted and since noise appears in collected frames.
unsuitable for outdoor applications. Paper [7], Developed work focuses on improving human
Paper [2] Developed Hand gesture tracking generates computer interaction systems [11] using hand gestures in 3-
images from camera frames by detecting the shape and D space by employing two USB cameras that are orthogonal
forming a convex hull around it. This research employs two to each other to acquire top and side views of different hand
approaches for monitoring fingers: one uses coloured caps movements in position. The hand's pointing motion is
and the other employs hand motion recognition. This estimated and mapped to the coordinate system of the
method has been subjected to extensive testing in real-world screen. It also employed additional hand motions to
circumstances. To execute mouse activities, hand features complete the actions of virtual mouse movements and hand
are extracted using the area ratio of shapes as well as the pointing to point at the screen, as well as other operations
area of features per square centimeter. like folder/object selection. However, the inventor of this
Paper [3] Designed hand gestures is utilized to create an solution uses two cameras to detect the mouse, which is
optical mouse and keyboard. The to computer's camera will very expensive. Click operations don’t work properly.
interpret the images of various movements done by a system camera windowlagging issues.
person's hand. The mouse will move according to the Paper [8] Designed a system can recognise particular human
movement of the gestures, even performing right and left gestures use to communicate information for device control
clicks using the defect calculations of the convex hull is one of the main objectives of gesture recognition. By
algorithm, which generates the mouse and keyboard implementing real-time gesture recognition, a user can
functions with the defects. It can detect and identify hand control the volume of a desktop by making a particular hand
gestures in order to control mouse and keyboard operations gesture in front of a video camera connected to a computer.
and generate a real-world user interface. This project has the With the aid of the OpenCV module, create a hand gesture
potential to be extremely large in medical science where volume control system. In this project no need for a
computing is necessary but limitations are However, if some keyboard or mouse, the system may be controlled here via
external noise or fault is found in the working region of the hand gestures. This is straightforward software control with
camera, the Convex Hull method may create problems. hand gestures. Using simple hand gestures and a desktop-
Paper [4] Based on the object tracking system to control based volume control system, a user may adjust the volume
mouse cursor actions system that uses hand gestures in real-time. Additionally, we suggest using mouse cursor
captured frames from a webcam using an HSV color motion control. however, it only has the ability to increase
detection technique. This system enables the user to control and decrease volume but don’t perform direct volume mute
the computer cursor with their hand bearing colour caps or operation.
tapes that the PC webcam tracks, as well as perform mouse Paper [9] Based on hand gesture recognition is an
operations such as left click, right click, and double click intelligent, intuitive, and practical method of interacting
using various hand gestures. The author implemented this between humans and robots (HRI). In this study, the author
project using the Python programming language. The gathered data for gesture recognition using a novel data
drawback is due to noise in the captured live webcam glove named YoBu. The extreme learning machines (ELM)
frames and it cannot work in low light conditions. and SVM models gathered information on static gestures,
Paper [5] Designed a virtual mouse system based on HCI created a gesture dataset, and examined which variables are
that employs computer vision and hand motions. The crucial for classification. This gesture identification is based
system uses a webcam or built-in camera to take frames and on a 54-dimension data glove called the ELM Kernel, which
analyses them using color segmentation and detection can perform better. Future robotic teleoperation based on
techniques in order to make the frames trackable. After that, gesture recognition may benefit from this study.
it recognizes various user gestures and executes the mouse nonetheless, it has a drawback since the user must wear
operations. So, a user can primarily do left clicks, right gloves.
clicks, and double clicks, as well as scroll up and down Paper [10] Designed a real-time tracking based virtual
operations with their hand in various gestures. The mouse application and virtual painting using finger colour
limitation is that noise is present in the gathered live caps based on gesture patterns and executed using a camera
webcam frames, it cannot function in low light. with the aid of artificial digital vision that integrates image
202
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)
processing and gesture recognition. Because the whole frames of data from a live webcam with the help of the
system is wireless and based on hand motion tracking and OpenCV python library. The Media pipe models are used to
gesture recognition of coloured objects, it performs air track, detect, and recognize the hand gestures from each
gestures, which will be captured and created on the laptop individual frame that is captured.
screen and projected onto a wall for the benefit of the class The proposed system module is divided into 4 parts: the
attendees. This application is particularly for professors or hand tracking module, mouse module, volume module, and
lecturers who are colour blind, have poor vision, or have brightness module. The hand tracking module provides all
vertebral deformities, damaged legs, or spinal abnormalities. of the essential libraries, and the hand detection class
however, the user's usage of coloured caps to operate the contains data about hand tracking, hand detection, hand
application is a constraint. location, hand landmarks, which fingertip is up, distance
From the detailed literature survey, it is evident that between two finger tips, and capture frame time period. To
controller hand and finger tips are not very accurate and do perform mouse operations, the following Mouse class
not provide much hassle-free control throughout all mouse module provides four mouse action methods: cursor
activities. movement, left/right click, scroll up/down, and drag/drop
objects. Following that, the Volume and Brightness class
III. PROPOSED SYSTEM module methods contain system data to perform
The main aim of the proposed work is to perform mouse increase/decrease/mute volume and increase/decrease
operations using human hand and fingers tips in virtual. The control brightness.
system collects real-time datasets by capturing streaming
203
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)
self.hands=self.mpHands.Hands(self.mode=False,self.maxH
ands=1,self.modelCon=1,self.detectionCon=0.85,
self.trackCon=0.15)
i.e., this threshold is fixed to attain good accuracy for hand
gesture detection. Then, draw an axis line across the
fingertips and hand tips using the media pipe. This keeps
track of finger movement and establishes connections with
neighbouring fingers.
The preconfigured media pipe model is being used to
capture the 21 hand landmarks and monitor them as shown
in Figure 2 If no landmarks are identified, the default value
is zero. This indicates that no action is required. Fig. 3. Virtual Mouse Movement Action
Otherwise, it returns the coordinate value of the X, Y, and
Z axes, indicatingthat action is needed. ii. Left Click
Following the capture of all hand landmarks from the The Hand Gestures of the index and middle fingers are up
webcam, a red rectangle box appears around the cv2 with fingerUp ID = 1 and fingerUp ID = 2. A mouse
Window. This converts fingertip coordinates from the performs the left-click action on the PC window screen
204
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)
when the distance between the index and middle fingertips with fingerUp ID = 1 and fingerUp ID = 4. A mouse
is less than 40 cm or when they are brought closer together. performs the scroll-up action on the PC window screen
The speed of the mouse's left-click action is proportional to when the distance between the index and pinky fingertips is
the distance victory gesture from the start point as shown in less than 90 cm or when they are brought closer together.
Figure 4. The speed of the mouse scroll-up action is proportional to
the distance horns gesture from the start point, and it is
controlled vertical/horizontal by horns gesture actions as
shown in Figure 6
205
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)
206
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)
207
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)
V. REFERENCE
Table 1 shows the users manually tested results of a virtual [1] Tran, D.S., Ho, N.H., Yang, H.J., Kim, S.H. and Lee,
mouse actions with professional peoples, ordinary peoples, G.S. (2021). Real-time Virtual Mouse System using
students, and expert peoples. The results indicate no RGB-D Images and Fingertip Detection. In
statistically significant difference between real and virtual Proceeding of International Conference on
mouse operations. Users can perform dynamic virtual Multimedia Tools and Applications, pp.10473-
mouse operations with 99% accuracy. After analyzing the 10490.
mouse operations perform admirably in a variety of [2] Reddy, Vantukala VishnuTeja, Thumma
illumination situations, noisy backgrounds, distance Dhyanchand, Galla Vamsi Krishna, and Satish
tracking. The prediction model obtained astounding Maheshwaram. (2020). Virtual Mouse Control Using
accuracy after examining the outcomes of the suggested Colored Finger Tips and Hand Gesture Recognition.
model. As shows in figure 14 bar graph chart that In Proceeding of International Conference in
demonstrates the percentage level of mouse operation Hyderabad Section, IEEE, pp.1-5.
accuracy that was attained after each user's virtual mouse [3] Chowdhury, S.R., Pathak, S., Praveena, M.A. (2020).
activity. Gesture Recognition Based Virtual Mouse and
Keyboard. In Proceeding of 4th International
Conference on Trends in Electronics and Informatics,
IEEE, pp.585-589.
[4] Shetty, Monali, Christina A. Daniel, Manthan K.
Bhatkar, and Ofrin P. Lopes. (2020). Virtual Mouse
Using Object Tracking. In Proceeding of 5th
International Conference on Communication and
Electronics Systems, IEEE, pp.548-553.
[5] Shibly, Kabid Hassan, Samrat Kumar Dey, Md
Aminul Islam, Shahriar Iftekhar Showrav. (2019).
Design and development of hand gesture based
virtual mouse. In Proceeding of 1st International
Conference on Advancesin Science, Engineering and
Robotics Technology, IEEE, pp.1-5.
[6] Varun, K.S., Puneeth, I. and Jacob, T.P. (2019).
Virtual mouse implementation using OpenCV. In
Proceeding of 3rd International Conference on
Trends in Electronics and Informatics, IEEE, pp.435-
Fig. 14. Level of Accuracy of virtual mouse actions 438.
[7] Shajideen, S. M. S., Preetha, V. H. (2018). Hand
In existing works, analysis of results were not done. In our Gestures- Virtual Mouse for Human Computer
work, the analysis of results are conducted by giving our Interaction. In Proceeding of International Conference
software application to different types of users (professional on Smart Systems and Inventive Technology, IEEE,
peoples, ordinary peoples, students, and expert peoples). pp.543-546.
Based on this accuracy is computed. Table 1 show the [8] SK. Abdul Soniya, R.V. Harshitha, Y. Veera Reddy,
accuracy of various mouse actions tested by different types Dr. Ratna Raju Mukiri. (2018). A Novel Approach:
of peoples. Hand Gesture Volume Control System. In
Proceeding of International Journal of Research,
IV. CONCLUSION volume-7, issue-7, pp.200-206.
The proposed system uses a PC webcam to capture 30fps [9] Lu, D., Yu, Y. and Liu, H. (2016). Gesture
real-time frames with OpenCV. The mediapipe model recognition using data glove: An extreme learning
recognises those frames to process hand gestures and machine method. In Proceeding of IEEE international
perform mouse operations like left/right-click, scroll conference on robotics and biomimetics, IEEE,
up/down, drag/drop objects, and volume/brightness control pp.1349-1354.
without using a physical mouse devices. It is evident from [10] Bhattacharjee, A., Jana, I., Das, A., Kundu, D.,
test results, that the proposed system model performs well Ghosh, S. and Gupta, S.D. (2015). A novel
even in dim lighting conditions. The mouse operations work probabilistic approach of colored object detection and
very well in real-world and real-time applications using just design of a gesture based real-time mouse tracking
a CPU and no GPU. The proposed model's accuracy is 99%. along with virtual teaching intended for color-blind
people. In Proceeding of 2nd International
208
International Journal of Engineering Applied Sciences and Technology, 2022
Vol. 7, Issue 6, ISSN No. 2455-2143, Pages 201-209
Published Online October 2022 in IJEAST (http://www.ijeast.com)
209