0% found this document useful (0 votes)
35 views5 pages

Lab Task 5

The document describes a hand gesture recognition system built using Python. The system uses MediaPipe and PyAutoGUI to detect hand gestures from video input and perform actions like scrolling based on the number of fingers raised or adjusting system volume by increasing the distance between thumb and index finger. Screenshots show the number of fingers detected and volume level changing with hand gestures. The code and a video link are provided. The system counts fingers, scrolls with fist open/close, and controls volume using thumb-index distance to prototype a virtual gesture-based mouse and eliminate strain from physical mice.

Uploaded by

Anvita tadepalli
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views5 pages

Lab Task 5

The document describes a hand gesture recognition system built using Python. The system uses MediaPipe and PyAutoGUI to detect hand gestures from video input and perform actions like scrolling based on the number of fingers raised or adjusting system volume by increasing the distance between thumb and index finger. Screenshots show the number of fingers detected and volume level changing with hand gestures. The code and a video link are provided. The system counts fingers, scrolls with fist open/close, and controls volume using thumb-index distance to prototype a virtual gesture-based mouse and eliminate strain from physical mice.

Uploaded by

Anvita tadepalli
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

ECE-2006

DIGITAL SIGNAL PROCESSING


L33+L34

LAB TASK-5

Name: Anvita Tadepalli


Reg.no: 20BEC0377
• Title:
Hand Gesture Recognition System using Python
• Abstract:
I have built a hand tracker that identifies the number of fingers that are raised and displays
the number on the screen. In this code I have tried to build a virtual mouse with basic
controls such as tracking and volume control. With one swipe of the hand you could browse
through Instagram and with that same gesture you could reduce the audio while watching a
video on Youtube. We have used mediapipe here. MediaPipe is a customizable machine
learning solutions framework developed by Google. It is an open-source and cross-platform
framework, and it is very lightweight. MediaPipe comes with some pre-trained ML solutions
such as face detection, pose estimation, hand recognition, object detection, etc. We used
MediaPipe to recognize the hand and the hand key points. MediaPipe returns a total of 21
key points for each detected hand. I have also used PyAutoGUI which is a Python Package
that works across Windows, MacOS X and Linux which provides the ability to simulate
mouse cursor moves and clicks as well as keyboard button presses. NumPy is a Python
library used for working with arrays. It also has functions for working in domain of linear
algebra, Fourier transform, and matrices.

• Code:
(i)
import cv2
import mediapipe as mp
from handtrackingmodule import HandDetector
import pyautogui as py
detector = HandDetector()

capture=cv2.VideoCapture(0)
while True:
success, img = capture.read()
lmlist,img = detector.lmlist(img)
if lmlist:
fingers,img= detector.fingersup(img,lmlist)
#print(fingers)
if (fingers==[0,0,0,0,0]):
py.scroll(-50)
elif(fingers==[1,1,1,1,1]):
py.scroll(50)

cv2.imshow("Video Feed",img)
key = cv2.waitKey(1)
if (key==27):
break
capture.release()
cv2.destroyAllWindows()

(ii)
port cv2
import mediapipe as mp
from handtrackingmodule import HandDetector
import numpy as np

from ctypes import cast, POINTER


from comtypes import CLSCTX_ALL
from pycaw.pycaw import AudioUtilities, IAudioEndpointVolume
devices = AudioUtilities.GetSpeakers()
interface = devices.Activate(
IAudioEndpointVolume._iid_, CLSCTX_ALL, None)
volume = cast(interface, POINTER(IAudioEndpointVolume))
#volume.GetMute()
#volume.GetMasterVolumeLevel()
volrange = volume.GetVolumeRange()
minvol=volrange[0]
maxvol=volrange[1]

detector = HandDetector()

capture=cv2.VideoCapture(0)
while True:
success, img = capture.read()
lmlist,img = detector.lmlist(img)

if lmlist:
fingers,img = detector.fingersup(img,lmlist,draw=False)
if (fingers==[1,1,0,0,0]):
lenght,img = detector.finddistance(4,8,img,lmlist)
vol = np.interp(lenght,(10,200),(minvol,maxvol))
volper = np.interp(lenght,(10,200),(0,100))
cv2.putText(img,str(int(volper))+"%",(100,100),cv2.F
ONT_HERSHEY_SIMPLEX,1,(0,0,225),3)
volume.SetMasterVolumeLevel(vol, None)

cv2.imshow("Video Feed",img)
key = cv2.waitKey(1)
if (key==27):
break
capture.release()
cv2.destroyAllWindows()

• Screenshots:

(i) Hand Tracking And Scroll feature


The system is able to recognise the number of fingers raised and is displaying the number.
Moreover the screen scrolls down upon opening the fist and scrolls up upon closing the fist.
(ii) Volume Control
You can see the computer volume being controlled by hand gestures. As we move the
thumb and index figure farther the volume increases and as we move them closer the
volume decreases
• Link For Code:
https://www.youtube.com/watch?v=SlqHa2R9RYg

• Conclusion:
Hence, using mediapipe and PyAutoGUI we were able to create a machine learning hand
gesture detection model that was able to count the number of fingers raised as well as scroll
the screen upwards and downwards. It was also able to adjust the computer volume using
the thumb and the index finger. This way upon further improvements and modifications we
can create a virtual mouse which completely works on gestures and will eliminate the
physical strain of using an actual mouse.

You might also like