Mini
Mini
The gesture control car project aims to develop an innovative system that allows intuitive control of a car
using hand gestures. The project integrates gesture recognition, wireless communication, and motor
control technologies to enable users to control the car's movements effortlessly. Through the use of an
Arduino Nano, MPU6050 sensor, NRF24L01+ modules, and L298N motor driver, the system accurately
recognizes predefined hand gestures and translates them into corresponding control signals for the car.
The project's implementation involves capturing and processing hand gesture data using computer vision
techniques. A deep learning model is trained to classify and map gestures to specific navigation
commands, enabling seamless control of the car's movements. The system establishes wireless
communication between the controller and the car, allowing users to interact with the car from a distance.
In summary, the gesture control car project showcases the successful integration of gesture recognition,
wireless communication, and motor control to enable intuitive control of a car using hand gestures. The
project's implementation and experimental results highlight the system's accuracy and responsiveness,
offering a promising approach for enhancing user experience and interaction with devices through
gesturebased control systems.
LIST OF CONTENTS
CHAPTER CONTENTS
PAGE
Acknowledgement NUMBER
Abstract
CHAPTER 1 INTRODUCTION
2
1.1 Background
3
1.2 Overview
3
1.3 Objectives 4
1.4 Methodology 5
LITERATURE SURVEY 6
CHAPTER 2
CHAPTER 3 HARDWARE REQUIREMENTS
CHAPTER 5 IMPLEMENTATION
6.1 Conclusion 27
6.2 Future Prospects 27
6.3 References 29
MINIPROJECT REPORT
ON
of
Assistant Professor
2022-2023