0% found this document useful (0 votes)
3 views

Mini Project

Uploaded by

kaustavdevgan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Mini Project

Uploaded by

kaustavdevgan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 32

JORHAT ENGINEERING COLLEGE

(Assam Science and Technology university)

Mini Project Report On


“Voice Controlled Bot using
8051 Microcontroller”
Submitted to
Department of Electrical Engineering
Jorhat Engineering College

Submitted by
BIPLOP HAGJER (210710003016)
KAUSTAV DEVGAN(210710003031)
MITURAJ BAISHYA(210710003038)
NITUMANI BHARALI(210710003040)
DECLARATION

We hereby declare that this project titled “ Voice Controlled Bot


using 8051 Microcontroller ” is a bonafide record of the project work
which we have submitted to the department of Electrical Engineering ,
Jorhat Engineering College in partial fulfilment of the credit requirement
for the degree of B. Tech.

We further declare that to our knowledge , the structure and content of


this report are original and is a result of self done work carried out by us .
This report has not been submitted for any purpose before.

Biplop Hagjer Kaustav Devgan

Mituraj Baishya Nitumani Bharali


CONTENTS
Page No.s

• Acknowledgement………………………... 5

• Abstract ....................................................... 6

• Introduction ................................................ 7-8

• System components.……………………… 9-12

• Project setup and working………………... 13-17

• Project code……………………………….. 18-19

• Operation………………………………….. 20-21

• Advantages and disadvantages.................... 22-26


• Future scope………….………………......... 27-29

• Conclusion………………………………… 30

• References…………………………………. 31
ACKNOWLEDGEMENT

We would like to extend our sincere thanks to Dr. Mrinal Buragohain Sir
(Head of the department, Electrical Engineering) and Assam Science and
Technology University for giving us such a great opportunity to do this
project from which we are able to learn and explore a new dimension in
electrical engineering.

We would also like to thank our parents and friends who have helped by
giving valuable inputs at different phases of completion of this project.
ABSTRACT

This paper presents a voice-controlled bot developed


on the 8051 microcontroller platform, offering intuitive
human-robot interaction. The system integrates a voice
recognition module with motor drivers and sensors to
enable real-time control based on spoken commands.
The 8051 microcontroller processes voice inputs,
translating them into motor control signals for the
robot's movement. Emphasizing simplicity and
reliability, the design incorporates feedback
mechanisms for user acknowledgment and ensures
robust performance in diverse environments.
Challenges such as noise interference and command
ambiguity are addressed through systematic
refinement. The prototype demonstrates satisfactory
accuracy, responsiveness, and user experience,
showcasing its potential for applications in home
automation, surveillance, and assistive technology.
Overall, this project contributes to advancing intelligent
systems, offering a practical solution for seamless
human-machine interaction in robotics.
INTRODUCTION

The integration of voice recognition technology with


microcontroller-based systems has revolutionized human-
machine interaction, enabling intuitive control mechanisms in
various applications. In line with this advancement, this project
focuses on the development of a voice-controlled bot utilizing
the versatile 8051 microcontroller platform.

The objective of this project is to create a user-friendly


interface that allows individuals to command the movement
and functionality of a robot through spoken instructions. Unlike
traditional input methods such as buttons or joysticks, voice
commands offer a more natural and convenient interaction
paradigm, particularly in scenarios where hands-free operation
is desired.

By leveraging the capabilities of the 8051 microcontroller,


coupled with a dedicated voice recognition module, this project
seeks to overcome the complexities associated with voice-
based control systems. The integration of motor drivers and
sensors further enhances the functionality of the bot, enabling
it to navigate its environment intelligently based on the
commands received.

This introduction sets the stage for exploring the design,


implementation, and evaluation of the voice-controlled bot,
emphasizing its potential impact on various domains including
home automation, surveillance, and assistive technology.
Through this project, we aim to contribute to the advancement
of intelligent systems, facilitating seamless interaction between
humans and robots in diverse real-world scenarios.
BLOCK DIAGRAM

Fig. 1 : Schematic diagram of Voice Controlled Bot using 8051


Microcontroller
SYSTEM COMPONENTS

 8051 Microcontroller: The core component responsible for


processing voice commands and coordinating the robot's
actions.

 Voice Recognition Module: Captures and processes spoken


commands from the user, converting them into digital signals
that the microcontroller can interpret.

 Motor Drivers: Control the movement of the robot's motors
based on the commands received from the microcontroller. They
regulate the speed and direction of the motors to execute desired
maneuvers.


 Sensors: Provide input about the robot's environment, enabling
it to perceive obstacles, detect changes in terrain, or navigate
through predefined paths. Common sensors include proximity
sensors, infrared sensors, or ultrasonic sensors.

 Power Supply Unit: Provides the necessary electrical power to
all components of the system, ensuring smooth operation. It
typically includes voltage regulators and power management
circuits to regulate and distribute power effectively.


 Actuators: Convert electrical signals from the microcontroller
into physical actions. In the context of a robot, actuators may
include motors for locomotion, servos for precise movements, or
grippers for manipulation tasks.
 Bluetooth Module: Enables wireless communication between
the robot and external devices such as smartphones or tablets. It
allows users to control the robot remotely and provides
additional functionality through Bluetooth-enabled applications.

 Chassis and Mechanical Components: The physical structure
of the robot, including the body, wheels, motors, and any other
mechanical parts necessary for its operation. The chassis
provides support and protection for internal components while
determining the overall form and functionality of the robot.
 User Interface: Interfaces such as buttons, LEDs, or displays
that allow users to interact with the robot. They provide
feedback about the robot's status, acknowledge user commands,
or display relevant information.
 Programming Environment: Software tools and development
environments used to write, compile, and upload code to the
microcontroller. This includes integrated development
environments (IDEs), compilers, and debugging tools specific to
the chosen microcontroller architecture.
PROJECT SETUP AND WORKING
Project Setup and Working:

1.Hardware Setup:

- Connect the 8051 microcontroller to a development board, ensuring


proper power supply and connections.

- Interface the voice recognition module with the microcontroller,


establishing communication channels for receiving voice commands.
- Connect motor drivers to the microcontroller to control the
movement of the robot's motors.

- Integrate sensors to provide environmental feedback, enabling the


robot to navigate and respond to its surroundings.

- Install the Bluetooth module for wireless communication with


external devices.

2.Software Setup:

- Set up the programming environment with the necessary tools and


libraries for developing code for the 8051 microcontroller.

- Write firmware code in a suitable programming language (e.g., C or


assembly language) to handle voice recognition, motor control, sensor
data processing, and Bluetooth communication.

- Implement algorithms for voice command interpretation, motor


control logic, obstacle detection, and Bluetooth data exchange.

- Compile the firmware code and generate a binary file compatible


with the 8051 microcontroller architecture.

- Upload the compiled code to the microcontroller using a suitable


programming interface or tool.

3.Working of the Project:


- Upon power-up, the microcontroller initializes all components and
enters a waiting state to receive voice commands.

- The user speaks commands into the microphone connected to the


voice recognition module.

- The voice recognition module processes the audio input, extracts


relevant commands, and sends them to the microcontroller.

- The microcontroller interprets the received commands and


generates corresponding control signals for the motor drivers.

- Based on the interpreted commands, the motors drive the robot to


perform desired movements such as forward, backward, turning, or
stopping.

- Simultaneously, sensors collect data about the robot's


surroundings, detecting obstacles or changes in terrain.

- The microcontroller analyzes sensor data, adjusts motor control


signals accordingly, and provides feedback to the user through the
user interface.

- Additionally, the Bluetooth module enables wireless communication


with external devices, allowing users to control the robot remotely or
receive real-time telemetry data.

4.Operation and Interaction:

- Users interact with the voice-controlled bot by issuing voice


commands, which are processed in real-time by the system.
- The robot responds to commands by executing corresponding
actions, such as moving in specified directions or performing specific
tasks.

- Users can monitor the robot's status and receive feedback through
the user interface, enabling seamless interaction and control.

In summary, the voice-controlled bot project utilizes a combination of


hardware components and software algorithms to enable intuitive
human-robot interaction. By leveraging the capabilities of the 8051
microcontroller and integrating advanced functionalities such as voice
recognition and Bluetooth communication, the project demonstrates a
versatile and efficient approach to robotic automation.

For our voice-controlled bot project, we can use DTMF (Dual Tone Multi-
Frequency) signals to represent different commands, which can then be
converted into binary codes for interpretation by the 8051
microcontroller. Here's a basic mapping of DTMF signals to binary
codes:

DTMF Signal Binary Code

1 0001
2 0010
3 0011
4 0100
5 0101
6 0110
7 0111
8 1000
9 1001
0 0000
* 1010
# 1011

We can assign specific actions or commands to each DTMF signal, and


the microcontroller can interpret these signals by listening for specific
combinations of frequencies and durations corresponding to each button
press.

For example, if we want the robot to move forward when it receives the
DTMF signal for "1", you can assign the binary code "0001" to that
action. Similarly, if you want the robot to stop when it receives the DTMF
signal for "*", you can assign the binary code "1010" to the stop action.

The microcontroller firmware will need to include logic to decode


incoming DTMF signals and map them to corresponding binary codes.
Once decoded, the microcontroller can then execute the appropriate
actions based on the binary code received.

Remember to integrate the necessary DTMF decoder circuitry with the


microcontroller setup to ensure accurate decoding of DTMF signals.
PROJECT C LANGUAGE CODE

#include<reg51.h>
sbit motor1 = P2^0; // Connect Motor1 to P2.0
sbit motor2 = P2^1; // Connect Motor2 to P2.1

void delay(int time) // Time delay function


{
int i,j;
for(i=0;i<time;i++)
for(j=0;j<1275;j++);
}

void main()
{
unsigned char command;
TMOD = 0x20; // Timer 1, mode 2
TH1 = 0xFD; // 9600 baud rate
SCON = 0x50; // Serial mode 1, 8-bit data, 1 stop bit, 1 start bit
TR1 = 1; // Start timer

while(1)
{
while(RI==0); // Wait till reception complete
command = SBUF; // Copy received data to command
RI = 0; // Make RI 0 for next reception

if(command == 'F') // If 'F' received


{
motor1 = 1; // Move forward
motor2 = 1;
}
else if(command == 'B') // If 'B' received
{
motor1 = 0; // Move backward
motor2 = 0;
}
else if(command == 'L') // If 'L' received
{
motor1 = 1; // Turn left
motor2 = 0;
}
else if(command == 'R') // If 'R' received
{
motor1 = 0; // Turn right
motor2 = 1;
}
else if(command == 'S') // If 'S' received
{
motor1 = 0; // Stop
motor2 = 0;
}
delay(100);
}
}

OPERATION
The operation of the voice-controlled bot project involves several key
steps:

1.Initialization:
- Upon powering up the system, the microcontroller initializes all
components, including the voice recognition module, motor drivers,
sensors, Bluetooth module, and user interface.

2.Waiting for Commands:


- The system enters a waiting state, ready to receive voice commands
from the user. The user speaks commands into the microphone
connected to the voice recognition module.

3.Voice Command Processing:


- The voice recognition module captures and processes the spoken
commands, converting them into digital signals.
- These signals are then sent to the microcontroller for interpretation.

4.Command Interpretation:
- The microcontroller interprets the received commands, mapping them
to specific actions or maneuvers for the robot.
- For example, commands such as "forward," "backward," "left," "right,"
or "stop" may be interpreted to control the robot's movement.

5.Motor Control:
- Based on the interpreted commands, the microcontroller generates
corresponding control signals for the motor drivers.
- These signals regulate the speed and direction of the motors,
enabling the robot to execute the desired movements.

6.Sensor Data Processing:


- Simultaneously, sensors mounted on the robot collect data about its
environment.
- This data includes information about obstacles, changes in terrain, or
other relevant factors affecting the robot's navigation.

7.Feedback and Interaction:


- The microcontroller analyzes sensor data and adjusts motor control
signals accordingly to navigate the robot safely.
- Feedback mechanisms such as LEDs or displays provide users with
acknowledgment of command execution and information about the
robot's status.
- Additionally, the Bluetooth module enables wireless communication
with external devices, allowing users to control the robot remotely or
receive real-time telemetry data.

8.Execution of Commands:
- The robot executes the interpreted commands, moving in specified
directions, adjusting speed, or performing specific tasks as instructed by
the user.

9.Continuous Operation:
- The system continues to wait for new commands, enabling ongoing
interaction between the user and the robot.

Overall, the operation of the voice-controlled bot project involves


seamless communication between the user and the robot, facilitated by
voice recognition technology, motor control mechanisms, sensors, and
feedback mechanisms. By integrating these components, the project
provides an intuitive and interactive interface for controlling the robot's
movements and functionality.
ADVANTAGES OF VOICE CONTROLLED
BOT PROJECT

The voice-controlled bot project offers several advantages:

1.Intuitive Interaction: Voice commands provide a natural and


intuitive way for users to interact with the robot, eliminating the
need for complex manual controls. This makes the system user-
friendly and accessible to individuals with varying levels of technical
expertise.

2.Hands-Free Operation: Unlike traditional control methods that


require physical inputs such as buttons or joysticks, voice control
enables hands-free operation. This is particularly beneficial in
scenarios where users need to keep their hands occupied or maintain
a safe distance from the robot.

3.Efficiency and Speed: Voice commands can be processed quickly,


allowing for rapid execution of actions by the robot. This enhances
the overall efficiency and responsiveness of the system, enabling
users to accomplish tasks more effectively.

4.Accessibility: Voice control technology can be beneficial for users


with disabilities or mobility impairments, providing them with an
accessible means of controlling and interacting with the robot. This
promotes inclusivity and ensures that the technology is accessible to
a broader range of users.

5.Versatility: Voice control can be adapted to various applications


and environments, making it suitable for a wide range of use cases.
Whether it's home automation, surveillance, or assistive technology,
the voice-controlled bot project can be tailored to meet different
needs and requirements.

6.Integration with Other Technologies: The incorporation of


Bluetooth technology in the project enhances its versatility by
enabling wireless connectivity and expanding the scope of user
interaction. This allows users to control the robot remotely and opens
up possibilities for integrating the project with other smart devices
and systems.

Overall, the voice-controlled bot project offers a convenient,


efficient, and versatile solution for human-robot interaction, paving
the way for advancements in automation and intelligent systems.

DISADVANTAGES OF VOICE
CONTROLLED BOT PROJECT
While voice-controlled bot projects offer numerous advantages, they
also come with some potential disadvantages:

1.Speech Recognition Accuracy: One of the primary challenges of


voice-controlled systems is ensuring accurate speech recognition.
Environmental noise, accents, speech impediments, and variations in
pronunciation can affect the system's ability to accurately interpret
commands, leading to errors or misunderstandings.

2.Limited Vocabulary: Voice recognition systems may have


limitations in terms of the vocabulary they can understand and
process. Complex or uncommon commands may be challenging for
the system to recognize accurately, restricting the range of actions
the user can command the robot to perform.
3.Privacy Concerns: Voice-controlled systems typically involve
recording and processing voice data, raising concerns about privacy
and data security. Users may be apprehensive about the storage and
use of their voice data, particularly if it is transmitted over the
internet or stored in cloud servers.

4.Dependency on Voice Input: Voice-controlled systems rely


entirely on voice input for interaction, which may not always be
practical or feasible. In noisy environments or situations where users
cannot speak aloud, such as in a library or during a meeting, voice
control may not be suitable.

5.Complexity of Implementation: Designing and implementing a


reliable voice-controlled system requires expertise in speech
recognition technology, signal processing, and microcontroller
programming. Developing robust algorithms and optimizing system
performance can be challenging and time-consuming.

6.Interference and False Positives: Environmental factors such as


background noise or interference from other audio sources can lead
to false positives, where the system mistakenly interprets non-
command sounds as valid commands. This can result in unintended
actions by the robot and undermine user confidence in the system.
7.Integration Issues: Integrating voice control with other system
components, such as motors, sensors, and communication modules,
can introduce additional complexity and potential points of failure.
Ensuring seamless integration and compatibility across all
components of the system requires careful design and testing.

Despite these disadvantages, advancements in speech recognition


technology and system design continue to address many of these
challenges, making voice-controlled systems increasingly practical
and reliable for a wide range of applications.
FUTURE SCOPE

The voice-controlled bot project has significant future scope for


enhancements and applications:

1.Advanced Speech Recognition: Future advancements in speech


recognition technology can improve the accuracy and robustness of
voice-controlled systems. Deep learning algorithms, natural
language processing techniques, and contextual understanding can
enable systems to recognize a wider range of commands and adapt to
diverse user inputs.

2.Natural Language Processing: Integrating natural language


processing (NLP) capabilities into voice-controlled systems can
enable them to understand and respond to more complex and
conversational commands. This would enhance the user experience
and make human-robot interaction more intuitive and seamless.
3.Multi-Modal Interaction: Combining voice control with other
input modalities such as gestures, touch, or facial expressions can
create more versatile and inclusive interaction paradigms. Multi-
modal interfaces can accommodate diverse user preferences and
accessibility needs, enhancing the overall usability of the system.

4.Smart Home Integration: Voice-controlled bots can be integrated


with smart home systems to provide enhanced automation and
convenience. They can control IoT devices, adjust home settings,
and perform routine tasks based on user commands, making smart
homes more responsive and adaptive to user needs.

5.Assistive Technology: Voice-controlled bots have significant


potential in assistive technology applications for individuals with
disabilities or mobility impairments. Enhanced voice recognition,
coupled with intelligent robotic assistance, can empower users to
perform everyday tasks more independently and efficiently.

6.Collaborative Robotics: Voice-controlled bots can be deployed in


collaborative robotics settings, where they work alongside humans in
shared workspaces. Future advancements in safety features, human-
robot interaction protocols, and task allocation algorithms can enable
safer and more efficient collaboration between humans and robots.
7.Personalized User Experience: By leveraging data analytics and
machine learning techniques, voice-controlled bots can learn and
adapt to users' preferences and behavior over time. They can
personalize interactions, anticipate user needs, and provide tailored
recommendations or assistance, enhancing user satisfaction and
engagement.

8.Augmented Reality Integration: Integration with augmented


reality (AR) technology can overlay virtual information or interfaces
onto the physical environment, enhancing the user interface and
providing contextual information in real-time. Voice commands can
complement AR interactions, enabling hands-free control and
navigation in AR environments.

Overall, the future scope of voice-controlled bot projects lies in


advancing technology to improve accuracy, versatility, and user
experience, while expanding applications across various domains
such as smart homes, assistive technology, collaborative robotics,
and augmented reality. Continued research and innovation in these
areas will unlock new possibilities for voice-controlled systems to
transform human-robot interaction and automation.
CONCLUSION

In conclusion, the voice-controlled bot project harnesses the synergy


between the 8051 microcontroller and modern voice recognition
technology to offer a streamlined interface for human-robot
interaction. Its intuitive nature, hands-free operation, and potential
applications in various domains make it a compelling innovation.
Despite its advantages, challenges such as speech recognition
accuracy and integration complexity persist, requiring ongoing
research and refinement. However, the project's future is bright, with
opportunities for advancements in speech recognition, multi-modal
interaction, and personalized user experiences. By addressing these
challenges and capitalizing on emerging technologies, the voice-
controlled bot project paves the way for intelligent automation and
collaborative robotics, promising a more connected and efficient
future.
REFERENCES

The 8051 Microcontroller: Hardware, Software,


and Interfacing: James W. Stewart, Joseph J.
Mistovich

The 8051 Microcontroller and Embedded


Systems: Using Assembly and C: Muhammed
Ali Mazidi, Janice Gillispie Mazidi, Rolin D.
McKinlay
] Mrumal. K. Pathak et. al wrote “Robot Control
Design Using Android Smartphone”.
R. Pahuja et.al “Android Mobile Phone Controlled
Bluetooth Robot”.
Atmel: www.atmel.com
DesignandConstruction
:https://livemytraining.com/product/voice-
controlled-robot-by-cell-phone-with-android-
app/.

You might also like