MINI PROJECT REPORT

Download as pdf or txt
Download as pdf or txt
You are on page 1of 43

AI PERSONAL TRAINER

A PROJECT REPORT

Submitted by

SHERLIN LIDIYA A 727721EUEC141

SHRIKAVITHA S 727721EUEC144

SWETHA PJ 727721EUEC163

In partial fulfillment for the award of the degree

of

BACHELOR OF ENGINEERING

IN

ELECTRONICS AND COMMUNICATION ENGINEERING

SRI KRISHNA COLLEGE OF ENGINEERING AND TECHNOLOGY


An Autonomous Institution | Approved by AICTE | Affiliated to Anna University | Accredited by NAAC with A++ Grade
Kuniamuthur, Coimbatore – 641008 NOVEMBER 2024
SUSTAINABLE DEVELOPMENT GOALS

The Sustainable Development Goals are a collection of 17 global goals designed to blue print
to achieve a better and more sustainable future for all. The SDGs, set in 2015 by the United
Nations General Assembly and intended to be achieved by the year 2030, In 2015, 195
nations agreed as a blue print that they can change the world for the better. The project is
based on one of the 17 goals.

Questions Answer Samples

Which SDGs does the project directly SDG 3 –Good wealth and health being.
address?
What strategies or actions are being Developing personalized fitness plans
implemented to achieve these based on user data (age, weight, fitness
goals? level).

How is progress measured and reported in Progress is measured by tracking users'


relation to the SDGs? health metrics (BMI, heart rate, calories
burned) over time.

How were these goals identified as relevant to Increasing public awareness of fitness and
the project’s objectives? physical well-being fits the SDG target of
reducing premature mortality and
promoting healthier lifestyles.
Are there any partnerships or collaborations in Collaboration with fitness professionals to
place to enhance this impact? design accurate exercise routines.

II
BONAFIDE CERTIFICATE

Certified that this project report “AI PERSONAL TRAINER” is the bonafide work

of “SHERLIN LIDIYA A(727721EUEC141), SHRIKAVITHA S(727721EUEC144),

SWETHA PJ(727721EUEC163)" who carried out the project work under my

supervision.

SIGNATURE SIGNATURE

Dr. S. SASIPRIYA, Ph.D, Dr.G. RANJITHAM, Ph.D,


HEAD OF THE DEPARTMENT SUPERVISOR
Department of Electronic and Communication Department of Electronic and
Engineering Communication Engineering
Sri Krishna College of Engineering and Sri Krishna College of Engineering and
Technology Technology
Kuniyamuthur , Coimbatore. Kuniyamuthur, Coimbatore.

Submitted for the Project viva-voce examination held on

INTERNAL EXAMINER EXTERNAL EXAMINER

III
TABLE OF CONTENTS

CHAPTER NO. TITLE PAGENO.

ABSTRACT viii
LIST OF FIGURES vii

1. INTRODUCTION 2

1.1 Problem Statement 4


1.2 Overview 6
1.3 Objective 6

2. COMPONENTS AND SPECIFICATION 7

2.1 Python 3.x 7

2.1.1 Overview 7

2.1.2 Features 7

2.1.3 Installation and Environment Setup 8

2.1.4 Libraries and Frameworks 8

2.1.5 Use Cases in the Project 9

2.1.6 Example Applications 9

2.2 Flask 9

2.2.1 Overview 9

2.2.2 Features 10

2.2.3 Installation and Configuration 10

2.2.4 Application Structure 11


IV
2.2.5 Use Cases in the Project 11

2.2.6 Best Practices 12

2.3 NumPy 12

2.3.1 Overview 12

2.3.2 Features 12

2.3.3 Installation and Configuration 13

2.3.4 Key Functions and Methods 13

2.3.5 Use Cases in the Project 13

2.3.6 Performance Considerations 14

2.4 pandas 14

2.4.1 Overview 14

2.4.2 Features 15

2.4.3 Installation and Configuration 15

2.4.4 Key Functions and Methods 15

2.4.5 Use Cases in the Project 16

2.4.6 Data Handling Best Practices 16

2.5 Web Browser 16

2.5.1 Overview 16

2.5.2 Features 17

2.5.3 Supported Technologies 17

2.5.4 Use Cases in the Project 18

2.5.5 Browser Compatibility Considerations 18

2.6 PC 19

2.6.1 Description 19

2.6.2 Specification 19

2.6.3 Working 19

V
3. WORKING 20

3.1 User Input Handling 20

3.2 Block Diagram 22

3.3 Source Code 32

4. CONCLUSION 34

REFERENCES 35

VI
LIST OF FIGURES

FIGURE NO TITLE PAGE NO

3.2 BLOCK DIAGRAM 22

3.4.1 PULL UP 32

3.4.2 SIT UP 33

3.4.3 PUSH UP 33

3.4.4 WALK 34

VII
ABSTRACT

Virtual assistants have become indispensable in modern life, playing a


crucial role in how we manage our daily activities. According to a 2019 Clutch
survey report, approximately 27% of people rely on AI virtual assistants to
accomplish their daily tasks. As technology continues to advance, there is a
growing interest in exploring the potential of artificial intelligence (AI) in various
domains. In light of this, our team embarked on research to explore the rapidly
evolving field of artificial intelligence, seeking opportunities to leverage AI
technology to address real-world challenges. One area of focus was the realm of
fitness and wellness, where there exists a significant demand for personalized
guidance and support. Our investigation led us to conceptualize and develop Fit
exercise, an AI-based fitness trainer designed to revolutionize the way individuals
engage with their workout routines. Fit exercise utilizes cutting-edge AI-driven
human pose estimation technology to analyze and interpret the user's workout
stance in real-time. The core functionality of Fit exercise revolves around its
ability to accurately detect and track the user's body movements during exercise
sessions By analysing the geometry of the user's posture and movement patterns,
the software offers actionable insights on how to improve exercise form and
technique. Overall, Fit exercise represents a significant advancement in the realm
of AI-driven fitness technology. By combining state-of-the-art AI algorithms with
human pose estimation technology, the software offers a comprehensive solution
for personalized fitness coaching and guidance. With its ability to analyze, track,
and provide feedback on exercise performance in real-time, Fit exercise has the
potential to revolutionize the way individuals approach their fitness journey.

1
CHAPTER 1

INTRODUCTION

In the digital age, artificial intelligence (AI) has emerged as a disruptive


force, reshaping industries and revolutionizing daily life in unprecedented ways.
The integration of AI into various sectors has opened up new possibilities for
innovation, particularly in fitness and wellness, where the demand for smarter,
more efficient solutions has grown substantially. As individuals increasingly seek
out tools that not only assist but elevate their workout experience, AI-driven
technologies are playing a key role in meeting these expectations by offering
tailored guidance, real-time feedback, and comprehensive data analysis.

A 2019 Clutch survey revealed that 27% of consumers regularly rely on AI


virtual assistants to manage everyday tasks, highlighting the widespread adoption
of AI across diverse areas of life. As this reliance grows, the fitness industry is
experiencing a shift from conventional training methods to more sophisticated,
technology-enhanced approaches. Traditional fitness regimens, while effective
for some, often lack personalization, on-the-spot corrections, and adaptability to
an individual’s unique needs. This is where AI stands to make a significant
impact, by delivering real-time, customized support that can enhance both the
safety and effectiveness of physical training.

Fit Exercise harnesses the power of advanced AI algorithms and human


pose estimation technology to create a highly interactive, data-driven fitness
experience. Utilizing real-time motion capture and analysis, the AI system
precisely tracks a user's movements during exercise, evaluating key factors such
as posture, alignment, and range of motion. Through this analysis, Fit Exercise

2
provides immediate feedback on form correction, helping users improve their
technique, maximize performance, and minimize the risk of injury.

This real-time analysis is underpinned by sophisticated machine learning


models, which continually learn and adapt based on the user’s performance
history and personal fitness goals. By examining patterns in a user’s movement
and progress, Fit Exercise delivers progressively tailored exercise
recommendations, ensuring that workouts remain challenging yet achievable.
Moreover, the system can track long-term data, offering users insights into their
fitness journey, highlighting areas of improvement, and adjusting routines to align
with evolving goals.

Additionally, Fit Exercise provides an unparalleled level of accessibility.


Unlike traditional personal trainers or fitness programs, which may require
scheduled appointments, in-person sessions, or expensive equipment, this AI-
based solution is available anytime and anywhere, making it ideal for busy
individuals. Whether working out at home, in a gym, or even while traveling,
users can access Fit Exercise via their smartphone or compatible devices,
allowing them to maintain consistency in their fitness journey without the need
for constant supervision.

Further, the integration of AI into Fit Exercise enhances the scalability of


personalized training. It ensures that every user, regardless of fitness level, can
receive individualized attention comparable to that of a personal trainer, without
the constraints of one-on-one sessions. Beginners can benefit from structured
guidance and foundational instruction, while experienced athletes receive
advanced, data-driven advice to refine their technique and push their boundaries.

3
Fit Exercise stands at the forefront of a new era in fitness, where artificial
intelligence transforms traditional workout routines into personalized, high-
performance training programs. By leveraging the latest advancements in AI and
human pose estimation technology, our solution empowers users to take control
of their fitness journey with precision, flexibility, and confidence. With Fit
Exercise, the future of fitness is not only smarter but more inclusive, accessible,
and effective for all.

1.1 Problem Statement:

In the modern era, maintaining a healthy lifestyle is a significant challenge


for many individuals, particularly when it comes to adhering to consistent and
effective workout routines. One of the most critical aspects of fitness training
is ensuring proper form and technique during exercises, as incorrect
movements can lead to reduced efficiency, slow progress, and an increased
risk of injury. Traditionally, personal trainers provide the necessary guidance
to correct these issues; however, access to such expertise is often limited due
to high costs, time constraints, and location barriers. Furthermore, as fitness
trends evolve and home workouts become more popular, there is a growing
demand for innovative, accessible solutions that can offer personalized fitness
guidance without the need for in-person supervision.

Despite advancements in artificial intelligence (AI) and machine learning


(ML), the integration of these technologies in the fitness industry remains
limited, especially in providing real-time, data-driven feedback on exercise
form and progress tracking. Most existing fitness applications rely on pre-
recorded videos or generic workout plans, which lack the capability to
provide personalized, immediate feedback based on the user’s individual

4
movements and performance. This creates a gap between the user’s needs for
precise, real-time correction and the current offerings in the fitness market.

The Fit Exercise project aims to address these challenges by leveraging


state-of-the-art AI technologies, including real-time human pose estimation
and machine learning models, to create a virtual fitness trainer that provides
personalized, real-time feedback on exercise performance. The system
analyzes the user’s workout movements using a live camera feed, assesses the
accuracy of their form, and provides instant, actionable recommendations to
improve technique. By incorporating AI-driven pose estimation technologies,
such as MediaPipe's Blazepose, and a machine learning algorithm trained on
labeled exercise data, Fit Exercise ensures that users receive tailored
coaching, helping them achieve optimal form and avoid injury.

1.2 Overview:

The Fit Exercise project represents a cutting-edge advancement in fitness


training, leveraging artificial intelligence (AI) and machine learning (ML) to
deliver a personalized and interactive workout experience. Central to this
initiative is an AI-based fitness trainer that utilizes sophisticated human pose
estimation technology for real-time analysis of users' movements. By
employing state-of-the-art pose detection algorithms, the system accurately
tracks critical body key points, enabling it to assess exercise performance and
provide immediate, constructive feedback to correct deviations from ideal
form. Users can select exercises tailored to their preferences, receiving
individualized coaching that enhances motivation and focus on specific
fitness goals. The platform includes features for goal setting and progress
monitoring, allowing users to visualize their development through detailed
performance metrics. With its intuitive web-based interface, Fit Exercise aims

5
to revolutionize how individuals approach their fitness routines, promoting
healthier lifestyles through accessible, effective, and data-driven fitness
solutions.

1.3 Objective:

The primary objective of the Fit Exercise project is to create an


intelligent fitness trainer that leverages advanced human pose estimation
algorithms. This system will provide precise analysis of users' exercise form
during workouts by integrating sophisticated machine learning models that
detect and track key points on the user's body, such as joints and limbs.

By processing real-time video feeds from the user's device, the trainer will
assess whether movements align with recommended techniques. If discrepancies
or improper postures are detected, the AI will generate immediate, actionable
feedback to help the user correct their form. This feature aims to enhance workout
effectiveness while minimizing the risk of injury due to poor exercise execution.
Ultimately, this objective empowers users to engage in safer and more productive
fitness practices, leading to improved overall performance and results.

The second objective focuses on enhancing user experience by


incorporating features that support goal setting and progress tracking. The Fit
Exercise platform will allow users to define specific fitness goals—such as
weight loss, muscle gain, or endurance improvement—tailored to their individual
needs.

6
CHAPTER 2
COMPONENTS AND SPECIFICATION

2.1 Python 3.x


2.1.1 Overview
Python 3.x is a powerful, high-level programming language that is widely
recognized for its ease of use and readability. It supports multiple programming
paradigms, including procedural, object-oriented, and functional programming.
Python’s extensive standard library and vibrant community make it a top choice
for various applications, including web development, data analysis, artificial
intelligence, and machine learning.

It is widely used for web development, automation, and data science due
to its simple syntax and extensive support for libraries and frameworks. Python
3.x introduced significant improvements over previous versions, including better
Unicode support, enhanced library functions, and syntax changes to simplify
coding practices.

2.1.2 Features
• Dynamic Typing: Enables developers to write code quickly without
specifying variable types, enhancing flexibility.

• Rich Ecosystem: Supports numerous libraries and frameworks that extend


functionality across various domains.

• Cross-Platform Compatibility: Code can run on different operating


systems without modification, promoting portability.

• Community Support: A vast user community provides resources,


documentation, and forums for assistance.

7
Key Features:

• Interpreted Language: Executes code line-by-line, making debugging


easier and allowing for immediate testing of changes.

• Extensive Standard Library: Includes modules for file handling, web


services, data serialization, and more.

2.1.3 Installation and Environment Setup

1. Download Python: Visit the official Python website and download the
latest version for your operating system.

2. Installation Instructions: Follow the installation wizard, ensuring the


option to add Python to the system PATH is selected.

3. Verifying Installation: Open a terminal or command prompt and execute


the command python --version to confirm the installation.

4. Setting Up a Virtual Environment: Use the command python -m venv


env to create an isolated environment. Activate it to manage dependencies
separately from the global Python installation.

2.1.4 Libraries and Frameworks

Python’s ecosystem includes various libraries essential for the AI Personal


Trainer project:

• Flask: For developing the web application’s back end.

• NumPy: For numerical computations and data manipulation.

• pandas: For data analysis and management.

8
Dependency Management

Utilize a requirements.txt file to specify all necessary libraries, allowing for easy
installation via a package manager. The command pip install -r requirements.txt
installs all dependencies listed.
2.1.5 Use Cases in the Project
Python will serve as the core programming language, enabling:
• Machine Learning Model Development: Implementing algorithms that
analyze user data to provide tailored fitness advice.
• Web Application Development: Using Flask to create a user-friendly
interface.
• Data Processing: Employing NumPy and pandas for efficient data
manipulation and analysis.
2.1.6 Example Applications
Python has been utilized in numerous applications across various domains,
including:
• Healthcare: Data analysis and predictive modeling for patient care
optimization.
• Finance: Automated trading systems and risk assessment models.
• Education: Development of interactive learning tools and assessment
platforms.

2.2 Flask

2.2.1 Overview

Flask is a micro web framework for Python designed for building web
applications quickly and with minimal boilerplate code. It is particularly well-
suited for small to medium-sized applications and APIs due to its simplicity and
flexibility.

9
Flask is used to build the user-facing side of the AI personal trainer system. Its
lightweight architecture and simplicity make it ideal for this project, where it
handles HTTP requests and routes, allowing users to interact with the system
through a web browser.

2.2.2 Features

• Routing: Maps URLs to Python functions, enabling clean and intuitive


URL management.

• Jinja2 Templating Engine: Allows dynamic HTML generation,


improving user interface capabilities.

• Session Management: Manages user sessions, preserving state between


different interactions.

• Blueprints: Supports organizing the application into modules, enhancing


maintainability.

Additional Features:

• RESTful Request Dispatching: Simplifies building RESTful APIs,


facilitating easy integration with other services.

• Debugging Tools: Built-in tools provide real-time feedback during


development, aiding in error detection.

2.2.3 Installation and Configuration

1. Install Flask: Activate the virtual environment and use a package manager
to install Flask using the command pip install Flask.

2. Creating a Basic Flask Application: Initialize a new Python file, import


Flask, create an application instance, and define a route for the home page.

10
3. Creating Templates: Establish a templates directory to store HTML files
and render them using Flask’s rendering methods.

4. Run the Application: Use the command flask run to start the development
server, allowing the application to be accessed through a web browser.

2.2.4 Application Structure

A typical Flask application is structured to enhance organization and


maintainability:

• app.py: Main application file where the Flask instance is created and
routes are defined.

• templates/: Directory for HTML templates used in rendering user


interfaces.

• static/: Folder for static files such as CSS, JavaScript, and images.

• blueprints/: For larger applications, organizing related routes and


functions into modular components.

2.2.5 Use Cases in the Project

Flask will facilitate:

• User Request Handling: Processing user input, such as workout


preferences and dietary requirements.

• Dynamic Content Rendering: Generating user-specific HTML content


based on input data.

• Session Management: Tracking user interactions and storing personalized


data for future reference.

11
2.2.6 Best Practices

• Use Blueprints: Organize routes into blueprints to keep the application


modular and maintainable.

• Environment Variables: Store sensitive data, such as API keys, in


environment variables rather than hard-coding them into the application.

• Testing: Implement unit tests to ensure the functionality of individual


components before integration.

2.3 NumPy

2.3.1 Overview

NumPy is an essential library for numerical computing in Python, providing


support for large, multi-dimensional arrays and matrices. It also offers a variety
of mathematical functions to perform operations on these data structures
efficiently.

NumPy is essential for numerical data manipulation in the project, especially for
handling the multidimensional data structures required for machine learning
algorithms. It also provides functions for scientific computations.

In the project, NumPy is used to manage and process fitness data such as workout
patterns, physical measurements, and exercise statistics. These data structures are
then passed to machine learning models for pattern recognition and prediction of
personalized exercise routines.

2.3.2 Features

• N-dimensional Arrays: Facilitates efficient storage and manipulation of


large datasets.

12
• Mathematical Functions: A comprehensive collection of functions for
statistical, algebraic, and mathematical operations.

• Linear Algebra: Built-in functions for performing linear algebra


operations, making it ideal for scientific computing.

Additional Features:

• Random Number Generation: Methods for generating random numbers,


which are useful in simulations and statistical sampling.

• Broadcasting: Allows operations on arrays of different shapes, enabling


more flexible mathematical computations.

2.3.3 Installation and Configuration

1. Install NumPy: In the activated virtual environment, execute the


command pip install numpy to add NumPy to your project.

2. Basic Usage: Create a new Python file and import NumPy. Demonstrate
array creation, mathematical operations, and statistical functions.

2.3.4 Key Functions and Methods

• Array Creation: Use np.array() to create arrays from lists or tuples.

• Mathematical Operations: Employ functions like np.mean(), np.sum(),


and np.dot() for computations.

• Array Manipulation: Utilize functions such as np.reshape() and


np.concatenate() for modifying array shapes and merging arrays.

2.3.5 Use Cases in the Project

In the AI Personal Trainer project, NumPy is vital for:

• Data Processing: Handling large datasets of user fitness and health metrics
efficiently.

13
• Calculating Metrics: Performing computations to derive insights, such as
average workout duration and calorie burn.

• Model Development: Providing the necessary array structures for machine


learning algorithms.

2.3.6 Performance Considerations

• Optimized for Performance: NumPy's array operations are significantly


faster than traditional Python lists due to its implementation in C.

• Memory Efficiency: NumPy uses a contiguous block of memory for its


arrays, reducing overhead and improving speed.

2.4 pandas

2.4.1 Overview

pandas is a powerful data analysis and manipulation library for Python. It


provides data structures and functions designed to work with structured data
seamlessly. It is particularly beneficial for tasks involving data preparation,
cleaning, and analysis.

pandas is employed to read, clean, and prepare fitness-related data. Whether the
data comes from user inputs or pre-existing datasets, pandas allows for efficient
handling and preprocessing of data for the machine learning models used in this
project.

For the AI personal trainer project, pandas loads datasets, applies filters to remove
outliers or erroneous data, and structures the data into formats required by the
machine learning algorithms. It also facilitates merging various data sources like
user inputs and exercise datasets to create comprehensive data for analysis.

14
2.4.2 Features

• DataFrame and Series: Two primary data structures that allow for easy
handling of heterogeneous data.

• Data Import/Export: Supports reading and writing data in various


formats, including CSV, Excel, and SQL databases.

• Data Cleaning and Preparation: Powerful tools for managing missing


data, filtering datasets, and transforming data into usable formats.

Additional Features:

• GroupBy Functionality: Allows for aggregating and transforming data


based on specific criteria.

• Time Series Support: Special functionalities for handling date and time
data efficiently, making it ideal for fitness tracking.

2.4.3 Installation and Configuration

1. Install pandas: Ensure the virtual environment is active and utilize a


package manager to install pandas with the command pip install pandas.

2. Basic Usage: Create a new file to demonstrate the creation of a DataFrame


from structured data, showcasing different types of columns and data types.

2.4.4 Key Functions and Methods

• DataFrame Creation: Use methods such as pd.DataFrame() to create


DataFrames from dictionaries or arrays.

• Data Manipulation: Access functions to remove specified labels from


DataFrames or fill in missing values using df.drop() and df.fillna().

• Aggregation and Grouping: Group data based on specific criteria and


perform aggregate functions, such as df.groupby().

15
2.4.5 Use Cases in the Project

In the AI Personal Trainer project, pandas is essential for:

• Loading and Preprocessing User Data: Ensuring it is clean and


structured for analysis, making data ready for machine learning models.

• Performing Data Analysis: Generating personalized recommendations


based on user input and historical data through analysis of trends and
patterns.

• Tracking User Progress: Organizing data effectively for storage and


retrieval, enhancing user engagement by allowing users to visualize their
fitness journey.

2.4.6 Data Handling Best Practices

• Use Efficient Data Structures: Leverage DataFrames for heterogeneous


data to take advantage of pandas’ optimized performance.

• Implement Data Validation: Regularly check data integrity and


consistency to ensure accurate analytics.

• Profile Data Performance: Utilize pandas’ built-in functions to analyze


data processing speed and identify bottlenecks.

2.5 Web Browser

2.5.1 Overview

A modern web browser is crucial for accessing the AI Personal Trainer


application. Browsers interpret and render web content, enabling users to interact
with the application smoothly. They serve as the user interface for the application,
facilitating real-time communication between the user and the server.

The web browser is the platform through which users interact with the AI personal
trainer system. It displays the user interface created using front-end technologies

16
and communicates with the Flask backend to handle user inputs and return the
relevant outputs.

2.5.2 Features

• HTML5 Support: Facilitates rich multimedia content and supports


modern web applications through enhanced markup capabilities.

• CSS3 Support: Provides advanced styling capabilities for responsive web


design, allowing applications to adapt to various screen sizes.

• JavaScript Execution: Supports client-side scripting, enhancing


interactivity and user experience by allowing dynamic content updates
without refreshing the page.

Additional Features:

• Security Features: Browsers implement protocols like HTTPS to ensure


secure data transmission, protecting user privacy and data integrity.

• Developer Tools: Built-in tools for debugging, performance monitoring,


and testing web applications enhance the development process, allowing
developers to identify issues and optimize performance.

2.5.3 Supported Technologies

Modern browsers support various web technologies, enabling the AI Personal


Trainer application to function effectively:

• HTML/CSS: For structuring and styling web content, creating visually


appealing interfaces.

• JavaScript: For interactive features and asynchronous data requests,


enhancing user experience through real-time updates.

17
• Web APIs: For accessing browser features such as local storage,
geolocation, and notifications, improving application functionality.

2.5.4 Use Cases in the Project

In the AI Personal Trainer project, web browsers play a critical role in:

• Providing a User-Friendly Interface: Allowing users to navigate the


application easily and access various features, ensuring a seamless
experience.

• Enabling Real-Time Data Updates: Facilitating dynamic content


rendering based on user input and preferences, such as real-time workout
statistics and feedback.

• Ensuring Secure Interactions: Implementing secure connections to


enhance user trust and protect sensitive information, crucial for
applications handling personal data.

2.5.5 Browser Compatibility Considerations

• Cross-Browser Compatibility: Ensuring that the application performs


consistently across different browsers (Chrome, Firefox, Safari, Edge) is
vital for user experience.

• Responsive Design: Implementing responsive design practices ensures


that the application is accessible on various devices, including desktops,
tablets, and smartphones.

• Feature Detection: Use feature detection libraries to determine whether


specific features are supported in the user’s browser, providing fallbacks
when necessary.

18
2.6 PC:

2.6.1 Description

This section outlines the necessary hardware components required to run the
application and handle related tasks such as storing code, models, and data.

2.6.2 Specification

• PC Configuration:

o Pentium IV 1.7 GHz processor

o 128 MB RAM

o 40 GB HDD

o 15” Color Monitor

• Storage Space: Adequate space for storing the application code, trained
model, and any associated data files.

• Input Devices: Monitor, Keyboard, Mouse

2.6.3 Working

The system with the given specifications will support basic computational
tasks, application development, and data storage needs. The monitor provides
visual feedback, while the keyboard and mouse serve as input devices for user
interaction. The PC's storage is sufficient for handling lightweight applications
and datasets.

19
CHAPTER 3
WORKING

User Input Handling:


The project begins with handling user inputs, which are captured
through command-line arguments using the argparse library. The user specifies
the type of exercise they want to perform (e.g., push-up, pull-up) and the video
source (either a pre-recorded video or the webcam). This allows flexibility in
testing different exercises and using various input methods.
MediaPipe Integration for Pose Detection:
The project utilizes MediaPipe, a framework developed by Google for
building multimodal applied machine learning pipelines. Specifically, it
leverages the Pose module, which provides real-time body pose estimation.
MediaPipe detects key landmarks on the human body, such as joints and limbs,
which are essential for tracking exercise movements and calculating angles
between body parts.
Video Capture and Processing with OpenCV:
Using OpenCV, the project captures video input either from a file or a
webcam. It sets the video frame dimensions and processes each frame to
prepare it for pose detection. The frames are resized for consistency, and color
conversion is performed to switch from BGR to RGB format, which is required
by MediaPipe for pose processing.
Exercise Tracking Logic:
The core functionality of the project revolves around tracking various
exercises based on the detected body landmarks. This involves several specific
methods for each type of exercise:

20
• Push-Up Tracking: The project calculates the angle of the arms to
determine the push-up movement. When the average angle of the arms
indicates a downward position, the counter increases.
• Pull-Up Monitoring: It tracks the position of the nose relative to the
average position of the elbows to count pull-ups. The counter
increments when the nose rises above a certain threshold.
• Squat Assessment: By analyzing the angles of the legs, the project can
count squats based on the angle of the legs reaching below a specified
limit.
• Walking Detection: The system counts steps based on the position of the
knees. When one knee crosses the position of the other, the counter
increments.
• Sit-Up Evaluation: The angle of the abdomen is monitored to assess sit-
ups. The counter increases when the angle falls below a specific
threshold.
User Feedback and Display:
To enhance user experience, the project provides real-time feedback on
the screen. The current exercise type, count of repetitions, and status (e.g.,
active/inactive) are displayed using OpenCV’s putText function. This helps
users visually track their performance during the exercise session.
Supporting Classes and Functions:
Several supporting classes and functions help organize the code
• BodyPartAngle Class: This class encapsulates methods to calculate
angles between various body parts, which is crucial for tracking exercise
form.
• Utility Functions: Functions like calculate_angle, detection_body_part,
and score_table streamline the detection of body parts, calculation of
angles, and display of user feedback.

21
Real-Time Performance Monitoring:
The system continuously monitors the user's performance in real time.
As the user exercises, the program evaluates their movements, counts
repetitions, and provides instant feedback on their form and progress. This
immediate response is critical for ensuring users maintain proper technique
and encourages them to achieve their fitness goals.

3.2 Block Diagram

3.3 Source Code

Main.py:
import cv2
import argparse
from utils import *
import mediapipe as mp
from body_part_angle import BodyPartAngle
from types_of_exercise import TypeOfExercise

22
ap = argparse.ArgumentParser()
ap.add_argument("-t",
"--exercise_type",
type=str,
help='Type of activity to do',
required=True)
ap.add_argument("-vs",
"--video_source",
type=str,
help='Type of activity to do',
required=False)
args = vars(ap.parse_args())
args = vars(ap.parse_args())
mp_drawing = mp.solutions.drawing_utils
mp_pose = mp.solutions.pose
if args["video_source"] is not None:
cap = cv2.VideoCapture("Exercise Videos/" + args["video_source"])
else:
cap = cv2.VideoCapture(0) # if you want to change your webcam just
toggle the parameter
cap.set(3, 800) # width
cap.set(4, 480) # height

# setup mediapipe
with mp_pose.Pose(min_detection_confidence=0.5,
min_tracking_confidence=0.5) as pose:
counter = 0 # movement of exercise

23
status = True # state of move
while cap.isOpened():
ret, frame = cap.read()
frame = cv2.resize(frame, (800, 480),
interpolation=cv2.INTER_AREA)
# recolor frame to RGB
frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
frame.flags.writeable = False
# make detection
results = pose.process(frame)
# recolor back to BGR
frame.flags.writeable = True
frame = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR)
try:
landmarks = results.pose_landmarks.landmark
counter, status = TypeOfExercise(landmarks).calculate_exercise(
args["exercise_type"], counter, status)
except:
pass

frame = score_table(args["exercise_type"], frame, counter, status)


# render detections (for landmarks)
mp_drawing.draw_landmarks(
frame,
results.pose_landmarks,
mp_pose.POSE_CONNECTIONS,
mp_drawing.DrawingSpec(color=(255, 255, 255),
thickness=2,

24
circle_radius=2),
mp_drawing.DrawingSpec(color=(174, 139, 45),
thickness=2,
circle_radius=2),
)
cv2.imshow('Video', frame)
if cv2.waitKey(10) & 0xFF == ord('q'):
break
cap.release()
cv2.destroyAllWindows()
Types_of_exercise:
import numpy as np
from body_part_angle import BodyPartAngle
from utils import *
class TypeOfExercise(BodyPartAngle):
def _init_(self, landmarks):
super()._init_(landmarks)

def push_up(self, counter, status):


left_arm_angle = self.angle_of_the_left_arm()
right_arm_angle = self.angle_of_the_left_arm()
avg_arm_angle = (left_arm_angle + right_arm_angle) // 2

if status:
if avg_arm_angle < 70:
counter += 1
status = False
else:

25
if avg_arm_angle > 160:
status = True
return [counter, status]
def pull_up(self, counter, status):
nose = detection_body_part(self.landmarks, "NOSE")
left_elbow = detection_body_part(self.landmarks, "LEFT_ELBOW")
right_elbow = detection_body_part(self.landmarks, "RIGHT_ELBOW")
avg_shoulder_y = (left_elbow[1] + right_elbow[1]) / 2
if status:
if nose[1] > avg_shoulder_y:
counter += 1
status = False
else:
if nose[1] < avg_shoulder_y:
status = True
return [counter, status]
def squat(self, counter, status):
left_leg_angle = self.angle_of_the_right_leg()
right_leg_angle = self.angle_of_the_left_leg()
avg_leg_angle = (left_leg_angle + right_leg_angle) // 2
if status:
if avg_leg_angle < 70:
counter += 1
status = False
else:
if avg_leg_angle > 160:
status = True
return [counter, status]

26
def walk(self, counter, status):
right_knee = detection_body_part(self.landmarks, "RIGHT_KNEE")
left_knee = detection_body_part(self.landmarks, "LEFT_KNEE")
if status:
if left_knee[0] > right_knee[0]:
counter += 1
status = False
else:
if left_knee[0] < right_knee[0]:
counter += 1
status = True
return [counter, status]
def sit_up(self, counter, status):
angle = self.angle_of_the_abdomen()
if status:
if angle < 55:
counter += 1
status = False
else:
if angle > 105:
status = True
return [counter, status]
def calculate_exercise(self, exercise_type, counter, status):
if exercise_type == "push-up":
counter, status = TypeOfExercise(self.landmarks).push_up(
counter, status)
elif exercise_type == "pull-up":
counter, status = TypeOfExercise(self.landmarks).pull_up(

27
counter, status)
elif exercise_type == "squat":
counter, status = TypeOfExercise(self.landmarks).squat(
counter, status)
elif exercise_type == "walk":
counter, status = TypeOfExercise(self.landmarks).walk(
counter, status)
elif exercise_type == "sit-up":
counter, status = TypeOfExercise(self.landmarks).sit_up(
counter, status)
return [counter, status]

2. utils.py:

import mediapipe as mp
import pandas as pd
import numpy as np
import cv2
mp_pose = mp.solutions.pose
def calculate_angle(a, b, c):
a = np.array(a)
b = np.array(b)
c = np.array(c)
radians = np.arctan2(c[1] - b[1], c[0] - b[0]) -\
np.arctan2(a[1] - b[1], a[0] - b[0])
angle = np.abs(radians * 180.0 / np.pi)
if angle > 180.0:
angle = 360 - angle

28
return angle
def detection_body_part(landmarks, body_part_name):
return [
landmarks[mp_pose.PoseLandmark[body_part_name].value].x,
landmarks[mp_pose.PoseLandmark[body_part_name].value].y
landmarks[mp_pose.PoseLandmark[body_part_name].value].visibility
]
def detection_body_parts(landmarks):
body_parts = pd.DataFrame(columns=["body_part", "x", "y"])
for i, lndmrk in enumerate(mp_pose.PoseLandmark):
lndmrk = str(lndmrk).split(".")[1]
cord = detection_body_part(landmarks, lndmrk)
body_parts.loc[i] = lndmrk, cord[0], cord[1]
return body_parts

def score_table(exercise, frame , counter, status):


cv2.putText(frame, "Activity : " + exercise.replace("-", " "),
(10, 65), cv2.FONT_HERSHEY_SIMPLEX, 0.7, (0, 0, 255), 2,
cv2.LINE_AA)
cv2.putText(frame, "Counter : " + str(counter), (10, 100),
cv2.FONT_HERSHEY_SIMPLEX, 0.7, (0, 0, 255), 2,
cv2.LINE_AA)
cv2.putText(frame, "Status : " + str(status), (10, 135),
cv2.FONT_HERSHEY_SIMPLEX, 0.7, (0, 0, 255), 2,
cv2.LINE_AA)
return frame
3. body_part_angle.py:
import mediapipe as mp

29
import pandas as pd
import numpy as np
import cv2
from utils import *
class BodyPartAngle:
def _init_(self, landmarks):
self.landmarks = landmarks
def angle_of_the_left_arm(self):
l_shoulder = detection_body_part(self.landmarks,
"LEFT_SHOULDER")
l_elbow = detection_body_part(self.landmarks, "LEFT_ELBOW")
l_wrist = detection_body_part(self.landmarks, "LEFT_WRIST")
return calculate_angle(l_shoulder, l_elbow, l_wrist)
def angle_of_the_right_arm(self):
r_shoulder = detection_body_part(self.landmarks,
"RIGHT_SHOULDER")
r_elbow = detection_body_part(self.landmarks, "RIGHT_ELBOW")
r_wrist = detection_body_part(self.landmarks, "RIGHT_WRIST")
return calculate_angle(r_shoulder, r_elbow, r_wrist)
def angle_of_the_left_leg(self):
l_hip = detection_body_part(self.landmarks, "LEFT_HIP")
l_knee = detection_body_part(self.landmarks, "LEFT_KNEE")
l_ankle = detection_body_part(self.landmarks, "LEFT_ANKLE")
return calculate_angle(l_hip, l_knee, l_ankle)
def angle_of_the_right_leg(self):
r_hip = detection_body_part(self.landmarks, "RIGHT_HIP")
r_knee = detection_body_part(self.landmarks, "RIGHT_KNEE")
r_ankle = detection_body_part(self.landmarks, "RIGHT_ANKLE")

30
return calculate_angle(r_hip, r_knee, r_ankle)
def angle_of_the_neck(self):
r_shoulder = detection_body_part(self.landmarks,
"RIGHT_SHOULDER")

l_shoulder = detection_body_part(self.landmarks,
"LEFT_SHOULDER")

r_mouth = detection_body_part(self.landmarks, "MOUTH_RIGHT")


l_mouth = detection_body_part(self.landmarks, "MOUTH_LEFT")
r_hip = detection_body_part(self.landmarks, "RIGHT_HIP")
l_hip = detection_body_part(self.landmarks, "LEFT_HIP")

shoulder_avg = [(r_shoulder[0] + l_shoulder[0]) / 2,


(r_shoulder[1] + l_shoulder[1]) / 2]

mouth_avg = [(r_mouth[0] + l_mouth[0]) / 2,


(r_mouth[1] + l_mouth[1]) / 2]

hip_avg = [(r_hip[0] + l_hip[0]) / 2, (r_hip[1] + l_hip[1]) / 2]


return abs(180 - calculate_angle(mouth_avg, shoulder_avg, hip_avg))
def angle_of_the_abdomen(self):

# calculate angle of the avg shoulder


r_shoulder = detection_body_part(self.landmarks,
"RIGHT_SHOULDER")
l_shoulder = detection_body_part(self.landmarks,
"LEFT_SHOULDER")

31
shoulder_avg = [(r_shoulder[0] + l_shoulder[0]) / 2,
(r_shoulder[1] + l_shoulder[1]) / 2]
# calculate angle of the avg hip
r_hip = detection_body_part(self.landmarks, "RIGHT_HIP")
l_hip = detection_body_part(self.landmarks, "LEFT_HIP")
hip_avg = [(r_hip[0] + l_hip[0]) / 2, (r_hip[1] + l_hip[1]) / 2]
# calculate angle of the avg knee

r_knee = detection_body_part(self.landmarks, "RIGHT_KNEE")


l_knee = detection_body_part(self.landmarks, "LEFT_KNEE")
knee_avg = [(r_knee[0] + l_knee[0]) / 2, (r_knee[1] + l_knee[1]) / 2]
return calculate_angle(shoulder_avg, hip_avg, knee_avg)

3.4 Output

PULL UP:

Fig : 3.4.1

32
SIT UP:

Fig : 3.4.2

PUSH UP:

Fig : 3.4.3

33
WALK:

Fig : 3.4.4

34
CHAPTER 4
CONCLUSION

The Fit Exercise project is an innovative initiative that transforms the


fitness industry by leveraging advanced artificial intelligence (AI) technologies
to provide personalized guidance and support for users' fitness journeys.
Utilizing sophisticated pose estimation algorithms, the software accurately
tracks and analyzes user movements during workouts, identifying various
exercises in real-time and delivering instant feedback and tailored
recommendations. Through an intuitive user interface, Fit Exercise facilitates
goal setting and tracking while prioritizing data privacy and security. By
monitoring six different exercises and employing machine learning algorithms
to assess posture correctness from labeled input videos, the application helps
maintain proper form, educates users on exercise routines, and prevents injuries.
Additionally, the performance of the random forest classifier is evaluated using
confusion matrices to determine precision and recall for each exercise, further
enhancing the accuracy of the assessments.

35
REFERENCES

[1] S. Jin, L. Xu, J. Xu, C. Wang, W. Liu, C. Qian, W. Ouyang and P. Luo, “Whole-
Body Human Pose Estimation in the Wild”, In book: Computer Vision – ECCV
2020 (pp.196-214).

[2] G. Taware, R. Agarwal, P. Dhende, P. Jondhalekar and Prof. S. Hule, “AI


Based Workout Assistant and Fitness Guide”, International Journal of
Engineering Research & Technology (IJERT) Vol. 10 Issue 11, November-2021.

[3] F. Zhang, V. Bazarevsky, A. Vakunov, A. Tkachenka, G. Sung, C.L. Chang


and M. Grundmann, “MediaPipe Hands: Ondevice Real-time Hand Tracking.”
ArXiv, 2020.

[4] Y. Kartynnik, A. Ablavatski, I. Grishchenko, and M. Grundman, “Real-time


facial surface geometry from monocular video on mobile gpus.”, IEEE/CVPR
Workshop on Computer Vision for Augmented and Virtual Reality, July 2019.

[5] S. Kreiss, L. Bertoni and A. Alahi, “PifPaf: Composite Fields for Human Pose
Estimation”, IEEE/CVF Conference on Computer Vision and Pattern
Recognition (CVPR), June 2019.

[6] M. Eichner, M. Marin-Jimenez, A. Zisserman, and V. Ferrari, “2d articulated


human pose estimation and retrieval in (almost) unconstrained still images”,
International Journal of Computer Vision Vol. 99, September 2012.

[7] V. Bazarevsky, I. Grishchenko, K. Raveendran, T. Zhu, F. Zhang and M.


Grundmann, “BlazePose: On-device Real-time Body Pose tracking”, ArXiv, June
2020.

36

You might also like