MINI PROJECT REPORT
MINI PROJECT REPORT
MINI PROJECT REPORT
A PROJECT REPORT
Submitted by
SHRIKAVITHA S 727721EUEC144
SWETHA PJ 727721EUEC163
of
BACHELOR OF ENGINEERING
IN
The Sustainable Development Goals are a collection of 17 global goals designed to blue print
to achieve a better and more sustainable future for all. The SDGs, set in 2015 by the United
Nations General Assembly and intended to be achieved by the year 2030, In 2015, 195
nations agreed as a blue print that they can change the world for the better. The project is
based on one of the 17 goals.
Which SDGs does the project directly SDG 3 –Good wealth and health being.
address?
What strategies or actions are being Developing personalized fitness plans
implemented to achieve these based on user data (age, weight, fitness
goals? level).
How were these goals identified as relevant to Increasing public awareness of fitness and
the project’s objectives? physical well-being fits the SDG target of
reducing premature mortality and
promoting healthier lifestyles.
Are there any partnerships or collaborations in Collaboration with fitness professionals to
place to enhance this impact? design accurate exercise routines.
II
BONAFIDE CERTIFICATE
Certified that this project report “AI PERSONAL TRAINER” is the bonafide work
supervision.
SIGNATURE SIGNATURE
III
TABLE OF CONTENTS
ABSTRACT viii
LIST OF FIGURES vii
1. INTRODUCTION 2
2.1.1 Overview 7
2.1.2 Features 7
2.2 Flask 9
2.2.1 Overview 9
2.2.2 Features 10
2.3 NumPy 12
2.3.1 Overview 12
2.3.2 Features 12
2.4 pandas 14
2.4.1 Overview 14
2.4.2 Features 15
2.5.1 Overview 16
2.5.2 Features 17
2.6 PC 19
2.6.1 Description 19
2.6.2 Specification 19
2.6.3 Working 19
V
3. WORKING 20
4. CONCLUSION 34
REFERENCES 35
VI
LIST OF FIGURES
3.4.1 PULL UP 32
3.4.2 SIT UP 33
3.4.3 PUSH UP 33
3.4.4 WALK 34
VII
ABSTRACT
1
CHAPTER 1
INTRODUCTION
2
provides immediate feedback on form correction, helping users improve their
technique, maximize performance, and minimize the risk of injury.
3
Fit Exercise stands at the forefront of a new era in fitness, where artificial
intelligence transforms traditional workout routines into personalized, high-
performance training programs. By leveraging the latest advancements in AI and
human pose estimation technology, our solution empowers users to take control
of their fitness journey with precision, flexibility, and confidence. With Fit
Exercise, the future of fitness is not only smarter but more inclusive, accessible,
and effective for all.
4
movements and performance. This creates a gap between the user’s needs for
precise, real-time correction and the current offerings in the fitness market.
1.2 Overview:
5
to revolutionize how individuals approach their fitness routines, promoting
healthier lifestyles through accessible, effective, and data-driven fitness
solutions.
1.3 Objective:
By processing real-time video feeds from the user's device, the trainer will
assess whether movements align with recommended techniques. If discrepancies
or improper postures are detected, the AI will generate immediate, actionable
feedback to help the user correct their form. This feature aims to enhance workout
effectiveness while minimizing the risk of injury due to poor exercise execution.
Ultimately, this objective empowers users to engage in safer and more productive
fitness practices, leading to improved overall performance and results.
6
CHAPTER 2
COMPONENTS AND SPECIFICATION
It is widely used for web development, automation, and data science due
to its simple syntax and extensive support for libraries and frameworks. Python
3.x introduced significant improvements over previous versions, including better
Unicode support, enhanced library functions, and syntax changes to simplify
coding practices.
2.1.2 Features
• Dynamic Typing: Enables developers to write code quickly without
specifying variable types, enhancing flexibility.
7
Key Features:
1. Download Python: Visit the official Python website and download the
latest version for your operating system.
8
Dependency Management
Utilize a requirements.txt file to specify all necessary libraries, allowing for easy
installation via a package manager. The command pip install -r requirements.txt
installs all dependencies listed.
2.1.5 Use Cases in the Project
Python will serve as the core programming language, enabling:
• Machine Learning Model Development: Implementing algorithms that
analyze user data to provide tailored fitness advice.
• Web Application Development: Using Flask to create a user-friendly
interface.
• Data Processing: Employing NumPy and pandas for efficient data
manipulation and analysis.
2.1.6 Example Applications
Python has been utilized in numerous applications across various domains,
including:
• Healthcare: Data analysis and predictive modeling for patient care
optimization.
• Finance: Automated trading systems and risk assessment models.
• Education: Development of interactive learning tools and assessment
platforms.
2.2 Flask
2.2.1 Overview
Flask is a micro web framework for Python designed for building web
applications quickly and with minimal boilerplate code. It is particularly well-
suited for small to medium-sized applications and APIs due to its simplicity and
flexibility.
9
Flask is used to build the user-facing side of the AI personal trainer system. Its
lightweight architecture and simplicity make it ideal for this project, where it
handles HTTP requests and routes, allowing users to interact with the system
through a web browser.
2.2.2 Features
Additional Features:
1. Install Flask: Activate the virtual environment and use a package manager
to install Flask using the command pip install Flask.
10
3. Creating Templates: Establish a templates directory to store HTML files
and render them using Flask’s rendering methods.
4. Run the Application: Use the command flask run to start the development
server, allowing the application to be accessed through a web browser.
• app.py: Main application file where the Flask instance is created and
routes are defined.
• static/: Folder for static files such as CSS, JavaScript, and images.
11
2.2.6 Best Practices
2.3 NumPy
2.3.1 Overview
NumPy is essential for numerical data manipulation in the project, especially for
handling the multidimensional data structures required for machine learning
algorithms. It also provides functions for scientific computations.
In the project, NumPy is used to manage and process fitness data such as workout
patterns, physical measurements, and exercise statistics. These data structures are
then passed to machine learning models for pattern recognition and prediction of
personalized exercise routines.
2.3.2 Features
12
• Mathematical Functions: A comprehensive collection of functions for
statistical, algebraic, and mathematical operations.
Additional Features:
2. Basic Usage: Create a new Python file and import NumPy. Demonstrate
array creation, mathematical operations, and statistical functions.
• Data Processing: Handling large datasets of user fitness and health metrics
efficiently.
13
• Calculating Metrics: Performing computations to derive insights, such as
average workout duration and calorie burn.
2.4 pandas
2.4.1 Overview
pandas is employed to read, clean, and prepare fitness-related data. Whether the
data comes from user inputs or pre-existing datasets, pandas allows for efficient
handling and preprocessing of data for the machine learning models used in this
project.
For the AI personal trainer project, pandas loads datasets, applies filters to remove
outliers or erroneous data, and structures the data into formats required by the
machine learning algorithms. It also facilitates merging various data sources like
user inputs and exercise datasets to create comprehensive data for analysis.
14
2.4.2 Features
• DataFrame and Series: Two primary data structures that allow for easy
handling of heterogeneous data.
Additional Features:
• Time Series Support: Special functionalities for handling date and time
data efficiently, making it ideal for fitness tracking.
15
2.4.5 Use Cases in the Project
2.5.1 Overview
The web browser is the platform through which users interact with the AI personal
trainer system. It displays the user interface created using front-end technologies
16
and communicates with the Flask backend to handle user inputs and return the
relevant outputs.
2.5.2 Features
Additional Features:
17
• Web APIs: For accessing browser features such as local storage,
geolocation, and notifications, improving application functionality.
In the AI Personal Trainer project, web browsers play a critical role in:
18
2.6 PC:
2.6.1 Description
This section outlines the necessary hardware components required to run the
application and handle related tasks such as storing code, models, and data.
2.6.2 Specification
• PC Configuration:
o 128 MB RAM
o 40 GB HDD
• Storage Space: Adequate space for storing the application code, trained
model, and any associated data files.
2.6.3 Working
The system with the given specifications will support basic computational
tasks, application development, and data storage needs. The monitor provides
visual feedback, while the keyboard and mouse serve as input devices for user
interaction. The PC's storage is sufficient for handling lightweight applications
and datasets.
19
CHAPTER 3
WORKING
20
• Push-Up Tracking: The project calculates the angle of the arms to
determine the push-up movement. When the average angle of the arms
indicates a downward position, the counter increases.
• Pull-Up Monitoring: It tracks the position of the nose relative to the
average position of the elbows to count pull-ups. The counter
increments when the nose rises above a certain threshold.
• Squat Assessment: By analyzing the angles of the legs, the project can
count squats based on the angle of the legs reaching below a specified
limit.
• Walking Detection: The system counts steps based on the position of the
knees. When one knee crosses the position of the other, the counter
increments.
• Sit-Up Evaluation: The angle of the abdomen is monitored to assess sit-
ups. The counter increases when the angle falls below a specific
threshold.
User Feedback and Display:
To enhance user experience, the project provides real-time feedback on
the screen. The current exercise type, count of repetitions, and status (e.g.,
active/inactive) are displayed using OpenCV’s putText function. This helps
users visually track their performance during the exercise session.
Supporting Classes and Functions:
Several supporting classes and functions help organize the code
• BodyPartAngle Class: This class encapsulates methods to calculate
angles between various body parts, which is crucial for tracking exercise
form.
• Utility Functions: Functions like calculate_angle, detection_body_part,
and score_table streamline the detection of body parts, calculation of
angles, and display of user feedback.
21
Real-Time Performance Monitoring:
The system continuously monitors the user's performance in real time.
As the user exercises, the program evaluates their movements, counts
repetitions, and provides instant feedback on their form and progress. This
immediate response is critical for ensuring users maintain proper technique
and encourages them to achieve their fitness goals.
Main.py:
import cv2
import argparse
from utils import *
import mediapipe as mp
from body_part_angle import BodyPartAngle
from types_of_exercise import TypeOfExercise
22
ap = argparse.ArgumentParser()
ap.add_argument("-t",
"--exercise_type",
type=str,
help='Type of activity to do',
required=True)
ap.add_argument("-vs",
"--video_source",
type=str,
help='Type of activity to do',
required=False)
args = vars(ap.parse_args())
args = vars(ap.parse_args())
mp_drawing = mp.solutions.drawing_utils
mp_pose = mp.solutions.pose
if args["video_source"] is not None:
cap = cv2.VideoCapture("Exercise Videos/" + args["video_source"])
else:
cap = cv2.VideoCapture(0) # if you want to change your webcam just
toggle the parameter
cap.set(3, 800) # width
cap.set(4, 480) # height
# setup mediapipe
with mp_pose.Pose(min_detection_confidence=0.5,
min_tracking_confidence=0.5) as pose:
counter = 0 # movement of exercise
23
status = True # state of move
while cap.isOpened():
ret, frame = cap.read()
frame = cv2.resize(frame, (800, 480),
interpolation=cv2.INTER_AREA)
# recolor frame to RGB
frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
frame.flags.writeable = False
# make detection
results = pose.process(frame)
# recolor back to BGR
frame.flags.writeable = True
frame = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR)
try:
landmarks = results.pose_landmarks.landmark
counter, status = TypeOfExercise(landmarks).calculate_exercise(
args["exercise_type"], counter, status)
except:
pass
24
circle_radius=2),
mp_drawing.DrawingSpec(color=(174, 139, 45),
thickness=2,
circle_radius=2),
)
cv2.imshow('Video', frame)
if cv2.waitKey(10) & 0xFF == ord('q'):
break
cap.release()
cv2.destroyAllWindows()
Types_of_exercise:
import numpy as np
from body_part_angle import BodyPartAngle
from utils import *
class TypeOfExercise(BodyPartAngle):
def _init_(self, landmarks):
super()._init_(landmarks)
if status:
if avg_arm_angle < 70:
counter += 1
status = False
else:
25
if avg_arm_angle > 160:
status = True
return [counter, status]
def pull_up(self, counter, status):
nose = detection_body_part(self.landmarks, "NOSE")
left_elbow = detection_body_part(self.landmarks, "LEFT_ELBOW")
right_elbow = detection_body_part(self.landmarks, "RIGHT_ELBOW")
avg_shoulder_y = (left_elbow[1] + right_elbow[1]) / 2
if status:
if nose[1] > avg_shoulder_y:
counter += 1
status = False
else:
if nose[1] < avg_shoulder_y:
status = True
return [counter, status]
def squat(self, counter, status):
left_leg_angle = self.angle_of_the_right_leg()
right_leg_angle = self.angle_of_the_left_leg()
avg_leg_angle = (left_leg_angle + right_leg_angle) // 2
if status:
if avg_leg_angle < 70:
counter += 1
status = False
else:
if avg_leg_angle > 160:
status = True
return [counter, status]
26
def walk(self, counter, status):
right_knee = detection_body_part(self.landmarks, "RIGHT_KNEE")
left_knee = detection_body_part(self.landmarks, "LEFT_KNEE")
if status:
if left_knee[0] > right_knee[0]:
counter += 1
status = False
else:
if left_knee[0] < right_knee[0]:
counter += 1
status = True
return [counter, status]
def sit_up(self, counter, status):
angle = self.angle_of_the_abdomen()
if status:
if angle < 55:
counter += 1
status = False
else:
if angle > 105:
status = True
return [counter, status]
def calculate_exercise(self, exercise_type, counter, status):
if exercise_type == "push-up":
counter, status = TypeOfExercise(self.landmarks).push_up(
counter, status)
elif exercise_type == "pull-up":
counter, status = TypeOfExercise(self.landmarks).pull_up(
27
counter, status)
elif exercise_type == "squat":
counter, status = TypeOfExercise(self.landmarks).squat(
counter, status)
elif exercise_type == "walk":
counter, status = TypeOfExercise(self.landmarks).walk(
counter, status)
elif exercise_type == "sit-up":
counter, status = TypeOfExercise(self.landmarks).sit_up(
counter, status)
return [counter, status]
2. utils.py:
import mediapipe as mp
import pandas as pd
import numpy as np
import cv2
mp_pose = mp.solutions.pose
def calculate_angle(a, b, c):
a = np.array(a)
b = np.array(b)
c = np.array(c)
radians = np.arctan2(c[1] - b[1], c[0] - b[0]) -\
np.arctan2(a[1] - b[1], a[0] - b[0])
angle = np.abs(radians * 180.0 / np.pi)
if angle > 180.0:
angle = 360 - angle
28
return angle
def detection_body_part(landmarks, body_part_name):
return [
landmarks[mp_pose.PoseLandmark[body_part_name].value].x,
landmarks[mp_pose.PoseLandmark[body_part_name].value].y
landmarks[mp_pose.PoseLandmark[body_part_name].value].visibility
]
def detection_body_parts(landmarks):
body_parts = pd.DataFrame(columns=["body_part", "x", "y"])
for i, lndmrk in enumerate(mp_pose.PoseLandmark):
lndmrk = str(lndmrk).split(".")[1]
cord = detection_body_part(landmarks, lndmrk)
body_parts.loc[i] = lndmrk, cord[0], cord[1]
return body_parts
29
import pandas as pd
import numpy as np
import cv2
from utils import *
class BodyPartAngle:
def _init_(self, landmarks):
self.landmarks = landmarks
def angle_of_the_left_arm(self):
l_shoulder = detection_body_part(self.landmarks,
"LEFT_SHOULDER")
l_elbow = detection_body_part(self.landmarks, "LEFT_ELBOW")
l_wrist = detection_body_part(self.landmarks, "LEFT_WRIST")
return calculate_angle(l_shoulder, l_elbow, l_wrist)
def angle_of_the_right_arm(self):
r_shoulder = detection_body_part(self.landmarks,
"RIGHT_SHOULDER")
r_elbow = detection_body_part(self.landmarks, "RIGHT_ELBOW")
r_wrist = detection_body_part(self.landmarks, "RIGHT_WRIST")
return calculate_angle(r_shoulder, r_elbow, r_wrist)
def angle_of_the_left_leg(self):
l_hip = detection_body_part(self.landmarks, "LEFT_HIP")
l_knee = detection_body_part(self.landmarks, "LEFT_KNEE")
l_ankle = detection_body_part(self.landmarks, "LEFT_ANKLE")
return calculate_angle(l_hip, l_knee, l_ankle)
def angle_of_the_right_leg(self):
r_hip = detection_body_part(self.landmarks, "RIGHT_HIP")
r_knee = detection_body_part(self.landmarks, "RIGHT_KNEE")
r_ankle = detection_body_part(self.landmarks, "RIGHT_ANKLE")
30
return calculate_angle(r_hip, r_knee, r_ankle)
def angle_of_the_neck(self):
r_shoulder = detection_body_part(self.landmarks,
"RIGHT_SHOULDER")
l_shoulder = detection_body_part(self.landmarks,
"LEFT_SHOULDER")
31
shoulder_avg = [(r_shoulder[0] + l_shoulder[0]) / 2,
(r_shoulder[1] + l_shoulder[1]) / 2]
# calculate angle of the avg hip
r_hip = detection_body_part(self.landmarks, "RIGHT_HIP")
l_hip = detection_body_part(self.landmarks, "LEFT_HIP")
hip_avg = [(r_hip[0] + l_hip[0]) / 2, (r_hip[1] + l_hip[1]) / 2]
# calculate angle of the avg knee
3.4 Output
PULL UP:
Fig : 3.4.1
32
SIT UP:
Fig : 3.4.2
PUSH UP:
Fig : 3.4.3
33
WALK:
Fig : 3.4.4
34
CHAPTER 4
CONCLUSION
35
REFERENCES
[1] S. Jin, L. Xu, J. Xu, C. Wang, W. Liu, C. Qian, W. Ouyang and P. Luo, “Whole-
Body Human Pose Estimation in the Wild”, In book: Computer Vision – ECCV
2020 (pp.196-214).
[5] S. Kreiss, L. Bertoni and A. Alahi, “PifPaf: Composite Fields for Human Pose
Estimation”, IEEE/CVF Conference on Computer Vision and Pattern
Recognition (CVPR), June 2019.
36