0% found this document useful (0 votes)
12 views

Report Python

This report from National Economic University discusses the use of Manim, a mathematical animation engine, to visualize linear regression in Python. It details the project implementation process, including design steps, tools used, and the algorithms behind the animations, emphasizing the importance of linear regression in data science. The report aims to enhance understanding of linear regression through engaging visual storytelling and practical examples.

Uploaded by

Vu Bui
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

Report Python

This report from National Economic University discusses the use of Manim, a mathematical animation engine, to visualize linear regression in Python. It details the project implementation process, including design steps, tools used, and the algorithms behind the animations, emphasizing the importance of linear regression in data science. The report aims to enhance understanding of linear regression through engaging visual storytelling and practical examples.

Uploaded by

Vu Bui
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 37

lOMoARcPSD|18620449

Report Python nhóm 6

Lập trình python (Đại học Kinh tế Quốc dân)

Scan to open on Studocu

Studocu is not sponsored or endorsed by any college or university


Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

NATIONAL ECONOMIC UNIVERSITY


***

PYTHON FOR DATA SCIENCE’S REPORT

TOPIC: USING MANIM TO VISUALIZE LINEAR


REGRESSION

Class: DSEB 64A


Supervisor: Nguyen Tuan Long
Student’s group: GROUP 6
Members: 1.Nguyen Phuong Hoai Ngoc (24%)
2.Vo Huyen Khanh May (24%)
3.Vo Thi Minh Phuong (21%)
4.Tran Phuong Linh (19%)
5.Duong Nhat Thanh (12%)

HANOI, 2023

Downloaded by Vu Bui ([email protected])


lOMoARcPSD|18620449

NATIONAL ECONOMIC UNIVERSITY


***

PYTHON FOR DATA SCIENCE’S REPORT

TOPIC: USING MANIM TO VISUALIZE LINEAR


REGRESSION

Class: DSEB 64A


Supervisor: Nguyen Tuan Long
Student’s group: GROUP 6
Members: 1.Nguyen Phuong Hoai Ngoc
2.Vo Huyen Khanh May
3.Vo Thi Minh Phuong
4.Tran Phuong Linh
5.Duong Nhat Thanh

HANOI, 2023

Downloaded by Vu Bui ([email protected])


lOMoARcPSD|18620449

ACKNOWLEDGEMENT
------*------

We would like to extend our heartfelt gratitude to all who have supported and
guided us throughout the preparation of this report on the Python For Data
Science’s lecture.

First and foremost, we owe our deepest thanks to our esteemed lecturer, Nguyen
Tuan Long, whose insights, patience, and dedication have been invaluable. Their
profound knowledge and persistent encouragement have been the guiding light
throughout our research and analysis. Without their unwavering support and
guidance, this report would not have been possible.

We are also immensely grateful to our fellow classmates and group members. The
collaborative spirit, constant exchange of ideas, and shared dedication among all
four of us have made this journey both rewarding and enlightening. Each member
brought a unique perspective and expertise that enriched the report and made it a
true team effort.

Lastly, we appreciate our families and friends for their understanding, patience,
and encouragement during the late nights and intense discussions.

In unity, there's strength. This report is a testament to the power of collaboration,


mentorship, and shared passion. Thank you to everyone who has been a part of
this journey.

Sincerely,

Nguyen Phuong Hoai Ngoc


Vo Huyen Khanh May
Vo Thi Minh Phuong
Tran Phuong Linh
Duong Nhat Thanh

Downloaded by Vu Bui ([email protected])


lOMoARcPSD|18620449

TABLE OF CONTENTS
------*------
DESCRIPTION OF PROJECT IMPLEMENTATION PROCESS USING
MANIM .......................................................................................................................... 5
I. Design and development steps for the animations ................................................ 5
1.1. Understanding the Problem: ............................................................................ 5
1.2. Developing Animation for Each Step: .............................................................. 5
II. Tools and techniques used ...................................................................................... 7
III. Algorithms and logic behind the animations........................................................ 8
3.1. Introduction to problem ..................................................................................... 8
3.2. Constructing and optimizing the loss function................................................... 9
3.2.1. Model............................................................................................................... 9
3.2.2. Loss function ................................................................................................. 10
3.2.3. Gradient Descent .......................................................................................... 11
3.4. Discussion ........................................................................................................ 13
3.4.1. Problems that can be solved by linear regression ........................................ 13
3.4.2. Limitations of linear regression .................................................................... 13
IV. Challenges and difficulties faced. ........................................................................ 14
PRESENTATION OF COMPLETED ANIMATIONS ........................................... 16
I. Linear Regression model ..................................................................................... 16
1.1. Introduction................................................................................................... 16
II. Loss Function ........................................................................................................ 18
2.1. Introduce to Loss Function ........................................................................... 18
2.2. Gradient Descent .......................................................................................... 22
III. Training/testing process....................................................................................... 22
3.1. Training/Testing process ................................................................................ 22
3.2. Explain class MetricScene .............................................................................. 25
IV. Solve an application problems ............................................................................ 25
4.1. Explain class TableExamples ......................................................................... 25
4.2. Explain class FitScene2 .................................................................................. 27
V. Explain the last scene ........................................................................................... 31
EVALUATION OF USING MANIM ........................................................................ 32
CONCLUSION ............................................................................................................ 34
REFERENCES ............................................................................................................ 35

Downloaded by Vu Bui ([email protected])


lOMoARcPSD|18620449

INTRODUCTION
------*------
In the realm of mathematical animation, few tools possess the capability to
seamlessly blend elegance with mathematical precision as effectively as Manim
— the Mathematical Animation Engine. Developed primarily by Grant Sanderson
of 3Blue1Brown, Manim has emerged as a powerful open-source library that
transforms the way we visualize and communicate mathematical concepts. This
essay explores the intersection of Manim's prowess and the fundamental
principles of linear regression, delving into how this animation engine breathes
life into abstract mathematical constructs.

Linear regression, a cornerstone of statistical modeling, forms the bedrock of


predictive analytics and data science. This technique, rooted in the principles of
mathematics and statistics, endeavors to establish a linear relationship between a
dependent variable and one or more independent variables. In the pursuit of
illuminating this statistical cornerstone, we embark on a journey through the
nuances of linear regression, seeking to demystify its algorithms and logic.
Through the lens of Manim, linear regression transcends the confines of static
representations, allowing learners and educators alike to witness the dance of data
points, the convergence of slopes, and the vivid story painted by regression lines.

This report serves a dual purpose — to unveil the capabilities of Manim in the
context of linear regression and to elucidate the fundamental principles that
underlie this statistical technique. By marrying the elegance of visual storytelling
with the rigor of mathematical analysis, we aim to not only showcase the
versatility of Manim but also provide a comprehensive understanding of linear
regression, making it accessible to a broader audience.

4
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

DESCRIPTION OF PROJECT IMPLEMENTATION


PROCESS USING MANIM
I. Design and development steps for the animations

The Linear Regression problem is a fundamental concept in statistics and


machine learning, serving to model the linear relationship between input and
output variables. In practical terms, it finds applications in predicting house
prices, sales figures, and numerous other fields. Realising the importance of linear
regression model in reality, our team comes up with an idea that we are going to
explain Linear Regression by using manim library, coupled with a detailed
explaination of a practical example based on our data.

1.1. Understanding the Problem:


Before delving into animation development, it is crucial to thoroughly
understand the problem at hand. This involves:

Data Understanding: The data set that we use is special, that the data must be
scattered in a straight line, so that the linear regression model will work effectively
and produce accurate results. As can be seen, we are using the data of housing
price and describe in a graph. The data actually be scattered on a straight line,
suitable for the application of linear regression.

Problem Definition: After coming up with the question: “How would we do


if we want to estimate how much of a x square meter house cost?”, we are going
to present the formula of loss function, base on the linear regression model.

1.2. Developing Animation for Each Step:


After gaining a clear understanding, the Linear Regression process can be
outlined through key steps:

Model Building: the linear function is : 𝑦 = 𝑎𝑥 + 𝑏 where 𝑦 is the output


variable also called the dependent variable, expressed as f(x), the function of the
input variable ( 𝑥) on the other hand would serve as the input variable also called
the independent variable. It's likely to be seen the coefficients m and b expressed
as β1 and β0 respectively. The 𝑚 or β1 coefficient controls the slope of the line

5
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

and the 𝑏 or the β0 controls the intercept of the line. In machine learning, they
also known as the bias.

Loss Function: Next step is visualizing the linear regression as a hyperplane.


So how do we fit the line to these points? These differences between the points
and the line these little red segments these are called residuals. They are the
differences between the data points and the predictions the line would produce
take each of these residuals and square them. These are the squared errors and
the larger the residuals are the more amplified the area of the squares are if we
total the areas of all of these squares for a given line we will get the Sum of the
Squared Error (SSE), then take the average to get the Mean of the Squared Error
(MSE), divided it with two, we will get the Loss Function.

Gradient Descent: After that, we need to find the 𝑚 and 𝑏 coefficients that will
minimize that Loss Function, the coefficients can be solved with a variety of
techniques. One of them is gradient descent which is the algorithm for finding the
minimum value of the function 𝐽(θ) based on the derivative.

Train/Test Split: Dividing data into training and testing sets to evaluate model
performance. The training data set will then be used to fit the regression line the
test data set will then be used to validate the regression line this is done to make
sure that the regression performs well on data it has not seen before metrics used
to evaluate the linear regression vary from the r square standard error of the
estimate prediction intervals as well as statistical significance.

Performance Metrics: R-squared, Standard error of the estimate (SEE), Mean


absolute error (MAE)… to assess the model's predictive capabilities.

By developing step-by-step animations of the Linear Regression process, we


can provide a more engaging and intuitive learning experience. This not only
makes the learning process more enjoyable but also aids in the practical
application of knowledge to real-world problems. This comprehensive
understanding, coupled with visual representation, ensures that users gain a deep
insight into the intricacies of Linear Regression and its practical implications.

6
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

II. Tools and techniques used

About tools, we wrote our code in Python classes that are inherited from
Manim's Scene class to structure each animation scene. Google Collab provides a
convenient browser-based environment for development without local software
installs. To combine the finished scenes into a video, we exported the animations
and edited them together using video software since rendering all scene in 1 video
by Manim is complex and wasting a lot of time. Moreover, we utilized many new
(to us) Python libraries to code what we want, for instance, we have used pandas
to read and transfer file of data into many graphs needed for visualizing data. This
step has opened a door of getting to know a simple way to create graph from data,
and as we are doing a statistical project involving coding, this helps a lot. Besides,
the sklearn library provides us essential functions such as defferentitate function
so that we do not need to put so much effort in building a whole function to
differentitate what we need.

In this project, we employ machine learning techniques, specifically linear


regression, to analyze and model a dataset. Using the power of Python and its
robust machine learning libraries, we implement a linear regression model to
understand the underlying relationships within the data. Once the model is trained,
we leverage Manim, a powerful mathematical animation library, to visually
represent the linear regression process. Manim allows us to create dynamic and
engaging animations that vividly illustrate the key concepts of linear regression,
such as the fitting of a line to the data points. Through the seamless integration of
machine learning, Python programming, and mathematical animation with
Manim, we bring to life the intricacies of linear regression, making it accessible
and visually compelling for learners and enthusiasts alike.

7
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

III. Algorithms and logic behind the animations

In this section, we delve into the underlying algorithms and logic governing
the creation of animations, particularly focusing on linear regression as a
foundational technique. The linear regression algorithm solves problems with
real-valued outputs, for example: predicting house prices, predicting stock prices,
predicting age, etc. It is a supervised algorithm where the relationship between
the input and output is described by a linear function. This algorithm is also
referred to as linear fitting or linear least squares.

3.1. Introduction to problem


Consider a scenario within the real estate domain, where an individual is
employed in a real estate company and possesses data pertaining to the area and
prices of houses. The objective is to estimate the price of a new house, recognizing
that the actual price is contingent upon various factors such as area, number of
rooms, proximity to commercial centers, among others. However, for the purpose
of simplification, let us assume that the house price is solely dependent on the area
of the house. A dataset consisting of information on the area and selling prices of
50 houses is available:

Price Table 1: Table displaying housing


Area (𝒎𝟐 )
(million VND) price data based on area
30 448
32 509
34 535
37 551
… …

In the hypothetical situation where one is tasked with estimating the price of a
50-square-meter house, a logical approach involves plotting a line that best aligns
with the given data points. Subsequently, the price of the house at the 50-square-
meter point can be calculated.

8
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

Figure 3.1: Estimate the price of a


house with 50 square meters

From a programming perspective, two essential tasks must be undertaken:


1. Training: The identification of the line (model) that best fits the data points.
While a human observer can manually draw such a line based on Figure 1, a
computational system lacks this capability and necessitates the use of the Gradient
Descent algorithm, as elucidated below.

2. Inference: The prediction of the price for a 50-square-meter house based


on the line established in the preceding training step.

3.2. Constructing and optimizing the loss function


3.2.1. Model
The equation representing a straight line takes the form:

𝑦 = 𝑤1 ∗ 𝑥 + 𝑤0
Therefore, the task of finding the straight line is equivalent to determining the
values of 𝑤0 and 𝑤1 . To facilitate formulaic representation, we designate the
data in the data table as (𝑥1 , 𝑦1 ) = (30,448), (𝑥2 , 𝑦2 ) = (32,509),...

This signifies that a house with area 𝑥𝑖 corresponds to an actual price 𝑦𝑖 . The
predicted value by the current model is denoted as:

ŷ𝑖 = 𝑤1 ∗ 𝑥𝑖 + 𝑤0 .

9
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

3.2.2. Loss function

The process of finding 𝑤0 and 𝑤1 may be straightforward if performed


manually; however, computers lack this intuitive capability. Hence, initial values,
such as 𝑤0 = 0, 𝑤1 = 1, are randomly chosen and subsequently adjusted. It is
evident that the line 𝑦 = 𝑥 does not closely align with the data points or the
desired line. For instance, at 𝑥 = 42 (house size of 42 square meters), the actual
price is 625 million, whereas the model predicts only 42 million.

Figure 3.2: The difference at the point


x = 42 between the model line y = x
and the actual values in Table 1.

A function is required to assess the effectiveness of the line with parameters


(𝑤0 , 𝑤1 ) = (0,1) is good or not. For each data point (𝑥𝑖 , 𝑦𝑖 ), the difference between
1
the actual price and the predicted price is calculated by: ∗ (ŷ𝑖 − 𝑦𝑖 )2 . The
2
cumulative difference over the entire dataset is expressed as the sum of these
differences:
𝑁
1 1
𝐽 = ∗ ( ) ∗ (∑(ŷ𝑖 − 𝑦𝑖 )2 )
2 𝑁
𝑖=1

Here, 𝑁 denotes the number of data points. Several observations can be made:
• 𝐽 is non-negative.
• The smaller 𝐽 is, the closer the line is to the data points. If 𝐽 = 0, the line
perfectly intersects all data points.

The term 𝐽 is referred to as the loss function, serving as an evaluative metric


to determine the adequacy of the current parameters with the data.

Hence, the imperative which is to ascertain the optimal fitting line (model)
with the dataset transforms into the task of finding 𝑤0 , 𝑤1 such that the loss
function 𝐽 is minimized.

10
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

3.2.3. Gradient Descent

Gradient descent is an algorithm utilized to find the minimum value of a


function 𝑓(𝑥) based on its derivative. In the context of linear regression, the
objective is to find values for w0 , w1 that minimize the function J(𝑤0 , 𝑤1 ) by
gradient descent algorithm.
General Algorithm in Linear Regression:
- Step 1: Initialize w0 , 𝑤1 with random value.
For example, 𝑤0 = 0 and 𝑤1 = 1 , i.e a line equation is 𝑦 = 𝑤1 𝑥 + 𝑤0 then 𝑦 =
𝑥 . This can give a line that is not best fitted for the historical data in figure 3.2.
- Step 2: Compute the partial derivative of 𝐽(𝑤0 , 𝑤1 ) with respect to 𝑤0 , 𝑤1
and plug in the current values of 𝑥, 𝑦, 𝑤1 and 𝑤0 in it to obtain the derivative
value D.
𝑁
dJ
Dw0 = = ∑ 𝑥𝑖 ∗ (𝑤0 + 𝑤1 ∗ 𝑥𝑖 − 𝑦𝑖 )
dw0
𝑖=1
𝑁
𝑑𝐽
Dw1 = = ∑ 𝑥𝑖 ∗ (𝑤0 + 𝑤1 ∗ 𝑥𝑖 − 𝑦𝑖 )
𝑑𝑤1
𝑖=1
Gradients give the direction of the movement of 𝑤1 and 𝑤0 with respect to 𝐽

In this step, compute the partial derivative of 𝐽 with respect to 𝑤0 (Dw0 ) and
partial derivative with respect to 𝑤1 (Dw1 ) using the above equation for each data
point (𝑥𝑖 , 𝑦𝑖 ). Finally, calculate the sum of all compute the partial derivative of 𝐽
with respect to 𝑤0 .and partial derivative with respect to 𝑤1 . In other words, we
compute the gradient of 𝐽(𝑤0 , 𝑤1 ) for the current dataset.

- Step 3: Update 𝑤0 , 𝑤1 using the formula


𝑤0 = 𝑤0 − 𝐿 ∗ 𝐷𝑤0
𝑤1 = 𝑤1 − 𝐿 ∗ 𝐷𝑤1
(where learning_rate (𝐿) is a non-negative constant, e.g., learning_rate(𝐿)
=0.001)

11
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

- Step 4: Calculate 𝐽(𝑤0 , 𝑤1 ) :


If 𝐽(𝑤0 , 𝑤1 ) is sufficiently small, stop. Otherwise, proceed to Step 2.

Figure 3.3: Gradient


Descent Algorithm steps

In essence, iterate Step 2 a sufficiently large number of times (e.g., 100 or 1000
iterations depending on the problem and the learning rate coefficient) until reaches
a small enough value. The value of 𝑤0 , 𝑤1 that we are left with now will 𝐽(𝑤0 , 𝑤1 )
be the optimum values.
Choosing the learning rate is extremely important. There are three scenarios to
consider:
• If 𝐿 is small: each time the function decreases very little so it takes many times
to perform step 2 for the function to reach the smallest value
• If 𝐿 is reasonable: after a reasonable number of step 2 iterations, the function
will reach a small enough value.
• If 𝐿 is too large: will cause overshoot and never reach the minimum value of
the function.

Figure 3.4: Learning rate in Gradient Descent

12
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

The best way to check whether the learning rate is appropriate is to examine
the value of the function after each execution of step 2 by plotting a graph
Comments:
• The algorithm works very well in cases where it is not possible to find the
minimum value using linear algebra.
• The most important thing in the algorithm is to calculate the derivative of
the function with respect to each variable and then repeat step 2.
3.4. Discussion
3.4.1. Problems that can be solved by linear regression
The function 𝒚 ≈ 𝒇(𝒙) = 𝒙ᵀ𝒘 is a linear function in both 𝒘 and 𝒙 . In
practice, linear regression can be applied to models that are linear only in 𝒘 . For
example,
𝑦 ≈ 𝑤1 𝑥1 + 𝑤2 𝑥2 + 𝑤3 𝑥12 + 𝑤4 sin(𝑥2 ) + 𝑤5 𝑥 1 𝑥 2 + 𝑤0

is a linear function in 𝒘 and therefore can also be solved using linear


regression. For each feature vector 𝒙 = ⟨𝑥1 , 𝑥2 ⟩Τ , we compute a new feature
vector
x̃ = [𝑥1 , 𝑥2 , 𝑥12 , sin(𝑥 2 ) , 𝑥1 𝑥2 ]ᵀ

and then apply linear regression with this new data. However, finding
functions like sin(𝑥 2 ) or 𝑥1 𝑥2 is relatively nontrivial. Polynomial regression is
often used more frequently with new feature vectors in the form [1, 𝑥1 , 𝑥12 , … ]Τ

3.4.2. Limitations of linear regression

While linear regression proves effective in many cases, it has inherent


limitations. One notable limitation is its reliance on a linear relationship between
input features and output. In situations where the underlying relationship is non-
linear, linear regression may yield suboptimal results. Additionally, it is sensitive
to outliers, and the assumption of homoscedasticity (constant variance of errors)
must be satisfied for robust performance. In the example of the relationship
between height and weight below, if there is just one pair of noisy data (150 cm,

13
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

90 kg), the result will deviate significantly. Careful consideration of these


limitations is crucial when applying linear regression in real-world scenarios.

Figure 3.5: Noisy data (150 cm,90 kg) in Dataset

IV. Challenges and difficulties faced.

Engaging with Manim, a powerful library tailored for dynamic mathematical


visualizations, has presented a spectrum of challenges in our journey of learning
and application. These challenges manifest across collaborative coding, local
execution intricacies, documentation limitations, and the academic demands
associated with mastering this innovative tool.

In the domain of collaborative coding, the unique requirements and


dependencies of Manim introduce complexities to the development process
when utilizing platforms like GitHub and shared codespaces. Challenges arise in
versioning, dependencies, and specific configurations, necessitating adeptness in
navigating through intricate setup procedures to ensure seamless collaboration.

The challenge extends to local execution, compelling us to resort to platforms


like Google Colab for task execution. However, this alternative is not without its
challenges. The installation of Manim libraries during each Colab run proves
time-consuming, impeding the efficiency of the development process.
Streamlining and enhancing local execution methods emerge as imperative

14
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

needs.

Furthermore, the scarcity of comprehensive documentation for Manim,


attributable to its relative novelty in comparison to more established
programming languages, introduces an additional layer of complexity. Learning
new functions and features within the library demands heightened effort. While
Manim inherits from Python, its specialized nature mandates a deeper
understanding, contributing to a discernible learning curve.

The academic pressure associated with mastering Manim is a significant


challenge. Expressing ideas through animation requires not only technical
proficiency but also a creative implementation that elevates the learning journey.
This pressure intensifies as users find themselves concurrently undertaking
various other courses, augmenting the overall workload.

In addition to the technical challenges, the creative expression inherent in


animating ideas proves formidable for those new to Manim. Beyond mastering
the library's functions, users must navigate the creative process of visualizing
mathematical concepts, a dual challenge that amplifies the complexity of initial
interactions with Manim.

In conclusion, the challenges encountered in collaborative coding, local


execution, documentation limitations, and the academic pressures linked to
mastering Manim collectively contribute to a nuanced learning and application
process. As users endeavor to unlock the full potential of this specialized library,
the equilibrium between technical proficiency and creative expression assumes
paramount significance. Addressing these challenges not only facilitates a more
seamless adoption of Manim but also unlocks its capabilities for dynamic and
engaging mathematical visualizations.

15
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

PRESENTATION OF COMPLETED ANIMATIONS


I. Linear Regression model
1.1. Introduction

Figure 1&2: Open Scene and Table of content Scene

Figure 3&4: Leading scene

The animation opens with title “Visualizing Linear Regression” which is our
topic Project. Then it introduces the table of contents in our video.

About the Leading scene, our mission is to make sense of the situation where
the Linear Regression can be applied. Moreover, as our real life example is house
pricing, we tried to draw a house in the scene with its parts combined of many
mobjects in Manim and a table which includes dataset about house pricing
depending on area.

16
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

1.2. Brief description of Linear Regression model animation

Figure 5: Linear Regression model title scene

The next scene is about introducing Linear Regression model, which is the
main point of the project. The title has the same formula with the previous scene,
the gradient color effect makes the code more eye-catching. However, the follow
parts actually are one of the most important scenes to explain an comprehensive
understanding about Linear Regression model.

Figure 6&7: The Linear regression of equation

In mathematics, the formula 𝑦 = 𝑎𝑥 + 𝑏 is quite familiar to everyone about


the linear function. However, in Linear Regression model, people most use 𝑦 =
𝑚𝑥 + 𝑏 as a typical function for the training algorithm. Beside, the linear
equation can depend one or more variables, thus we have the multiple variables
linearly equation in figure 7. The animation also emphasizing that 𝑦̂ is the
predicted output, and 𝑥 is the independent variable.

17
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

Figure 8: Correlation
scence

In this animation, we use a linear line on the xy-axis to illustrate the correlation
of the linear function.

II. Loss Function

2.1. Introduce to Loss Function


Before metioning the animation of Loss Function class, we would like to
define a function:

1. def create_model() -> tuple: to create the grapgh.


2. data = list(pd.read_csv("https://bit.ly/2KF29Bd").itertuples())
3. m = ValueTracker(1.93939)
4. b = ValueTracker(4.73333)
To load data from a CSV file using panda,with two ValueTracker objects m
and b are created and initialized with the values 1.93939 and 4.73333,
respectively. Next the Axis is created as a set in Manim, with specified x and y
ranges and additional configuration options. The slope and y-intercept are
initialized as ValueTracker objects.
1. ax = Axes(x_range=[0, 10],y_range=[0, 25, 5],
#y=0,5,10,15..,axis_config={"include_tip": False},)
The function then creates a set of axes using Manim, defined with specific x
and y ranges and additional configuration options. Data points are plotted on these
axes based on the loaded dataset.
1. points = [Dot(point=ax.c2p(p.x, p.y), radius=.15, color=BLUE) for p in data]
line = Line(start=ax.c2p(0, b.get_value()), end=ax.c2p(10, m.get_value() * 10
2. + b.get_value())).set_color(YELLOW)
Additionally, a linear regression line is established with a specified starting
point and follows the changes in the slope and y-intercept values during
animations.

18
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

1. line.add_updater(
2. lambda l: l.become(
3. Line(start=ax.c2p(0, b.get_value()),
4. end=ax.c2p(10, m.get_value() * 10 +
b.get_value()))).set_color(YELLOW)
5. )
6. return data, m, b, ax, points, line

Furthermore, an updater is added to the line object, ensuring that it


dynamically adjusts its position based on changes in the slope 𝑚 and y-intercept
𝑏 values during animations. The function ultimately returns a tuple containing the
dataset, slope and y-intercept trackers, axes, data points, and the linear regression
line. This comprehensive setup allows for the visualization and manipulation of a
simple linear regression model in a Manim animation context.

Following the function def create_model(), we are going to drawing it on screen


with class.

1. data, m, b, ax, points, line = create_model()


2. graph = VGroup(ax, *points)
The create_model function is first utilized to assemble a comprehensive set
of components for visualizing a linear regression model. These components
include the dataset, tracked slope 𝑚 and y-intercept 𝑏, axes, data points, and the
regression line. Subsequently, a visual grouping, named graph, is created to
encapsulate the axes and data points.

After creating three versions of the linear regression function and notation for
the function in LaTeX format using Manim's MathTex class.

1. eq5 = MathTex("f(x) = ", r"m ", r"x + ", "b").move_to((RIGHT + DOWN))


2. eq5[1].set_color(RED)
3. eq5[3].set_color(RED)
4. eq6 = MathTex(r"f(x) = ", r"\beta_1", r"x + ",
r"\beta_0").move_to((RIGHT + DOWN))
5. eq6[1].set_color(RED)
6. eq6[3].set_color(RED)
7. eq7 = MathTex("f(x) = ", f'{m.get_value()}', r"x + ",
f'{b.get_value()}').move_to((RIGHT + DOWN))
8. eq7[1].set_color(RED)
9. eq7[3].set_color(RED)
10. note1 = Text("m: slope of the line",font_size=20).next_to(eq5, DOWN)
11. note2 = Text("b: intercept of the line",font_size=20).next_to(note1, DOWN)

19
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

To ensure the movement of the line of the graph, we define the function def

blink(item, value, increment) that animates the visual representation of changing


values. It scales the text item (item) and increments the corresponding value
(value). After that, invoking the blink function for the slope and y-intercept terms,
creating a dynamic representation of how these coefficients change visually.
Finally, the equation gracefully fades out by FadeOut animation, creating a
cohesive and informative animation illustrating the key concepts of linear
regression.

1. blink(eq5[1], m, .50)
2. blink(eq5[3], b, 2.0)

The important part of calculating Loss Function is finding the residuals from
the data to the 𝑦 = 𝑚 𝑥 + 𝑏 . Consequently, when creating the graph to
illustrate the Loss Function, we need a function:
1. def create_residual_model(scene,data,m,b,ax,points,line) -> tuple:c to animate
residuals in graph.
2. for d in data:
3. residual = Line(start=ax.c2p(d.x, d.y),
4. end=ax.c2p(d.x, m.get_value() * d.x +
b.get_value())).set_color(RED) #từ các điểm dot đến đường thẳng
5. scene.play(Create(residual), run_time=.3)
6.
7. residual.add_updater( # cập nhật residuals khi m và b thay đổi
8. lambda r,d=d:
9. r.become(Line(start=ax.c2p(d.x, d.y), end=ax.c2p(d.x,
m.get_value()*d.x+b.get_value())).set_color(RED)))
10. residuals += residual
The add_updater method is employed to dynamically update the position of
each residual as the values of the slope 𝑚 and y-intercept 𝑏 change. This ensures
that the residuals move accordingly during any subsequent animations that modify
the linear regression model.

def flex_residuals():

The flex_residuals function introduces a visual effect by flexing or


oscillating the slope 𝑚 around its current value. This is achieved by incrementing
the slope with varying magnitudes and directions, creating an animation that
dynamically adjusts the residuals in order to calculate the total length of squares
and replaces them with a larger square.

20
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

Figure 9&10: Loss Function label and Residual Scene

Next part to create the given scene, we add a class named class

ThirdScene(Scene): The create_model() function establishes the foundational


elements of the linear regression graph, including axes, data points, and the
regression line.

Figure 11-13: SSE and MSE and Loss Function


The cumulative sum of squared errors (SSE) is visually depicted by updating
the size of a red square to represent the total sum of squared residuals. This creates
a graphical representation of the SSE. The script transitions smoothly from SSE
to MSE, with associated labels and formulas. Subsequently, the concept of a
general loss function 𝐽(𝜃) is introduced, offering a broader perspective on error
metrics.

21
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

2.2. Gradient Descent

Figure 14&15: Gradient Descent Scene

This code defines a 3D animation to visualize the process of gradient descent


for linear regression. The algorithm starts with initial estimates for m and b and
calculates the partial derivatives of the loss function with respect to each
parameter. These derivatives indicate the direction of steepest ascent, and the
parameters are updated in the opposite direction, scaled by a learning rate. This
process is repeated iteratively, gradually converging towards the values of m and
b that yield the lowest loss. The key intuition is to follow the negative gradient of
the loss function to find the local minimum, allowing the linear regression model
to better fit the training data.

III. Training/testing process

3.1. Training/Testing process

The function create_train_test_model encapsulates the process of


creating a linear regression model and visualizing its behavior on a training and
testing dataset. The function begins by importing a CSV file containing relevant
data using the Pandas library. The data is then split into training and testing sets
with a ratio of 2:1 using the train_test_split function from the Scikit-learn
library. Subsequently, a linear regression model is fitted to the training data. Then
it process to calculate the 𝑅2 score on the test set.

1. def create_train_test_model() -> tuple:


2. # Import data from a CSV file
3. df = pd.read_csv('https://bit.ly/3TUCgh2', delimiter=",")
4. # Initialize variables for independent (X) and dependent (Y) variables
5. X = df.values[:, :-1]

22
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

6. Y = df.values[:, -1]
7. # Split the dataset into training and testing sets (2/3 train, 1/3 test)
8. X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size=1 /
3, random_state=7)
9. # Fit a Linear Regression model on the training set
10. model = LinearRegression()
11. fit = model.fit(X_train, Y_train)
12. # Calculate the R^2 Score on the test set
13. result = model.score(X_test, Y_test)

The coefficients of the trained model, representing the slope (m) and
intercept (b), are stored using Manim's ValueTracker. An axes system is set
up to create a visual representation of the data points and the linear regression
model. The training and testing data points are plotted on the graph as blue dots.

14. # Store the coefficients of the model into m and b using ValueTrackers
15. m = ValueTracker(fit.coef_.flatten()[0])
16. b = ValueTracker(fit.intercept_.flatten()[0])
17. # Set up the coordinate axes for visualization
18. ax = Axes(
19. x_range=[0, 100, 20],
20. y_range=[-40, 200, 40],
21. axis_config={"include_tip": False},
22. )
23. # Plot the training and testing data points
24. train_points = [Dot(point=ax.c2p(p.x, p.y), radius=.15, color=BLUE) for
p in
25. pd.DataFrame(data={'x': X_train.flatten(), 'y':
Y_train.flatten()}).itertuples()]
26. test_points = [Dot(point=ax.c2p(p.x, p.y), radius=.15, color=BLUE) for p
in
27. pd.DataFrame(data={'x': X_test.flatten(), 'y':
Y_test.flatten()}).itertuples()]

A linear regression line is drawn to represent the relationship between the


independent and dependent variables. The line's position is dynamically updated
to reflect changes in the slope (m) and intercept (b). The resulting visual
representation allows for a clear understanding of how well the linear regression
model fits the given dataset. The function returns the trackers for m and b, the
axes, and the visual elements representing the training and testing data points and
the linear regression line.

23
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

Figure 15&16: Introduction and Dataset

This `TrainTestScene` class gives a visual representation of the process of


dividing data into training and testing sets, a fundamental step in machine learning
model evaluation. The scene showcases the initial linear regression model,
displayed on a coordinate system, along with the data points.

Figure 17: Train/Test Split

To elucidate the partitioning ratios, the scene introduces braces and


corresponding labels, providing a clear visual representation of the proportions—
two-thirds for training and one-third for testing. This scene dynamically
transforms the displayed point, illustrate the transition from the original dataset to
the segregated training and testing sets.

Subsequently, the scene dynamically transforms the displayed points to


convey the transition from the original dataset to the split sets. The linear
regression model remains a consistent visual element throughout.

24
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

Figure 18&19: Throughout scene


3.2. Explain class MetricScene

Figure 20: Performance Metrics


The script defines a class MetricScene that inherits from Manim's Scene
class. The purpose of this script is to create an animation that explains various
performance metrics used in linear regression analysis.

IV. Solve an application problems


4.1. Explain class TableExamples

Figure 21:
Introduction to
problem

25
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

The initial animation introduces the topic with a title, "Linear Regression," and
a subtitle highlighting the context, "Hanoi Housing Price Problems." The use of
gradient colors enhances visual appeal.

Following the title animation, the problem statement is presented through a


series of textual animations. It describes the scenario of predicting the optimal
price of a 70 m² house based on historical house selling prices. This sets the stage
for the subsequent animations.

Figure 22&23: Question scene

The first table animation displays historical data on house areas and
corresponding prices. The color-coded labels emphasize the relevance of the
information. The table is introduced with a smooth entry animation, followed by
the gradual presentation of column labels. A second table animation introduces
the scenario of predicting the price for a 70 m² house. The row corresponding to
the prediction task is highlighted in blue, distinguishing it from the historical data.
This animation effectively conveys the transition from historical data to the
predictive task.

Arrows and braces are employed to illustrate the flow of information. The
arrows indicate the transition from historical data to prediction, while braces
provide textual annotations such as "Historical data" and "Prediction," aiding in
viewer comprehension. The animations are orchestrated to create a coherent flow,
ensuring that each element is introduced at an appropriate moment.

26
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

4.2. Explain class FitScene2

This function ‘create_problem_model’ sets up and returns various


elements necessary for visualizing a linear regression problem, including the
training data, model coefficients, axes, data points, and a regression line.

1. def create_problem_model() -> tuple:


2. # Read data from the provided CSV URL into a DataFrame
3. df = pd.read_csv("https://bit.ly/45VEJfL", delimiter=",")
4. # Extract features (X) and target variable (Y) from the DataFrame
5. X = df.values[:, :-1] # area
6. Y = df.values[:, -1] # price
7. # Split the data into training and testing sets
8. X_train, X_test, Y_train, Y_test = train_test_split(X, Y,
test_size=1 / 3, random_state=49)
9. # Create a DataFrame for the training data
10. train_data = pd.DataFrame(data=X_train, columns=df.columns[:-1])
11. train_data['y'] = Y_train
12. train_data = list(train_data.itertuples()) # Convert DataFrame rows
to namedtuples
13. # Fit a linear regression model to the training data
14. model = LinearRegression().fit(X_train, Y_train)
15. # Use ValueTracker to track the R^2 score of the model on the test
set
16. result = ValueTracker(model.score(X_test, Y_test))
17. # Use ValueTrackers to track the slope (m) and intercept (b) of the
model
18. m = ValueTracker(model.coef_.flatten()[0])
19. b = ValueTracker(model.intercept_.flatten()[0])
20. # Set up a coordinate system (axes) for the visualization
21. ax = Axes(
22. x_range=[0, 120, 10],
23. y_range=[0, 210, 50],
24. axis_config={"include_tip": True},
25. ).add_coordinates()
26. # Add axis labels to the coordinate system
27. labels = ax.get_axis_labels(
28. MathTex(r"\text{Size of House (m}^2\text{)}").scale(0.7),
29. MathTex(r"\text{Price of House (million VND)}").scale(0.7)
30. )
31. # Create dots representing training and testing data points on the
coordinate system
32. train_points = [Dot(point=ax.c2p(p.x, p.y), radius=.15, color=BLUE)
for p in
33. pd.DataFrame(data={'x': X_train.flatten(), 'y':
Y_train.flatten()}).itertuples()]
34. test_points = [Dot(point=ax.c2p(p.x, p.y), radius=.15, color=BLUE)
for p in
35. pd.DataFrame(data={'x': X_test.flatten(), 'y':
Y_test.flatten()}).itertuples()]
36. # Create a line representing the linear regression model
37. line = Line(start=ax.c2p(0, b.get_value()), end=ax.c2p(100,
m.get_value() * 100 + b.get_value())).set_color(
38. YELLOW)
39. line.add_updater(
40. lambda l: l.become(

27
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

41. Line(start=ax.c2p(0, b.get_value()), end=ax.c2p(100,


m.get_value() * 100 + b.get_value()))).set_color(
42. YELLOW)
43. )
44. # Return the generated elements as a tuple
45. return train_data, m, b, ax, train_points, test_points, line,
labels, result

The `sse_fit` function is designed to visually represent the Sum of Squared


Errors (SSE) in our linear regression problem. It takes essential parameters such
as the scene, training data, slope (`m`) and intercept (`b`) ValueTrackers, the
coordinate system (`ax`), data points (`points`), and the linear regression line
(`line`). The function iterates through each data point, creating and animating
residual lines that depict the vertical distances between the data points and the
regression line. These residuals are gradually revealed on the scene, and updaters
are incorporated to dynamically adjust the residual lines during the animation.
Additionally, the function defines `get_sse`, a calculation function that computes
the SSE by summing the squares of the lengths of the residual lines. The list of
residual lines and the SSE calculation function are returned as a tuple, contributing
to the overall animation's illustration of the SSE concept in linear regression.

1. def sse_fit(scene, data, m, b, ax, points, line) -> tuple:


2. # List to store residuals (lines representing the vertical distance between data points
and the regression line)
3. residuals: list[Line] = []
4. # Iterate through each data point (data truyền vào ban đầu là train set)
5. for d in data:
6. # Create a residual line for the current data point and animate its appearance
7. residual = Line(start=ax.c2p(d.x, d.y),
8. end=ax.c2p(d.x,
9. m.get_value() * d.x +
10. b.get_value())).set_color(RED)
11. scene.play(Create(residual), run_time=.2)
12. # Add an updater to dynamically adjust the residual line during animation
13. residual.add_updater(lambda r, d=d: r.become(
14. Line(start=ax.c2p(d.x, d.y),
15. end=ax.c2p(d.x,
16. m.get_value() * d.x + b.get_value())).set_color(RED
17. )))
18. residuals += residual # Add the residual line to the list
19. # Function to calculate the Sum of Squared Errors (SSE) given current values of slope (m)
and intercept (b)
20. def get_sse(m, b):
21. new_sse = ValueTracker(0.0)
22. for i in range(len(residuals)):
23. new_sse += (residuals[i].get_length()**2)
24. return new_sse
25. # Return the list of residuals and the SSE calculation function
26. return residuals, get_sse

28
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

Figure 23: Bad fit model

This scene visualizes the process of linear regression fitting, begins by


creating a linear regression problem using the `create_problem_model` function,
which generates essential elements such as the coordinate system, data points, and
the initial linear regression line. The scene then displays both the training and
testing data points, emphasizing the separation of the test set. Then, it introduces
an initial linear regression model characterized by a fixed slope and intercept,
representing a "bad fit." The associated Sum of Squared Errors (SSE) is visualized
on the scene.

Figure 24: Better-fiting model

The animation then progresses to demonstrate the optimization process by


transitioning to a better-fitting model with values obtained from the
`create_problem_model`. This optimization is accomplished by manipulating
the slope, intercept, and SSE values using ValueTrackers and a linear
interpolation. The final optimized model is visually displayed, along with its
corresponding SSE.

29
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

4.3. Explain class FinalExample2

Figure 25: Predicted Value

This scene begins by presenting both training and testing data points, along
with the initial linear regression model. The quality of the model is assessed
through the 𝑅2 score, a metric indicating its predictive performance. The scene
begins by presenting both training and testing data points, along with the initial
linear regression model. The quality of the model is assessed through the 𝑅2 score,
a metric indicating its predictive performance.

Figure 26: Conclusion scene

The left panel consolidates various elements for better visibility, and the
conclusion text summarizes the optimal house price according to the linear
regression model.

30
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

V. Explain the last scene

Figure 27: Group 6 label

First, it displays the group name ("GROUP 6") and the label "DIRECTED
BY" at the top of the screen. These texts are created with specific font styles and
sizes, and after a brief pause (self.wait(2)), they fade out of view.

Figure 28&29: Credit and Thanks you scene

Next, it introduces the individual names of the group members (Nguyễn


Phương Hoài Ngọc, Võ Huyền Khánh Mây, Võ Thị Minh Phương, Dương Nhật
Thanh, Trần Phương Linh). The names are created, arranged in a vertical stack,
and initially placed at the bottom of the screen. Then, there is an animation to shift
the group of names upward, creating a scrolling effect. Finally, the message
"Thanks for watching" is displayed, and it moves upward as well.

31
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

EVALUATION OF USING MANIM


The utilization of Manim, a mathematical animation engine, has gained
prominence in the academic and educational communities for its prowess in
visualizing complex mathematical concepts. However, during the use of Manim,
we have identified several advantages and disadvantages of the library, as well as
the differences compared to other similar tools.
I. Advantages of Manim.
One of Manim's standout features is its exceptional customizability. Users
can create intricate mathematical objects and dynamic graphics with a high degree
of flexibility. This makes Manim a preferred choice for educators and content
creators looking to represent sophisticated mathematical ideas visually.
Built on Python, Manim offers a seamless integration experience for users
familiar with the language. This integration simplifies the coding process and
enhances the readability of scripts, making it an attractive option for those
comfortable with Python programming.
Manim places a strong emphasis on generating high-quality output videos.
The rendering engine ensures smooth transitions and produces videos with
excellent resolution, contributing to a visually appealing and professional final
product.
The presence of a dynamic and creative community around Manim is a
significant advantage. This community-driven development model encourages
knowledge sharing, provides valuable resources, and facilitates collaborative
problem-solving.

II. Disadvantages of Manim.


One of the notable challenges associated with Manim is its steep learning
curve, especially for beginners. The complex syntax and, at times, insufficient
documentation can pose obstacles to new users, potentially slowing down the
onboarding process.
Manim demands relatively high computational resources. Users with lower-
end machines may find it challenging to work with Manim efficiently, leading to
potential hardware constraints.

32
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

Exporting videos with Manim can be a time-consuming process, particularly


for graphics with high complexity. This may impact productivity and result in
extended waiting times for users seeking prompt video output.
III. Comparison with Other Tools.
Manim stands out in the realm of graphics libraries due to its specialization
in dynamic and intricate mathematical visualizations. Unlike general-purpose
graphics libraries like Matplotlib and Plotly, which cater to a broad spectrum of
visualization needs, Manim is specifically designed to meet the demands of
mathematical representation in a dynamic and visually engaging manner.

Matplotlib is a widely-used library that offers a comprehensive set of tools


for creating static, animated, and interactive visualizations in Python. It excels in
providing a diverse range of plots and charts for various applications. Plotly, on
the other hand, is known for its interactive and web-based plotting capabilities,
making it suitable for creating dashboards and interactive data visualizations.

However, when it comes to conveying complex mathematical concepts,


equations, and transformations, Manim takes the spotlight. It offers a dedicated
framework for creating animations that dynamically illustrate mathematical
phenomena. This focus allows for precise and detailed visualizations that may be
challenging to achieve with the broader, more generalized capabilities of
Matplotlib and Plotly.

Manim's strength lies in its ability to seamlessly integrate mathematical


expressions and animations, making it an ideal choice for educational content,
presentations, and research where a deep understanding of mathematical
relationships is crucial. By providing a specialized environment for mathematical
visualizations, Manim enhances the clarity and educational impact of such visual
representations.

In summary, while Matplotlib and Plotly are versatile tools for a wide range
of visualizations, Manim's specialization in mathematical graphics sets it apart,
making it a preferred choice for those seeking to convey complex mathematical
concepts through dynamic and visually compelling animations.

33
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

CONCLUSION
------*------
In summarizing the implementation process, our visualization proved to be a
harmonious blend of aesthetic appeal and educational depth. The animation not
only presented a visually engaging representation of linear regression but also
served as an effective pedagogical tool, seamlessly integrating quantitative
assessments like the R² score for a comprehensive understanding of the model's
accuracy.

Looking ahead, our venture opens the door to exciting possibilities for future
improvements and developments. Suggestions include refining visual aesthetics,
experimenting with color schemes, and introducing interactive elements to elevate
the user experience. Feature expansion is also on the horizon, with considerations
for incorporating regularization techniques and exploring alternative algorithms,
ensuring a more comprehensive exploration of linear regression concepts.

In conclusion, our endeavor to visualize linear regression with Manim stands


as a commendable achievement. Beyond its immediate impact, this project serves
as a foundational platform for future endeavors in machine learning visualization.
Envisioning continual refinement and expansion, we eagerly anticipate
collaborative efforts within the community, propelling the ongoing evolution of
Manim as a powerful instrument for scientific storytelling in the realm of linear
regression.

34
Downloaded by Vu Bui ([email protected])
lOMoARcPSD|18620449

REFERENCES
------*------
1. Khandelwal, R. (2019). Linear Regression using Gradient Descent.
Towards Data Science. Retrieved from https://towardsdatascience.com/linear-
regression-using-gradient-descent-97a6c8700931

2. Nguyen Thanh Tuan. (2019). Bài 1: Linear Regression và Gradient descent.


Deep Learning cơ bản. Retrieved from: https://nttuan8.com/bai-1-linear-
regression-va-gradient-descent/

3. Manim Community. (n.d.). Retrieved from


https://www.manim.community/

4. Brian Amedee, (2021). Manim Tutorial | Updater Animations | Tutorial 1,


Manim Explained. YouTube.
https://youtu.be/MOv6yN7b2aI?si=AIkvVsxORb9wKaWU

35
Downloaded by Vu Bui ([email protected])

You might also like