0% found this document useful (0 votes)
58 views14 pages

Math Ia

This document presents a Mathematics Internal Assessment by Abdullah Jamshed, focusing on modeling chord progressions in Jazz and Pop music using Markov Chains. It explores the predictability of these genres through the analysis of transition probabilities and entropy, ultimately comparing their structural patterns. The methodology includes defining the model, calculating steady-state vectors, and applying entropy to measure unpredictability in chord progressions.

Uploaded by

Mustafa Salman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
58 views14 pages

Math Ia

This document presents a Mathematics Internal Assessment by Abdullah Jamshed, focusing on modeling chord progressions in Jazz and Pop music using Markov Chains. It explores the predictability of these genres through the analysis of transition probabilities and entropy, ultimately comparing their structural patterns. The methodology includes defining the model, calculating steady-state vectors, and applying entropy to measure unpredictability in chord progressions.

Uploaded by

Mustafa Salman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Mathematics Internal Assessment

Modeling Chord Progressions through Markov Chains

Name: Abdullah Jamshed


Personal Code: lcg462
School Name: Lahore Grammar School Johar Town International
Subject: Mathematics AA HL

Date of Submission: April 2025


2

Contents

1 Introduction 3

2 Methodology 3
2.1 Defining the Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2 Solving for the steady-state Vector . . . . . . . . . . . . . . . . . . . . . 4
2.3 Entropy to calculate predictability in Markov Chains . . . . . . . . . . . 5

3 Exploration 7
3.1 The Markov Chain applied to Jazz . . . . . . . . . . . . . . . . . . . . . 7
3.1.1 Steady-state equations . . . . . . . . . . . . . . . . . . . . . . . . 7
3.1.2 Entropy And Probability . . . . . . . . . . . . . . . . . . . . . . . 10
3.2 The Markov Chain applied to Pop . . . . . . . . . . . . . . . . . . . . . . 10
3.2.1 Steady-state analysis . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.2.2 Entropy And Probability . . . . . . . . . . . . . . . . . . . . . . . 11

4 Reflection 12
4.1 Methodological Considerations . . . . . . . . . . . . . . . . . . . . . . . . 12

5 Conclusion 12

6 References 13
3

1. Introduction
Two years ago, I delved into the realm of music. I had always listened to music, but had
recently learnt how to play the guitar. Merging both music and math had always been at
the back of my head, through this IA, I have brought to life my own passion. My explo-
ration focuses on the mathematical modeling of chord progressions of two genres, namely
Jazz and Pop, using a first-order Markov Chain, thereby analyzing how predictable each
genre is, and thus comparing those two genres on the basis of predictability. A Markov
Chain is a stochastic process that describes a sequence of possible events, where the prob-
ability of each event depends solely on the state attained in the previous event. In musical
terms, this means modeling how one chord leads to another based only on the chord that
preceded it — a perfect application for analyzing patterns in songwriting and harmonic
structure. Music is a language that everyone understands, but beneath its melody, there
exists a universe of mathematical patterns. There exists the chord progression, which is a
series of chords that make up the foundation of a song. But how predictable are the pro-
gressions? Can we model them mathematically in order to learn about their structure?
In order to correctly model chord progression and its predictability, it was important for
me to find a model that found the predictability of the next possible chord based on
the current chord, this was extremely important as an isolated chord does not produce
much value, a chord is only valuable in a linear progression. It was then that I found
Markov Chains. This investigation explores how Markov chains, a probabilistic model,
can analyze chord progressions and quantify their predictability using entropy by study-
ing two chord progressions, one from the track Giant Steps by John Coltrane, a highly
acclaimed jazz musician, and one from Baby by Justin Bieber, an extremely popular pop
hit. Through Markov Chains, we will be able to determine the Transition probabilities
of each track, i.e. the probability of one chord following another. Using the Transition
probabilities, we can create steady-state vectors which will provide us with the long term
distribution of the chords. Finally, through the steady-state vectors, we can establish
predictability through entropy.

2. Methodology

2.1 Defining the Model


We first identify the key components to develop a Markov chain model for chord progres-
sions. The states, i.e. the set of possible cords e.g.,

S = C, G, Am, F
4

And secondly, the transition probabilities for a given chord progression. For example:

C → G → Am → F → C

For this chord progression, we construct a transition matrix P where:

Pij =Probability of moving from chord i to chord j.

For the progression written above, the chord progressions are

C → G, G → Am, Am → F, F → C

This gives us the transition matrix P:

 
0 1 0 0
 
0 0 1 0
 
0 0 0 1
 
1 0 0 0

2.2 Solving for the steady-state Vector


Once we have the transition matrix, we can then solve for the steady-state vector. The
steady-state vector of a Markov chain represents the long-term probability distribution of
the system’s states (e.g., musical chords) after many transitions. It answers the question:
"What is the likelihood of being in each state after the system has evolved indefinitely?"
To find the steady-state vector v for the example transition matrix P , we must ensure that
v = [vC , vG , vAm , vF ] satisfies the system of equations vP = v and vC +vG +vAm +vF = 1.
for our example transition matrix P the steady state vector can be found out through
solving this equation:

 
0 1 0 0
 
0 0 1 0
[vC , vG , vAm , vF ] · 
0
 =[vC , vG , vAm , vF ]
 0 0 1

1 0 0 0
5

Multiply to get 4 equations:

1. vC · 0 + vG · 0 + vAm · 0 + vF · 1 = vC
⇒ vF = vC
2. vC · 1 + vG · 0 + vAm · 0 + vF · 0 = vG
⇒ vC = vG
3. vC · 0 + vG · 1 + vAm · 0 + vF · 0 = vAm
⇒ vG = vAm
4. vC · 0 + vG · 0 + vAm · 1 + vF · 0 = vF
⇒ vAm = vF

From the equations:

vF = vC , vC = vG , vG = vAm , vAm = vF

All probabilities are equal:


vC = vG = vAm = vF

Then apply the normalization constraint, the sum of all probabilities are equal to one:

vC + vG + vAm + vF = 1

Substitute vC = vG = vAm = vF :

1
4vC = 1 ⇒ vC =
4

We then get the steady state vector:


 
1 1 1 1
v= , , ,
4 4 4 4

the steady-state vector indicates that, over time, each chord appears equally often (25%
probability each). This uniformity arises because the progression cycles deterministically
through all chords without bias. The steady-state vector is found and reflects the balance
between predictability (fixed transitions) and long-term randomness (equal distribution).
In essence, it quantifies the "equilibrium" of the system.

2.3 Entropy to calculate predictability in Markov Chains


Entropy, in the context of Markov chains and information theory, is a quantitative mea-
sure of uncertainty, disorder, or unpredictability in a system. For a probability distribu-
6

tion (such as the steady-state vector of a Markov chain), entropy calculates how much
"surprise" or "information" is inherent in the system’s possible outcomes. For example,
maximal entropy would mean that all outcomes are equally likely, meaning the system is
perfectly unpredictable. On the other hand, absolute minimal entropy would occur when
one outcome has a 100% probability, i.e. one chord endlessly repeating. Entropy is an
effective tool in the analysis of Markov chains in music due to the fact that it measures
the unpredictability of chord progression over time. Entropy in the steady-state vector
can be calculated using the formula
X
H(v) = − vi log2 (vi ).
i

which lets us determine whether a progression is going to be predictable or mostly ran-


dom. For example, a simple four-chord loop achieves maximum entropy (2 bits), meaning
all chords appear equally often in the long run, despite its deterministic short-term tran-
sitions. In contrast, a two-chord alternation would yield lower entropy, reflecting its
repetitive nature. For our example single-state vector v, the entropy calculated would be

H = −4 (0.25 log2 (0.25)) = 2 bits.

(Maximum entropy for 4 states = log2 (4) = 2 bits.)


The short-term predictability is the entropy calculated divided by the maximum possible
entropy subtracted from 1

E
1 − = P redictability
Emax

For our example steady-state vector v:

2
1 − =0
2

This means that our example subject is completely unpredictable in the long term, as
each chord has an equal chance to transition into the next.
7

Figure 1: Giant Steps Chord Progression

3. Exploration

3.1 The Markov Chain applied to Jazz


I first implemented the Markov Chain model on our first selected genre, jazz. Specifically,
I extracted the chord progression from the song Giant Steps by John Coltrane. Then,
I created a transition matrix from the given sequence of chord progressions. First, I
identified all the unique notes in the sequence (B, D, G, A#, D#, A, F#, F, C#) then I
counted each transition from the given sequence. Then I divided each count by the total
number of transitions from the note the transition originated from, through this process
I ended up with the transition matrix:

B D G A# D# A F # F C#
 
0 0.5 0 0 0 0 0 0.5 0 B
0 0 0.67 0 0 0.33 0 0 0  D
 
 
0 0 0 0.67 0 0 0 0 0.33
  G
0 0 0 0 1 0 0 0 0  A#
 
 
0 0 0 0 0 0.5 0.25 0 0.25
  D#
0 1 0 0 0 0 0 0 0  A
 
 
0.5 0 0 0 0 0 0 0.5 0 
 F#
0 0 0 1 0 0 0 0.5 0  F
 

0 0 0 0 0 0 1 0 0 C#

Using our transition matrix, we want to find out the steady-state probability vector

v = [vB , vD , vG , vA# , vD# , vA , vF # , vF , vC# ]

3.1.1 Steady-state equations

We want to find the steady-state probability vector v = [vB , vD , vG , vA# , vD# , vA , vF # , vF , vC# ]
such that:

vP = v
8

where P is the transition matrix, and


P
vi = 1.
From the transition matrix, the steady-state equations are:

1. For B : vB = 0.5vF #
2. For D : vD = 0.5vB + vA
3. For G : vG = 32 vD
4. For A# : vA# = 23 vG + vF
5. For D# : vD# = vA#
6. For A : vA = 0.5vD#
7. For F # : vF # = 0.25vD# + vC#
8. For F : vF = 0.5vB + 0.5vF #
9. For C# : vC# = 0.25vD# + 31 vG

Solving the System of Equations:

1. From vD# = vA# , substitute vD# with vA# in all equations.

2. From vA = 0.5vD# = 0.5vA# .

3. From vG = 23 vD .

4. From vA# = 32 vG + vF .

5. From vF = 0.5vB + 0.5vF # .

6. From vB = 0.5vF # , substitute into vF :

vF = 0.5(0.5vF # ) + 0.5vF # = 0.75vF #

7. From vC# = 0.25vA# + 31 vG .

8. From vF # = 0.25vA# + vC# , substitute vC# :

vF # = 0.25vA# + 0.25vA# + 13 vG = 0.5vA# + 13 vG

Now express everything in terms of vA# and vG :


From vG = 23 vD and vD = 0.5vB + vA = 0.5(0.5vF # ) + 0.5vA# = 0.25vF # + 0.5vA# :

vG = 32 (0.25vF # + 0.5vA# ) = 61 vF # + 13 vA#


9

Substitute vG into vF # :

1 1
+ 13 vA#

vF # = 0.5vA# + 3
v
6 F#

1
vF # = 0.5vA# + + 91 vA#
v
18 F #

1
= 0.5 + 19 vA#

vF # − v
18 F #

17 11
v
18 F #
= v
18 A#

11
vF # = v
17 A#

Now substitute back:

11
vB = 0.5vF # = v
34 A#
33
vF = 0.75vF # = v
68 A#

vA = 0.5vA#
11 34 45
vD = 0.5vB + vA = v
68 A#
+ v
68 A#
= v
68 A#

vG = 23 vD = 30
v
68 A#
= 15
v
34 A#

vC# = 0.25vA# + 13 vG = 17
v
68 A#
+ 20
v
68 A#
= 37
v
68 A#

9. Normalize v:

11 45 15
+ 1 + 12 + 11 33 37

vA# 1 + 34
+ 68
+ 34 17
+ 68
+ 68
=1

381 68

vA# 68
= 1 =⇒ vA# = 381

Compute all probabilities:

11 68 22
vB = 34
· 381
= 381
45 68 45
vD = 68
· 381
= 381
15 68 30
vG = 34
· 381
= 381
68
vD# = 381
34
vA = 381
44
vF # = 381
33
vF = 381
37
vC# = 381

Final Steady-State Vector


10

 22
, 45 , 30 , 68 , 68 , 34 , 44 , 33 , 37

v= 381 381 381 381 381 381 381 381 381

Approximate decimal form:

v ≈ [0.0577, 0.1181, 0.0787, 0.1785, 0.1785, 0.0892, 0.1155, 0.0866, 0.0971]

3.1.2 Entropy And Probability

The given steady-state vector

v ≈ [0.0577, 0.1181, 0.0787, 0.1785, 0.1785, 0.0892, 0.1155, 0.0866, 0.0971]

describes the long-term probability distribution of states in the Markov chain. To


quantify its predictability, we first calculate the entropy using:

9
X
H(v) = − vi log2 (vi ) ≈ 3.082
i

This high entropy value indicates substantial unpredictability, reaching 87.2% of the
maximum possible entropy (log2 (9) ≈ 3.172) for a 9 state system.
The corresponding predictability score:

H(v)
P redictability = 1 − ≈ 2.8%
H(max)

demonstrates that the system exhibits nearly minimal long-term predictability. This
near-uniform distribution contains some notable structure - states 4 and 5 (each with
probability 17.85%) are about three times more likely than state 1 (5.77%), while the
remaining states show intermediate probabilities. The combination of high entropy and
low predictability suggests that while certain transitions may be favored in the short term
(particularly to states 4 and 5), the long-term behavior maintains significant variability.

3.2 The Markov Chain applied to Pop


Using the same method we used to analyze Jazz, we are now going to analyze Pop Music
through Justin Bieber’s popular song Baby. The verse progression repeats:

D → Bm → G → A
11

Through this verse progression, the constructed transitional matrice is:


   
P (D → D) P (D → Bm) P (D → G) P (D → A) 0 1 0 0
P (Bm → D) P (Bm → Bm) P (Bm → G) P (Bm → A) 0
   
0 1 0
P =
 P (G → D)
= 
 P (G → Bm) P (G → G) P (G → A)  0
  0 0 1

P (A → D) P (A → Bm) P (A → G) P (A → A) 1 0 0 0

3.2.1 Steady-state analysis

Using the transitional matrix, we attempt to solve for the steady-state matrix using the
same method we applied before. Let v = [vD , vBm , vG , vA ]. We solve:

vD = vA
vBm = vD
vG = vBm
vA = vG
1 = vD + vBm + vG + vA

Substituting recursively:
vD = vBm = vG = vA
1
4vD = 1 =⇒ vD =
4
Thus:  
1 1 1 1
v= , , ,
4 4 4 4

3.2.2 Entropy And Probability

Through the steady-state vector that was found, we can now calculate for entropy and
probability:
4   
X 1 1
H(v) = − vi log2 vi = −4 log2 = 2 bits
i=1
4 4

for 4 states, the maximum entropy obtainable is:

Hmax = log2 4 = 2 bits

through this, we can obtain the predictability:

H 2
Predictability = 1 − = 1 − = 0%
Hmax 2
12

Here, we obtain a somewhat unpredictable result. Intuitively, one would assume that
due to the repetition of multiple chords, pop music is easily predictable. Yet, due to the
nature of Markov chains, the steady state vector reveals that there is an equal probability
for the system to be playing any chord. Due to that reason, the predictability obtained
is 0

4. Reflection
My probability analysis was rudimentary and one-dimensional which would account for
the findings that felt counterintuitive. I was surprised by my findings. I did not expect
Jazz to be more predictable than pop. Intuitively, it made sense? Pop is musically more
predictable and easy to understand than jazz, yet my findings revealed otherwise, this
had come as a great shock to me.

Predictabilitypop < Predictabilityjazz (1)

This unexpected outcome prompted careful re-examination of several factors:

4.1 Methodological Considerations


• Markov Order: First-order chains may inadequately capture jazz’s longer har-
monic patterns

• Chord Representation: Jazz’s extended chords (e.g., Cmaj7, G9) were simplified
in analysis

• Improvisation: Solo sections were excluded from the chord progressions analyzed

What I could have done to improve my research:

• Higher-order Markov analysis of jazz standards

• Comparative study of different jazz subgenres

• Incorporation of rhythmic factors into predictability metrics

5. Conclusion
My aim was to model two different genres of music through Markov Chains, I did that
and succesfully compared the two genres through how predictable and/or random they
are. Ultimately, this project not only allowed me to apply mathematical theory in a
real-world context, but it also deepened my appreciation for the hidden structure within
13

music and how different styles can be interpreted through the lens of probability and
transition modeling.

6. References
Use MLA, APA, or any consistent format. Make sure to cite textbooks, websites, or tools
(like Desmos, GeoGebra, etc.). Kiefer, P., Riehl, M. (n.d.). Markov Chains of Chord Pro-
gressions. Ball State Undergraduate Mathematics Exchange. "https://digitalresearch.bsu.edu/mathexch
content/uploads/2021/02/Markov-Chains-of-Chord-Progressionsk ief er.peter. pdf ”

Bitsandbinarydigits.(n.d.).https : //cs.stanf ord.edu/people/eroberts/courses/soco/projects/1999−


00/inf ormation − theory/entropyo fe nglish9 .html
14

Appendix
Include any extra graphs, raw data, or calculations that support your exploration but
would clutter the main body.

You might also like