Moran
Moran
1 Introduction
Consider a fixed population of N individuals of types A and B, where the relative fitness of
individuals of type B is given by a real number s. The classical Moran process [13] models a
discrete-time process for the fixed-size population where at each step, one individual is chosen for
reproduction with probability proportional to its fitness (s for type B, 1 for type A), and then
replaces an individual chosen uniformly for death. In particular, for the numbers NA and NB of
individuals of each type, at each step of the process, the probability that NB increases by 1 and
NA decreases by 1 is (︃ )︃ (︃ )︃
+ rNB NA
p = ,
NA + rNB NA + NB
and the probability that NB decreases by 1 and NA increases by 1 is
(︃ )︃ (︃ )︃
NA NB
p− = .
NA + rNB NA + NB
This process was generalized to graphs by Liberman, Hauert and Nowak in [10], see also [14]. In
this setting, a fixed graph—whose vertices represent the fixed population—has vertex colors that
evolve over time representing the two types of individuals. We will focus on the case where one
vertex begins colored as Type B—the mutant type. In the Birth-Death process, a random vertex
v is chosen for reproduction, with vertices chosen with probabilities proportional to the fitness of
their color type, and then a random neighbor u of v is chosen for death, with the result that u gets
recolored with the color of v. In the Death-Birth process, a vertex v is chosen uniformly randomly
for death, and then a vertex u is chosen randomly from among its neighbors, with probabilities
proprotional to the fitness of their respective types, with the result that v is recolored with the
color of u.
∗
Research supported in part by NSF grant DMS1952285 Email: [email protected]
†
Research supported in part by NSF grant DMS1700365.Email: [email protected]
1
The isothermal theorem of [10] implies that if G is a regular, connected, undirected graph, the
fixation probability for type B—that is, the probability that all vertices will eventually be of Type
B—depends only on the number n of vertices in G and the relative fitness s, and not, for example
on the particular regular graph or the particular choice of starting vertex (more generally, the same
holds for doubly stochastic—so called isothermal —weighted digraphs). But it is also observed in
[10] that beyond this special setting, the graph structure can have a dramatic effect on the fixation
probability. In the classical Moran model for a population of size N (equivalent to the graph process
when the graph is a complete graph on N vertices, with loops), the fixation probability for Type
B is
1 − s−1 1
−N
∼1−
1−s s
(where aN ∼ bN indicates that aN /bN → 1 as N → ∞). But the fixation probability for an
N -vertex star graph is, asymptotically in N , for s > 1,
1
1−
s2
(see also [3, 2]).
For the special case when s = 1 the fixation probability can be characterized in terms of coalescence
times for random walks [4], but for s > 1 it is unknown whether a polynomial time algorithm exists
to determine the fixation probability given a particular input graph [5]; in place of this, heuristic
approximations (e.g., see [6]) or numerical experiments (e.g., [12]) are used to estimate fixation
probabilities.
In place of analyzing fixed graphs, one can analyze the fixation probability for a random graph
from some distribution. For the Erdős-Rényi random graph Gn,p , each possible edge among a set
of n vertices is included independently, each with probability p. For the case where 0 < p < 1 is
a constant independent of n, Adlam and Nowak leveraged the near-regularity of such graphs (in
particular, that they are “nearly-isothermal”) to show that the fixation probability on such graphs
is approximated by that of the classical Moran model. When p is not a constant but “small”, in
the sense that p = p(n) → 0 as n → ∞, Gn,p can exhibit significant diversity in vertex degrees,
and numerical experiments conducted by Mohamadichamgavi and Miekisz [7] showed a strong
dependence of the fixation probability of the degree of the initial mutant vertex.
In this paper we give a rigorous analysis of fixation probabilities for random graphs with degree
heterogeneity. In particular, our first result concerns Gn,p when p = p(n) = log n+ω(n)
n where ω(n)
denotes any function satisfying ω(n) → ∞. When ω(n) is a slow-growing function, this places our
analysis right at the threshold for connectivity of Gn,p ; indeed, in this regime, Gn,p is connected
and most vertices have degree close to log n, but still there are ∼ log n vertices whose degree is just
1. With probability tending to 1 as n → ∞ (“with high probability”), the automorphism group of
the graph is trivial—that is, any two vertices can be distinguished by their relations in the network
structure (even if they have the same degree). Nevertheless, we prove that the degree of the initial
mutant vertex (or possibly one of its neighbors) is enough to asymptotically determine the fixation
2
probability on Gn,p : the following are defined w.r.t. G = Gn,p . d(v) denotes the degree of vertex v.
1
ε=
log log log n
{︂ np }︂
S0 = v : d(v) ≤ . (1)
10
S1 = {v : d(v) ∈/ Iε = [(1 ± ε)np]} . (2)
Theorem 1. Given a graph G and a vertex v0 , we let ϕ = ϕBD G,v0 ,r denote the fixation probability of
the Birth-Death process on G when the process is initialized with a mutant at v0 of relative fitness
s > 1. If G is a random graph sampled according to the distribution Gn,p , then w.h.p., G has the
property that
On the other hand, if s ≤ 1 then G has the property that ϕG,v0 = o(1) regardless of v0 .
Theorem 2. Given a graph G and a vertex v0 , we let ϕDB G,v0 ,s denote the fixation probability of
the Death-Birth process on G when the process is initialized with a mutant at v0 of relative fitness
s > 1. If G be a random graph sampled according to the distribution Gn,p , then w.h.p., G has the
property that
Remark 1. In our proofs we establish recurrence relations that enable us to asymptotically deter-
mine the fixation probabiliy in the above cases where it is not explicitly given. Unfortunately, these
recurrences are non-linear and difficult to solve explicitly, although in principle we could obtain
numerical results from them.
3
2 Notation
For a set S ⊆ [n], we let S̄ = [n]\S, e(S) = | {vw ∈ E(G) : v, w ∈ S} |. If S, T ⊆ [n], S ∩T = ∅, then
e(S : T ) = | {vw : v ∈ S, w ∈ T } |. N (S) = {w ∈ / S : ∃v ∈ S s.t. vw ∈ E(G)}, where we shorten
N ({v}) to N (v). We let dS (v) = |N (v) ∩ S| and d(v) = d[n] (v) and ∆ = max {d(v) : v ∈ [n]}.
We let
n ε2 np
n1 = n − and ω0 = .
(np)1/2 100 log np
We write A ∼ε B if A ∈ (1 ± O(ε))B as n → ∞ and A ≲ε B if A ≤ (1 − O(ε))B.
Chernoff Bounds We use the following inequalities for the Binomial random variable B(N, p):
2 np/2
P(B(N, p) ≤ (1 − θ)N p) ≤ e−θ 0 ≤ θ ≤ 1. (3)
−θ2 np/3
P(B(N, p) ≥ (1 + θ)N p) ≤ e 0 ≤ θ ≤ 1. (4)
(︂ e )︂λN p
P(B(N, p) ≥ λN p) ≤ λ > 0. (5)
λ
Assume that np = O(log n). We will deal with the simpler case of np/ log n → ∞ in Section 4.4.
The following hold w.h.p.
Lemma 3.
(a) ∆ ≤ 5np.
2 /4
(b) |S1 | ≤ n1−ε .
(i) If S induces a connected subgraph and ω0 /2 ≤ |S| ≤ n/(np)9/8 then e(S : S̄) ≥ |S|np/2.
4
{︂ }︂
(j) If S ⊆ [n] and S induces a connected subgraph then S ∪ N (S) contains at most 7
ε2
max 1, |S|
ω0
members of S1 ∪ N (S1 ).
(k) Suppose that |S| ≤ n/(np)9/8 and S induces a connected subgraph. Let Bk (S) be the set of
vertices v ∈ S̄ with dS (v) = k. Then for 2 ≤ k ≤ (np)1/3 , |Bk (S)| ≤ αk snp where αk = ε/k 2 .
(l) If n/(np)2 ≤ |S| ≤ n1 , then there are at most θ|S| vertices v ∈ S that dS̄ (v) ∈
/ (1 ± ε)(n − |S|)p,
1
where θ = e2 (np)1/2 .
(m) There do not exist disjoint sets S, T ⊆ [n] with n/(np)9/8 ≤ |S| ≤ n/(np)1/3 and |T | = θ(n−|S|)
such that e(S, T ) ≥ α|S||T |p, where α = (np)1/4 .
4 Birth-Death
s ∑︂ dX̄ (v)
p+ = pBD
+ (X) = P(|X| → |X| + 1) = . (6)
w(X) d(v)
v∈X
1 ∑︂ dX (v)
pBD
− (X) = P(|X| → |X| − 1) = . (7)
w(X) d(v)
v∈N (X)
An iteration will be a step of the process in which X changes. We prove some lemmas that will be
useful in later sections. In the following Z is a model for |X|.
Proof. Let σ = γ/2 + 1/4 and let Et be the event that Z makes at least (t − t0 )σ − a/2 positive
moves at times t0 + 1, . . . , t. If Eτ occurs for t0 ≤ τ ≤ t then Zt ≥ a + 2(t − t0 )σ − (t − t0 ) =
a + (γ − 1/2)(t − t0 ) > max {a, ρt} for ρ sufficiently small. (This is true by assumption for t = t0
the LHS increases by γ − 1/2 > ρ as t increases by one.) Let t1 = t0 + (b − a)/(γ − 1/2). If
and ⋂︁
E = tτ1=t0 Eτ occurs then Zt1 ≥ b. The Chernoff bounds imply that
t1 t1
{︄ (︃ )︃2 }︄
∑︂ ∑︂ 1 γ 1
P(¬E) ≤ ≤ exp − − γ(τ − t0 ) ≤ e−Ω(a) .
2 2 4
τ =t0 +a/2σ τ =t0 +a/2σ
5
Lemma 5. While, |X| ≤ n/(np)9/8 , the probability that (X ∪ N (X)) ∩ S1 increases in an iteration
is O(ε−2 /ω0 ).
Let S1+ = S1 ∪ N (S1 ). We consider the addition of a member of S1 to X ∪ N (X). This would mean
the choice of v ∈ X and then the choice of a neighbor of v in S1+ . Since v is at distance at most 2
from S1 , Lemma 3(e) implies that d(v) ≥ ω0 . Let C be the component of the graph GX induced
by X that contains v. Assume first that |C| ≤ ω0 /2. Then v has at least ω0 /2 neighbors in X̄ and
so we see from Lemma 3(j) that the conditional probability of adding to S1 ∩ X is O(ε−2 /ω0 ).
Now assume that ω0 /2 < |C| ≤ n/(np)9/8 . Let C0 denote {v ∈ C : dX̄ (v) > 0}. We estimate the
probability of adding a vertex in S1 to X ∪ N (X), conditional on choosing v ∈ C0 . Now Lemma
3(f) implies that e(C) ≤ 10|C| and Lemma 3(i) implies that e(C : C̄) ≥ |C|np/2. So, very crudely,
|C0 | ≥ |C|/10 by Lemma 3 a. We write
where
C1 = {v ∈ C0 \ S0 : dX (v) ≥ d(v)/2} and C2 = C0 \ (S0 ∪ C1 ).
The first sum in (8) is at most |C0 ∩ S0 | and it follows from Lemma 3(d) that |C0 ∩ S0 | ≤ 2|C|/ω0
(the 2 from the fact that our lower bound on |C is ω0 /2). It follows from Lemma 3(f) and dX (v0 ) ≥
log n/40 that the second sum in (8) is at most 800|C|/np. As for C2 , let A1 , A2 , . . . , Aℓ be the
components of the graph induced by C2 . It follows from Lemma 3(j) that
∑︂ dX̄∩S + (v) ℓ ∑︂ d m
1
∑︂ Āi ∩S + (v)
1
20 ∑︂ ∑︂
= ≤ dS + (v)
dX̄ (v) dX̄ (v) np 1
v∈C2 i=1 v∈Ai i=1 v∈Ai
m (︃ {︃ }︃)︃ (︃ )︃
20 ∑︂ 7 |Ai | |C2 |
≤ max 1, =O ,
np ε2 ω0 ε2 ω0 np
i=1
Lemma 6. W.h.p., there are no vertices in S0 added to N (X) and no vertices in N (S0 ) added to
3/4
X in the first ω0 iterations.
6
Proof. Suppose that a member of S0 is added to N (X) because we choose v ∈ X and then add
u ∈ N (v) to X, and N (u) ∩ S0 ̸= ∅. It follows from Lemma 3(d,e) that d(v) ≥ ω0 and the choice
3/4
of u is unique. So the conditional probability of this happening is O(ω0 /ω0 ) = o(1).
If a vertex in N (S0 ) is added to X then either (i) v0 has a neighbor w ∈ S0 which has two neighbors
in Xˆ︁ or (ii) X
ˆ︁ has two neighbors in S0 or (iii) v0 chooses w as its neighbor during the process.
Lemma 3(c) rules out (i) and Lemma 3(d) rules out (ii). Lemma 3(e) implies that v0 has degree at
3/4
least ω0 and so the probability of this is O(ω0 /ω0 ) = o(1), given Lemma 3(d) .
Remark 2. We will see in the next section that w.h.p. the size of |X| follows a random walk
with a positive drift in the increasing direction. It follows from this that to deal with cases where
1/2
0 < |X| ≤ ω for some ω ≤ ω0 , we can assume that there will be have been at most O(ω log ω)
iterations to this point. More precisely we can use the Chernoff bounds as we did in Lemma 4 to
argue that if |X| is not absorbed at 0 then |X| will reach ω in O(ω log ω) iterations. Thus, given
|X| = ω at some point in the process we have |X| ˆ︁ = O(ω log ω).
4.2 p+ versus p−
In this section we bound the ratio p+ /p− for various values of |X|. We will see later when we
analyse the cases in Section 4.3 that if |X| ≤ 20/ε3 then we only need to consider cases where
|X ∩ S1 |, |N (X) ∩ S0 | ≤ 1. This will in turn follow from the results of Section 4.1.
Because |X (i) | is small and X (i) induces a connected subgraph, Lemma 3(g) implies that X (i) ∪
(i)
N (X (i) ) = X (i) ∩ NX̄ (X (i) ) induces a tree or a unicyclic subgraph. Let δT be the indicator for
∑︁ (i)
X (i) inducing a tree and let δT = i δT .
Note that the number of edges inside X is precisely |X| − δT ≥ 0, and the number of edges inside
X that are not incident to X1 is precisely |X| − δT − dX (X1 ) ≥ 0.
We can simplify this as follows. If |X| > |X1 |, then |X| − δT − dX (X1 ) = o(np) = o(np(|X| − |X1 |).
7
On other hand, if |X| = |X1 | = 1, then δT = 1 and dX (X1 ) = 0. Thus in fact we have
(︃ )︃
s dX̄ (X1 )
p+ ≤ 1 + 3ε)(|X| − |X1 |) + . (9)
w(X) d(X1 )
(︃ )︃
s d (X1 )
p+ ≥ (1 − 3ε)(|X| − |X1 |) + X̄ . (10)
w(X) d(X1 )
Case BD1a: X1 = Y0 = ∅.
In this case equations (9), (10) and εnp ≫ 1 imply that
(1 − 3ε)|X| (1 + 3ε)|X|
≤ p− ≤ .
w(X) w(X)
So we have
p+
∼ε s. (13)
p−
Case BD1b: X1 = {x1 } , Y0 = ∅ and (|X| > 1 or d(x1 ) = Ω(np)).
If |X| > 1 then equations (11), (12) imply that
8
We then have, with the aid of (9) and (10) and the fact that d(x1 ) = Ω(np) implies dX̄ (x1 ) ∼ε d(x1 )
that (︃ )︃
p+ |X|
∼ε s . (15)
p− |X| − 1 + d(x1 )/np
s d(x1 )
If |X| = 1 then p+ = w(X) and p− ∼ε w(X)np and so (15) holds in this case too.
We have dX (y0 ) = 1. To see this observe that Xˆ︁ defined in Remark 2 will be a connected set of
size o(ω0 ) and so Lemma 3(c) implies that dX (y0 ) = 1
But Lemma 3(d) implies that d(y0 ) ≥ ω0 and the term 1/d(y0 ) is absorbed into error terms. Note
that x1 = v0 ∈
/ S0 and so d(x1 )/np ≥ 1/10. So (15) holds.
Case BD2: |X| ∈ I1 = [20/ε3 , n1 = n − n/(np)1/2 ] and |X| ≥ εt where t denotes the iteration
number and
{︄ 1/2
1 |X| ≤ ω0 .
|(X ∪ N (X)) ∩ S1 | ≤ 1/2 (21)
O(ε−2 |X|/ω0 ) ω0 ≤ |X| ≤ n/(np)9/8 .
9
It follows from Lemma 3(h),(j) that
{︁ }︁
e(X : X̄) ≥ min e(X \ S1 : X̄), e(X, X̄ \ S1 ) ≥ e(X \ S1 : X̄ \ S1 )
≥ (1 − 2ε)|X \ S1 | |X̄ \ S1 |p
(︃ (︃ )︃)︃
t (︁ )︁
≥ (1 − 2ε) |X| − O 2
|X̄| − |S1 | p (22)
ε ω0
≥ (1 − 3ε)|X| |X̄|p. (23)
and similarly
e(X : X̄) ≤ (1 + 3ε)|X| |X̄|p.
Note that to go from (22) to (23) we use |X̄| ≫ |S1 | from Lemma 3(b) and the assumption that
|X| ≥ εt.
Now,
s e(X \ S1 : X̄) s(1 − 5ε)|X|(n − |X|)
p+ ≥ · ≥ . (24)
w(X) (1 + ε)np nw(X)
(︃ )︃
s e(X \ S1 : X̄) s(1 + 5ε)|X|(n − |X|)
p+ ≤ + |X ∩ S1 | ≤ . (25)
w(X) (1 − ε)np nw(X)
When |X| ≤ n/(np)9/8 we use (21). For larger X we have from Lemma 3(b) that |X|(n − |X|) ≥
ε−2 n|S1 |.
When |X| ≤ n/(np)9/8 we use (21). For larger X we again use |X|(n − |X|) ≥ ε−2 n|S1 |.
10
So,
p+ 1
≥ . (31)
p− 5np
In this section we use the results of Section 4.2 to determine the asymptotic fixation problem, for
various starting situations.
First recall the following basic result on random walk i.e. Gambler’s Ruin: we consider a random
walk Z0 , Z1 , . . . , on A = {0, 1, . . . , m}. Suppose that Z0 = z0 > 0 and that if Zt = x > 0 then
P(Zt+1 = x − 1) = β and P(Zt+1 = x + 1) = α = 1 − β. We assume that 0, z1 > z0 are absorbing
states and that α > β. Let ϕ denote the probability that the walk is ultimately absorbed at 0.
Then, see Feller [8] XIV.2,
(β/α)z0 − (β/α)z1
ϕ= . (32)
1 − (β/α)z1
Feller also proves that if D denotes the expected duration of the game then
m 1 − (β/α)a a
D= · m
+ = O(m). (33)
α − β 1 − (β/α) α−β
Proof. Suppose that we consider a process Z1 , Z2 , . . . , such that Zt is the size of X after t iterations,
unless X becomes zero. In the latter case we use 0 as a reflecting barrier for Zt . Now consider
s
the first τ = 40s/(ε3 (2σ − 1)) steps of the Z process where σ = 2(s+1) + 14 ∈ ( 12 , s+1
s
). Given the
probabilty that the walk followed by X increases with probability ∼ s, the Chernoff bounds imply
that w.h.p. Z makes at least 2τ σ/3 positive steps and this means that Z will at some stage reach
20/ε3 . Going back to X, we see that w.h.p. either |X| reaches 0 or |X| reaches 20/ε3 .
Proof. We first show that if |X| reaches ω then w.h.p. |X| reaches n1 = n − n/(np)1/2 . This
follows from Lemma 4 with a = 10/ε3 and b = n1 , applied to the walk Zt = |X| at iteration t. In
particular, as long as |X| > max {a, εt}, the hypotheses of the lemma are satisfied with a positive
bias ∼ε s, by (30).
Now assume that |X| = n1 . The analysis of Case BD3 shows that there is a probability of at least
η = (1/5np)n2 that X reaches [n] after a further n2 = n − n1 = n/(np)1/2 steps. Now consider
11
the following experiment: when |X| = n1 , the walk moves right with probability at least 1/2 and
left with probability at most 1/2. If it moves right then there is a probability of at least η that
|X| reaches n before it returns to n1 . If it moves left then (32) implies that there is a constant
0 < ζ < 1 such that the probability of |X| reaching 20/ε3 before returning to n1 is at most (1−ζ)n1 ,
for any constant ζ < s/(s + 1). (Since this event can be analyzed with Case BD2 exclusively.) Let
m = η −1 log n. Then we have m(1 − ζ)n1 → 0 and mη → ∞. So w.h.p. X will reach [n] after
at most m returns and never visit 20/ε3 . Indeed, the probability it does not reach [n] is at most
o(1) + (1 − η)m/2 = o(1). (The first o(1) bounds the probability that |X| moves right from n1 fewer
than m/2 times.)
Lemma 9. Suppose that d(v0 ) = o(np) and let ω = np/d(v0 ). Then w.h.p. v0 ∈ X for the first
ω 1/2 iterations.
Proof. If X = {v0 } then it follows from Case BD1c that p− /p+ = O(1/ω) and the probability X
becomes empty is O(1/ω). If v0 ∈ X and |X| > 1 then the probability that v0 is removed from
X in the next iteration is also O(1/ω). This is because all of v0 ’s neighbors have degree Ω(np)
outside X (Lemma 3(d)) and all vertices of X other than v0 have many more neighbors in X̄ than
X. (Note from Lemma 5 that no vertex in X other than v0 can be in S0 ).
So, the probability that v0 gets removed this early is O(ω 1/2 /ω) = o(1).
Case BDF2: d(v0 ) = o(np): We are initially in Case BD1c and Lemma 9 implies that we stay in
this case for the first ω 1/2 iterations, where ω = np/d(v0 ). Whenever |X| = 1, we see from (16) that
the probability X becomes empty in the next iteration is O(1/ω). Furthermore, (32) with a = 2
then implies that |X| reaches ω 1/3 with probability ∼ε (s − 1)/s before returning to 1. Combining
the above facts, we see that |X| reaches ω 1/3 within ω 1/2 iterations and we can then apply Lemma
8 to complete the proof of Theorem 1(a).
12
conditioned on X losing a vertex in the next iteration, the vertex it loses is v0 with probability
asymptotically equal to
αnp 1
n np α
(|X|−1)np 1
= . (34)
+ αnp 1 |X| − 1 + α
n np n np
1/2
Also, if v0 leaves X then Lemma 5 implies that it only returns to X with probability O(ε−2 ω0 /ω0 ) =
1/2
o(1) in the next ω0 iterations.
1/2
We can thus asymptotically approximately model |X| in the first ω0 iterations as a random walk
W0 = (Z0 = 1, Z1 , . . . , ) on {0, 1, . . . , n} where at the tth step if Zt−1 = j > 0 then either (i)
P(Zt = Zt−1 + 1) = αj = s/(s + 1 + (α − 1)/j) or (ii) P(Zt = Zt−1 + 1) = β = s/(s + 1).
The walk starts with probabilities as in (i) and at any stage may switch irrevocably to (ii) with
probability ∼ε ηj = α/(j − 1 + α), by (34). The fixation probability is then asymptotically equal
1/3
1/3 s−j −s−ω0
to the probability this walk reaches m = ω0 before it reaches 0. Let qj = 1/3 denote the
1−s−ω0
probability of reaching 0 before m in the random walk W1 where there is always a rightward bias
of s.
1/3
Let pj = pj (BDF 3) denote the probability that the walk reaches 0 before m = ω0 . Then p0 = 1
and pm = 0 and
pj = αj pj+1 + (1 − αj )(1 − ηj )pj−1 + (1 − αj )ηj qj (35)
1/2
for 1 < j < ω0 , from which we can compute p1 , asymptotically.
1/2
If |X| reaches ω0 then Lemma 8 implies that it will reach n w.h.p. This establishes part (b) of
Theorem 1.
4.3.2 s<1
Arguing as above we see that except when |X| ≤ 20/ε3 that w.h.p. the size of X follows a
random walk where the probability of moving left from a positive position is asymptotically at
13
|X|−1
least (s+1)|X| > 12 for |X| > 1−s
2
. We argue as in we did at the end of Case BDF2, with right moves
and left moves reversed, that w.h.p. X becomes empty.
4.3.3 s=1
It follows
∑︁ from Maciejewski [11] that the fixation probability of vertex v is precisely π(v) =
d(v)−1 / w∈[n] d(w)−1 . In a random graph with np = O(log n) this gives maxv π(v) = O(log n/n)
and when np ≫ log n this gives maxv π(v) = O(1/n).
If np/ log n → ∞ then all vertices have degree ∼ np, see Theorem 3.4 of Frieze and Karoński [9]. So
S1 = ∅ and all but (f), (g), (k), (l) of Lemma 3 hold trivially. But (f) is only used to bound e(X : X̄),
where there is the possibility of low degree vertices. This is unnecessary when np/ log n → ∞ since
then w.h.p. e(S : S̄) ∼ |S|(n − |S|)np for all S. Property (g) is only used in (9), (10) to bound
e(X). But because |X| is small this will be small compared to |X|np and only contributes to the
error term. Properties (k), (l) are not used in analysing Birth-Death. In conclusion we see that
only Case BDF1 is relevant and Theorem 1 holds in this case.
5 Death-Birth
The analysis here is similar to the Birth-Death process and so we will be less detailed in our
description. We first replace (6), (7) by
1 ∑︂ sdX (v)
p+ = pDB
+ (X) = P(|X| → |X| + 1) = . (37)
n sdX (v) + dX̄ (v)
v∈N (X)
1 ∑︂ dX̄ (v) |X|
p− = pDB
− (X) = P(|X| → |X| − 1) = ≤ . (38)
n sdX (v) + dX̄ (v) n
v∈X
We use the notation of Section 4. We will once again assume first that np = O(log n) and remove
the restriction later in Section 5.4.
Lemma 10. While, |X| ≤ n/(np)9/8 , the probability that X ∩ (S1 \ S0 ) increases in an iteration is
O(ε−2 /ω0 ).
Proof. We consider the addition of a member of S1 to X. This would mean the choice of v ∈
N (X) ∩ (S1 \ S0 ) and then the choice of a neighbor w of v in X. Let C be the component of the
14
graph GX induced by X that contains v. Assume first that |X| ≤ np/20. Lemma 3(e) implies that
d(w) ≥ ω0 and so dX̄ (v) ≥ ω0 /2 and since d(v) ≥ np/10 we can bound the probability of adding a
member of S1 \ S0 by O(ε−2 /ω0 ), where the ε−2 term comes from Lemma 3(j), applied with S = C.
3/4
We next consider the first ω0 iterations.
3/4
Lemma 11. W.h.p. N (X) ∩ S0 does not increase during the first ω0 iterations.
Proof. Consider the addition of a member of S0 to N (X). Suppose that a member of S0 is added
to N (X) because we choose v ∈ N (X)where N (v) ∩ S0 ̸= ∅ and we then choose w ∈ N (v) ∩ X.
3/4
Lemma 3(d) implies that d(v), d(w) ≥ np/10 and so we can bound this possibility in the first ω0
3/4
iterations by O(ω0 /np) = o(1).
3/4
Lemma 12. W.h.p., if d(v0 ) ≤ ω0 then dX (X1 ) ≤ 1 during the first ω0 iterations. Furthermore,
3/4
if such a neighbor leaves X then dX (X1 ) = 0 for the remaining iterations up to ω0 .
Proof. After the first iteration either X = ∅ or X = {v0 , v1 }(︂ where v1 ∈ N (v)︂0 ). As
(︂ long
)︂ as
d(v0 )−1 ω0
|X| > 1, the chance of adding another neighbor of X to X is O (|X|−1)np+d(v0 )−1 = O np . So,
7/4
the probability that dX (v0 ) reaches 2 is O(ω0 /np) = o(1). The same calculation suffices for the
second claim.
15
5.2 Bounds on p+ , p−
1/2
It follows from Lemma 11 that we only need to consider the case where (i) |X| > ω0 or (ii)
1/2
|X| ≤ ω0 and X ∩ S1 ⊆ {v0 } and |N (X) ∩ S0 | ≤ 1.
Here we have used the fact that d(v) ≥ np/10 and np ≫ |X| to remove dX (v) from the first two
denominators. This is also used to remove the second summation in (39). So, separating w ∈ X1
from the rest of X we see that when |X| > 1, (using Lemma 3(j)),
⎛ ⎞
s⎜ ∑︂ 1 |Y0 | ⎟
p+ ∼ε ⎜|X| − |X1 | + + ⎟. (41)
n⎝ dX̄ (v) sdX (Y0 ) + dX̄ (Y0 ) ⎠
w∈X1
v∈N (w)\X
16
Case DB1b: |X| > 1 and X1 = Y0 = ∅.
It follows from (41) that w.h.p.
s|X| |X|
p+ ∼ε and p− ∼ε . (45)
n n
Case DB1c: |X| > 1 and X1 = {x1 } and d(x1 ) = αnp ≫ ε−2 and Y0 = ∅.
s(|X| − 1 + α) |X|
p+ ∼ε and p− ∼ε . (46)
n n
Here we have used Lemma 3(j) to replace the sum in (41) by α. (When we apply the lemma, the
set S will be the connected component of X that contains x1 .)
Case DB1d: |X| > 1 and X1 = {x1 } and d(x1 ) = O(ε−2 ) and Y0 = ∅.
(︃ )︃
s(|X| − 1) 1 d(x1 ) − δ1 |X|
p+ ∼ε and p− ∼ε |X| − 1 + ∼ , (47)
n n sδ1 + d(x1 ) − δ1 n
where δ1 = dX (x1 ). Note that Lemma 12 implies that w.h.p. x1 has at most one neighbor in X.
We have used Lemma 3(d) to remove the sum in (41).
ˆ︁ (X) = N (X) \ (B(X) ∪ S1 ) then from Lemma 3(h)(j)(k)and Lemma 4 (used to replace
Then, if N
17
|X| by |X|
ˆ︁ in one place,
∑︂
|N
ˆ︁ (X)| ≥ e(X \ S1 , X̄ \ S1 ) − Bk (X)
k≥2
(np)1/3 5np
∑︂ ε|X|np ∑︂ 10|X|
≥ (1 − 2ε)|X \ S1 |(n − |X| − |S1 |)p − −
k2 k − 10
k=2 k=(np)1/3
2
ˆ︁ ∩ S1 |np − επ |X|np + (10 + o(1))|X| log(5np)
≥ (1 − 3ε)|X|np − |X
6
≥ (1 − 4ε)|X|np.
So,
1 ∑︂ s s|X| |X|
p+ ≥ ≳ε and p− ≤ .
n (1 + ε)np n n
v∈N
ˆ︁ (X)
e(D(X) \ S1 , X̄ \ S1 ) + 5np|S1 |
≤
(n + (s − 1)|X| − ((2s + 1)n − (s + 1)|X|)ε)p
(1 + 3ε)|X|(n − |X|)p
≤ . (50)
(n + (s − 1)|X| − ((2s + 1)n − (s + 1)|X|)ε)p
∑︂ dX̄ (v) |X|
≤ |X ∩ S1 | ≤ |S1 | ≤ . (51)
sdX (v) + dX̄ (v) np
X∩S1
∑︂ dX̄ (v) 1
≤ θ|X|, where θ = . (52)
sdX (v) + dX̄ (v) ε2 (np)1/2
X\(D(X)∪S1 )
So we see that if |X| = ξn then after summing the above inequalities and simplifying, we see that
ξ(1 − ξ)
p− ≲ε . (53)
1 + (s − 1)ξ
We now look for a lower bound on p+ .
1 ∑︂ sdX (v)
p+ ≥
n (s − 1)dX (v) + (1 + ε)np
v∈N (X)∩(D(X̄)\S1 )
1 ∑︂ sdX (v)
≥
n (s − 1)(1 + ε)|X|p + (1 + ε)np
v∈N (X)∩(D(X̄)\S1 )
18
With α = 1/(np)1/4 and θ = 1/ε2 (np)1/2 ,
It follows from this that in both cases e(X̄ \ D(X̄), X) ≤ ε|X|(n − |X|)p. So,
s(1 − 3ε)ξ(1 − ξ)
p+ ≥ (54)
(s − 1)(1 + ε)ξ + 1 + ε
sξ(1 − ξ)
≳ε . (55)
(s − 1)ξ + 1
In which case
p+ sξ(1 − ξ) 1 + (s − 1)ξ
≳ε · = s.
p− (s − 1)ξ + 1 ξ(1 − ξ)
Proof. We first show that if |X| reaches ω then w.h.p. |X| reaches n1 = n − n/(np)1/2 . Let a = ω/2
and m = n1 − a. There is a positive bias of ∼ε s in Cases DB2, DB3 as long as |X| > a. It follows
from (32) that the probability |X| ever reaches a before reaching m is o(1).
Now consider the case of |X| ≥ n1 . Comparing (7) and (37) we see that pDB BD
+ (X) ≥ p− (X).
Comparing (6) and (38) we see that pDB BD
− (X) ≤ P+ (X) . By comparing this with Case BD3
of Section 4.2, we see that this implies that p+ (X)/pDB
DB
− (X) ≥ 1/(5snp). Now consider the
experiment described in Lemma 8. Beginning with |X| = n1 , we still have a probability of at
most (1 − ζ)n1 of |X| reaching 0 before returning to n1 . Now there is a probability of at least
1/2
η = (5snp)s−n/(np) of |X| reaching n before returning to n1 .
19
Case DBF2: v0 ∈ S1 , Y0 = ∅ and d(v0 ) = αnp where αnp ≫ ε−2 .
1/2
In this case we remain in Case DB1a or DB1c while |X| ≤ ω0 as long as v0 is not removed
from X. If d(v0 ) = αnp then the probability that this happens, conditional on a change in X, is
∼ε s/((s + 1)|X| − 1 + α). There are |X| chances of about s/n of choosing v ∈ X. Then for each
w ∈ X \ {v0 } there is a chance of about 1/n that v is a neighbor of w and that v chooses w as u.
This leads to (35) with ηj = s/((s + 1)j − 1 + α) and gives pj (DBF 2).
1/2
We see from the above cases that when |X| is small the chance that X reaches ω0 yields (a), (b)
of Theorem 2, because if |X| reaches ω0 then there is a positive rightward bias and X will w.h.p.
eventually become [n].
20
The case s ≤ 1 The above analysis holds for s > 1. For s ≤ 1 we go back to the case where
|X| ≤ ω0 . If s < 1 then we see from (45) – (48) that there are constants C1 > 0, 0 < C2 < 1 such
that if |X| ≥ C1 then p+ /p− ≤ C2 . It follows that w.h.p. |X| will return to C2 before it reaches
1/2
ω0 and then there is a probability bounded away from 0 that |X| will go directly to 0.
The case s = 1 It ∑︁ follows from Maciejewski [11] that the fixation probability of vertex v is
precisely π(v) = d(v)/ w∈[n] d(w). In a random graph with np = Ω(log n) this gives maxv π(v) =
O(1/n).
When np ≫ log n then S1 = ∅ and and all but (f), (g), (k), (l) of Lemma 3 hold trivially. Now (f)
and (k) are used in bounding e(X : X̄) and are not therefore needed. (g) is not used in Death-Birth.
The proof of (l) does not need np = O(log n). There is always a bias close to s and the fixation
probability is asymptotic to s−1
s .
References
[2] M. Broom and J. Rychtář. An analysis of the fixation probability of a mutant on special
classes of non-directed graphs, in Proceedings of the Royal Society A: Mathematical, Physical
and Engineering Sciences 464.2098 (2008): 2609-2627.
[3] Chalub, Fabio ACC. Asymptotic expression for the fixation probability of a mutant in star
graphs, arXiv preprint arXiv:1404.3944 (2014).
[4] Allen, B., Lippner, G., Chen, YT. et al. Evolutionary dynamics on any population structure.
Nature 544, 227–230 (2017). https://doi.org/10.1038/nature21723
[5] Ibsen-Jensen, R., Chatterjee, K., and Nowak, M. A. (2015). Computational complexity of
ecological and evolutionary spatial dynamics. Proceedings of the National Academy of Sciences,
112(51), 15636-15641.
[6] Kuo, Y.P., Nombela-Arrieta, C. and Carja, O. A theory of evolutionary dynamics on any
complex population structure reveals stem cell niche architecture as a spatial suppressor of
selection. Nat Commun 15, 4666 (2024). https://doi.org/10.1038/s41467-024-48617-2
[7] Mohamadichamgavi, Javad, and Jacek Miekisz. ”Effect of the degree of an initial mutant in
Moran processes in structured populations.” Physical Review E 109.4 (2024): 044406.
[8] W. Feller, An Introduction to Probability Theory and its Applications, 3rd Edition, Wiley
New York, 1968.
21
[9] A.M. Frieze and M. Karoński, Introduction to Random Graphs, Cambridge University Press,
2015.
[10] Lieberman, E., Hauert, C. and Nowak, M. Evolutionary dynamics on graphs, in Nature 433,
312–316 (2005). https://doi.org/10.1038/nature03204
[12] J. Mohamadichamgavi and J. Miekisz, Effect of the degree of an initial mutant in Moran
processes in structured populations, Physical Review E 109 (2024).
A Proof of Lemma 3
(a) The degree d(v) of vertex v ∈ [n] is distributed as Bin(n−1, p). The Chernoff bound (5) implies
that (︂ e )︂5np
P(∆ > 5np) ≤ nP(Bin(n, p) ≥ 5np) ≤ n = o(1).
5
(b) We first observe that the Chernoff bounds (3), (4) imply that
∑︂ (︃n)︃ 2
P(B(n, p) ∈/ Iε ) = pi (1 − p)n−i ≤ e−ε np/(3+o(1)) . (56)
i
i∈I
/ ε
(c)
ω0 (︃ )︃
∑︂ n
P(∃v ∈ S1 ∩ C : ¬(c))) ≤ k k!pk P(B(n − 3, p) ∈
/ Iε − 2)
k
k=3
2 np/(3+o(1))
≤ 2(np)ω0 e−ε
= o(1).
Explanation: we sum over possible choices for a k-cycle C of Kn . There are less than nk k!
(︁ )︁
k-cycles in Kn . There are k choices for a vertex of C ∩ S1 . Given a cycle C and v ∈ C we multiply
by the probability that the edges of C exist in Gn,p and that (dC̄ (v) + 2) ∈/ Iε .
22
(d)
⎛ ⎞2
0 −1 (︃ )︃
(︃ )︃ ω∑︂ np/10 (︃ )︃
n n ∑︂ n i
P(¬(d)) ≤ k!pk+1 ⎝ p (1 − p)n−i ⎠ ≤
2 k i
k=1 i=0
⎛ ⎞2
np/10 (︃ )︃i
∑︂ nep
n(np)ω0 +1 ⎝
e−np ⎠ ≤ n(np)ω0 +1 (2(10e)np/10 e−np )2 ≤ n1+1/10+2/3−2+o(1) = o(1).
i(1 − p)
i=0
Explanation: we sum over pairs of vertices x, y and paths P of length k < ω0 joining x, y. Then
we multiply by the probability that these paths exist and then by the probability that x, y have
few neighbors outside P .
(e)
ω0 ω0 (︃ )︃
∑︂
ℓ ℓ −ε2 np/4
∑︂ n−2 2 np/4+np)
P(∃x, y : ¬(e)) ≤ n 2
npe pk (1−p)n−k−2 ≤ 2n(np)2ω0 +1 e−(ε = o(1).
k
ℓ=1 k=0
(f)
2n/(np)9/8 (︃ 2n/(np)9/8 (︃(︂ )︂ (︂
s 9 enp )︂10 s
)︃(︃ (︁s)︁ )︃ )︃
∑︂ n 2 10s
∑︂
P(¬(f)) ≤ p ≤ e = o(1).
s 10s n 20
s=20 s=20
Explanation: we choose a set of size s and bound the probability it has 10s edges by the expected
number of sets of 10s edges that it contains. The final claim uses that fact that np = O(log n).
%eqrefitem9/81/2 (g)
2ω0 (︃ )︃(︃ (︁s)︁ )︃ 2ω0 (︂
∑︂ n 2 s+1
∑︂ neps )︂s
P(∃S : ¬(g)) ≤ p ≤ ω0 ep = o(1).
s s+1 2
s=4 s=4
We use a similar analysis as for property (f). The final claim also uses that fact that np = O(log n),
in which case (nepω0 )2ω0 ≤ no(1) .
(h) At least one of S, T has size at most n/2 and assume it is S. Suppose first that S induces a
connected subgraph. Suppose that |S| ≤ n/(np)9/8 . We first note that
n 2
n − |T | ≤ 9/8
+ n1−ε /4 ≤ ε2 |T |.
(np)
Then we have
23
So now assume that n/(np)9/8 ≤ |S| ≤ n/2. Let Iˆ︁(d) = [n/(np)9/8 , n/2]. Fix S1 and all edges
2
incident with S1 . Then, where m stands for |S1 | and nε = n1−ε /4 ,
(︂ )︂
P ∃|S| ∈ Iˆ︁(d) , e(S : T ) ≤ (1 − ε)|S| |T |p
∑︂ (︃n − m)︃ 2
≤ ss−2 ps−1 e−ε s(n−s−m)p/3 (57)
s
s∈Iˆ︁(d)
∑︂ (︃n − m)︃ 2
≤ ss−2 ps−1 e−ε snp/7
s
s∈Iˆ︁(d)
1 ∑︂ −ε2 nps/8
≤ e
s2 p
s∈Iˆ︁(d)
= o(1).
Explanation of (57): Given s there are ns choices for S, ss−2 choices for a spanning tree T of
(︁ )︁
S. The factor ps−1 accounts for the probability that T exists in Gn,p and then the final factor
2
e−ε s(n−s−m)p/3 comes from the Chernoff bounds, since e(S : S̄) is distributed as Bin(s(n − s), p).
These are computed conditional on each v ∈ S having a lower bound on its degree and applying
the FKG inequality.
(︂ )︂
When it comes to estimating P ∃|S| ∈ Iˆ︁(d) , e(S : T ) ≥ (1 + ε)|S| |T |p we apply a similar argument,
but this time when we apply the FKG inequality we use the fact that each vertex has an upper
bound on its degree.
We now deal with the connectivity assumption. Suppose now that S has a component C of size less
than 20/ε3 . Then, using Lemma 3(g), we see that w.h.p. |N (C)| ≥ d(C) − 2|C| ≥ (1 − 2ε)|C|np
since S ∩ S1 = ∅. Clearly |N (C)| ≤ (1 + 2ε)|C|np, since C ∩ S1 = ∅. So, S will inherit the required
property from its components.
(i)
n/(np)9/8 (︃ )︃ n/(np)9/8 (︂
∑︂ n s−2 s−1 −s(n−s)p/3 1 ∑︂ )︂s
P(∃S : e(S : S̄) ≤ |S|np/2) ≤ s p e ≤ 2 e1−np/4 np = o(1).
s s p
s=ω0 /2 s=ω0 /2
(j) Suppose first that |S| ≤ ω0 . Let n0 = 5ω0 np be an upper bound on |S ∪ N (S)| and let s0 = 7/ε2 .
Then, using (a),
n0 (︃ )︃ (︃ )︃(︃ )︃
∑︂ n s
s−2 s−1 5s0 np −s0 ε2 np/(3+o(1))
P(∃S : ¬(j)) ≤ o(1) + s p e
s s0 s0
s=1
n0
(︄ 2
)︄s0
n ∑︂ s 5snpe2−ε np/(3+o(1))
≤ o(1) + 2 (enp) 2 = o(1).
s p s 0
s=1
24
When |S| > ω0 we replace s0 by s1 = 7s/ε2 ω0 to obtain
⌈︁ ⌉︁
n0 (︃ )︃ (︃ )︃(︃ )︃
∑︂ n s−2 s−1 s 5s1 np −s1 ε2 np/(3+o(1))
P(∃S : ¬(j)) ≤ s p e
s=ω0
s s1 s1
n0
(︄ 2
)︄s1
n ∑︂ s 5snpe2−ε np/(3+o(1))
≤ 2 (enp) = o(1).
s p s=ω s1
0
(︁ s )︁
(k) We use k pk to bound the probability that v ∈ Bk (S).
(np)1/3 n/(np)9/8 (︃
)︃ (︃ )︃ (︃(︃ )︃ )︃αk snp
∑︂ n s−2 s−1 n − s
∑︂ s k
P(∃S : ¬(k)) ≤ s p p
s αk snp k
k=2 s=k
(np)1/3
∑︂ 1 n/(np)
9/8 (︄ (︃ k+1 )︃εnp/k2 )︄s
∑︂ e (sp)k−1
≤ (enp) · = o(1).
s2 p k k αk
k=2 s=k
(l) We can assume that S induces a connected subgraph and then sum the contributions from each
component. We first consider the case where |S| ≤ n/2.
n/2
(︃ )︃ (︃ )︃
∑︂ n s−2 s−1 s 2
P(∃S : ¬(l)) ≤ s p (2e−ε (n−s)p/3 )θs
s θs
s=n/(np)2
n/2
⎛ (︄ 2 (n−s)p/3
)︄θ ⎞s
1 1−ε
⎝nep 2e
∑︂
≤ ⎠ = o(1).
p 2
θ
s=n/(np)
(︁n)︁
When n/2 < |S| ≤ n1 we drop the connectivity constraint and replace s by 4s . The summand is
(︂ 2
)︂s
then equal to 4e(2e−ε (n−s)p/3 /θ)θ .
25