0% found this document useful (0 votes)
17 views9 pages

Dma 3 Notes

This document provides notes on sequences, functions, and asymptotic growth of functions. 1. It defines sequences as infinite collections of numbers indexed by the integers, and introduces ways to specify sequences explicitly via functions or recursively. It also defines series as sums of the terms in a sequence. 2. It reviews common functions like power, exponential, logarithmic, absolute value, floor, and ceiling functions. It discusses their properties and graphing them to understand behavior. 3. It collects definitions for analyzing asymptotic growth rates of functions, introducing Big O, Theta, and little o notation to describe how fast functions grow relative to each other for large values.

Uploaded by

Simon Andersen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views9 pages

Dma 3 Notes

This document provides notes on sequences, functions, and asymptotic growth of functions. 1. It defines sequences as infinite collections of numbers indexed by the integers, and introduces ways to specify sequences explicitly via functions or recursively. It also defines series as sums of the terms in a sequence. 2. It reviews common functions like power, exponential, logarithmic, absolute value, floor, and ceiling functions. It discusses their properties and graphing them to understand behavior. 3. It collects definitions for analyzing asymptotic growth rates of functions, introducing Big O, Theta, and little o notation to describe how fast functions grow relative to each other for large values.

Uploaded by

Simon Andersen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

DMA

– Notes for Week 3 –

1 Sequences
We will think of a sequence as an infinite, linearly ordered collection of
numbers. An example would be
1, 4, 9, 16, 25, 36, 49, 64, 81, 100, . . . (1)
or
0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, . . . (2)
We will often use the notation (an ) to describe the elements of a sequence
and refer to an as the n’th element of the sequence. The number n is an
index and we often start a sequence at index 0 or 1.
We can succinctly specify a sequence via a function f that is defined on,
say, the positive integers {1, 2, 3, 4 . . . }. In that case, we can write an explicit
expression for a sequence by setting
an = f (n).
For example, the sequence in (1) is defined by the function f (x) = x2 and
we would write the sequence as
bn = n2 for n ≥ 1.
In a similar way, we can explicitly specify sequence (2) as follows:
jnk
cn = n − 2 for n ≥ 0.
2
We can also define sequences recursively. A recursive definition works
by explicitly specifying some of the first elements in a sequence and then
defining the rest by referring back to previous elements one or more indices
away. One example would be to define
c0 = 0
c1 = 1
cn = cn−2 for n ≥ 2
The above recovers the sequence (2) since we can use the rule cn = cn−2 to
convince ourselves that c2 must be equal to c0 = 0 and that c3 must be equal
to c1 = 1 etc.

1
Example 1. We define a sequence fn by setting f0 = 1 and

fn = nfn−1 for n ≥ 1.

The first elements of the sequence are

1, 1, 2, 6, 24, 120, 720, 5040

This sequence is referred to as the factorial sequence and we usually write


fn as n!. Note that
n! = 1 · 2 · 3 · · · n.

A special class of recursively defined sequences consists of series (sums


of sequences). A series is defined from some sequence (an ) by summing its
elements. If n0 is the first index of a sequence (an ), we define the series (sn )
recursively as

sn0 = an0
sn = sn−1 + an for n > n0

We will often write


sn = an0 + an0 +1 + · · · + an
or n
X
sn = ak
k=n0

to define a series.
A summation formula is the explicit expression of a series.

Example 2. Let (an ) be the constant sequence 1, i.e. an = 1, n ≥ 1. The


series of (an ) is then given by the sequence

1, 2, 3, 4, 5, 6, 7, 8, 9, . . .

and we have obtained our first summation formula:


n
X
1=n (3)
k=1

The derivation and proof of summation formulas are very connected to


the concept of mathematical induction, which we will return to later in
the course. For now, we postulate some commonly used summation formulas
without proofs:

2
Theorem 3.
n
X n2 + n
k =
k=1
2
n
X
2 2n3 + 3n2 + n
k =
k=1
6
n
X cn+1 − c
ck =
k=1
c−1

The last formula is valid for all c 6= 1.

When c = 1 in the last summation formula, one can use formula from (3)
instead.
As we will discover in the exercises, one can often go a long way by
combing the four summation formulas above with the general sum rules:

Theorem 4.
n
X n
X n
X
(ak + bk ) = ak + bk
k=1 k=1 k=1
n
X Xn
cak = c ak
k=1 k=1
Xn m−1
X n
X
ak = ak + ak
k=1 k=1 k=m

2 Functions and graphs


In this section, we recall the concept of functions and introduce some rele-
vant functions for the course.

• A power function is of the form

f (x) = xa

where the exponent a is a constant. The expression is well-defined for


all x when a ∈ {1, 2, 3, . . . } but it is sometimes √
necessary to restrict to
x > 0 for other values of a (recall the notation x = x1/2 ).

3
• An exponential function is of the form

f (x) = bx

where the base b > 0 is a constant. The expression is well-defined for


all x ∈ R.

• A logarithmic function is of the form

f (x) = logb (x)

where the base b > 1 is a constant. The expression is well-defined for all
x > 0. Logarithmic functions are the inverse functions of exponential
functions with the same base. Thus,

bx = y ⇐⇒ logb (y) = x

Theorem 5 (Properties of logarithms). For all a, b, c > 0 and all r ∈ R


we have

logc (ab) = logc (a) + logc (b) (4)


logb (ar ) = r logb (a) (5)
logc (a)
logb (a) = (6)
logc (b)

where, in each equation above, logarithm bases are not 1.

• The absolute value is given by


(
x if x ≥ 0
|x| =
−x if x < 0

• Floor is the function bxc, which rounds a real number x to the largest
integer less than or equal to x. Thus,
   
1 1
=0 − = −1 bπc = 3 b7c = 7
2 2

We can define the ceiling function in a similar way; dxe is the smallest
integer larger than or equal to x.

4
Figure 1: Graphs of bxc and |x|

In computer science, it is often convenient to combine other functions


with the floor and ceiling functions. In order to perform calculations in such
situations, it is advantageous to notice that

bxc = n ⇐⇒ n ≤ x < n + 1

and
dxe = n ⇐⇒ n − 1 < x ≤ n.

Example 6. We have that


dlog2 xe = n
when
n − 1 < log2 x ≤ n,
which gives
2n−1 < 2log2 x ≤ 2n
or just
2n−1 < x ≤ 2n
Thus, we see that dlog2 xe = n exactly when 2n−1 < x ≤ 2n .

Recall that functions that map numbers to numbers can be shown in a


graph, which is often the easiest way to understand the most important
properties of a function. See figures 1 and 2 for some examples of graphs.

5
Figure 2: The graph of blog2 xc.

3 Asymptotic growth of functions


3.1 Collection of definitions
In this section we collect the definitions regarding asymptotic growth of func-
tions. Further explanations are available in the [CLRS] book and during
lectures.

Definition 7. We say that a function f : R+ → R is asymptotically positive


if there exists x0 ∈ R+ such that 0 < f (x) for all x ≥ x0 .

We will also apply the above definition for functions that are defined on
some subset of the positive reals. A common choice of such a subset will be
positive integers Z+ or natural numbers N.

Definition 8 (Asymptotic notation). Let f and g be asymptotically positive


functions.

• We say that f (x) is O(g(x)) if there exists a constant c > 0 and x0


such that
f (x) ≤ cg(x)
for all x ≥ x0 . Think of this as “g grows at least as fast as f
asymptotically”.

6
• We say that f (x) is Θ(g(x)) if f is O(g(x)) and g(x) is O(f (x)). Think
of this as “g and f grow at the same rate asymptotically”.

• We say that f (x) is o(g(x)) if for any constant c > 0 we can find x0
such that
f (x) < cg(x)
for all x ≥ x0 . Think of this as “g grows (strictly) faster than f
asymptotically”.

One can use the definition of big-O to show that big-Θ can equivalently
be defined in the following manner.

Definition 9 (Second definition of big-Θ). Let f and g be asymptotically


positive functions. We say that f (x) is Θ(g(x)) if there exist constants
c1 , c2 > 0 and x0 such that

c1 g(x) ≤ f (x) ≤ c2 g(x)

for all x ≥ x0 .

It can be useful to informally(!) think of the above defined asymptotic


notions as being analogous to comparison of numbers. Specifically,

f (x) is O(g(x)) is like “f ≤ g”


f (x) is o(g(x)) is like “f < g”
f (x) is Θ(g(x)) is like “f = g”

One must be careful and only use the above analogy to build intuition as
some properties that hold for comparison of numbers do not carry over for
functions. For instance, if a and b are numbers, then we have that a ≤ b or
b ≤ a. For functions, however, we can have a situation where f is not O(g)
and g is not O(f ). Can you think of such an example?
Both little-o and big-O give upper bounds. Intuitively, little-o gives a
strict upper bound while big-O gives an upper bound that is potentially not
strict. More formally, we have the following.

Theorem 10. Let f, g : R+ → R be asymptotically positive functions such


that f (x) is o(g(x)). Then we have that

1. f (x) is O(g(x)) and

2. g(x) is not O(f (x)).

7
3.2 Collection of rules
Theorem 11. Let f, g, h, p : R+ → R be asymptotically positive functions.

(R1) “Overall constant factors can be ignored”


If c > 0 is a constant then cf (x) is Θ(f (x)).

(R2) “For polynomials only the highest-order term matters”


If p(x) is a polynomial of degree d, then p(x) is Θ(xd ).

(R3) “The fastest growing term determines the growth rate”


If f (x) is o(g(x)) then c1 g(x) + c2 f (x) is Θ(g(x)), where c1 > 0 and
c2 ∈ R are constants.

(R4) “Logarithms grow faster than constants”


If c > 0 is a constant then c is o(loga (x)) for all a > 1.

(R5) “Powers (and polynomials) grow faster than logarithms”


loga (x) is o(xb ) for all a > 1 and b > 0.

(R6) “Exponentials grow faster than powers (and polynomials)”


xa is o(bx ) for all a and all b > 1.

(R7) “Larger powers grow faster”


xa is o(xb ) if a < b.

(R8) “Exponentials with a bigger base grow faster”


ax is o(bx ) if 0 < a < b.

Informally, we can summarize rules (R4)–(R6) from above as


Constants < Logarithms < Polynomials < Exponentials

Example 12. Let us use the above rules to show that 2x + x grows asymp-
totically faster than 3x2 + 5x (i.e. 3x2 + 5x is o(2x + x)).
(1) First, use (R2) to conclude that 3x2 + 5x and x2 grow at the same
rate asymptotically (i.e. 3x2 + 5x is Θ(x2 )).

(2) Use (R6) observe that 2x grows faster x2 (i.e. x2 is o(2x )).

(3) Combine1 the bold statements from (1) and (2), to conclude that 2x
grows faster than 3x2 + 5x (i.e. 3x2 + 5x is o(2x )).
1
Note that even though we don’t have a formal rule saying this, we are using the fact
that if f1 grows faster than f2 which grows at the same rate as f3 then f1 grows faster
than f3 . The formal statement is that if f2 is o(f1 ) and f2 is Θ(f3 ) then f3 is o(f1 ).

8
(4) Use (R6) again to observe that 2x grows faster than x (i.e. x is o(2x ))).
By (R3), we can now say that 2x + x grows at the same rate as 2x
(i.e. 2x + x is Θ(2x )).

(5) Finally, combine the bold statements from (3) and (4), to get that 2x +x
grows faster than 3x2 + 5x asymptotically (i.e. 3x2 + 5x is o(2x + x)).

You might also like