0% found this document useful (0 votes)
35 views3 pages

15 Bigo

This document introduces the concept of big O notation to describe the time complexity of algorithms. It defines big O formally as an asymptotic upper bound on the growth rate of an algorithm's run time. Specifically, an algorithm is said to run in O(g(n)) time if there exists constants c and n0 such that the time t(n) is less than or equal to c*g(n) for all n greater than or equal to n0. Examples are provided to demonstrate how to determine the big O classification of different functions using this definition.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views3 pages

15 Bigo

This document introduces the concept of big O notation to describe the time complexity of algorithms. It defines big O formally as an asymptotic upper bound on the growth rate of an algorithm's run time. Specifically, an algorithm is said to run in O(g(n)) time if there exists constants c and n0 such that the time t(n) is less than or equal to c*g(n) for all n greater than or equal to n0. Examples are provided to demonstrate how to determine the big O classification of different functions using this definition.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

COMP 250 Fall 2012

lecture 15 - big O

Oct. 16, 2012 (updated Oct. 17)

We have seen several algorithms in the course, and loosely characterized the time it takes to run them in terms of the size n of the input. Lets now tighten up our analysis. We will study the behavior of algorithms by comparing the number of operations required which is typically a complicated function of n of the input size n against some simpler function of n. This simpler function describes either an upper bound, in a certain technical sense to be specied below. The technical denition is reminiscent of the technical denition of a limit that some of you may have seen in Calculus (and that all of you have seen in Real Analysis if you are in Math). However, there are important dierences. A limits describes an asymptotic convergence of a function (or asymptotic divergence, in the case that the limit is ). Big O describes an asymptotic upper bound.

Big O
Let t(n) be a well-dened sequence of integers. This sequence t(n) represents the time or number of steps it takes an algorithm to run as a function of some variable n which itself represents the size of the input. We will consider n 0 and t(n) to both be positive integers. Let g (n) be another well dened sequence of integers. We commonly consider g (n) to be one of the following: 1, log n, n, n log n, n2 , n3 , 2n , . . . We would like to say that t(n) is bounded about by a simple g (n) function if, for n suciently large, we have t(n) g (n). We would say that t(n) is asymptotically bounded above by g (n). Formally, lets say that t(n) is asymptotically bounded above by g (n) if there exists a positive number n0 such that, for all n n0 , t(n) g (n). For example, consider the function t(n) = 5 + 7n. For n suciently large, t(n) 8n. Thus, t(n) is asymptotically bounded above by g (n) = 8n in the sense that I just dene. Notice that the constant 8 is arbitrary here. Any constant greater than 7 would do. For example, t(n) is also asymptotically bounded above by g (n) = 7.00001n. It is much more common and useful to talk about a asymptotic upper bounds on t(n) in terms of a simpler function g (n), namely where we dont have constants in the g (n). To do this, one needs a slightly more complicated denition. This is the standard denition of an asymptotic upper bound: Denition (big O): The sequence t(n) is O(g (n)) if there exists two positive numbers n0 and c such that, for all n n0 , t(n) c g (n). We say t(n) is big O of g (n). I emphasize that the condition n n0 allows us to ignore how t(n) compares with g (n) when n is small. In this sense, it describes an asymptotic upper bound. Example 1 The function t(n) = 5 + 7n is O(n). To prove this, we write: t(n) = 5 + 7n 5n + 7n, n 1 = 12n 1

COMP 250 Fall 2012

lecture 15 - big O

Oct. 16, 2012 (updated Oct. 17)

and so n0 = 1 and c = 12 satises the denition. An alternative proof is: t(n) = 5 + 7n n + 7n, for n 5 = 8n and so n0 = 5 and c = 8 also satises the denition. A few points to note: If you can show t(n) is O(g (n)) using constants c, n0 , then you can always increase c and/or n0 and be sure that these constants with satisfy the denition also. So, dont think of the c and n0 as being uniquely dened. There are inequalities in the denition, e.g. n n0 and t(n) cg (n). Does it matter if the inequalities are strict or not? Not really. You can easily verify that, for any t(n) and g (n), the statement t(n) is O(g (n)) is true (or false) whether we a strict inequality or just less than or equal to. We are looking for tight upper bounds for our g (n). But the denition of big O doesnt require this. For example, in the above example, t(n) is also O(n2 ) since t(n) 12n 12n2 for n 1. By a similar argument, t(n) is also O(n log n) or O(n3 ), etc. Many students write proofs that they think are correct, but in fact the proofs are incomplete (or wrong). For example, consider the following proof for the above example: 5 + 7n 5n + 7n 12n Thus, < < < c n c n, n >= 1 c n n_0 = 1

c > 12,

There are several problems with this, as a formal proof. the rst statement seems to assume what he is trying to prove; some indication should be given whether this rst line is an assumption, or this is what he is going to show. And is the statement true for some c, or for all c, or for some n, or all n. Its just not clear. There is no clear connection between the statements. What implies what? Are the statements equivalent, etc ? Such proofs may get grades of 0. This is not the big O you want. What would it mean to say t(n) is O(1), i.e. g (n) = 1 ? Applying the denition, it would mean that there exists a c and n0 such that t(n) c for all n n0 . That is, t(n) is bounded by some constant.1 Finally, some students try to do big O proofs by using ideas from Calculus and taking limits. There are situations in which this can be done. But generally you should avoid limits in proving statements about big O, unless you really know what you are doing (and have taken courses like MATH 242 Real Analysis, and so you know what a limit is, in a formal sense).
Why would this be used? Sometimes when we analyze an algorithms performance, we consider dierent parts of the algorithm separately. Some of those parts (such as assigning values to variables outside of any loops or any recursive calls) might be done once. Such parts take O(1) time.
1

COMP 250 Fall 2012

lecture 15 - big O

Oct. 16, 2012 (updated Oct. 17)

Example 2 (see a dierent example in slides) The function t(n) = 17 46n + 8n2 is O(n2 ). To prove this, we need to show there exists positive c and n0 such that, for all n n0 , 17 46n + 8n2 cn2 . t(n) = = and so n0 = 1 and c = 25 does the job. We can perform an alternatively manipulation: t(n) = = and so c = 9 and n0 = 5 does the job. Example 3 Show t(n) = First note
500+20 log n n

17 46n + 8n2 17 + 8n2 , n > 0 17n2 + 8n2 , if n 1 25n2

17 46n + 8n2 17 + 8n2 , n > 0 n2 + 8n2 , if n 5 9 n2

is O(1). t(n) =

500 + 20n 500 + 20 log n n n since log n < n for all n 1 (see below). We now want to show there exist positive c and n0 such that for all n n0 , 500 + 20n <c n Take c = 520. Then we want to show there exists an n0 such that 500 + 20n < 520n for all n n0 . But n0 = 2 clearly does the job. [BEGIN ASIDE: Can you formally prove the obvious claim that log n < n for n 1. One easy way to do this is to note the even more obvious claim n < 2n for all n 1 and take the log of both sides. To formally prove that n < 2n for n 1, use induction. The base case is trivial. For the induction step, we have k + 1 < 2k + 1 by the induction hypothesis < 2k + 2k = 2k+1 and we are done. END ASIDE] 3

You might also like