0% found this document useful (0 votes)
91 views10 pages

Definition

The document discusses algorithms and their complexity. It defines an algorithm as a design for a computer program. The time and space complexity of an algorithm is expressed as a function of the problem size and the limiting behavior as size increases is the asymptotic complexity. Random access machines are introduced as a model of computation that consists of input/output tapes and memory registers. The computational complexity of RAM programs is analyzed based on uniform and logarithmic cost criteria. A stored program model called RASP that stores the program in memory is also discussed.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
91 views10 pages

Definition

The document discusses algorithms and their complexity. It defines an algorithm as a design for a computer program. The time and space complexity of an algorithm is expressed as a function of the problem size and the limiting behavior as size increases is the asymptotic complexity. Random access machines are introduced as a model of computation that consists of input/output tapes and memory registers. The computational complexity of RAM programs is analyzed based on uniform and logarithmic cost criteria. A stored program model called RASP that stores the program in memory is also discussed.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

1.

1 ALGORITHMS AND THEIR COMPLEXITY

1. What is an algorithm?
Algorithm is a design of computer program.
2. Algorithms and their complexity problem size(n).
The measure of the quantity of input data.
3. Time complexity and asymptotic time complexity
1. The time needed by an algorithm expressed as a function of the size of a problem is
called the time complexity of the algorithm.
2. The limiting behavior of the complexity as size increases is called the asymptotic time
complexity.
4. Space complexity and asymptotic space complexity
1. The space needed by an algorithm expressed as a function of the size of a problem is
called the space complexity of the algorithm.
2. The limiting behavior of the complexity as size increases is called the asymptotic time
complexity.

1.2 Random access machines


A random-access machine (RAM) models a one-accumulator computer in which
instructions are not permitted to modify themselves. A RAM consists of a read-only input tape,
a write-only output tape, a program, and a memory (Fig. 1.3). The input tape is a sequence of
squares, each of which holds an integer (possibly negative). Whenever a symbol is read from
the input tape, the tape head moves one square to the right. The output is a write-only tape ruled
into squares which are initially all blank. When a write instruction is executed, an integer is
printed in the square of the output tape that is currently under the output tape head. and the tape
head is moved one square to the right. Once an output symbol has been written. it cannot be
changed. The memory consists of a sequence of registers r0, r1, ri, …each of which is capable
of holding an integer of arbitrary size. We place no upper bound on the number of registers that
can be used. This abstraction is valid in cases where: I. the size of the problem is small enough
to fit in the main memory of a computer. and., the integers used in the computation are small
enough to fit in one computer word. The program for. a RAM is not stored in the memory. the
program does not modify itself. the instructions are arithmetic instructions. input-output
instructions. indirect addressing (for indexing arrays. e.g.) and branching instructions.
An operand can be one of the folJowing:
1.t. =i. indicating the integer i itself . LOAD =3, ADD=3, MULT=3
2. .., A nonnegative integer i, indicating the contents of register i. LOAD i, ADD i, MULT i
3. *i. indicating indirect addressing. That is. the operand is the contents LOAD *i, LOAD*2, LOAD *1
of register j. where j is the integer found in register i. If j <;. 0. then the
machine halts.
quite familiar to assembly language. To specify the meaning of an instruction, we define v
(a), the value of operand a, as follows:
c(i) = content of register i
c(i) = 0 unless the A.th instruction is JUMP, HALT, JGTZ. or JZERO.
v(a) = value of operand
v(=i) = i,
v(i)=c(i),
v(*i) = c(c(i) ).
division by zero halts the machine.
the execution of first eight instructions increments one. Instructions is in sequential order
JUMP or HALT. v(*1) = c(c(1/2))/10
1.3 COMPUTATIONAL COMPLEXITY OF RAM PROGRAMS
1. important measures of an algorithm: time and space complexity
2. The worst -case rime complexity (or just time complexity) of a RAM program is the
function f(n) which is the maximum, over all inputs of size n. of
change space
the sum of the "'time" taken by each instruction executed. The expected time
complexity is the average. over all inputs of size n. of the same sum.
3. substitute "'space”

l(i) ={ [log(i)]+1,
i!=0
{1 i=0

-Uniform cost criteria each RAM instruction requires one unit of time
n-1 MUL T instructions and each register requires one unit of space.
-A second, sometimes more realistic definition takes into account
limited size of a real memory word and is called the logarithmic criteria.
O(n)

uniform cost: space complexity is O (1).

logarithmic cost: space complexity is O (n log n).

1.4 A STORED PROGRAM MODEL


• a random access stored program machine (RASP), which is similar to a RAM with the
exception that the program is in memory and can modify itself.
• indirect addressing is not permitted.
• Each RASP instruction occupies two consecutive memory registers. The first register.
holds an encoding of the operation code: the second register holds the address.
Theorem 1.1. If costs of instructions are either uniform or logarithmic, for every RAM
program of time complexity T(n) there is a constant k such that there is an equivalent RASP
program of time complexity kT(n).

Proof: to simulate a RAM program P by a RASP program, the memory register of the rasp are
used as follow:
temporarily stores the contents of the accumulator in register 1
Theorem 1.2. If costs of instructions are either uniform or logarithmic. for every RASP
program of time complexity T(n) there is a constant k such that there is an equivalent RAM
program of time complexity at most kT(n). Proof: The RAM program we shall construct to
simulate the RASP will use indirect addressing to decode and simulate RASP instructions
stored in the memory of the RAM. Certain registers of the RAM will have special purposes:
register I - used for indirect addressing,
register 2-the RASP's location counter,
register 3 - storage for the RAS P's accumulator.
Register i of the RASP will be stored in register i + 3 of the RAM for i >=1 .
The RAM begins with the finite-length RASP-program loaded in its memory starting at register
4. The RAM program consists of a simulation loop which begins by reading an instruction of
the RASP (with a LOAD *2 RAM instruction). decoding it and branching to one of 18 sets of
instructions. The decoding and branching operations are straightforward:

You might also like