0% found this document useful (0 votes)
15 views227 pages

Slides Part3 AlgAndDatastrct

Uploaded by

dutchessloretta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views227 pages

Slides Part3 AlgAndDatastrct

Uploaded by

dutchessloretta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 227

School of CIT

Introduction to Informatics
for Students of Other Subjects

Social Computing
Prof. Dr. Georg Groh Research Group

1
Part III: Algorithms and
Data Structures

based on (1), (3) and (4)


(citations omitted for legibility) 2
Notion of Algorithm

● Definition Problem: given: set of information; sought: determine new


(unknown) information

● Algorithm: exact, stepwise procedure for solving a problem.

○ implements a function !: # → %
(E: input, A: output (data representations))
(--> „determined“: same input produces same output)

○ is represented by a finite list of unambiguously elementarily


executable (“effective”) instructions; these produce a finite sequence
(à algorithm “terminates”) of well defined states (ßà steps)

● not every function is computable (ß an algorithm exists)

● Program: formulation of an algorithm in a programming language (e.g.


with Java methods)

[8] 3
Example: Euclidean Algorithm

Problem: given: !, # ∈ ℕ !, # ≠ 0 . sought: greatest common


divisor ()*(!, #)

Algorithm:
(1) if a = b, terminate. We then have gcd(a,b)=a.
else goto step (2)
(2) if a > b replace a by a – b and goto step (1),
else goto step (3)
(3) we have a < b. Replace b by b – a and goto step (1)

4
Example: Euclidean Algorithm

Problem: given: !, # ∈ ℕ !, # ≠ 0 . sought: greatest common


divisor ()*(!, #)

Algorithm:
formulation in Java:
public static int gcd(int a, int b){
while(a != b){
if(a < b){
b = b - a;
} else { (1) if a = b, terminate. We then have gcd(a,b)=a.
a = a - b; else goto step (2)
} (2) if a > b replace a by a – b and goto step (1),
} else goto step (3)
return a; (3) we have a < b. Replace b by b – a and goto step (1)
}
5
Exercise

We have the following recipe for getting rich:

(1) add 7 and 15 and determine the square root of the result
(2) make a fortune at the stock market with options on pork bellies
(3) buy a Mercedes S500

Using our definition of Algorithm argue why or why not this recipe is actually an algorithm!

6
Exercise

E.g. :
• steps 2 and 3 are not unambiguously, elementarily executable (not effective)
• we do not solve a problem in the defined sense (of course one can broaden the
definition for problem and also include cooking recipes as algorithms)
• …

7
Efficiency

● further desirable property of an algorithm: Efficiency

● Efficiency:
ßà number of required steps (Time Complexity)
ßà number of required memory units (cells) (Space Complexity)

● Often: trade-off between time complexity (today often more important) and
space complexity

8
Worst, Average, Best Case

● starting point: size $ of a problem


○ example: problem: given: set % of numbers; sought: sorted sequence
→ input size: $ = %

● run-time complexity: let '( be the set of all instances of a problem ' with
size $ . Let ) be an algorithm solving ' that requires on an instance *+ ∈
'( -.,( (*+ ) many steps:
○ Worst Case runtime of ) for problem size $:
23456
-.,( = max -.,( (*+ )
+

○ Average Case runtime of ) for problem size $:


:;<4:=< 1
-.,( = Σ - (* )
'( + .,( +
○ Best Case runtime of ) for problem size $:
@<56
-.,( = min -.,( (*+ )
+

9
Asymptotic Complexity: Landau Symbols

● Asymptotic (i.e. effectively resulting for large n) time and space complexity.
(not dependent on programming language or machine model).

● → “effectively“: describe asymptotic complexity with Landau-Symbols (→ e.g.


neglect constant factors):
for a function !: ℕ → ℝ&

' ! = ) ∃+ > 0: ∃./ > 0: ∀. ≥ .2 : ) . ≤ + ! . }

is the set of all functions ): ℕ → ℝ&, that asymptotically (∀. ≥ .2 )


effectively (∃+ > 0) grow at most as fast as !

10
Asymptotic Complexity: Landau Symbols

● besides ! " („…at most as fast as "“) we also have the classes Ω " („… at
least as fast…“), Θ " („… as fast as…“), o " („… truly slower…“),
& " („… truly faster…“).

● examples:
○ ℎ ( = 15(, + 200(0 + 267854 ∈ ! " ( = (, or shortly ∈ ! (,
○ ℎ ( = 11(0 + 854 ∈ ! (0
○ ℎ ( = 11(0 + 854 ∈ ! (,
○ ℎ ( = 133 log ( + 54 ∈ ! log (

● notational variations:
ℎ: ( → 12(, ∈ ! ": ( → (, ←→ ℎ ( = 12(, ∈ ! " ( = (, ←→
12(, ∈ ! (, ←→ 12(, = ! (, ←→ ℎ ∈ ! " ←→ ℎ = ! "

11
Typical / Important Complexity Classes

● ! "# polynomial (especially ! " linear, ! "$ quadratic, ! "% cubic)

● ! n log "

● ! *+ exponential

● ! 1 constant

● ! (log ")# poly-logarithmic

12
Typical / Important Complexity Classes

● ! : class of all problems for that an algorithm exists that has a polynomial
(1)

[asymptotic effective worst-case] time complexity („solvable problems“)

● '!: class of all problems , where a solution of a problem instance may be


(1)

verified as a solution by an algorithm („Verifier“) in polynomial time.


(Associated known solving algorithms NP problems usually have an exponential […]
time complexity (problems are “unsolvable” according to current knowledge)).
We trivially have ! ⊆ '!

● a problem ( is NP-complete, if ( ∈ '! and ( ∈ NP-hard.


(1)

( ∈ NP-hard: ( is as hard or harder than any problem in


NP: every instance of any problem in NP can be translated
into an instance of ( with at most polynomial “translation
costs” → an algorithm for ( may be used to solve any
problem in '!.

(1)
more precisely decision problems, i.e. an algorithm solving the problem implements a function
": $ → & with A={0,1}. For our purposes, general function problems may be mapped on
equivalently complex decision problems.
13
Examples for Problems

● SAT: given: propositional logic formula with ! variables


(e.g.. (# ∨ %) ∧ (¬ ) ∨ * ∧ + ) (with !=5)); sought: is there an
assignment of the variables ∈ -./*, +#12* 3 that makes the formula
evaluate to true? (NP-complete, no P solution known)

● TSP (Traveling Salesperson): given weighted A


52 5
graph 4(5, 6, 7); sought: a sequence of visits B 9 D
of the nodes (cities) such that the overall 12 2
travelled distance is minimized (no efficient C
E 4
solution algorithm for the general case is known;
decision variant (“is there a solution with overall
travelled distance < L ?”) is NP-complete)

● given: set of numbers; sought: sorted sequence


(increasing). Without assumptions on the maximum size of
the numbers and only using comparisons: worst-case time
complexity: 8(! 19: !)
14
Exercise

For the solution of a problem with characteristic input size n, you have two
algorithms to choose from:
algorithm 1 has a worst case time complexity of O(n17 ),
algorithm 2 has a worst case time complexity of O(2n ).
Which algorithm do you choose?

15
Exercise

2" > $%& even for small n so you would choose algorithm 1. However, this
algorithm’s time complexity is already beyond practical applicability for all but
very moderately sized n.

16
Data Structures, Abstract Data Types

● Data Structure: formally defined organization form of data in view of


efficient processing .
○ e.g.: graph, weighted directed graph, binary tree, tree, array (field)
linked list, hash-map….

● Abstract Data Type: set of data of a certain type + related operations on these
data
○ operations are defined according to their semantics (implementation is
not specified)
→ encapsulated by defining an interface (with syntax and semantics)
○ often implemented with the help of data structures
○ e.g. classes in java (but also primitive types) implement abstract data
types

17
Examples for Abstract Data Types

● ℕ: (representations of) natural numbers; operations: +, - , div, mod, …)

● Stack: sequence of objects of arbitrary type;


operations: push(e), pop().

● Java class Bicycle: as an abstract data type: set of all possible Bicycle
objects (all possible combinations of values for cadence, speed, gear) and
the operations speedUp(), changeGear() etc.

18
Recommendations for Studying

● minimal approach:
understand the contents of the slides and look up unknown terms in
Wikipedia

● standard approach:
== minimal approach

● interested students
standard approach + read
http://en.wikipedia.org/wiki/P_versus_NP_problem
and [2], chapter 12

19
Part III.1: Data Structures for
Sequences

based on elements of [3],[4] 20


Sequences

simple general abstract data type: Sequence

special forms: Array (Field), Linked List, Stack, Queue

Sequence: linearly ordered structure of elements e"


S = e% , e' , … , e)*'

o linearly ordered in contrast to


-- branching (tree, graph, …) or
-- without oder (hash table),…

21
Operations on Sequences

Operation delivers State after Operation

e" , e$ , … , e& , … , e'($ [i] e& no change


( Reference to e& )

e" , e$ , … , e& , … , e'($ . set(i, e) -- e" , e$ , … , e, … , e'($

e" , e$ , … , e'($ . pushBack(e) -- e" , e$ , … , e'($ , e

e" , e$ , … , e'(8 . popBack() -- e" , e$ , … , e'(:

e" , e$ , … , e'($ . size() n no change

… … …

22
Static Arrays
o in Java e.g.:
int[] someIntArray = new int[120];
Bicycle someBicycleArray = new Bicycle[20];

o direct access to elements via index


o advantages: direct access possible, homogeneous in memory
o disadvantages: insert, delete difficult
expanding: not possible (→ dynamic array)
o typical application: linear algebra

Operation Java Compl.


e", e$, … , e& , … , e'($ [i] someArray[i] ,(1)
someArray[i] = e;
e", e$, … , e& , … , e'($ . set(i, e) if(lastFilledIndex < i) lastFilledIndex = i; , 1 (1)

e", e$, … , e'($ . pushBack(e) someArray[lastFilledIndex++] = e; , 1 (2)

e", e$, … , e'(: . popBack() lastFilledIndex--; ,(1)

e", e$, … , e'($ . size() someArray.length; ,(1)


(1),(2): assuming maximal size is not exceeded 23


Dynamic Arrays
o in Java e.g.:
Vector<ElementType> someVector = new Vector<ElementType>();

o if array too small: genererate new array of double size and


copy contents of old array into new array „reallocate“
o if array too large (lastIndexNotNull < n/4) genererate new
array of half size and copy contents of old array into new
array

interal implementation in Vector ≈


void pushBack(ElementTyp e){ void reallocate(int newSize){
if(lastFilledIndex ==n) n = newSize;
reallocate(2*n); ElementTyp[] newInternalArray = new
internalArray[lastFillexIndex]=e; ElementTyp[newSize]();
lastFilledIndex++; for (i=0; i<lastFilledIndex; i++){
} newInternalArray[i] = internalArray[i];
void popBack(){ }
lastFilledIndex--; internalArray = newInternalArray;
if(lastFilledIndex < n/4) }
reallocate(n/2);
24
}
Dynamic Arrays

o complexity of reallocate: Θ # (we have to copy whole array (à complexity of


pushBack and popBack also Θ # ???))

o observation: reallocate is required only infrequently à any sequence of n


pushback and popBack operations requires O(n) time → each of these has
”amortised costs” of O(1)

interal implementation in Vector ≈


void pushBack(ElementTyp e){ void reallocate(int newSize){
if(lastFilledIndex ==n) n = newSize;
reallocate(2*n); ElementTyp[] newInternalArray = new
internalArray[lastFillexIndex]=e; ElementTyp[newSize]();
lastFilledIndex++; for (i=0; i<lastFilledIndex; i++){
} newInternalArray[i] = internalArray[i];
void popBack(){ }
lastFilledIndex--; internalArray = newInternalArray;
if(lastFilledIndex < n/4) }
reallocate(n/2);
25
}
LIVE DEMO:
• Collections: Vector
• Iterators

26
Doubly Linked Lists

alternative for an array: doubly linked list:


⊥ e" e# e$%#

special dummy-element h
at the without content (⊥)
beginning: (easier management)

27
Doubly Linked Lists
o indirect access via predecessor / succesor
o advantages: insert, delete, expand: simple (no reallocate required)
o disadvantages: no direct access via index,
elements are “strewn” across memory.
o typical applications: implementation of stacks, queues


class Item<ElementType>{ e" e"#$
ElementType e;
Item<ElementType> next;
Item<ElementType> prev;

}

class List<ElementType>{
Item< ElementType> h; // initialised with ⊥ and references to itself
… further variables and methods …
}
in Java Class Library e.g. Linked List

Invariant: next.prev == prev.next == this


28
Doubly Linked Lists - splice

Important method: splice(Item<ElementType> a,


Item<ElementType> b,
Item<ElementType> t):

… , !' , !, … , $, $ ' , … , &, & ' , …

splice(a, b, t)

… , !' , $ ' , … , &, !, … , $, & ' , …

conditions:
o !, … , $ must be sub-sequence; (! = $ allowed).
o $ not before !
o & not in !, … , $
for clarity, we may not distinguish here between elements and references to elements 29
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ' : time complexity: O(1)


Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ' after t :


Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

30
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

a b t
$′ $ ! !′ " "′
… … … …
31
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

a b t
$′ $ ! !′ " "′
… a.prev … … …
32
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

a b t
$′ $ ! !′ " "′
… a.prev … … …
ap
33
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

a b t
$′ $ !b.next !′ " "′
… … … …
ap
34
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

a b t
bn
$′ $ !b.next !′ " "′
… … … …
ap
35
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

a b t
bn
$′ $ ! !′ " "′
… ap.next
… … …
ap
36
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

ap.next
a b t
bn
$′ $ ! !′ " "′
… … … …
ap
37
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

ap.next
a b t
bn
$′ $ ! !′ " "′
… … … …
bn.prev
ap
38
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

ap.next
a b t
bn
$′ $ ! !′ " "′
… … … …
ap
bn.prev 39
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

ap.next
a b t
$′ $ ! !′ " "′
t.next
… … … …
bn.prev 40
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

ap.next
a b t
tn
$′ $ ! !′ " "′
t.next
… … … …
bn.prev 41
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

ap.next
a b t
tn
$′ $ !b.next !′ " "′
… … … …
bn.prev 42
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

ap.next b.next
a b t
tn
$′ $ ! !′ " "′
… … … …
bn.prev 43
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

ap.next b.next
a b t
tn
$′ $ ! !′ " "′
… a.prev … … …
bn.prev 44
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

ap.next b.next
a b t
tn
$′ $ ! !′ " "′
… … … …
bn.prev a.prev 45
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

ap.next b.next
a b t
tn
$′ $ ! !′ " "′
t.next
… … … …
bn.prev a.prev 46
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

t.next
ap.next b.next
a b t
tn
$′ $ ! !′ " "′
… … … …
bn.prev a.prev 47
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

t.next
ap.next b.next
a b t
tn
$′ $ ! !′ " "′
… … … tn.prev …
bn.prev a.prev 48
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

t.next
ap.next b.next
a b t
tn
$′ $ ! !′ " "′
… … … …
tn.prev
bn.prev a.prev 49
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out $, … , ! :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert $, … , ! after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

t.next
ap.next b.next
a b t
$′ $ ! !′ " "′
… … … …
tn.prev
bn.prev a.prev 50
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out #, … , $ :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert #, … , $ after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
} a b b.next
t.next
# $

ap.next
t
#′ $′ ! !′
… … …
tn.prev
a.prev 51
bn.prev
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out #, … , $ :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert #, … , $ after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b; a b
} # $ b.next

ap.next t.next t
#′ $′ ! !′
… … …
tn.prev
a.prev 52
bn.prev
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out #, … , $ :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert #, … , $ after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
} a b
# $
b.next
ap.next
t.next
t

#′ $′ ! !′
… … a.prev …
tn.prev

bn.prev 53
Doubly Linked Lists - splice
static splice(Item<ElementType> a,
Item<ElementType> b, Item<ElementType> t){

// cut out #, … , $ :
Item<ElementType> ap = a.prev;
Item<ElementType> bn = b.next;
ap.next = bn;
bn.prev = ap;

// insert #, … , $ after t :
Item<ElementType> tn = t.next;
b.next = tn;
a.prev = t;
t.next = a;
tn.prev = b;
}

t a b
#′ ap.next $′ ! t.next # $ b.next !′
… … … …
bn.prev a.prev tn.prev

54
Exercise

complete the method at the two missing places!

// cut out $, … , ' :

// insert $, … , ' after t :

55
Exercise

// cut out $, … , ' :

// insert $, … , ' after t :

56
Doubly Linked Lists – Further Operations
Item<ElementType> head() { all: time complexity O(1)
return h;
}

boolean isEmpty(){
return (h.next == head());
}

Item<ElementType> first(){
return h.next; // evtl. ⊥
}

Item<ElementType> last(){
return h.prev; // evtl. ⊥
}

57
Doubly Linked Lists – Further Operations
static moveAfter (Item<ElementType> b, all: time complexity O(1)
Item<ElementType> a){
splice(b, b, a); // move b after a
}

moveToFront (Item<ElementType> b){


moveAfter(b, head());
}

moveToBack (Item<ElementType> b){


moveAfter(b, last());
}

58
Doubly Linked Lists – Further Operations
deleting and inserting of elements: all: time complexity O(1)
using separate list freeList
→ better run time
(memory allocation is expensive)

static remove(Item<ElementType> b){


moveAfter(b, freeList.head());
}

popFront(){
remove(first());
}

popBack(){
remove(last());
}

59
Doubly Linked Lists – Insert and Delete
static Item<ElementType> insertAfter(ElementType x, all: time complexity O(1)
Item<ElementType> a){
checkFreeList(); // if free list has no elements, possibly allocate memory
Item<ElementType> b = freeList.first();
moveAfter(b, a);
b.e = x;
return b;
}

static Item<ElementType> insertBefore(ElementType x,


Item<ElementType> b){
return insertAfter(x, b.prev);
}

pushFront(ElementType x){
insertAfter(x, head());
}

pushBack(ElementType x){
insertAfter(x, last());
}

60
Doubly Linked Lists – Find

Find elements time compl.: O(n)


via linearly iterating
Trick: use dummy-element (as Sentinel)

Item<ElementType> findNext(ElementType x,
Item<ElementType> from){
h.e = x;
while (from.e != x)
from = from.next;
h.e = ⊥;
return from;
}

61
Lists / Arrays – Run-Time Comparison

Operation DoublyLinkedList SinglyLinkedList Dynamic Array


[.] O(n) O(n) O(1)
set(.,.) O(n) O(n) O(1)
*not if splice using many involved
lists is provided . Then O(n) (see
size() O(1)* O(1)* O(1) Mehlhorn Sanders 3.1.1. at the
end)

first() O(1) O(1) O(1)


last() O(1) O(1) O(1)
insertAfter(.,.) O(1) O(1) O(n)
insertBefore(.,.) O(1) O(n) O(n)
remove(.) O(1) O(1)* O(n) *only removeAfter

pushBack(.) O(1) O(1) O(1)* * only amortised

pushFront(.) O(1) O(1) O(n)


popBack(.) O(1) O(n) O(1)* * only amortised

popFront(.) O(1) O(1) O(n)


concat(.,.) O(1) O(1) O(n)
splice(.,.,.) O(1) O(1) O(n)
findNext(.,.) O(n) O(n) O(n)

62
Stack & Queue

Stack methods
○ pushBack (or push) : insert new element at the front
○ last (or top): returns element at the front
○ popBack (or pop): delete front element

Queue methods
○ pushback (bzw. enqueue): insert new element at the back
○ last (bzw. top): return front element
dequeue
○ popFront: delete front element

both can be implemented with linked lists.

For Stack: use singly linked list “reversed”, i.e. use pushFront and popFront. à as fast (and
saves memory) as doubly linked list 63
Application for Queue: Breadth First Traversal of a Graph

given: (directed) graph !(#, %), (all edge weights == 1)


task: visit all nodes (possible purposes:
• to search for something
• to execute some operation on each node
• or to solve SSSP (Single Source Shortest Path) problem etc.)

Q
F
Z
A
Y
P
D
X
S

L
M

V: set of nodes; here V = {), *, +, ,, -, ., /, 0, 1, 2, 3}


E ⊆ V x V : set of edges; here E = { ), * , *, , , ,, . , ., * , ., 0 , ), + , … , (0, 3)}
64
Application for Queue: Breadth First Traversal of a Graph
ßà of course more precisely we would have
● simplification of notation: Integer-endcoding of nodes: to distinguish between a node
and his index (or key or node-id)

4
2
9
1
8
5
10
3
0

7
6 ! = {0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10}
E = { 0, 1 , 1, 2 , 2, 5 , … , (8, 10)}

● assumption: we have a class Graph that provides suitable methods for storing and
accessing graph elements.

● assumption: we have a class Queue<Integer> that implements a queue of Integers


(ßà Nodes)
the following code has been implemented using Integer here for better readability 65
Representing a Graph
directed graph: undirected graph:
4 4
2 2
9 1 9
1 8 8
5 5
10 10
3 3
0 0
7 7
6 6
(*) (*)
adjacency matrix: adjacency matrix (symmetric):
0 1 0 1 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0
0 0 1 0 0 0 0 0 0 0 0 1 0 1 0 1 1 0 0 0 0 0
0 0 0 0 1 1 0 0 0 0 0 0 1 0 0 1 1 0 0 0 0 0
0 0 0 0 0 0 1 1 0 0 0 1 0 0 0 0 0 1 1 0 0 0
0 0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 1 0 0
0 1 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 1 0 1 0 0
0 0 0 0 0 1 0 1 0 0 0 0 0 0 1 0 1 0 1 0 0 0
0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 1 0 1
0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 1 0 1 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0
0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 1 0 0

adjacency list: (*) adjacency list: (*)


0 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 5 6 7 8 9 10

1 2 4 6 8 8 5 8 9 7 1 2 4 6 8 8 5 8 9 8 7
3 5 7 7 10 3 0 5 7 2 2 7 10 10 10
5 1 0 6 3 6 4
3 7
1
(*)
suitably implemented using sequences 5 66
Application for Queue: Breadth First Traversal of a Graph
public void bfs(Graph g, int sourceNode) {
boolean[] marked = new boolean[g.sizeOfV()];
//marked[v]: v has been visited
int[] distTo = new int[g.sizeOfV()];
//dist[v]: (current) distance to v
int[] parentOf = new int[g.sizeOfV()];
//parentOf[v]: (current) parent node on shortest path to v
Queue<Integer> queue = new Queue<Integer>();
for (int v=0; v<g.sizeOfV(); v++)
distTo[v] = INFINITY;
distTo[sourceNode] = 0;
marked[sourceNode] = true;
queue.enqueue(sourceNode);

while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1;
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]);
}
}
}
} 67
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 ∞
} 2 ∞
4 3 ∞
2
9
1 4 ∞
8
5 5 ∞
10 6 ∞
3
S: 0 7 ∞
8 ∞
7 Queue: 9 ∞
6
0 10 ∞
“insertion “working
end” end” 68
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 ∞
} 2 ∞
4 3 ∞
2
9
1 4 ∞
8
5 5 ∞
10 6 ∞
3
S: 0 7 ∞
8 ∞
7 Queue: 9 ∞
6
0 10 ∞
69
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 ∞
4 3 1 0
2
9
1 4 ∞
8
5 5 ∞
10 6 ∞
3
S: 0 7 ∞
8 ∞
7 Queue: 9 ∞
6
here we assume without 1 3 10 ∞
loss of generality that
we inserted 3 first 70
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 ∞
4 3 1 0
2
9
1 4 ∞
8
5 5 ∞
10 6 ∞
3
S: 0 7 ∞
8 ∞
7 Queue: 9 ∞
6
1 3 10 ∞
71
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 ∞
4 3 1 0
2
9
1 4 ∞
8
5 5 ∞
10 6 2 3
3
S: 0 7 2 3
8 ∞
7 Queue: 9 ∞
6
here we assume without 6 7 1 10 ∞
loss of generality that
we inserted 7 first 72
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 ∞
4 3 1 0
2
9
1 4 ∞
8
5 5 ∞
10 6 2 3
3
S: 0 7 2 3
8 ∞
7 Queue: 9 ∞
6
6 7 1 10 ∞
73
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 ∞
8
5 5 ∞
10 6 2 3
3
S: 0 7 2 3
8 ∞
7 Queue: 9 ∞
6
2 6 7 10 ∞
74
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 ∞
8
5 5 ∞
10 6 2 3
3
S: 0 7 2 3
8 ∞
7 Queue: 9 ∞
6
2 6 7 10 ∞
75
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
G.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]])
(!marked[w]) { {
parentOf[w] = v;
parentOf[nodesAdjacentToV[w]] = v;
distTo[w] = distTo[v] + 1; = distTo[v] + 1;
distTo[nodesAdjacentToV[w]] distTo parentOf
marked[w] = true;
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(w);
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 ∞
8
5 5 ∞
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 ∞
6
8 2 6 10 ∞
76
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 ∞
8
5 5 ∞
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 ∞
6
8 2 6 10 ∞
77
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 ∞
8
5 5 ∞
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 ∞
6
8 2 10 ∞
78
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 ∞
8
5 5 ∞
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 ∞
6
8 2 10 ∞
79
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 3 2
8
5 5 3 2
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 ∞
6
5 4 8 10 ∞
80
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 3 2
8
5 5 3 2
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 ∞
6
5 4 8 10 ∞
81
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 3 2
8
5 5 3 2
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 4 8
6
9 10 5 4 10 4 8
82
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 3 2
8
5 5 3 2
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 4 8
6
9 10 5 4 10 4 8
83
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 3 2
8
5 5 3 2
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 4 8
6
9 10 5 10 4 8
84
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 3 2
8
5 5 3 2
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 4 8
6
9 10 5 10 4 8
85
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 3 2
8
5 5 3 2
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 4 8
6
9 10 10 4 8
86
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 3 2
8
5 5 3 2
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 4 8
6
9 10 10 4 8
87
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 3 2
8
5 5 3 2
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 4 8
6
9 10 4 8
88
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 3 2
8
5 5 3 2
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 4 8
6
9 10 4 8
89
Application for Queue: Breadth First Traversal of a Graph
while (!queue.isEmpty()) {
int v = queue.dequeue();
int[] nodesAdjacentToV = g.nodesAdjacentTo(v);
marked
for (int w=0; w<nodesAdjacentToV.length; w++) {
if (!marked[nodesAdjacentToV[w]]) {
parentOf[nodesAdjacentToV[w]] = v;
distTo[nodesAdjacentToV[w]] = distTo[v] + 1; distTo parentOf
marked[nodesAdjacentToV[w]] = true;
queue.enqueue(nodesAdjacentToV[w]); 0 0
}
}
1 1 0
} 2 2 1
4 3 1 0
2
9
1 4 3 2
8
5 5 3 2
10 6 2 3
3
S: 0 7 2 3
8 3 7
7 Queue: 9 4 8
6
10 4 8
90
Recommendations for Studying

● minimal approach:
understand the contents of the slides and work with a selection of the
assignments on the homework sheets

● standard approach:
minimal approach + work on all assignments from the homework sheets

● interested students
standard approach + study the slide-decks “Introduction” “Efficiency”
“Data Structures for Sequences” in [3] OR read the corresponding parts in
[4]

91
Part III.2: Recursion

92
Recursion

Important application for Stacks: resolve /


implement Recursions

93
Recursion
● possible definitions of Recursion:
○ define a function using the function itself
○ formulate the solution for a problem with recourse to the solution itself:
provide a direct solution for base cases and provide rules how a general
instance of the problem may be divided / simplified in the direction of the
base cases.
○ a method that calls itself

Examples: (
1 if n = 0,
• Factorial n! =
(n 1)! · n if n > 0.
8
>
<1 if n = 1,
• Fibonacci f (n) = 1 if n = 2,
>
:
f (n 1) + f (n 2) if n > 2.
94
Recursion
● possible definitions of Recursion:
○ define a function using the function itself
○ formulate the solution for a problem with recourse to the solution itself:
provide a direct solution for base cases and provide rules how a general
instance of the problem may be divided / simplified in the direction of the
base cases.
○ a method that calls itself

Factorial long factorial(int n) {


long temp;
( if (n == 0) {
1 if n = 0, return 1;
n! = } else {
(n 1)! · n if n > 0. temp = factorial(n-1);
return n * temp;
}
93
}
Call Stack
For each method call (recursive or non-recursive) local variables + parameters + return
address (“who called the method from where”?) is stored on the Call-Stack

… … …
public class someClass { 2345 someObject <2346>
2346 someObject.a
int a; 2347 someObject.b
int b; …
… … …
int someMethodOne(int paramOne1, int paramOne2){ 5467 int result =
int localOne = 5; someMethodTwo(17) +
localOne;
int result = someMethodTwo(17) + localOne;

return result; 5899 int bbb =
} someObject.someMethodOne(
12, 34);
int someMethodTwo(int paramTwo){ …
int localTwo = 8; … … …
return paramTwo * localTwo; 6000 (return address) <5467>
} 6001 (calling object) <2346>
} 6002 paramTwo 17
6003 localTwo 8
… 6004 (return address) <5899>
SomeClass someObject = new SomeClass(); 6005 (calling object) <…>
int bbb = someObject.someMethodOne(12, 34); 6006 paramOne1 12
… 6007 paramOne2 34
6008 localOne 5 96
… … …
Call Stack
For each method call (recursive or non-recursive) local variables + parameters + return
address (“who called the method from where”?) is stored on the Call-Stack

… … …
public class someClass { 2345 someObject <2346>
2346 someObject.a
int a; memory area for objects: 2347 someObject.b
int b; …
„Heap“
… … …
int someMethodOne(int paramOne1, int paramOne2){ 5467 int result =
int localOne = 5; someMethodTwo(17) +
localOne;
int result = someMethodTwo(17) + localOne;

return result; 5899 int bbb =
} someObject.someMethodOne(
12, 34);
int someMethodTwo(int paramTwo){ …
int localTwo = 8; … … …
insertion end of Call Stack →
return paramTwo * localTwo; 6000 (return address) <5467>
} 6001 (calling object) <2346>
} 6002 paramTwo 17
„Call Stack“ 6003 localTwo 8
… 6004 (return address) <5899>
SomeClass someObject = new SomeClass(); 6005 (calling object) <…>
int bbb = someObject.someMethodOne(12, 34); 6006 paramOne1 12
… 6007 paramOne2 34
6008 localOne 5 97
… … …
Call Stack
For each method call (recursive or non-recursive) local variables + parameters + return
address (“who called the method from where”?) is stored on the Call-Stack

… … …
public class someClass { 2345 someObject <2346>
2346 someObject.a
int a; memory area for objects: 2347 someObject.b
int b; …
„Heap“
… … …
int someMethodOne(int paramOne1, int paramOne2){ 5467 int result =
int localOne = 5; someMethodTwo(17) +
localOne;
int result = someMethodTwo(17) + localOne;

return result;
} of course this is a very coarse 5899 int bbb =
someObject.someMethodOne(
schematic chart only! 12, 34);
int someMethodTwo(int paramTwo){ …
int localTwo = 8; … … …
insertion end of Call Stack →
return paramTwo * localTwo; 6000 (return address) <5467>
} 6001 (calling object) <2346>
} 6002 paramTwo 17
„Call Stack“ 6003 localTwo 8
… 6004 (return address) <5899>
SomeClass someObject = new SomeClass(); 6005 (calling object) <…>
int bbb = someObject.someMethodOne(12, 34); 6006 paramOne1 12
… 6007 paramOne2 34
6008 localOne 5 98
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);
… active Stack-Frame

we will leave that out on the (Return-Adress) …


following slides for legibility
(calling object) …
ccc …
… …
… … … 99
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);

ccc …
… … 100
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);

n 3
temp 0
ccc …
… … 101
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);

n 3
temp 0
ccc …
… … 102
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);

n 3
temp 0
ccc …
… …
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);

n 2
temp 0
n 3
temp 0
ccc …
… … 104
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);

n 2
temp 0
n 3
temp 0
ccc …
… … 105
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);
… n 1
temp 0
n 2
temp 0
n 3
temp 0
ccc …
… … 106
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);
… n 1
temp 0
n 2
temp 0
n 3
temp 0
ccc …
… … 107
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}

… n 0
int ccc = factorial(3); temp 0
… n 1
temp 0
n 2
temp 0
n 3
temp 0
ccc …
… … 108
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}

… n 0
int ccc = factorial(3); temp 0
… n 1
temp 0
n 2
temp 0
n 3
temp 0
ccc …
… … 109
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);
… n 1
temp 1
n 2
temp 0
n 3
temp 0
ccc …
… … 110
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);
… n 1
temp 1
n 2
temp 0
n 3
temp 0
ccc …
… … 111
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);

n 2
temp 1
n 3
temp 0
ccc …
… … 112
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);

n 2
temp 1
n 3
temp 0
ccc …
… … 113
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);

n 3
temp 2
ccc …
… … 114
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);

n 3
temp 2
ccc …
… … 115
… … …
Recursive Method Calls – Call Stack
long factorial(int n) {
long temp;
if (n == 0) {
return 1;
} else {
temp = factorial(n-1);
return n * temp;
}
}


int ccc = factorial(3);

ccc 6
… …
… … …
Exercise A
Suitably complement the following two versions of a java method to
compute the factorial of a positive integer!

public long factorialNonRecursive(long argument){


long result = 1;
for(int i=0; i < argument; i++){
result = result * (argument - i);
}
return result;
}

public long factorialRecursive(long argument){


long result;
if(argument == 0)
result = 1;
else
result = argument * facultyRecursive(argument - 1);
return result;
}
117
Exercise A

public long factorialNonRecursive(long argument){


long result = 1;
for(int i=0; i < argument; i++){
result = result * (argument - i);
}
return result;
}

public long factorialRecursive(long argument){


long result;
if(argument == 0)
result = 1;
else
result = argument * factorialRecursive(argument - 1);
return result;
}
118
Exercise B

Suitably complement the following implementation!

double exponentialFunction(int n){


double result;
if(n==0){
result = 1.0;
} else {
result = 2.71828 * exponentialFunction(n - 1);
}
return result;
}

119
Exercise B

double exponentialFunction(int n){


double result;
if(n==0){
result = 1.0;
} else {
result = 2.71828 * exponentialFunction(n - 1);
}
return result;
}

120
Recommendations for Studying

● minimal approach:
understand the contents of the slides and work with a selection of the
assignments on the homework sheets

● standard approach:
minimal approach + work on all assignments from the homework sheets

● interested students
standard approach + read
http://en.wikipedia.org/wiki/Recursion
http://en.wikipedia.org/wiki/Factorial
http://en.wikipedia.org/wiki/Tower_of_Hanoi

121
Part III.3: Hashing

122
based on elements from [1],[3],[4]
Java Class String

● Class String manages constant strings in Java


(StringBuffer: class for variable strings)

String someString1 = "didumdidei";


String someString2 = new String("tirilitirilo");
boolean isEqual = someString1.equals(someString2);
int length2 = someString2.length();
char[] charArray2 = someString2.toCharArray();
StringBuffer someStringBuffer = new StringBuffer("arghh");
someStringBuffer.append("hhh");
123
Java Class String

● Class String manages constant strings in Java


(StringBuffer: class for variable strings)

● equals() (tests for (structural) equality):


instead of character-wise comparison (expensive: O(n))
implemented via a String-Pool and == operator (tests for
identity (“the same“)).

String someString1 = "didumdidei";


String someString2 = new String("tirilitirilo");
boolean isEqual = someString1.equals(someString2);
int length2 = someString2.length();
char[] charArray2 = someString2.toCharArray();
StringBuffer someStringBuffer = new StringBuffer("arghh");
someStringBuffer.append("hhh");
124
Java Class String

● equals() (tests for (structural) equality):


instead of character-wise comparison (expensive: O(n)) „ein maennlein steht
implemented via a String-Pool and == operator (tests for im walde“
identity (“the same“)). „horst“
„toleranz“
„tirilitirilo“
● String-Pool: every string that exists in the system is
contained once; new strings are inserted if not yet in „didumdidei“
there.
„puh“
● some string1 is created à insert into StringPool? à „hannah“
look up if it is already contained à how is “looking up”
„wunderbar“
realized?

„ԄԇԉԎԄԋ“

125
Java Class String

how is “looking up” realized?


„ein maennlein steht
im walde“
○ idea 1: store strings in sequence (dynamic
„horst“
array or linked list) → sequential search for
„toleranz“
element
→ O(n) : expensive! „tirilitirilo“

„didumdidei“
○ idea 2: sorted array (e.g. with lexicographic
„puh“
order on strings or via integer encoding) + „hannah“
binary search (example: telephone book)
→ O(log n): OK. „wunderbar“

„ԄԇԉԎԄԋ“
○ idea 3: Use Hashing (dictionary)
à O(1)

126
Hashing
● assumption: each element e of a data structure has a key key(e) (analogous to
keys in data-bases)
Examples:
Strings → use character sequence itself or an integer encoding as key
Bicycles → use frame-number as key
Java Bicycle objects à use address in memory or function thereof

● !: set of possible keys of an element type


Hash-function ℎ: ! → [0, ( − 1]

● Dictionary (HashMap, Associative Array, Hash- Table) S: stores elements e ∈ E


(or references to elements) under their key key(e) in an array.

● Operations:
○ S.insert(Element e): insert e in S .
○ S.remove(Key k): delete e with key(e)=k
○ S.find(Key k): returns e with key(e)=k (if contained; else return ⊥ )

127
Hashing
Keys
(e.g. Integers): 22 2 1 19 84 7 11

0 1 2 … 30 … 121 … 663 … 842 … 870 871 … 899 … m-1


• • • • • • •

key=22 key=1 key=2 key=19 key=7 key=84 key=11


e22 e1 e2 e19 e7 e84 e11

class Item<ElementType,Key>{
ElementType e;
Key k;
}

sometimes simplifying for the sake of legibility:


identify Items (e,k) with elements e or keys k 128
Hashing
Keys
(e.g. Integers): 22 2 1 19 84 7 11

0 1 2 … 30 … 121 … 663 … 842 … 870 871 … 899 … m-1


• • • • • • •

key=22 key=1 key=2 key=19 key=7 key=84 key=11


e22 e1 e2 e19 e7 e84 e11

simplified representation:

0 1 2 … 30 … 121 … 663 … 842 … 870 871 899 m-1


… 22 … 1 … 2 … 19 … 7 84 … 11 …

129
Hashing

● requirements for a hash function:


○ space saving (e.g. ideally surjective)
○ good spreading / scattering / dispersion over the array
○ efficiently computable
○ …

● ideal case: h computable in O(1) and each element e is stored alone under
its index h(key(e)) → find, insert, remove can be realized in O(1)

void insert(ElementType e){ Item<ElementType,Key> find(Key k){


hashTable[h(key(e))] = e; (1) return hashTable[h(k)];
} }

void remove(Key k){


hashTable[h(k)] = null;
}

(1) more precisely we should write: hashTable[h(key(e))] = new Item(key(e),e); of course, key(e) should then be
stored instead of being computed twice. 130
Hashing

● unfortunately in practice: a lot of empty table / array entries and


collisions (keys are mapped to the same index (h not injective))

● probability of collisions: example: assume randomized hash function, n


keys are to be mapped to m positions / indices :

0 − (3 − 1)
! no collision at i key =
*+
0
: 0 − (3 − 1) :<9 3
! no collision at n keys = 6 = 6 (1 − )
789 0 78; 0

E.g. for = = 23 and 0 = 365 we have ! no Collision < 0.5

131
Constructing Hash Functions
Example for a family of good hash functions ℎ"
parameterized by a vector # of g integers: # = #% , #' , … , #)

● choose m as prime (reason: [3],[4])

● interpret key of f bits as tuple of g binary numbers (+% , +' , … , +) )


e.g. f=32, g = 4: 11011010 01001011 01101101 00110111
+% =218 +' =75 +1 =109 +2 =55

● ℎ" is then defined using scalar product and modulo operation:


ℎ" (+) = # ∗ + ./0 .

● → provably few collisions (see [3],[4])

132
Collisions: Solution1: Chaining
Idea: instead of having element references or (key, element-references) as array
elements, we now have sequences (e.g. linked lists) of element references or (key,
element-references) as array elements

22 2 1 19 84 7 11

0 1 2 … 30 … 121 … 663 … 842 … 870 871 … 899 … m-1


• • • •

key=22 key=1 key=19 key=11


e22 e1 e19 e11
• •

key=2 key=7
e2 e7

key=84
e84

133
Collisions: Solution1: Chaining
Idea: instead of having element references or (key, element-references) as array
elements, we now have sequences (e.g. linked lists) of element references or (key,
element-references) as array elements

22 2 1 19 84 7 11

0 1 2 … 30 … 121 … 663 … 842 … 870 871 … 899 … m-1


• • • •

key=22 key=1 key=19 key=11


e22 e1 e19 e11
• • List<Item>[] = new List<Item>[m];

key=7 insert(ElementType e){


key=2 (1)
hashTable[h(key(e))].insert(e);
e2 e7 }

• remove(Key k){
hashTable[h(k)].remove(k);
key=84 }

e84 Item find(Key k){ 134


return hashTable[h(k)].find(k);
}
e.g. roughly as in java.util.HashTable (1) more precisely we should of course write:
hashTable[h(key(e))].insert(new Item(key(e),e));
Collisions: Solution2: (Linear) Probing
idea: store element e with i==h(key(e)) at next free index i, i+1, i+2, …

22 2 1 19 84 7 11

0 1 2 … 30 31 … 121 … 842 843 844 … 899 … m-1


22 2 1 19 84 7 11

135
Collisions: Solution2: (Linear) Probing
idea: store element e with i==h(key(e)) at next free index i, i+1, i+2, …

22 2 1 19 84 7 11

0 1 2 … 30 31 … 121 … 842 843 844 … 899 … m-1


22 2 1 19 84 7 11

insert(ElementType e){
i = h(key(e));
while((hashTable[i] != null) && (hashTable[i].element != e))
i = (i+1) % m;
hashTable[i] = new Item(key(e),e);
}

Item find(Key k){


i = h(k);
while((hashTable[i] != null) && (hashTable[i].key != k))
i = (i+1) % m;
return hashTable[i];
}
136
Collisions: Solution2: (Linear) Probing
idea: store element e with i==h(key(e)) at next free index i, i+1, i+2, …

22 2 1 19 84 7 11

0 1 2 … 30 31 … 121 … 842 843 844 … 899 … m-1


22 2 1 19 84 7 11

attention!: remove requires modifications:


for every element e with ideal index i == h(key(e)) that was stored under index j
we have to ensure that the indices i, i+1, …, j are occupied (e.g. using dummy
137
symbols or via moving elements).
Changing Hash Table Size

● for hashing to be as collision free as possible: n < m,


but m should not be too large (waste of memory).

● à if m too small or too large: re-allocation:

○ choose new hash table size m‘ , where m‘ should


be prime (reason: see [4]) (amortizedly
efficiently possible)

○ choose new hash function ℎ: # −> [0, )‘ − 1]

○ copy elements to new hash table

138
Perfect Hashing

● Goal: linked list sequences (in case of chaining) or


series of occupied replacement positions (in case of lin. probing)
should be as short as possible.

● otherwise: find may degenerate into linear search over all elements
à worst case: O(n)

139
Perfect Hashing

● Goal: linked list sequences (in case of chaining) or


series of occupied replacement positions (in case of lin. probing)
should be as short as possible.

● otherwise: find may degenerate into linear search over all elements
à worst case: O(n)

● → goal: perfect hashing without collisions (→ find: always O(1) )

● idea (at least with static has h table size m and static number of
elements n): two step hashing:

○ h_1 with a few collisions but with good capacity utilization of the
m indices: maps to buckets (= smaller hash tables) of constant
average size

○ h_2 without collisions for each bucket to resolve collisions of first


step
140
Exercise A
In a fictitious hash table without collisions, insert, remove and find could be
implemented like this

In a hash table with chaining, the operations need to be adapted. Suitably complete
the missing instructions!

141
Exercise A
In a fictitious hash table without collisions, insert, remove and find could be
implemented like this

In a hash table with chaining, the operations need to be adapted. Suitably complete
the missing instructions!

142
Exercise B
We want to insert into a HashMap with 12 slots the following elements:
!", !$, !%, !&, !', !(
with the keys
)!* !" = 82, )!* !$ = 5, )!* !% = 103, )!* !& = 93, )!* !' = 11, )!* !( = 138
via a hashfunktion h with the values:
ℎ 82 = 2, ℎ 5 = 2, ℎ 103 = 2, ℎ 93 = 5, ℎ 11 = 2, ℎ 138 = 10

1. Insert the elements with Linear Probing and provide the final result by inserting the keys
into the following schema:

What has to be regarded concerning the later deletion of the element !$?

2. Insert the elements with Chaining! Use the following schema in a suitable way!

143
Exercise B

when deleting !" with #!$(!" ) = 5 (in order to be able to find back the elements with keys 103 and 11 (which
are mapped to the same index)) you either have to write a special symbol instead of key 5 or move the
elements with keys 103 and 11 in a suitable way.

144
Part III.4: Searching

145
Sorted Sequences – Binary Search

● we have seen: if the hash function and the proportion of m and n are
chosen well: → find(Key k) may be realized with hashing in O(1) (as well
as insert, remove)

● but: hashing destroys the order of elements / items


→ the operation locate

locate(Key k) : return element e (or key(e)) with minimal key


where key(e) ≥ k

cannot be efficiently implemented with hashing. (→ i.e. range queries are


also not possible with hashing)

146
Sorted Sequences – Binary Search

● idea: use sorted array and binary search to implement locate → locate (as
well as find) may be realized in O(log n) (insert and remove will, of course,
still always require Θ " (because of necessary shifting of elements))

● binary search on ascendingly sorted array: recursive principle:


○ start search in the middle
○ if sought key is larger: apply binary search to right sub-list
○ if sought key is smaller: apply binary search to left sub-list

147
Binary Search – Java Code of locate
public static int locate(int key, int[] a) { //a must be sorted ascendingly
int lowerIndex = 0;
int higherIndex = a.length - 1;
if(key > a[higherIndex])
return -1; //key is larger than the largest key in a --> treat this case extra
int middleIndex = 0;
while(lowerIndex <= higherIndex) { //while we are not finished with searching
middleIndex = lowerIndex + (higherIndex - lowerIndex) / 2; //choose new mid
if(key < a[middleIndex])
higherIndex = middleIndex - 1; //continue search in lower half
else if(key > a[middleIndex])
lowerIndex = middleIndex + 1; //continue search in upper half
else
return middleIndex; //we found it!
}
//reaching this part of the code means that a does not contain key
//--> return index of smallest key k' with k' > k :
if(key <= a[middleIndex])
return middleIndex;
else
return middleIndex + 1;
}

148
Binary Search
lowerIndex middleIndex higherIndex

locate key 147

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
2 3 4 28 32 40 80 120 128 140 141 144 147 152 159 163 173 189 193 198

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
2 3 4 28 32 40 80 120 128 140 141 144 147 152 159 163 173 189 193 198

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
2 3 4 28 32 40 80 120 128 140 141 144 147 152 159 163 173 189 193 198

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
2 3 4 28 32 40 80 120 128 140 141 144 147 152 159 163 173 189 193 198

149
Binary Search
lowerIndex middleIndex higherIndex

locate key 149

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
2 3 4 28 32 40 80 120 128 140 141 144 147 152 159 163 173 189 193 198

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
2 3 4 28 32 40 80 120 128 140 141 144 147 152 159 163 173 189 193 198

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
2 3 4 28 32 40 80 120 128 140 141 144 147 152 159 163 173 189 193 198

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
2 3 4 28 32 40 80 120 128 140 141 144 147 152 159 163 173 189 193 198

150
Mini Exercise
binary search: mark the index positions lowerIndex, middleIndex, and higherIndex after
each essential step when executing locate(147)!

151
Binary Search

● locate: run time analysis:


crucial stements:

while(lowerIndex <= higherIndex) { //while we are not finished with searching
middleIndex = lowerIndex + (higherIndex - lowerIndex) / 2; //choose new mid

how often can you halve a list of length n? Answer: log $ times
→ run time is O(log n)

● still a severe disadvantage of binary search with sorted array: insert


and remove require O(n) time (as mentioned before)

alternatively using linked list instead of array: insert and remove need
O(1) time, but the required index operator [.] needs O(n) à log(n)
“kaputt” L

● → idea: sorted linked list with additional search data-structure


“implementing” binary search 152
Search Structures

sorted linked list with additional search / navigation data-structure :

Navigation Structure

[4]

153
Trees

● undirected, connected, acyclic graph: “tree“


(equivalent: a unique path exists between any two nodes)

154
Trees

● undirected, connected, acyclic graph: “tree“


(equivalent: a unique path exists between any two nodes)

● “directed tree“: directed, connected graph


○ that is a tree if directions of edges are ignored and
○ there is a unique root node such that each node
can be reached via a unique path from the root node

155
Trees

● undirected, connected, acyclic graph: “tree“


(equivalent: a unique path exists between any two nodes)

● “directed tree“: directed, connected graph


○ that is a tree if directions of edges are ignored and
○ there is a unique root node such that each node
can be reached via a unique path from the root node

● directed trees: notions:


○ root: node with in-degree 0
○ leaf: node with out-dregree 0
○ inner node: in-degree = 1 & out-degree ≥ 1
156
Exercise

(a) Is the following graph a tree?

(b) Mark all leafs in the following tree!

157
Exercise

(a) Is the following graph a tree?

no it contains cycles!

(b) Mark all leafs in the following tree!

158
Binary Search Trees
● as a Search Structure:
Binary Search Tree: directed tree with the following properties:
○ each node has at most 2 children (out-degree ≤ 2)
○ for every node $% with key & :
all keys in the left sub-tree under $% are smaller than &,
all keys in the right sub-tree under $% are larger than &

[4] 159
Binary Search Trees
● as a Search Structure:
Binary Search Tree: directed tree with the following properties:
○ each node has at most 2 children (out-degree ≤ 2)
○ for every node $% with key & :
all keys in the left sub-tree under $% are smaller than &,
all keys in the right sub-tree under $% are larger than &

● -- nodes of the search tree are marked with keys from the linked list
-- each key from the list occurs exactly once in the tree
-- additional edges from the leafs
of the tree to the
items (key(e),e)
of he list

[4] 160
Binary Search Trees: locate
● locate(Key k) operation:
○ start in root node
○ for every node reached: if k ≤ Key k‘, goto left child, else goto right child.

Example: locate(9)

[4] 161
Binary Search Trees: locate
● locate(Key k) operation:
○ start in root node
○ for every node reached: if k ≤ Key k‘, goto left child, else goto right child.

● if tree is balanced (i.e. has height


Example: locate(9) log % ) → locate: O(log n)

[4] 162
Binary Search Trees: insert

insert(ElementType e):
○ first like locate(key(e)) until element e‘ is reached in linked list
○ if key(e’) > key(e): (i.e. element is not yet in linked list and tree)
‒ insert e before e‘ in linked list
‒ insert a new leaf into search tree with key key(e)

163
Binary Search Trees: insert insert(ElementType e):
○ first like locate(key(e)) until element e‘ is reached in linked list
○ if key(e’) > key(e): (i.e. element is not yet in linked list and tree)
‒ insert e before e‘ in linked list
‒ insert a new leaf into search tree with key key(e)

[4] 164
Binary Search Trees: insert insert(ElementType e):
○ first like locate(key(e)) until element e‘ is reached in linked list
○ if key(e’) > key(e): (i.e. element is not yet in linked list and tree)
‒ insert e before e‘ in linked list
‒ insert a new leaf into search tree with key key(e)

[4] 165
Binary Search Trees: remove
remove(Key k):
○ first like locate(k) until element e is reached in linked list
○ if key(e) = k: (i.e. element is contained at all in linked list and tree)
‒ delete e from linked list
‒ delete tree-father v of e from the tree
‒ set in tree node w with key(w)=k (unless it has been deleted in
the previous step) the new value key(w) = key(v)

166
Binary Search Trees: remove
remove(Key k):
○ first like locate(k) until element e is reached in linked list
○ if key(e) = k: (i.e. element is contained at all in linked list and tree)
‒ delete e from linked list
‒ delete tree-father v of e from the tree
‒ set in tree node w with key(w)=k (unless it has been deleted in
the previous step) the new value key(w) = key(v)

[4] 167
Binary Search Trees: remove
remove(Key k):
○ first like locate(k) until element e is reached in linked list
○ if key(e) = k: (i.e. element is contained at all in linked list and tree)
‒ delete e from linked list
‒ delete tree-father v of e from the tree
‒ set in tree node w with key(w)=k (unless it has been deleted in
the previous step) the new value key(w) = key(v)

[4] 168
Binary Search Trees: remove
remove(Key k):
○ first like locate(k) until element e is reached in linked list
○ if key(e) = k: (i.e. element is contained at all in linked list and tree)
‒ delete e from linked list
‒ delete tree-father v of e from the tree
‒ set in tree node w with key(w)=k (unless it has been deleted in
the previous step) the new value key(w) = key(v)

[4] 169
Exercise

Binary Search Trees: Draw the tree after remove(28)!

170
Exercise

Binary Search Trees: Draw the tree after remove(28)!

171
Binary Search Trees: Problem

● if tree is balanced (i.e. has height log $ ) → locate, remove, insert:


O(log n) time

● if tree not balanced → in the extreme case: degnerates into linear list
(e.g. via inserting in sorted order) → operations cost Θ $

[4] 172
Exercise

Binary Search Trees: Draw the tree after remove(28)!

173
AVL Trees

● solution: keep balance of tree (after each operation possibly rebalancing


necessary)

● many different approaches: AVL trees, (a,b)-trees, red-black-trees etc.

● AVL trees: in every node store height differences Δℎ = ℎ$ − ℎ& between


left and right sub-tree under that node. Goal for all nodes: Δℎ ∈ −1,0,1 .

● insert and remove: book-keeping: correct Δℎ-values up to root

● re-balancing methods if Δℎ ≥ 2 :
○ Rotations to the right or left,
○ Double-Rotations to the right or left.

174
AVL Trees: Rotations

● example for the case that requires a rotation to the left:

+1 +2
A A
0 +1
B B

e.g. via inserting


into the green
sub-tree we might
get:

175
AVL Trees: Rotations

● Rotation to the left:

+2 0
A B
+1
B A 0

176
AVL Trees: Double Rotations
E.g. via inserting into the green sub-
tree, we might arrive in the following
situation:
+2 0
A X
-1 healing: 0
B double A -1 B
+1 rotation to
X
the left

177
Exercise

In an AVL tree a sub-optimal state has been reached by inserting into the
sub-tree ! . Correct that via a double rotation to the left! Also provide the
new Δℎ values for N1, N2 and N3!

178
Exercise

healing:
double N1 N2
rotation to
the left

179
Part III.5: Sorting

180
[3], [4]
SelectionSort

● Strategy: choose smallest element in remaining input sequence and move


it to the end of the output sequence

(see also https://www.youtube.com/user/AlgoRythmics/videos J)

181
SelectionSort

● Strategy: choose smallest element in remaining input sequence and move


it to the end of the output sequence

public void selectionSort(int[] a){


int n = a.length;
for(int i=0; i<n; i++){
//inner loop: move min({a[i],a[i+1],...,a[n-1]}) to position i:
for(int j=i; j<n; j++){
if(a[i] > a[j]) //if current element a[j] is smaller than
//current min-candidate a[i] swap them:
swap(a[i],a[j]);
} ?
}
}

0 1 2 3 …. i i+1 i+2 i+3 … n-1


j
● run time: ∑%"#$ Θ ' = Θ )*
(with a better strategy for finding / managing the minimum +() log )) is also possible)
182
SelectionSort

● Strategy: choose smallest element in remaining input sequence and move


it to the end of the output sequence
Remark:
swap(a[i],a[j]) is pseudocode for easier
public void selectionSort(int[] a){understanding of the algorithm.
int n = a.length;
for(int i=0; i<n; i++){ A correct Java call of a swapping method would
//inner loop: move min({a[i],a[i+1],...,a[n-1]}) to position i:
have to rather look like this:
for(int j=i; j<n; j++){
if(a[i] > a[j]) //if current element a[j] is smaller than
swap(a,
//current i, j)
min-candidate a[i] swap them:
swap(a[i],a[j]);
} where the method would have ? to be declared
} e.g. as
}
private void swap(int[] a, int i, int j){
int backup = a[i];
0a[i]
1 2= a[j];
3 …. i i+1 i+2 i+3 … n-1
a[j] = backup; j
● run time: ∑% Θ ' = Θ )*
"#$
}
(with a better strategy for finding / managing the minimum +() log )) is also possible)
183
SelectionSort
public void selectionSort(int[] a){
int n = a.length;
for(int i=0; i<n; i++){
//inner loop: move min({a[i],a[i+1],...,a[n-1]}) to position i:
for(int j=i; j<n; j++){
if(a[i] > a[j]) //if current element a[j] is smaller than
//current min-candidate a[i] swap them:
swap(a[i],a[j]);
}
}
}

Example

[3]
184
InsertionSort

● Strategy: insert next element from remaining input sequence into the right
position in the output sequence (by “bubbling” down )

public void insertionSort(int[] a){


int n = a.length;
for(int i=1; i<n; i++){
//inner loop: move a[i] to the right position:
for(int j=i-1; j>=0; j--){
if(a[j] > a[j+1]) //if current element a[j+1]
//(in the beginning this is a[i])
//is smaller as left neighbor a[j], swap them:
swap(a[j],a[j+1]);
}
}
} ?

0 1 2 3 …. i-1 i i+1 i+2 i+3 … n-1


● run time: ∑%"#$ Θ ' = Θ )* j
(with better insertion strategy, +() log * )) is also possible)
185
InsertionSort
public void insertionSort(int[] a){
int n = a.length;
for(int i=1; i<n; i++){
//inner loop:move a[i] to the right position:
for(int j=i-1; j>=0; j--){
if(a[j] > a[j+1]) //if current element a[j+1]
//(in the beginning this is a[i])
//is smaller as left neighbor a[j], swap them:
swap(a[j],a[j+1]);
}
}
}

Example

[3]
186
Exercise

We are given a run of insertionSort on the array

Complete the missing rows!

187
Exercise

188
MergeSort

● InsertionSort, SelectionSort, (Bubblesort…): easy to implement but quadratic


run time.
Can we do better? Yes! In O(n log n): Mergesort, Quicksort etc.

● Mergesort: recursive approach: partition sequence into two halves, sort each
half and subsequently merge the halves:

public void mergeSort(int[] a, int l, int r){ //sort from position l to r



int m = (r + l)/2; // choose mid
mergeSort(a, l, m); // sort left part
mergeSort(a, m + 1, r); // sort right part
//then merge both sorted parts:

}

189
MergeSort

Example

[3]

190
MergeSort
public void mergeSort(int[] a, int l, int r){ //sort from position l to r
if (l == r) return; // we have only one element, nothing to sort;
int m = (r + l)/2; // choose mid
mergeSort(a, l, m); // sort left part
mergeSort(a, m + 1, r); // sort right part
//merge both sorted parts:
int j = l; //running variable for iterating left part
int k = m + 1; //running variable for iterating right part
int[] b = new int[r-l+1]; //intermediate storage for merging result
for(int i=0; i<r-l+1; i++){ //perform merge:
if(j > m){ //left part is used up --> use elements of right part
b[i] = a[k];
k++;
} else if(k>r){ //right part is used up --> use elements of left part
b[i] = a[j];
j++;
} else if(a[j]<=a[k]){ //element from the left part is smaller --> use it
b[i] = a[j];
j++;
} else { //element from the right part is smaller --> use it
b[i] = a[k];
k++;
}
}
for(int i=0; i<r-l+1; i++) //copy b back into respective parts of a:
a[l+i] = b[i];
}
191
Merge Step
for(int i=0; i<r-l+1; i++){ //perform merge:
if(j > m){ //left part is used up --> use elements of right part
b[i] = a[k];
k++;
} else if(k>r){ //right part is used up --> use elements of left part
b[i] = a[j];
j++;
} else if(a[j]<=a[k]){ //element from the left part is smaller --> use it
b[i] = a[j];
j++;
} else { //element from the right part is smaller --> use it
b[i] = a[k];
k++;
}

Example for
Beispiel fürmerge
merge

[3]
192
Exercise A

Using the provided schema, represent the MergeSort run on the input

193
Exercise A

20 71

194
Exercise B
Using MergeSort, we want to sort the array [6,5,4,3,2] in ascending order. Document the course of execution
with the given schema! Each grey box represents one call of the method mergeSort(int[] a, int l, int r).
• the first line in the box represents the array indices. Mark the lower (left) index l and the upper (right)
index r of the respective call!
• input into the second line the values in the array at the beginning of the method call
• input into the third line the values in the array at the end of the method call

call 1

call 2 call 7

call 3 call 6 call 8 call 9

call 4 call 5

195
Exercise B

call 1

call 2 call 7

call 3 call 6 call 8 call 9

call 4 call 5

196
Exercise C

Using the provided schema, represent the merge step of the


two sub-sequences (1,5,7,8,19) and (2,6,20,23)

197
Exercise C

198
Comparison Based Sorting in General

● Stated run times (InsertionSort, SelectionSort : ! "# , MergeSort:


! " log " ) were worst case run times.

● Question: can we do better than ! " log " worst case?


Answer: not without further assumptions on the set of elements to be
sorted. On the basis of comparisons only we cannot do better than
! " log " worst case.

● Example for such an assumption: if eys of elements are integers from [0, K-1]
→ use BucketSort:
-- Array b of K elements; each array-element is some arbitrary sort of
sequence (e.g. linked list).
-- for each e from input sequence do:
b(key(e)).pushBack(e)
-- concatenate all sequences b[i]

run-time: Θ " + ) (of course only good if ) ∈ +(" -+. ") (K is “smaller”
than n log n)) 199
An Example Exam

200
201
202
203
204
205
206
207
select s.Name
from Students s
where s.MatrNr not in ( select ex.MatrNr
from examine ex );
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
-----------------------------------------------------------------------------

225
Bibliography

(1) H. Seidl: Folien zu “Einführung in die Informatik 1” IN0001, WS 2011 / 2012, TUM

(2) StackOverflow: Diskussionsforum rund ums Programmieren


http://stackoverflow.com (URL, Okt 2014)

(3) Hanjo Täubig: Vorlesungsfolien Grundlagen Algorithmen und Datenstrukturen,


SS2013, TUM

(4) K. Mehlhorn, P. Sanders: Algorithms and Data Structures: the Basic Toolbox,
Springer 2008

226
Recommendations for Studying

● minimal approach:
understand the contents of the slides and practice with a selection of the
exercises on the homework sheets!

● standard approach:
minimal approach + read the respective pages from Mehlhorn Sanders [4].
Work on all the assignments from the homework sheets!

● interested students
== standard approach

227

You might also like