0% found this document useful (0 votes)
5 views32 pages

Chapter 3 Brute Force

Chapter 3 discusses brute-force algorithms, highlighting their straightforward approach and applicability in problems like sorting and searching. It details specific algorithms such as bubble sort, selection sort, and brute-force string matching, along with their strengths and weaknesses. The chapter also covers the efficiency of these algorithms and introduces problems like the traveling salesman and knapsack problems, illustrating the exhaustive search method.

Uploaded by

sujay15042005
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views32 pages

Chapter 3 Brute Force

Chapter 3 discusses brute-force algorithms, highlighting their straightforward approach and applicability in problems like sorting and searching. It details specific algorithms such as bubble sort, selection sort, and brute-force string matching, along with their strengths and weaknesses. The chapter also covers the efficiency of these algorithms and introduces problems like the traveling salesman and knapsack problems, illustrating the exhaustive search method.

Uploaded by

sujay15042005
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 32

Chapter 3

Brute Force

Copyright © 2007 Pearson Addison-Wesley. All rights reserved.


Brute Force

A straightforward approach, usually directly based on the


problem’s statement and definitions of the concepts involved

Examples:
1. Computing an (a > 0, n a nonnegative integer)

2. Computing n!

3. Multiplying two matrices

4. Searching for a key of a given value in a list

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-2
Brute-Force Strengths and Weaknesses
 Strengths
• wide applicability

• yields reasonable algorithms for some important problems


(e.g., matrix multiplication, sorting, searching, string matching)
• Simplicity- can solve few instances of a problem within acceptable amount of time
and the expense of designing a more efficient algorithm may be unjustifiable
 Weaknesses
• rarely yields efficient algorithms

• some brute-force algorithms are unacceptably slow

• not as constructive as some other design techniques

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-3
Brute Force algorithms

 Bubble sort
 Selection sort
 Sequential or Linear Search
 Brute-Force String Matching
 Closest Pair problems

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-4
Selection Sort algorithm

Time efficiency: Θ(n^2)


Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-5
Analysis of Selection Sort

Thus, selection sort is a 𝜃(n2) algorithm on all inputs.


however, the number of key swaps is only 𝜃(n), or, more precisely, n − 1 (one for


each repetition of the i loop).
 This property distinguishes selection sort positively from many other sorting
algorithms.

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-6
Brute-Force Sorting Algorithm
Selection Sort Scan the array to find its smallest element and
swap it with the first element. Then, starting with the second
element, scan the elements to the right of it to find the
smallest among them and swap it with the second elements.
Generally, on pass i (0  i  n-2), find the smallest element in
A[i..n-1] and swap it with A[i]:

A[0]  . . .  A[i-1] | A[i], . . . , A[min], . . ., A[n-


1]
in their final positions

Example: 7 3 2 5

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-7
Bubble Sort
 Another brute-force application to the sorting problem is to
compare adjacent elements of the list and exchange them if
they are out of order.
 By doing it repeatedly, we end up “bubbling up” the largest
element to the last position on the list.
 The next pass bubbles up the second largest element, and so
on, until after n − 1 passes the list is sorted

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-8
Bubble Sort algorithm

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-9
Analysis of Bubble sort
 No. of Comparisons

 No. of swaps

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-10
Bubble sort Vs Selection sort

 Bubble sort algorithm is considered to be the


most simple and inefficient algorithm,

 but selection sort algorithm is efficient as


compared to bubble sort.

 Bubble sort also consumes additional space for


storing temporary variable and needs more
swaps.

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-11
Improve efficiency of Bubble sort

 we can improve the crude version of bubble sort by


exploiting the following observation:
• if a pass through the list makes no exchanges, the list has been
sorted and we can stop the algorithm

 Though the new version runs faster on some inputs, it is


still in 𝜃(n2) in the worst and average cases.
 In fact, even among elementary sorting methods, bubble
sort is an inferior choice, and if it were not for its catchy
name, you would probably have never heard of it.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-12
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-13
Sequential Search
 a brute-force algorithm for the general searching problem:
 the algorithm simply compares successive elements of a given
list with a given search key until either a match is encountered
(successful search) or the list is exhausted without finding a
match (unsuccessful search).

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-14
Analysis of sequential search

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-15
Enhanced Sequential search

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-16
Brute-Force String Matching
pattern: a string of m characters to search for
text: a (longer) string of n characters to search in
problem: find a substring in the text that matches the pattern

Brute-force algorithm
Step 1 Align pattern at beginning of text
Step 2 Moving from left to right, compare each character of
pattern to the corresponding character in text until
all characters are found to match (successful search); or
a mismatch is detected
Step 3 While pattern is not found and the text is not yet
exhausted, realign pattern one position to the right and
repeat Step 2
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-17
Examples of Brute-Force String Matching

1. Pattern: 001011
Text: 10010101101001100101111010

2. Pattern: happy
Text: It is never too late to have a happy
childhood.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-18
Pseudocode and Efficiency

Time efficiency: Θ(mn) comparisons (in the worst case)


Why?
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-19
Brute-Force Polynomial Evaluation
Problem: Find the value of polynomial
p(x) = anxn + an-1xn-1 +… + a1x1 + a0
at a point x = x0

Brute-force algorithm
p  0.0
for i  n downto 0 do
power_of_x  1
for j  1 to i do //compute xi
power_of_x  power _of_x x
p  p + a[i]  power_of_x
return p
0in i = Θ(n^2) multiplications
Efficiency:
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-20
Polynomial Evaluation: Improvement
We can do better by evaluating from right to left:

Better brute-force algorithm


p  a[0]
power  1
for i  1 to n do
power  power  x
p  p + a[i]  power
return p
Efficiency: Θ(n) multiplications

Horner’s Rule is another linear time method.

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-21
Closest-Pair Problem
 The closest-pair problem calls for finding the two closest points in a set of n
points.
 Points in question can represent such physical objects as airplanes or post
offices as well as database records, statistical samples, DNA sequences, and
so on.
 An air-traffic controller might be interested in two closest planes as the
most probable collision candidates.
 A regional postal service manager might need a solution to the closest pair
problem to find candidate post-office locations to be closed.
 One of the important applications of the closest-pair problem is cluster
analysis in statistics.
 A bottom-up algorithm begins with each element as a separate
 cluster and merges them into successively larger clusters by combining the
closest pair of clusters.

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-22
Closest-Pair Problem
 standard Euclidean distance

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-23
Analysis
 basic operation of the algorithm will be squaring a number

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-24
Exhaustive Search
A brute force solution to a problem involving search for an
element with a special property, usually among
combinatorial objects such as permutations, combinations, or
subsets of a set.

Method:
• generate a list of all potential solutions to the problem in a
systematic manner (see algorithms in Sec. 5.4)

• evaluate potential solutions one by one, disqualifying


infeasible ones and, for an optimization problem, keeping
track of the best one found so far

• when search ends, announce the solution(s) found


Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-25
Example 1: Traveling Salesman Problem
 Given n cities with known distances between each pair, find
the shortest tour that passes through all the cities exactly
once before returning to the starting city
 Alternatively: Find shortest Hamiltonian circuit in a
weighted connected graph
 Example:
2
a b
5 3
8 4

c 7 d

How do we represent a solution (Hamiltonian circuit)?


Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-26
TSP by Exhaustive Search
Tour Cost
a→b→c→d→a 2+3+7+5 = 17
a→b→d→c→a 2+4+7+8 = 21
a→c→b→d→a 8+3+4+5 = 20
a→c→d→b→a 8+7+4+2 = 21
a→d→b→c→a 5+4+3+8 = 20
a→d→c→b→a 5+7+3+2 = 17

Efficiency: Θ((n-1)!)

Chapter 5 discusses how to generate permutations fast.

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-27
Example 2: Knapsack Problem
Given n items:
• weights: w1 w2 … wn
• values: v1 v2 … v n
• a knapsack of capacity W
Find most valuable subset of the items that fit into the knapsack

Example: Knapsack capacity W=16


item weight value
1 2 $20
2 5 $30
3 10 $50
4 5 $10
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-28
Knapsack Problem by Exhaustive Search
Subset Total weight Total value
{1} 2 $20
{2} 5 $30
{3} 10 $50
{4} 5 $10
{1,2} 7 $50
{1,3} 12 $70
{1,4} 7 $30
{2,3} 15 $80
{2,4} 10 $40
{3,4} 15 $60
{1,2,3} 17 not feasible
{1,2,4} 12 $60
{1,3,4} 17 not feasible
{2,3,4} 20 not feasible
{1,2,3,4} 22 Efficiency: Θ(2^n) not feasible
Each subset can be represented by a binary string (bit vector, Ch 5).
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-29
Example 3: The Assignment Problem
There are n people who need to be assigned to n jobs, one
person per job. The cost of assigning person i to job j is C[i,j].
Find an assignment that minimizes the total cost.

Job 0 Job 1 Job 2 Job 3


Person 0 9 2 7 8
Person 1 6 4 3 7
Person 2 5 8 1 8
Person 3 7 6 9 4

Algorithmic Plan: Generate all legitimate assignments, compute


their costs, and select the cheapest one.
How many assignments are there? n!
Pose the problem as one about a cost matrix: cycle cover
in a graph
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-30
Assignment Problem by Exhaustive Search
9 2 7 8
6 4 3 7
C= 5 8 1 8
7 6 9 4

Assignment (col.#s) Total Cost


1, 2, 3, 4 9+4+1+4=18
1, 2, 4, 3 9+4+8+9=30
1, 3, 2, 4 9+3+8+4=24
1, 3, 4, 2 9+3+8+6=26
1, 4, 2, 3 9+7+8+9=33
1, 4, 3, 2 9+7+1+6=23
etc.
(For this particular instance, the optimal assignment can be found by
exploiting the specific features of the number given. It is: 2,1,3,4 )

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-31
Final Comments on Exhaustive Search
 Exhaustive-search algorithms run in a realistic amount of
time only on very small instances

 In some cases, there are much better alternatives!


• Euler circuits
• shortest paths
• minimum spanning tree
• assignment problem The Hungarian method
runs in O(n^3) time.
 In many cases, exhaustive search or its variation is the only
known way to get exact solution

Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 3 3-32

You might also like