0% found this document useful (0 votes)
57 views4 pages

Experimental Study 2

The document describes an experimental study evaluating the performance of the shortest augmenting path algorithm for computing maximum flow on graphs. It measures the number of calls to find the augmenting path and running time on worst-case graphs and random graphs as the graph size increases. The number of augmenting paths and running time on random graphs grows more slowly than predicted by the worst-case analysis due to larger edge capacities on average. The performance improves when random edges are added to the graphs.

Uploaded by

Rosie Li
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views4 pages

Experimental Study 2

The document describes an experimental study evaluating the performance of the shortest augmenting path algorithm for computing maximum flow on graphs. It measures the number of calls to find the augmenting path and running time on worst-case graphs and random graphs as the graph size increases. The number of augmenting paths and running time on random graphs grows more slowly than predicted by the worst-case analysis due to larger edge capacities on average. The performance improves when random edges are added to the graphs.

Uploaded by

Rosie Li
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

CSE 542 Experimental Study 2

Ruosi Li

1 Evaluate the performance of the shortest augmenting path algorithm in


worst-case
1. Use gprof to determine the number of calls to findPath method and the total time spend in the
computation of the shortest augmenting path

k=4 k=8 k = 12 k = 16 k = 24 k = 32
Num of Calls(findPath) 129 1025 3457 8193 27649 65537
Total Time 0.00s 0.03s 0.14s 0.47s 2.70s 9.75s

2. Do the number of calls to findPath match what you expect? Explain.


The number of calls to findPath should be 2k 3 + 1. And the result matches what I expected.
It can be illustrated as the graphs shown below:

We have 2k paths group(each red line presents one paths group), and each group has k 2 augmenting path with capacity
1. So the total number of the augmenting paths is 2k × k 2 = 2k 3 And 1 comes from the last test of findPath which

1
returns false.

3. Does the overall running time grow at the rate of you would expect based on the assertion in the notes
that the running time grows in proportion to n5 ? Explain
In the bad-case, we have:

c1 = 2; // start of short chain from source


c2 = c1 + 4*(k-1)+1; // start of long chain from source
bl = c2 + 4*(k-1)+3; // start of first group of vertices in bipartite graph
br = bl + k; // start of second group of vertices in bipartite graph
c3 = br + k; // start of long chain to sink
c4 = c3 + 4*(k-1)+3; // start of short chain to sink
n = c4 + 4*(k-1)+1;
m = 16*(k-1) + k*k + 8*k + 4;

so we have n ≈ 18k, and the running time should be proportion to k 5 . Plot the time which is shown as below:

We can see that it is proportion to k 5 .

2 Evaluate the performance of the shortest augmenting path algorithm on


random graphs.
1. For each series make a table showing the average number of augmenting paths that were used per max
flow computation and the average time spent. For each case, show the upper bound on the number of
augmenting paths from the worst-case analysis
For the worst-case, we can bounded it by using (n − 1)m because leveli increase 1 after every m steps. More strictly,
the worst-case will give us up to |f | augmenting paths, which can be mss ∗ ec1 which is the sum of capacity of all
source edges. First Series:

n = 100 n = 200 n = 400 n = 800 n = 2000


Average Num of Path 35.4 79.4 174.8 350.5 929.2
Average Time 0.003s 0.011s 0.060s 0.217s 1.521s
Worst Case((n − 1)m ≈ nm) 100, 000 400, 000 1, 600, 000 64, 000, 000 40, 000, 000
Worst Case(mss ∗ ec1) 10, 000 20, 000 40, 000 80, 000 200, 000

Second Series:

m = 5, 000 m = 20, 000 m = 40, 000 m = 100, 000


Average Num of Path 287.9 767.3 1420.4 3143.9
Average Time 0.129s 1.193s 4.832s 26.196s
Worst Case((n − 1)m ≈ nm) 5, 000, 000 20, 000, 000 40, 000, 000 100, 000, 000
Worst Case(mss ∗ ec1) 50, 000 200, 000 400, 000 1, 000, 000

2
2. How do the worst-case values for the number of augmenting paths compare to the experimental values?
Discuss the differences you observe.
The experimental values are much smaller than the worst-case value for the number of augmenting paths. Because for
the worst case, each time we only have an augmenting path with capacity 1, we can only add 1 to the flow. But for
the random case, the mean edge capacity is 100, which means we can always find an augmenting path with capacity
much larger than 1. In this way, the number of augmenting path will be much smaller.
3. Consider that rate at which the number of augmenting paths grows as you increase n and/or m in each
of the two cases. How does this compare with what you would expect from the worst-case analysis

We can see from the graphs the number of augmenting path grows in proportion to n and m
For the first series, when we increase n we also increase m and mss(the number of source/sink edges)

3 Write program that adds a specified number of random edges to an input


flograph and computes a maximum flow
1. Apply randExtend with r=10 and then use gprof to determine the average number of calls to findPath
and the average running time for the max flow computation. Compare them to the result from the first
part of the assignment.

k=0 k=1 k=2 k=5 k = 10 k = 20


Average Num of Path 8193 5349.2 3235.6 1889.7 701.7 595.4
Average Time 0.481s 0.301s 0.237s 0.108s 0.037s 0.037s

As we increase the number of edges we added, the number of augmenting path decreases. Since the capacity of each
edge we add are randomly assigned from range [1, C], where C is the maximum capacity(in this case it is k 3 = 4096),
it could be large. If the added edge connects any vertex in the chains from source to any vertex in the chains from the
sink, and skip the bipartite graph in the middle, it will create an augmenting path with very large capacity. This will
significantly reduce the number of augmenting paths needed to find the maxflow.

3
2. Source Code

// usage:
// randExtend r k
// r is the number of random modifications to this graph
// k is the number of vertices pairs we select at each modification
#include "stdinc.h"
#include "flograph.h"
#include "shortPath.h"

main(int argc, char* argv[]) {


int r,k;
flograph G;
cin >> G;
if (argc != 3 ||
sscanf(argv[1],"%d",&r) != 1 ||
sscanf(argv[2],"%d",&k) != 1 )
fatal("usage: randExtend r k");
flow C = 0; //the maximum capacity among all the edges in the input graph
for(edge e = 1; e <= G.m(); e++){
vertex v = G.tail(e);
flow cap = G.cap(v,e);
if(cap > C) {
C = cap;
}
// cout<<"The max capacity is" << C <<endl;
}

for(int i = 1 ; i <= r; i++){


flograph *h = new flograph(G.n(), G.m()+k, G.src(), G.snk());
*h = G;
// cout<<"r = "<<i<<"before change"<<*h<<endl;
// add extra edges to *h here
int j = 0;
while (j != k) {
vertex u,v;
u = randint(1,h->n());
v = randint(1,h->n());
// make sure the two end of one edge are two different vertices
// and they are not the source and sink vertices
if((u != v)&&(u != h->src())&&(u != h->snk())&&(v != h->src()) &&(v != h->snk())) {
edge edgeAdd;
edgeAdd = h->join(u,v);
h->changeCap(edgeAdd, randint(1,C));
j++;
}
}
int floVal=0;
// cout <<"r = "<<i<<"after change"<<*h<<endl;
shortPath(*h,floVal);
// cout<<"total flow of " << floVal << endl;
// need to clear h before delete it, otherwise, when we create a new graph h, and add edges
// the flow for the newly added edge will not be 0
h->clear();
delete h; // discard the previous graph
}
}

You might also like