Dalgorithm
Dalgorithm
GROUP MEMBERS ID
1, KALKIDAN ZINABU RCS/2011/12
2, ANTENEH BEZA RCS/1991/12
3, SAMUEL KASAHUN RCS/1929/12
4, MINTESNOTE GEZAHEGN RCS/1977/12
5, DAGEM ENEYEW RCS/1976/12
SECTION 2
Probabilistic algorithms
A randomized algorithm is an algorithm that employs a degree of randomness as part of its
logic or procedure. The algorithm typically uses uniformly random bits as an auxiliary input to
guide its behavior, in the hope of achieving good performance in the "average case" over all
possible choices of random determined by the random bits; thus, either the running time, or
the output (or both) are random variables
Probabilistic algorithms: ‘Monte Carlo’ methods
Algorithms which always return a result, but the result may not always be correct. We attempt
to minimize the probability of an incorrect result, and using the random element, multiple runs
of the algorithm will reduce the probability of incorrect results.
Probabilistic algorithms: ‘Las Vegas’ methods
Algorithms that never return an incorrect result, but may not produce results at all on some
runs. Again, we wish to minimize the probability of no result, and, because of the random
element, multiple runs will reduce the probability of no result. Las Vegas algorithms may
produce tractable computations for tasks for which deterministic algorithms are intractable
even on average. However, we cannot guarantee a result and there is no upper bound on the
time for a result to appear, but the expected time may in fact be small.
Probabilistic algorithms: ‘Sherwood’ methods
Algorithms which always return a result and the correct result, but where a random element
increases the efficiency, by avoiding or reducing the probability of worst-case behavior. This is
useful for algorithms which have a poor worst-case behavior but a good average-case behavior,
and in particular can be used where embedding an algorithm in an application may lead to
increased worst-case behavior.
Parallel Algorithm
Parallel Algorithm - Analysis
Analysis of an algorithm helps us determine whether the algorithm is useful or not. Generally,
an algorithm is analyzed based on its execution time (Time Complexity) and the amount of
space (Space Complexity) it requires.
Since we have sophisticated memory devices available at reasonable cost, storage space is no
longer an issue. Hence, space complexity is not given so much of importance.
Parallel algorithms are designed to improve the computation speed of a computer. For
analyzing a Parallel Algorithm, we normally consider the following parameters −
Data Parallel
In data parallel model, tasks are assigned to processes and each task performs similar types of
operations on different data. Data parallelism is a consequence of single operations that is
being applied on multiple data items.
Data-parallel model can be applied on shared-address spaces and message-passing paradigms.
In data-parallel model, interaction overheads can be reduced by selecting a locality preserving
decomposition, by using optimized collective interaction routines, or by overlapping
computation and interaction.