Parallel Computing
Parallel Computing
ANS: Parallel computing is computing where the jobs are broken into
discrete parts that can be executed concurrently. Each part is further broken
down into a series of instructions. Instructions from each piece execute
simultaneously on different CPUs. The breaking up of different parts of a
task among multiple processors will help to reduce the amount of time to run
a program. Parallel systems deal with the simultaneous use of multiple
computer resources that can include a single computer with multiple
processors, a number of computers connected by a network to form a
parallel processing cluster, or a combination of both. Parallel systems are
more difficult to program than computers with a single processor because the
architecture of parallel computers varies accordingly and the processes of
multiple CPUs must be coordinated and synchronized. The difficult problem
of parallel processing is portability.
An Instruction Stream is a sequence of instructions that are read from
memory. Data Stream is the operations performed on the data in the
processor.
Flynn’s taxonomy is a classification scheme for computer architectures
proposed by Michael Flynn in 1966. The taxonomy is based on the number
of instruction streams and data streams that can be processed
simultaneously by a computer architecture.
Q2. Differentiate between perfect shuffle permutation and butter-fly permutation. Also,
discuss the role of permutation network in parallel computing.