Parallelism Group1
Parallelism Group1
Parallelism Group1
C:
PARALLELISM
PRESENTED BY:
GROUP # 01
2019-EE-604
2019-EE-605
2019-EE-624
DEPARTMENT OF ELECTRICAL
ENGINEERING
Computer Architecture
PARALLEL COMPUTING
Define Parallelism by word?
It is an evolution of serial processing wherein tasks are broken down into manageable
tasks that can be executed concurrently. Further, each part is broken down into a series of
instructions executed simultaneously on different CPUs. The working of the parallel
operating system is such that the CPU and its components are divided into smaller parts,
each having full speed and power. In a normally operating system, once an I/O device is
identified, it will use the CPU to transfer the information into the memory before
performing any operations on it, like processing and transmitting. By parallel operating
systems, however, more data can be transferred and processed simultaneously, resulting in
quicker data transfer.
TYPES OF PARALLEL COMPUTING
• bit-level
• instruction-level
• Task level
1. Bit-level parallelism
Bit-level parallelism refers to a type of parallel computing that depends on reducing the
quantity or size of the instructions that the processor depends on to carry out particular
tasks. This applies when the processor is working with large amounts of data.
2. Instruction-level parallelism
In this type of parallel computing, the processor determines how to order and run instructions in a parallel
sequence. It also decides how many instructions to process and carry out at the same time. In instruction-level
parallelism, which functions on static parallelism, the compiler can only address less than one instruction, so
it is important to be able to group the instructions and run them simultaneously without altering the results.
3. Task parallelism
This form of parallel computing means that the tasks are broken down into smaller tasks — or
subtasks — and then allocated to multiple processors, which execute those components at the
same time, using the same information source.
PARALLEL COMPUTING EXAMPLES
Parallel computing is always part of our daily life. The concept has been around for decades,
although it has become more and more common and applicable to the current, increasingly
digital world.
1. SMARTPHONES
2. LAPTOPS
3. MICRO-PROCCESSORS
4. DATA MINING
Functions of Parallel Operating System
• The parallel OS can handle the load of tasks in the operating system
complex
• High maintenance.