0% found this document useful (0 votes)
11 views13 pages

Parallelism Group1

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1/ 13

TOPI

C:
PARALLELISM
PRESENTED BY:
GROUP # 01
2019-EE-604
2019-EE-605
2019-EE-624

DEPARTMENT OF ELECTRICAL
ENGINEERING
Computer Architecture
PARALLEL COMPUTING
 Define Parallelism by word?

The definition of parallelism is based on the word “parallel,”


which means “to run side by side with.”

Define Parallelism in computer architecture?

Parallel computing is a type of computing architecture in


which several processors simultaneously execute multiple, smaller
calculations broken down from an overall larger, complex problem.
What is a parallel operating system?

Parallel computing is a type of computing architecture in which several processors


simultaneously execute multiple, smaller calculations broken down from an overall
larger, complex problem.
OR

Parallel operating systems are designed to speed up the execution of programs by


dividing them into multiple segments. It is used for dealing with multiple processors
simultaneously by using computer resources which include a single computer with
multiple processors and several computers connected by a network to form a cluster
of parallel processing or a combination of both.
How do Parallel Operating Systems Work?

It is an evolution of serial processing wherein tasks are broken down into manageable
tasks that can be executed concurrently. Further, each part is broken down into a series of
instructions executed simultaneously on different CPUs. The working of the parallel
operating system is such that the CPU and its components are divided into smaller parts,
each having full speed and power. In a normally operating system, once an I/O device is
identified, it will use the CPU to transfer the information into the memory before
performing any operations on it, like processing and transmitting. By parallel operating
systems, however, more data can be transferred and processed simultaneously, resulting in
quicker data transfer.
TYPES OF PARALLEL COMPUTING

Parallel computing is typically classified among three distinct types:

• bit-level
• instruction-level
• Task level

1. Bit-level parallelism

Bit-level parallelism refers to a type of parallel computing that depends on reducing the
quantity or size of the instructions that the processor depends on to carry out particular
tasks. This applies when the processor is working with large amounts of data.
2. Instruction-level parallelism

In this type of parallel computing, the processor determines how to order and run instructions in a parallel
sequence. It also decides how many instructions to process and carry out at the same time. In instruction-level
parallelism, which functions on static parallelism, the compiler can only address less than one instruction, so
it is important to be able to group the instructions and run them simultaneously without altering the results.

3. Task parallelism

This form of parallel computing means that the tasks are broken down into smaller tasks — or
subtasks — and then allocated to multiple processors, which execute those components at the
same time, using the same information source.
PARALLEL COMPUTING EXAMPLES

Parallel computing is always part of our daily life. The concept has been around for decades,
although it has become more and more common and applicable to the current, increasingly
digital world.

1. SMARTPHONES

2. LAPTOPS

3. MICRO-PROCCESSORS

4. DATA MINING
Functions of Parallel Operating System

The following are the important functions of a parallel operating


system:

• Has a multiprocessing environment

• Security among processes

• The parallel OS can handle the load of tasks in the operating system

• Sharing of resources between other processes

• Avoiding interference of other processes or threads with each other

• Efficient utilization of all the resources


Advantages of Parallel Operating System

• It saves time and allows the execution of applications simultaneously

• Solves the large complex problem of operating system

• Multiple resources can be used simultaneously

• Has larger memory for the allocation of resources and tasks

• Faster as compared to another operating system


Disadvantages of Parallel Operating System

• The architecture of the parallel operating system is

complex

• High cost since more resources are used for

synchronization, data transfer, thread, and

communication. In the case of clusters, better cooling

techniques are required.

• Huge power consumption.

• High maintenance.

You might also like