Pdclab 8
Pdclab 8
EXPERIMENT NO. 8
LAB ASSESSMENT:
Ability to Conduct
Experiment
Data Presentation
Experiment Results
Conclusion
Objective:
• Implement and analyze various Open MP Programs (Reduction, Critical & Section Clause)
Introduction:
OpenMP (Open Multi-Processing) is an API (Application Programming Interface) for parallel
programming in C, C++, and Fortran. It allows developers to write parallel code by adding simple
compiler directives to loops or sections of code that can be executed concurrently. Using
OpenMP for tasks like reduction and finding critical clauses can significantly enhance
performance in algorithms that need to be executed on multiple cores or processors, especially
when working with large datasets or computationally intensive tasks. In OpenMP, the section
clause is used to split tasks or code into independent sections that can be executed concurrently
by different threads. It allows you to explicitly specify which portions of the code should run in
parallel, and it can be especially useful when different tasks can run in parallel but don’t need to
be inside a loop. Each section can be executed by a different thread.
Explanation:
• #pragma omp parallel for: This directive parallelizes the for loop, allowing iterations to
be divided across threads.
• reduction(+:sum): This tells OpenMP to handle the sum variable in a thread-safe
manner. Each thread will maintain a local copy of sum and add it to the global sum after
the loop.
Output:
The sum of the array is: 1000000 In this case, each thread works on a part of the array and
reduces the local results into a final sum, improving performance for large datasets.
OpenMP for Critical Clause:
In OpenMP, a critical section is used to protect a block of code from being executed by more
than one thread at a time. It is often used when threads need to access shared data, and it is
important to avoid race conditions. This is especially useful when working with critical clauses
where the order of execution matters, and you need to ensure that only one thread accesses
the critical section at a time.
Output:
The critical result is: 500000
In this example, the critical clause refers to a section of code that updates a shared variable
(critical Result). Using #pragma omp critical ensures that only one thread at a time modifies the
critical Result, preventing race conditions.
Syntax:
Example of Using section Clause:
Let’s demonstrate a simple example where we use the section clause to perform multiple independent
tasks in parallel. In this example, we’ll split the task of calculating the sum of an array, printing some text,
and performing some mathematical computation.
Explanation:
• #pragma omp parallel sections: This directive tells OpenMP to parallelize the following
sections of code. Each section directive defines a block of code that can be executed
independently.
• #pragma omp section: Marks a block of code as a section that will be executed by one of
the threads.
• Threads: OpenMP creates a team of threads, each executing a section of the code
concurrently.
Key Points:
• Independent Tasks: Each section can contain independent tasks that don’t require
communication with other sections.
• No Ordering Guarantee: The order in which sections are executed is not guaranteed.
Each section runs on a different thread, but the exact execution order can vary.
• Performance: OpenMP handles the assignment of sections to threads automatically. The
number of threads used is determined by the system’s available resources or can be
specified by the user with the omp_set_num_threads() function.
Output Example:
The sum of the array is: 1000000
This is section 2: Printing a message.
The factorial of 10 is: 3628800
Considerations:
• Number of Threads: The number of threads available will depend on the system’s
resources and OpenMP configuration. If you have more sections than threads, some
threads will handle multiple sections.
• Shared vs Private Variables: As with other OpenMP constructs, variables used inside
sections need to be carefully managed. By default, variables declared outside the
sections are shared between threads, but you can use OpenMP’s private or first private
clauses to control variable visibility.
Reduction:
o Ensures that each thread computes its own local value and the results are combined
safely.
o Examples of operations: addition, multiplication, logical operations.
o Syntax: reduction(+:sum) where + is the operation (other operations could be *,&, etc.).
Critical Clause:
o Protects a section of code from concurrent access by multiple threads.
o Ensures that shared data is modified by only one thread at a time, preventing race
conditions.
o Syntax: #pragma omp critical before the code that needs to be synchronized.
Lab Tasks:
Code and Output:
Task2:
Conclusion:
Using OpenMP for reduction and handling critical clauses allows us to efficiently manage
parallelism in programs involving operations on shared data or performing operations like
summing large arrays. By leveraging reduction for independent operations and critical for shared
data updates, OpenMP helps in optimizing the performance and correctness of parallel
algorithms. The OpenMP sections construct is a useful tool for executing independent tasks
concurrently. It simplifies the process of parallelizing multiple blocks of code without the need to
manage complex thread creation or synchronization manually. By using sections, you can improve
the performance of applications where tasks do not depend on each other and can be divided
into parallelizable.