0% found this document useful (0 votes)
5 views3 pages

Govind 4

The document outlines a practical exercise on demonstrating MPI (Message Passing Interface) functions in high-performance computing. It explains the purpose of MPI, lists basic MPI functions, and provides a simple C program example that utilizes these functions to facilitate communication between processes. The conclusion emphasizes MPI's effectiveness in enabling parallel programming in Linux systems.

Uploaded by

reddygovind4550
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views3 pages

Govind 4

The document outlines a practical exercise on demonstrating MPI (Message Passing Interface) functions in high-performance computing. It explains the purpose of MPI, lists basic MPI functions, and provides a simple C program example that utilizes these functions to facilitate communication between processes. The conclusion emphasizes MPI's effectiveness in enabling parallel programming in Linux systems.

Uploaded by

reddygovind4550
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Faculty of Engineering & Technology

Subject Name: High Performance


ComputingSubject Code: 303105356
B-Tech-CSE (AI) 3rd Year 6th Semester

Practical-04
Aim: Demonstrate MPI function through a simple program.

What is MPI ?
MPI (Message Passing Interface) is a standardized and portable message-passing system designed
to enable parallel computing across multiple processors or nodes in a distributed computing
environment. It is widely used in high-performance computing (HPC) to allow programs to run
efficiently on supercomputers and clusters.

MPI enables communication between processes in a parallel system, whether they are running on
the same machine or across a network of computers.

What are the basic Mpi Functions ?


MPI_Init – Initializes the MPI environment.

MPI_Comm_size – Retrieves the total number of processes participating in the MPI program.

MPI_Comm_rank – Determines the rank (ID) of the current process.

MPI_Send – Sends messages between processes.

MPI_Recv – Receives messages from other processes.

MPI_Finalize – Terminates the MPI environment.

Simple Mpi C program


Code :
#include <stdio.h>
#include <mpi.h>

int main(int argc, char** argv){


int process_Rank, size_Of_Cluster;

MPI_Init(&argc, &argv);
MPI_Comm_size(MPI_COMM_WORLD,
&size_Of_Cluster);
MPI_Comm_rank(MPI_COMM_WORLD,

Name: KISTIPATI GOVINDREDDY


Enrollment No: 2203031240607 1
Division: 6AI8
Faculty of Engineering & Technology
Subject Name: High Performance
ComputingSubject Code: 303105356
B-Tech-CSE (AI) 3rd Year 6th Semester

&process_Rank);
printf("Hello World from process %d of %d\n", process_Rank,
size_Of_Cluster);
MPI_Finalize();
return 0;
}

Out put:

Name: KISTIPATI GOVINDREDDY


Enrollment No: 2203031240607 2
Division: 6AI8
Faculty of Engineering & Technology
Subject Name: High Performance
ComputingSubject Code: 303105356
B-Tech-CSE (AI) 3rd Year 6th Semester
Mpirun -np 8 –hostfile my_hostfile G

Conclusion:
MPI (Message Passing Interface) provides an efficient way to implement parallel programming in
Linux-based systems. By using MPI functions, multiple processes can communicate and work
together to solve complex computational problems.

Name: KISTIPATI GOVINDREDDY


Enrollment No: 2203031240607 3
Division: 6AI8

You might also like