Parallel Programming - Introduction to MPI with C/C++

Parallel Programming - Introduction to MPI with C/C++

Published: 2 Mar 2022 by HPC Team Freiburg

The, University of Stuttgart is pleased to announce its web seminar for bwHPC users.

Title: Introduction to MPI with C/C++

Date: 15.03.2022

Topic: Parallel Programming: Introduction to MPI with C/C++ on bwUniCluster 2.0 (Web Seminar)

Lecturer: Bärbel Große-Wöhrmann (HLRS)

Web: https://training.bwhpc.de/goto.php?target=crs_682&client_id=BWHPC

Abstract

Message Passing Interface (MPI) is a widely-used parallel programming paradigm for distributed memory systems. In this course we introduce basic MPI routines and run simple examples on bwUniCluster 2.0. The target audience are newbies in MPI. No knowledge of MPI is required.

Agenda

  • Two-sided point-to-point communication
  • Collective communications
  • Exercises and live demo on bwUniCluster 2.0
  • Groups and communicators
  • Cartesian topologies

After this course, participants will…

  • have a detailed understanding of basic MPI routines,
  • know how to compile and run MPI parallel C/C++ programs on bwUniCluster 2.0.

Prerequisites:

  • the bwUniCluster entitlement https://wiki.bwhpc.de/e/BwUniCluster_2.0_User_Access
  • some background of working on the cluster, e.g. creating simple batch scripts with an editor like nano, vi or emacs.

Date & Location

Online Webex, HLRS, University of Stuttgart, Germany

  • Online Web-Seminar: Tuesday 15.03.2022, 09:00 – 12:00 CET

Webex-Invitation will be sent directly by e-mail to the registered participants.

Registration and further information

Further upcoming courses 2022 that may be of interest for you:

You can not fit our course dates into your busy schedule? Please let us know. We will offer further course dates if there will be more prospective attendees.

bwHPC HPC Course Training Parallel Programming MPI

Latest Posts

Genoa Nodes Delivered

The AMD Genoa, Machine Learning and AI partitions for NEMO2 were delivered on December 4th. The acceptance of the storage has been delayed, so that NEMO2 could not yet start this year. However, calculations with the Milan nodes in NEMO1 are still possible.

End of Life and Milan Nodes in NEMO1

The Genoa partition for NEMO2 will be delivered on December 4th, at the same time all old NEMO1 nodes will be removed. To ease the transition, some new Milan nodes will be booted into NEMO1 environment and will remain available until at least January 31st. Users are encouraged to switch to the new Milan nodes and use the ‘milan’ queue for their jobs (-q milan). If demand increases, additional nodes will be added next week. The launch of NEMO2 is delayed due to unavailable storage, with further updates on testing and data transfer to follow once it becomes available.

Storage and Milan Partition Delivered

The Weka Storage and Milan partition have been successfully delivered for the new NEMO2 cluster. Testing, benchmarking, and system configuration will take place in the coming weeks. We anticipate starting with limited functionality and gradually expanding it over time. A portion of the old NEMO cluster had to be shut down to accommodate the installation of the Milan partition.