Matlab on HPC

Matlab on HPC

Published: 16 Feb 2022 by HPC Team Freiburg

The, University of Stuttgart is pleased to announce its web seminar for bwHPC users.

Title: Matlab on HPC

Date: 24.03.2022

Topic: Focus on Parallel Programming with Matlab and using Matlab on bwUniCluster 2.0 (Web Seminar)

Lecturer: Darko Milakovic (HLRS)

Web: https://training.bwhpc.de/goto.php?target=crs_647&client_id=BWHPC

Abstract

This MATLAB on HPC training is a course on software development with the MATLAB programming language. The focus of the training are the essential parallel programming principles, concepts, idioms, and best practices, which enable programmers to create professional, high-quality code. The course will give insight into the different aspects of parallelization, vectorization and optimization with MATLAB and will teach guidelines to develop mature, robust, maintainable, and efficient code. In addition, the course will give an introduction to the use of MATLAB on bwUniCluster 2.0, including the batch system, workspace, module files, visualization and interactive jobs.

Learning Sequence

  • Parallel Computing with MATLAB
  • Using MATLAB on bwUniCluster 2.0
  • Digression: Use MATLAB in Jupyter Notebooks on bwUniCluster 2.0 (Live Demo)

After this course, participants will…

  • understand the basic concept of parallel computing
  • know about the limitations of parallel computing
  • have gained knowledge about the types of parallel programming
  • be able to properly write parallel code
  • know about the parallelization, vectorization and optimization
  • have a detailed understanding about the most important commands on the bwUniCluster 2.0
  • gained knowledge how to user MATLAB in Jupyter Notebooks

Prerequisites:

  • A general knowledge of programming as well as a background in MATLAB programming is useful for understanding this course. MATLAB should already be installed on the participants computer.
  • The background of Linux as well as the most important concepts and tools of Linux should be known, e.g.
    • Shell and shell commands (→safe use of the command line),
    • secure shell,
    • the handling of files and scripts,
    • the structure of the system,
    • the user and rights management and
    • creating simple batch scripts with an editor like nano, vi or emacs. If you still notice deficits in this respect, we refer you at this point to https://www.tuxcademy.org/product/lxes/.
  • Experience with connecting to the bwUniCluster 2.0 as well as file systems, data transfer and using the batch system and enviroment module system on bwUniCluster 2.0, e.g., from participation in previous course “Introduction to HPC-Cluster - bwUniCluster 2.0” - https://training.bwhpc.de/goto.php?target=crs_664&client_id=BWHPC

Date & Location

Online Webex, HLRS, University of Stuttgart, Germany

  • Online Kick Off: 24.03.2022, 09:00 – 10:30 CET
  • Self-Study Phase: 24.03.2022, 10:30 – 15:00 CET
  • Online Closing: 24.03.2022, 15:00 – 16:30 CET

Webex-Invitation will be sent directly by e-mail to the registered participants.

Registration and further information

Further upcoming courses 2022 that may be of interest for you:

You can not fit our course dates into your busy schedule? Please let us know. We will offer further course dates if there will be more prospective attendees.

bwHPC HPC Course Training

Latest Posts

NEMO2 Production Mode

NEMO2 has officially launched, transitioning from a testing phase to full production with expanded hardware, including AMD Instinct MI300A and Nvidia L40A nodes. NEMO1 is being phased out, with limited resources available until May 31st. Users are encouraged to transition to NEMO2 and consult the wiki for details.

NEMO2 Conda

NEMO2 uses Miniforge for conda environments, offering a streamlined setup with conda-forge as the default repository. The Miniforge module auto-initializes conda, simplifying environment activation without modifying shell profiles.

Genoa Nodes Delivered

The AMD Genoa, Machine Learning and AI partitions for NEMO2 were delivered on December 4th. The acceptance of the storage has been delayed, so that NEMO2 could not yet start this year. However, calculations with the Milan nodes in NEMO1 are still possible.