Matlab on HPC

Matlab on HPC

Published: 16 Feb 2022 by HPC Team Freiburg

The, University of Stuttgart is pleased to announce its web seminar for bwHPC users.

Title: Matlab on HPC

Date: 24.03.2022

Topic: Focus on Parallel Programming with Matlab and using Matlab on bwUniCluster 2.0 (Web Seminar)

Lecturer: Darko Milakovic (HLRS)

Web: https://training.bwhpc.de/goto.php?target=crs_647&client_id=BWHPC

Abstract

This MATLAB on HPC training is a course on software development with the MATLAB programming language. The focus of the training are the essential parallel programming principles, concepts, idioms, and best practices, which enable programmers to create professional, high-quality code. The course will give insight into the different aspects of parallelization, vectorization and optimization with MATLAB and will teach guidelines to develop mature, robust, maintainable, and efficient code. In addition, the course will give an introduction to the use of MATLAB on bwUniCluster 2.0, including the batch system, workspace, module files, visualization and interactive jobs.

Learning Sequence

  • Parallel Computing with MATLAB
  • Using MATLAB on bwUniCluster 2.0
  • Digression: Use MATLAB in Jupyter Notebooks on bwUniCluster 2.0 (Live Demo)

After this course, participants will…

  • understand the basic concept of parallel computing
  • know about the limitations of parallel computing
  • have gained knowledge about the types of parallel programming
  • be able to properly write parallel code
  • know about the parallelization, vectorization and optimization
  • have a detailed understanding about the most important commands on the bwUniCluster 2.0
  • gained knowledge how to user MATLAB in Jupyter Notebooks

Prerequisites:

  • A general knowledge of programming as well as a background in MATLAB programming is useful for understanding this course. MATLAB should already be installed on the participants computer.
  • The background of Linux as well as the most important concepts and tools of Linux should be known, e.g.
    • Shell and shell commands (→safe use of the command line),
    • secure shell,
    • the handling of files and scripts,
    • the structure of the system,
    • the user and rights management and
    • creating simple batch scripts with an editor like nano, vi or emacs. If you still notice deficits in this respect, we refer you at this point to https://www.tuxcademy.org/product/lxes/.
  • Experience with connecting to the bwUniCluster 2.0 as well as file systems, data transfer and using the batch system and enviroment module system on bwUniCluster 2.0, e.g., from participation in previous course “Introduction to HPC-Cluster - bwUniCluster 2.0” - https://training.bwhpc.de/goto.php?target=crs_664&client_id=BWHPC

Date & Location

Online Webex, HLRS, University of Stuttgart, Germany

  • Online Kick Off: 24.03.2022, 09:00 – 10:30 CET
  • Self-Study Phase: 24.03.2022, 10:30 – 15:00 CET
  • Online Closing: 24.03.2022, 15:00 – 16:30 CET

Webex-Invitation will be sent directly by e-mail to the registered participants.

Registration and further information

Further upcoming courses 2022 that may be of interest for you:

You can not fit our course dates into your busy schedule? Please let us know. We will offer further course dates if there will be more prospective attendees.

bwHPC HPC Course Training

Latest Posts

10th bwHPC Symposium

The 10th bwHPC Symposium will take place on September 25th and 26th, 2024 and will be hosted by the University of Freiburg. Registration and call for participation are now open.

FIDO2 and TOTP Token as a Second Factor for bwHPC and NEMO2

The use of a second factor to secure logins to services is becoming increasingly mandatory. bwHPC currently uses time-based one-time passwords (TOTP) or Yubico OTP as a second factor for SSH logins. We have looked at some hardware security tokens for bwIDM/bwHPC that can be used instead of a mobile phone.

News on NEMO2 Procurement

The initial partition of NEMO2, consisting of around 140 Milan nodes and 1000 terabytes (one petabyte) of high-speed storage space, has been ordered. A tender for a GPU partition and a second CPU partition will be opened in early 2024.