Matlab on HPC

Matlab on HPC

Published: 16 Feb 2022 by HPC Team Freiburg

The, University of Stuttgart is pleased to announce its web seminar for bwHPC users.

Title: Matlab on HPC

Date: 24.03.2022

Topic: Focus on Parallel Programming with Matlab and using Matlab on bwUniCluster 2.0 (Web Seminar)

Lecturer: Darko Milakovic (HLRS)

Web: https://training.bwhpc.de/goto.php?target=crs_647&client_id=BWHPC

Abstract

This MATLAB on HPC training is a course on software development with the MATLAB programming language. The focus of the training are the essential parallel programming principles, concepts, idioms, and best practices, which enable programmers to create professional, high-quality code. The course will give insight into the different aspects of parallelization, vectorization and optimization with MATLAB and will teach guidelines to develop mature, robust, maintainable, and efficient code. In addition, the course will give an introduction to the use of MATLAB on bwUniCluster 2.0, including the batch system, workspace, module files, visualization and interactive jobs.

Learning Sequence

  • Parallel Computing with MATLAB
  • Using MATLAB on bwUniCluster 2.0
  • Digression: Use MATLAB in Jupyter Notebooks on bwUniCluster 2.0 (Live Demo)

After this course, participants will…

  • understand the basic concept of parallel computing
  • know about the limitations of parallel computing
  • have gained knowledge about the types of parallel programming
  • be able to properly write parallel code
  • know about the parallelization, vectorization and optimization
  • have a detailed understanding about the most important commands on the bwUniCluster 2.0
  • gained knowledge how to user MATLAB in Jupyter Notebooks

Prerequisites:

  • A general knowledge of programming as well as a background in MATLAB programming is useful for understanding this course. MATLAB should already be installed on the participants computer.
  • The background of Linux as well as the most important concepts and tools of Linux should be known, e.g.
    • Shell and shell commands (→safe use of the command line),
    • secure shell,
    • the handling of files and scripts,
    • the structure of the system,
    • the user and rights management and
    • creating simple batch scripts with an editor like nano, vi or emacs. If you still notice deficits in this respect, we refer you at this point to https://www.tuxcademy.org/product/lxes/.
  • Experience with connecting to the bwUniCluster 2.0 as well as file systems, data transfer and using the batch system and enviroment module system on bwUniCluster 2.0, e.g., from participation in previous course “Introduction to HPC-Cluster - bwUniCluster 2.0” - https://training.bwhpc.de/goto.php?target=crs_664&client_id=BWHPC

Date & Location

Online Webex, HLRS, University of Stuttgart, Germany

  • Online Kick Off: 24.03.2022, 09:00 – 10:30 CET
  • Self-Study Phase: 24.03.2022, 10:30 – 15:00 CET
  • Online Closing: 24.03.2022, 15:00 – 16:30 CET

Webex-Invitation will be sent directly by e-mail to the registered participants.

Registration and further information

Further upcoming courses 2022 that may be of interest for you:

You can not fit our course dates into your busy schedule? Please let us know. We will offer further course dates if there will be more prospective attendees.

bwHPC HPC Course Training

Latest Posts

Genoa Nodes Delivered

The AMD Genoa, Machine Learning and AI partitions for NEMO2 were delivered on December 4th. The acceptance of the storage has been delayed, so that NEMO2 could not yet start this year. However, calculations with the Milan nodes in NEMO1 are still possible.

End of Life and Milan Nodes in NEMO1

The Genoa partition for NEMO2 will be delivered on December 4th, at the same time all old NEMO1 nodes will be removed. To ease the transition, some new Milan nodes will be booted into NEMO1 environment and will remain available until at least January 31st. Users are encouraged to switch to the new Milan nodes and use the ‘milan’ queue for their jobs (-q milan). If demand increases, additional nodes will be added next week. The launch of NEMO2 is delayed due to unavailable storage, with further updates on testing and data transfer to follow once it becomes available.

Storage and Milan Partition Delivered

The Weka Storage and Milan partition have been successfully delivered for the new NEMO2 cluster. Testing, benchmarking, and system configuration will take place in the coming weeks. We anticipate starting with limited functionality and gradually expanding it over time. A portion of the old NEMO cluster had to be shut down to accommodate the installation of the Milan partition.