Following the Getting Started section should get you settled.
Remote Access
To access the cluster remotely you can use one of these ways:
- ssh max-display.desy.de - no vpn or ssh tunnel required
- https://max-display.desy.de:3443/ - just point your browser to the url, login with the school account and start an XFCE4 graphical session.
- Fetch a FastX client from https://www.starnet.com/fastx/current-client?version=2.4.15, install it on your device and configure a web-connection using https://max-display.desy.de:3443/ as the URL
Please note: eduroam might not allow using port 3443 as it's a non-standard port. If you are connected with your local machine to eduroam and the max-display.desy.de is not reachable, please use ssh or the FastX client and configure it for an ssh-connection using max-display001, or 002, or 003. The sessions will persist as for the web-connection, but are not discoverable when using max-display.desy.de, so you need to remember which machine you were connected to.
Batch Jobs
school account usually only have access to the following partitions
- maxcpu - the general purpose partition. Nodes are frequently all in use so expect some waiting time
- maxgpu - the GPU partition contains only about 20 state-of-the-art GPUs and is usually quite overbooked.
- allcpu - the general purpose partition, containing more than 400 nodes. It's very well suited for short jobs. Long running jobs might be terminated any time by jobs with higher priority.
- allgpu - like the all partition but with GPUs
To select a partition use for example sbatch --partition=allgpu ... or salloc --partition=maxcpu. To monitor jobs and partitions use sinfo and squeue. For more details please visit the Maxwell documentation.
In some cases nodes might be reserved for school purposes. To list reservations use 'scontrol show reservation'. If there is a reservation listing your school-account you can use the reservation:
scontrol show reservation > ReservationName=hpc-school StartTime=2020-04-20T16:35:11 EndTime=2020-05-01T17:00:00 Duration=11-00:24:49 > Nodes=max-wng003 NodeCnt=1 CoreCnt=32 Features=(null) PartitionName=all Flags=IGNORE_JOBS,SPEC_NODES > Users=school01, school02,... # note the name of the reservation and partition: sbatch --partition=<partitionName> --reservation=<reservationName> ...
IDE
There are a number of plain editors available for code development, like vim, emacs, nedit etc. For more advanced approaches there are also atom, code and JetBrains applications available.
For JetBrains pycharm, clion: module load maxwell jetbrains; pycharm
Please note: most environments are configured using modules. If the module-command is not found, source /etc/profile.d/modules.sh
Compiler
schools will usually use GNU compiler like gcc or gfortran. The standard compiler come in version 4.8.x. If that's not sufficient, there are a couple of alternatives. Most simple:
module load maxwell module avail gcc # show available gnu compiler, also containing gfortran module load gcc/8.2 # initialize environment for gcc v8.2 # To discover all available versions for example of gfortran xwhich gfortran # list all it can find including the commands required to set the environment
MPI
there are various MPI implementations available on Maxwell, mpich, mvapich, openmpi, intel- and pgi-compiled openmpi and mvapich, and each in various versions. For most purposes standard openmpi is the simplest choice. Recommended:
module avail mpi # lists standard MPI variants module load mpi/openmpi-x86_64 # your prime choice if you don't have special requirements module load mpi/openmpi3-x86_64 # a newer version (version 2 is actually retired) # The above versions might fail on modern hardware (Connect-X6 IB HCAs). If you see lots of errors referring to UCX: module load maxwell openmpi/3 # or module load maxwell openmpi/4 # to get a complete list: module avail mpi # lists standard MPI variants xwhich mpicc # lists (almost) all available variants
Disk Space
The home-directories on Maxwell have a hard limit of 20GB. That should be sufficient for all regular purposes. In case more space is needed use the command mk-beegfs. It will create a folder under /beegfs/desy/user/<school-account> with ample of space. Please note: both home and beegfs space will regularly cleaned up after the end of an event.
Instructors can use /beegfs/desy/group/school for (semi-)persistent deposition of materials. The folder is accessible for all school accounts.
Support
Having trouble? Drop a message to maxwell.service@desy.de