Back to Main FAQ

Discovery Cluster

Where does the content of the terminal (standard out and standard error) go for a scheduled job (non-interactive)?

After the job starts, two files are created in the directory you submitted the jobs form. Those files are named, STDIN.o<job-id> and STDIN.e<job-id>

  • The .e file will contain errors that the job generates (STD ERR).
  • The .o file contains that output of the job along with prologue and epilogue information (STD OUT).
  • STDIN is the name of the job since the qsub (mksub for DartFS) received the commands from STanDard INput.
  • The prologue shows requested resources and the epilogue shows received resources.
How much memory do my jobs get? How do I assign more memory for a job?

Each ‘core’ comes with 4GB RAM (cells E-K) and 8GB RAM on cell M.

In you PBS script, you can specify the number of nodes and cores your job will require:

#PBS -l nodes=1:ppn=4

In this example, my job will be assign one node and 4 cores => 4x4GB=16GB RAM if it runs on cells E-K and 4x8GB=32GB on cell M.

Note that even you job does not need more than 1 core, but need more RAM, you must request the appropriate number of cores which might remain unused.

How to launch research software applications (Matlab, Stata, R) on Andes and Polaris high-performance linux machines (HPC’s)?

Python

What Version of Python Should I Use on the HPC Systems?

There are multiple versions of python available on the HPC systems. There is the system version /usr/bin/python. This is an older version of python (v2.7), it does not have any additional python packages installed with it and it does not use any high performance libraries. If you are doing a large amount of data processing or scientific computing, we recommend that you use the Anaconda distribution of Python.

There are two versions of pythons installed on the HPC systems that we recommend if you want a standard Anaconda environment and do not need to install any other Python packages. The modules are called python/anaconda3 and python/anaconda2 and you can load by typing the command module load anaconda3 or module load anaconda2.

If you need to install additional Python packages that are not part of the base Anaconda python environment, please create your own conda environment as described here: http://dartgo.org/conda-python

Please contact Research Computing if you need help installing your own conda environment.

Shared Systems, Andes and Polaris

No FAQs Found

Virtual Machines and Cloud Systems

No FAQs Found

Back to Main FAQ