eResearch HPC

High Performance Computing (HPC) can be accessed by UTS researchers via the eResearch HPCC (High Performance Computing Cluster).

The goals are:

  • provide a shared resource across the UTS research community
  • provide a test-bed for larger HPCC projects destined for Intersect/NCI.

Documentation

We have a site specifically for documentation of how to use the HPCC at https://hpc.research.uts.edu.au.

Access to the Cluster

The eResearch team will need to give you access. Simply email eResearch-IT@uts.edu.au to introduce yourself and your requirements to us. Once you have access read the HPC Getting Started page.

Cluster Hardware

The HPCC consists of:

  • Twenty nodes for compute, one node for login and a head node.
  • The number of cores in each node range from 24 to 64. Total number of cores is a little over 700.
  • Most cores have 256 GB of RAM but some have 512 GB for applications that require more memory. Total distributed memory is about 5.6 TB.
  • Some nodes contain GPU processing units. Either dual Nvidia Tesla K80 or a Tesla P100.
  • Most nodes have at least 3 TB and some have 6 TB of local attached disk.
  • 700 TB of Isilon storage shared with other eResearch infrastructure.

Acknowledging use of the HPCC

We would appreciate the following text or similar to be used for achknowledgemt. acknowledgement:

"Computational facilities were provided by the UTS eResearch High Performance Computer Cluster."