Research

Computing Resources

The Department of Mathematics has extensive computer facilities for both research and course work, including workstations for graduate students and faculty as well as department servers and clusters for course instruction and research. Personal workstations running Linux in a laboratory setting are provided for Master's students, while Ph.D. students are provided with a workstation on their desktop.

General notes about department workstations

New SMU research computing cluster (Maneframe)

  • 1104 batch worker nodes, connected with an infiniband high-speed network;  all are 8-core Intel Xeon X5560 2.8 GHz nodes; 20 of these have 192 GB of RAM each, the remainder have 24 GB RAM each
  • One 1.2 PB parallel file system (Lustre) is attached to all nodes.
  • OS: Scientific Linux 6 (64 bit)
  • Usage: research
  • Scheduler: SLURM
  • Software: GNU compilers, PGI compilers, hdf5, fftw, mvapich2, python, ...

For access to this system, have a faculty member contact Amit Kumar (ahkumar@smu.edu).

SMU research computing cluster (SMUHPC)

  • 163 batch worker nodes, connected with a gigabit ethernet network:
    • 107 are 8-core Intel Xeon 2.53 GHz nodes with 48 GB RAM each
    • 56 are 12-core Intel Xeon 2.8 GHz nodes with 72 GB RAM each
  • 48 MPI nodes, connected with an Infiniband high-speed network:
    • 16 are 8-core Intel Xeon 2.53 GHz nodes with 48 GB RAM each
    • 32 are 12-core Intel Xeon 2.8 GHz nodes with 72 GB RAM each
  • 2 high memory data analysis and shared-memory parallel nodes, each with 2 quad-core Intel Xeon 2.66 GHz processors, 144 GB RAM, and 3 TB of local disk space.
  • 1 GPU computing node, with 2 quad-core Intel Xeon 2.26 GHz processors, and 2 NVIDIA GTX 295 cards (each has 960 GPU cores).
  • One 320 TB parallel file system (Lustre) is attached to all nodes.
  • OS: Scientific Linux 5.5 (64 bit)
  • Usage: research
  • Scheduler: Condor
  • Software: Matlab, GNU compilers, PGI compilers, NAG compilers, hdf5, fftw, mpich2, mvapich2, python, R, ...
For more information on the SMUHPC cluster, see the SMU HPC Wiki. For access to this system, contact Amit Kumar (ahkumar@smu.edu).

Departmental servers

Zeno:

  • CPU: 2 16-core AMD Interlagos CPUs @ 2.1 GHz per core (32 total cores)
  • RAM: 32 GB @ 1333 MHz
  • OS: RHEL 6
  • Usage: research and instruction
  • Software: Matlab, Mathematica, X-Maple, gcc 4.4 compiler suite (gcc, g++, gfortran), MPICH2

For access to these systems, contact Prof. Daniel Reynolds.