The Center for Scientific Computation provides the main resource for
scientific computing collaboration at Southern Methodist University.
Members of the CSC have access to the large-scale SMU computing
cluster. This cluster currently has:
- 163 batch worker nodes, connected with a gigabit ethernet network:
- 107 are 8-core nodes (856 total cores), each with 48 GB of memory and 250 GB of local disk space.
- 56 are 12-core nodes (672 total cores), each with 72 GB of memory and 500 GB of local disk space.
- 48 parallel nodes, connected with an Infiniband high-speed network for MPI-based parallel computing:
- 16 are 8-core nodes (128 total cores), each with 48 GB of memory and 500 GB of local disk space.
- 32 are 12-core nodes (384 total cores), each with 72 GB of memory and 500 GB of local disk space.
- 2 high-memory data analysis and shared-memory parallel nodes, each
with 8 cores, 144 GB of RAM, and
3 TB of local disk space.
- 1 GPU computing node, with 8 CPU cores, 6 GB of RAM
and 2 NVIDIA GTX 295 cards. Each of these GPU cards has 960 GPU cores and 3585 MB of RAM.
- One 320 TB parallel Lustre file system is attached to all nodes.
- OS: Scientific Linux 5.5 (64 bit).
- Scheduler: Condor
- The software stack for the full cluster includes a variety of high performance mathematics and software libraries, as well as the GNU, NAG and PGI compiler suites.
- The two high-memory nodes also have Matlab installed, for interactive data analysis.
For information on obtaining a user account on the SMU computing
cluster, see this
page (SMU internal).
The SMUHPC cluster was an instrumental resource in SMU's contribution to the discovery of the Higgs Boson, as featured in this news story on CBS:
Faculty are encouraged to contribute to the cluster via the CSC's Faculty Partnership Program.