Center for Research Computing

The mission of the Center for Scientific Computation at Southern Methodist University is to stimulate interdisciplinary education and research in simulation-based engineering and science.

We are motivated by the fact that computer simulation has become an essential component of research in most disciplines in engineering and science, and that advances in computing, networking, and data storage technologies are likely to accelerate this trend. At the same time, it is recognized that our nation's position of dominance in this area can no longer be taken for granted. Thus we believe that the education of the next generation of computational scientists and engineers is an urgent challenge which SMU must accept. 

Goals of the CSC

  • Education and training focused on high-performance computing algorithms, software, and hardware.
  • Formation of interdisciplinary research teams addressing cutting-edge applications.
  • Rapid communication of faculty and student research accomplishments.

ManeFrame II (M2)

SMU’s new high-performance compute cluster will dramatically increase the computational capacity and performance that SMU provides to its researchers. The new cluster features state of the art CPUs, accelerators, and networking technologies, significantly more memory per node, and advanced interactive GPU-accelerated remote desktop experiences. Also, the cluster is much more energy efficient making it more economical to run and more environmentally friendly!

The new cluster will provide a similar interactive experience for researchers currently using ManeFrame. Similarities include the CentOS 7 operating system (replacing Scientific Linux 6; both are Red Hat Enterprise Linux derivatives), the SLURM resource scheduler, and the Lmod environment module system. Additionally, updated, but familiar, development tool chains will be available including the GCC, Intel, and PGI compiler suites. Optimized high-level programming environments such as MATLAB, Python, and R will also be installed in addition to the domain specific software packages that SMU researchers depend on for their work.

More information on ManeFrame II and use can be found here. If you currently use ManeFrame I, you can find more information about transitioning to ManeFrame II here.

ManeFrame I (retired) ManeFrame II (2017) ManeFrame II (2019)
Computational Ability 104 TFLOPS 630 TFLOPS 870 TFLOPS
Number of Nodes 1,104 349 354
Intel CPU Cores (AVX2) 8,832 11,088 11,276
Total Accelerator Cores 0 132,608 275,968
Total Memory 29.2 TB (29,856 GB) 116.5 TB (119,336 GB) 120 TB (122,880 GB)
Node Interconnect Bandwidth 20 Gb/s 100 Gb/s 100 Gb/s
Scratch Space 1.4 PB (1,229 TB) 1.4 PB (1,434 TB) 2.8 PB (2,867 TB)
Archive Capabilities No Yes Yes
Operating System Scientific Linux 6 CentOS 7 CentOS 7

Request an Account

SMU faculty/staff can request an account by filling out the New Account Form. If you are a student or postdoc please ask your supervisor, sponsor, or adviser to submit the New Account Form.

If you are an external research collaborator please have your sponsor request a sponsored account at Once the account has been created the sponor can request an M2 account via New Account Form.

More information on account management can be found here.



Here you can can find information on how to effectively use SMU’s HPC resources. Topics covered in the documentation include:

For questions about using resources or setting up accounts please email the SMU HPC Admins with "HPC" in the subject line.

Spring 2020

The Spring 2020 Center for Research Computing (CRC) workshop series will provide a hands-on experience that will guide researchers from the basics of using SMU's supercomputing resources to advanced parallelization and application specific usage. The topics will cover information useful for researchers to quickly begin to use the advanced compute capabilities provided with the cluster. New users are encouraged to take advantage of the introductory "Introduction to Using M2" workshop that will be given once monthly during the semester.

Workshops will be given weekly on Tuesdays in Fondren Library East Room 110 from 2:00 to 4:00 PM. There will not be a workshop on March 17.

Register here!

Date Workshop
January 21 M2 Introduction
January 28 Introduction to LAPACK and BLAS
February 4 Text Mining with Python on M2 (Lead by Dr. Eric Godat)
February 11 Using the New HPC Portal
February 18 Using GitHub
February 25 Writing Portable Accelerator Code with KOKKOS, RAJA, and OCCA
March 3 M2 Introduction
March 10 Introduction to Parallelization Using MPI
March 17 No Workshop Spring Break
March 24 Writing High Performance Python Code
March 31 Creating Portable Environments with Docker and Singularity
April 7 M2 Introduction
April 14 Introduction to Parallelization Using OpenMP and OpenACC
April 21 Profiling Applications on M2
April 28 Improving Code Vectorization

Report HPC Usage 

Faculty are encouraged to report their usage of SMU’s HPC facilities here. The data will be used to assess and document usage of these resources.