The primary purposes of the Center are to: (i.) provide a state-of-the-art research computing infrastructure for SMU faculty and students; (ii.) provide training and support to faculty and students in the use of the Center's resources and external facilities, including awarding certificates in high-performance computing based on a completion of assignments associated with center's workshops; (iii.) provide the mechanism for faculty governance of advanced research computing and related educational activities ; (iv.) stimulate multidisciplinary research involving computation; (v.) support and develop educational programs involving computation; (vi.) publicize faculty and student research involving computation; and (vii.) engage with local government and industry on relevant research projects.
ManeFrame II (M2)
SMU’s new high-performance compute cluster will dramatically increase the computational capacity and performance that SMU provides to its researchers. The new cluster features state of the art CPUs, accelerators, and networking technologies, significantly more memory per node, and advanced interactive GPU-accelerated remote desktop experiences. Also, the cluster is much more energy efficient making it more economical to run and more environmentally friendly!
The new cluster will provide a similar interactive experience for researchers currently using ManeFrame. Similarities include the CentOS 7 operating system (replacing Scientific Linux 6; both are Red Hat Enterprise Linux derivatives), the SLURM resource scheduler, and the Lmod environment module system. Additionally, updated, but familiar, development tool chains will be available including the GCC, Intel, and PGI compiler suites. Optimized high-level programming environments such as MATLAB, Python, and R will also be installed in addition to the domain specific software packages that SMU researchers depend on for their work.
More information on ManeFrame II and use can be found here. If you currently use ManeFrame I, you can find more information about transitioning to ManeFrame II here.
|ManeFrame I (retired)||ManeFrame II (2017)||ManeFrame II (2019)|
|Computational Ability||104 TFLOPS||630 TFLOPS||870 TFLOPS|
|Number of Nodes||1,104||349||354|
|Intel CPU Cores (AVX2)||8,832||11,088||11,276|
|Total Accelerator Cores||0||132,608||275,968|
|Total Memory||29.2 TB (29,856 GB)||116.5 TB (119,336 GB)||120 TB (122,880 GB)|
|Node Interconnect Bandwidth||20 Gb/s||100 Gb/s||100 Gb/s|
|Scratch Space||1.4 PB (1,229 TB)||1.4 PB (1,434 TB)||2.8 PB (2,867 TB)|
|Operating System||Scientific Linux 6||CentOS 7||CentOS 7|
Request an Account
SMU faculty/staff can request an account by filling out the New Account Form. If you are a student or postdoc please ask your supervisor, sponsor, or adviser to submit the New Account Form.
If you are an external research collaborator please have your sponsor request a sponsored account at firstname.lastname@example.org. Once the account has been created the sponor can request an M2 account via New Account Form.
More information on account management can be found here.
Here you can can find information on how to effectively use SMU’s HPC resources. Topics covered in the documentation include:
For questions about using resources or setting up accounts please email the SMU HPC Admins with "HPC" in the subject line.
CRC HPC Summer Workshop Series
The Summer 2020 Center for Research Computing workshop series will provide a hands-on experience that will guide researchers from the basics of using SMU’s supercomputing resources to advanced parallelization and application specific usage. The topics will cover information useful for researchers to quickly begin to use the advanced compute capabilities provided with the cluster. New users are encouraged to take advantage of the introductory “HPC Usage and Development Introduction” workshop on July 6.
The workshops will be given weekly on Mondays July 6 through August 3 from 9:00 AM to 4:00 PM. Register here!
July 6: HPC Usage and Development Introduction
July 13: Python Development on M2
July 20: R Development on M2
July 27: OpenMP and MPI Development on M2
August 3: Accelerator Development on M2