SURGE XR Research Lab
The SURGE XR Lab (Simulation User Research Game Experience) is a research lab dedicated to advancing research in Virtual Reality (VR), Augmented Reality (AR), Extended Reality (XR), and Mixed Reality (MR). It is run by Dr. Aleshia Hayes, SMU Guildhall Clinical Associate Professor, who founded the lab in 2016 at Purdue Fort Wayne.
Research Goals
SURGE XR Lab research focuses on the use of emergent technologies, applying VR/AR/XR/MR and game technologies to better understand and resolve key educational and social challenges, while improving entrepreneurship and STEM learning among diverse communities.
Current and Past Projects
Past projects including those listed below have been developed to benefit adult education and training programs, including those teaching ROTC cadets, manufacturing and sales learners, and job interview trainees.
A Training Tool to Help Teachers Recognize and Reduce Bias in Their Classroom Behaviors and Increase Interpersonal Competence
Students with teachers who communicate effectively nonverbally are more motivated to learn and demonstrate more academic progress. Nonverbal communication is a skill that can be improved with guidance and reflection. This project will work with students and teachers to prototype training modules in virtual reality that track teacher movement. This will compare teachers' nonverbal behaviors with transformed nonverbal behaviors that would effectively engage students. It will also give students from underrepresented backgrounds the opportunity to observe and participate in our research. This NSF Early Grant for Exploratory Research (EAGER) will contribute to the fields of education and learning technologies by: exploring how nonverbal behaviors are expressed and can best be transformed in virtual reality classrooms, examining how skills learned in virtual reality environments can be transferred to teaching in physical classrooms, and designing guidelines on how to aid self-reflection on nonverbal behavior and how behavior is interpreted in the classroom by both teachers and students.

Natural Disaster Simulator(s) - Haiti (Earthquake), Houston (Flood)
This STEM VR sim game in which a student will be introduced to an area of natural disaster (Haiti, post earthquake in the first iteration) and they will be offered opportunities to rebuild a place after a natural disaster. During the simulation the player will be “dropped into” different roles and learn about career fields that keep our military running. This simulation will also provide opportunities intended to teach students about the lifestyle and gain empathy to victims of natural disasters. At the end of the experience, the game AI will generate scores and make recommendations about what military careers they might be interested in, based on the choices made. During this time, the student will be given lessons on the causes of the disaster, the reasons why some areas are more susceptible to damage (stronger infrastructures etc).
Goals:
- Headfake to care about other communities and natural disasters
- Learn about Gain empathy to their experience and loss
- Critical Thinking
- Understand more of the levels of work involved in disaster recovery.
SURGE into STEM XR Camp
This in-person summer camp led by Dr. Aleshia Hayes and the SURGE XR Lab, provided an opportunity for young girls to be introduced to STEM Careers. This hands-on experience promoted self-efficacy with emerging technology while allowing participants to learn about design thinking and user experience testing principles.

Levels of Fidelity and Training
Substituting avatars of high or low levels of fidelity for training students' interview skills. We have the support of Samsung to create high fidelity (stereoscopic rendered) avatars to test in virtual space. This pilot project would compare the user experience of social presence in varying levels of fidelity in computer-mediated communication. Students will encounter an interview preparation training in one of two conditions of trained professional trainers.

ROTC VR Simulation
We worked with the Purdue ROTC team to create a simulation that allows a ROTC Cadet to practice leading squads through various combat situations using Hand Gestures, Voice Commands, and AI to allow for a secluded practice environment. The simulation was developed using HTC VIVE, Leap Motion, WizDish, and Unreal Engine 4.
Engaging the community, teachers and K12 students with VR and exploring ways to support curriculum with existing VR tools
