REAI 2026 Conference

Resource-Efficient AI (REAI) Conference:
From Embodied to Quantum Intelligence

April 21 - 22, 2026
Moody Hall Auditorium
Southern Methodist University
Dallas, Texas
Registration opens in early February.

As artificial intelligence systems expand across embodied and autonomous platforms, edge deployment, distributed learning environments, and emerging quantum platforms, resource efficiency is becoming a defining factor in what forms of intelligence are possible. By focusing on resource efficiency as a governing design requirement, this conference brings together communities working on deployable, scalable, and sustainable AI systems.

Track 1: Resource-Efficient Foundation Models & Digital AI Systems

This track focuses on the design, training, and deployment of resource-efficient foundation models and large-scale digital AI systems. Topics include parameter-efficient fine-tuning, model compression and distillation, and adaptive inference. Retrieval-augmented architectures are also of interest as they offer a resource-efficient alternative to scaling foundation models purely through parameters. Emphasis is placed on methods that make large language and multimodal models usable in practice by reducing training, adaptation, and inference costs without relying on brute-force scaling.

Track 2: Federated & Distributed AI Systems

Federated AI challenges conventional machine learning by shifting the primary bottleneck from computation to communication in large-scale, distributed systems. This track focuses on resource-efficient learning in settings characterized by limited bandwidth, heterogeneous devices, intermittent connectivity, and decentralized data. In addition to federated learning, the track welcomes work on multi-agent systems with shared state and in-network computation, where state synchronization and data movement dominate the system cost. Topics include communication-efficient optimization under system and data heterogeneity. Emphasis is placed on approaches that scale learning and coordination without requiring tight synchronization or global system visibility.

Track 3: Physical & Embodied AI Systems

This track focuses on AI systems that operate in the physical world under strict constraints on energy, compute, memory, and latency. Topics include on-device and edge AI, real-time and adaptive inference, model–sensor–hardware co-design, and learning-based control under resource limitations. The track also welcomes work on digital twins and simulation-based pipelines that enable efficient training, validation, and deployment of physical AI systems while reducing real-world data collection and experimentation costs. Emphasis is placed on approaches that decrease reliance on compute- or data-intensive trial-and-error in the physical world.

Track 4: Quantum & Hybrid Classical–Quantum AI Systems

Quantum AI refers to machine learning and optimization methods that leverage quantum computing components, often in combination with classical systems, to perform learning, inference, or search tasks. Current quantum computing operates in the noisy intermediate-scale quantum (NISQ) era, characterized by limited numbers of noisy qubits, shallow circuit depths, and the absence of full error correction. The limitations of the NISQ-era directly constrain what AI systems can be learned and deployed. This track explores NISQ-era quantum machine learning, hybrid workflows, noise-aware optimization, and practical benchmarking of quantum–AI systems.

Program Committee

Dr. Neena Imam
Peter O'Donnell Jr. Director of the O’Donnell Data Science and Research Computing Institute
Southern Methodist University
nimam@smu.edu

Dr. Suku Nair
Vice Provost for Research and Chief Innovation Officer
Southern Methodist University
nair@smu.edu

Dr. Nader Jalili
Mary and Richard Templeton Dean of Lyle School of Engineering
Southern Methodist University
njalili@smu.edu

Mike Truty
Technical Program Manager - ML Data
Google