(all talks to be virtual)
To join to these talks, please use this link:
Meeting ID: 984 3666 2983 Passcode: 405237
Contact person: Alejandro Aceves, email@example.com
February 3, 2021 3:30-4:30PM
“Weighted Fourier analysis and dispersive equations”
Brian J. Choi, Postdoctoral Fellow, Dept. of Mathematics, SMU
This talk is a quick summary of my PhD thesis where I apply the theory of multilinear weighted Fourier estimates to nonlinear dispersive equations in order to tackle problems in regularity, well-posedness, and pointwise convergence of solutions. Dispersion of waves is a ubiquitous physical phenomenon that arises, among others, from problems in shallow-water propagation, nonlinear optics, quantum mechanics, and plasma physics. A natural tool for understanding the related physics is to study waves/signals simultaneously from both physical and spectral perspectives. Specifically, we will treat nonlinearities as multilinear operator perturbations, which (by the method of spacetime Fourier transforms), exhibit smoothing properties in norms defined to reflect the dispersive natures of the solutions. The model equation of my thesis is the quantum Zakharov system, which can be viewed as a variation on the cubic nonlinear Schrodinger equation. We investigate the model in various contexts (adiabatic limits, nonlinear Schrodinger limits, semi-classical limits). I additionally study a variation of Carleson's Fourier convergence problem in the context of pointwise convergence of the full Schrodinger operator with non-zero potential.
February 24, 2021 3:30-4:30PM
“Modeling in the time of the pandemic”
Joceline Lega, Professor, Department of Mathematics, The University of Arizona http://math.arizona.edu/~lega
When COVID-19 started to spread, modelers around the world rallied to provide forecasts and longer-term scenarios to guide the public health response. The pace was fast and the methods varied. However, the types of questions faced by developers were the same as for any model, and I will start by reviewing the various decisions that guide such endeavors. I will then present our work on EpiCovDA, a minimalist model developed by former graduate student Hannah Biegel, which combines simple nonlinear dynamics with data assimilation to provide short-term forecasts. EpiCovDA's predictions contribute to the CDC ensemble model and the last part of the talk will describe the open-science forecasting community that was fostered by the CDC mathematical modeling team and its flu forecasting centers of excellence.
(NEW DATE) March 10, 2021 3:30-4:30PM
“Wave-mediated Kuramoto-like synchronization of bouncing droplets”
Andre Nachbin, Professor of Mathematics, IMPA, Brasil http://www.impa.br/~nachbin
Couder and Fort (PRL 2006) discovered that a fluid droplet bouncing on the surface of a vertically vibrating silicon oil bath, forms a wave-particle system referred to as a hydrodynamic pilot-wave system. Such an object was only imagined in the quantum realm. Much research has been done since this discovery and many problems emerged with uncertainty related issues. The main focus of this talk is on the nonlinear dynamics of oscillators, which are coupled by the underlying Faraday wavefield. We will briefly discuss our PDE/fluid dynamic modeling as well as the numerical method. Computationally we display regimes where two oscillating droplets, confined to separate wells, exhibit correlated features even when separated by a large distance. The particles’ phase space dynamics is described in a holistic fashion and may not be decomposed into separate subsystems. We detect “coherence” when the bouncing droplets behave as nonlinearly-coupled oscillators which spontaneously synchronize, as in the celebrated Kuramoto model for phase oscillators. The droplet coupling is dynamic and implicit, being wave-mediated as opposed to the Kuramoto model where phase-coupling is explicit and pre-defined. We also discover a new regime where “coherence” emerges in a statistical fashion. Recent references are Nachbin, Milewski & Bush, Phys. Rev. Fluids (2017), Nachbin, Chaos (2018) and Nachbin, Fluids (2020).
March 11, 2021 3:45-4:45PM
“Gaps and effective gaps in Floquet materials ”
Amir Segiv, Chu Assistant Professor of Applied Mathematics, Columbia University
Applying time-periodic forcing is a common technique to effectively change materials properties. A well-known example is the transformation of graphene from a conductor to an insulator ("Floquet topological insulator'') by applying to it a time-dependent magnetic potential. We will see how this phenomenon is derived from certain reduced models of graphene. We will then turn to the first-principle, continuum, model of graphene. There, not only that the standard derivation fails, but it is unclear wh mathematical expression is. We will introduce the notion of an "effective gap", or low-oscillations gap, and prove its existence in forced graphene. This new notion distinguishes a part of the energy-spectrum in a quantitative way. It implies that the medium is approximately insulating to a class of physically-likely wavepackets. Based on joint work with MI Weinstein.
March 24, 2021 3:30-4:30PM
“Decision-making in a spiking neuronal network”
Jonathan Rubin, Professor and Chair, Department of Mathematics, University of Pittsburgh
It’s easy to think abstractly about how a decision might be made. And there are established mathematical frameworks for representing this process. There’s a complication, however: Decisions are made by brains, consisting of biological neurons and other components, which imposes some very significant constraints on how the decision-making process is implemented. In this talk, I will briefly describe some classical perspectives on simple decision-making. I will describe our approach to unifying two of these perspectives by modeling the dynamics of a spiking neuronal network and fitting lots of “data” that it generates. I will also describe a serious complication that arises, courtesy of some discordant experimental results, and how this leads us to some possible new insights about function of a certain brain region (the basal ganglia) and the emergence of oscillations.
April 7, 2021 3:30-4:30PM
“Data-driven model discovery and physics-informed learning”
Joint talk with the DCII Cluster on Machine Learning and Control Theory
Nathan Kutz, Robert Bolles and Yasuko Endo Professor of Applied Mathematics and Adjunct Professor of Electrical Engineering and Physics, University of Washington
A major challenge in the study of dynamic systems and boundary value problems is that of model discovery: turning data into reduced order models that are not just predictive, but provide insight into the nature of the underlying system that generated the data. We introduce a number of data-driven strategies for discovering nonlinear multiscale dynamical systems and their embeddings from data. We consider two canonical cases: (i) systems for which we have full measurements of the governing variables, and (ii) systems for which we have incomplete measurements. For systems with full state measurements, we show that the recent sparse identification of nonlinear dynamical systems (SINDy) method can discover governing equations with relatively little data and introduce a sampling method that allows SINDy to scale efficiently
to problems with multiple time scales, noise and parametric dependencies. For systems with incomplete observations, we show that the Hankel alternative view of Koopman (HAVOK) method, based on time-delay embedding coordinates and the dynamic mode decomposition, can be used to obtain a linear model and Koopman invariant measurement systems that nearly perfectly captures the dynamics of nonlinear systems and boundary value problems. Neural networks are used in targeted ways to aid in the model reduction process. Together, these approaches provide a suite of mathematical strategies for reducing the data required to discover and model nonlinear multiscale systems.
April 28, 2021 3:30-4:30PM
“Mathematical models and methods in computational neurology”
Pedro Maia, Assistant Professor, Department of Mathematics, University of Texas at Arlington
The emerging field of computational neurology provides an important window of opportunity for modeling of complex biophysical phenomena, for scientific computing, for understanding functionality disruption in neural networks, and for applying machine-learning methods for diagnosis and personalized medicine. In this talk, I will illustrate some of our latest results across different spatial scales spanning a broad array of mathematical techniques such as: (i) numerical methods for nonlinear PDEs for solving inhomogeneous active cable equations, (ii) spike-train metrics for quantifying information loss on compromised neural signals, (iii) applied dynamical systems for modeling biological neural networks, (iv) decision-making models, (v) applied inverse-problem techniques for finding the origins of neurodegeneration, and (vi) data methods in medical imaging.