874 669 1528
Learning nonlinear operators using deep neural networks for diverse applications
It is widely known that neural networks (NNs) are universal approximators of continuous functions. However, a less known but powerful result is that a NN can accurately approximate any nonlinear continuous operator. This universal approximation theorem of operators is suggestive of the structure and potential of deep neural networks (DNNs) in learning continuous operators or complex systems from streams of scattered data. In this talk, I will present the deep operator network (DeepONet) to learn various explicit operators, such as integrals and fractional Laplacians, as well as implicit operators that represent deterministic and stochastic differential equations. More generally, DeepONet can learn multiscale operators spanning across many scales and trained by diverse sources of data simultaneously. We will demonstrate the effectiveness of DeepONet to multiphysics and multiscale problems.
Lu Lu （陆路）is an Assistant Professor （知名助理教授） in the Department of Chemical and Biomolecular Engineering at University of Pennsylvania. Prior to joining Penn, Lu Lu was an Applied Mathematics Instructor in the Department of Mathematics at Massachusetts Institute of Technology from 2020 to 2021. He has a multidisciplinary research background with research experience in applied mathematics, physics, computational biology, and computer science. His current research interest lies in physics-informed deep learning, and its applications to engineering, physical, and biological problems. His broad research interests focus on multiscale modeling and high performance computing. Lu obtained his Ph.D. degree in Applied Mathematics, master's degrees in Engineering, Applied Mathematics, and Computer Science at Brown University, and bachelor's degrees in Mechanical Engineering, Economics, and Computer Science at Tsinghua University.
Up to now, Lu Lu has published 25+ papers in top-tier journals such as Science advances, Nature Reviews Physics, Proceedings of the National Academy of Sciences, Nature Machine Intelligence, SIAM Review, and SIAM Journal on Scientific Computing.