|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Mathematics colloquium
Venue: Ramanujan Hall, Department of Mathematics
Speaker: Tapabrata Maiti, Michigan State University
Host: Ayan Bhattacharya
Date: Wednesday, May 17, 2023
Time: 10.00 am
Title: Statistical Foundation of Deep Learning: Application to Big Data
Abstract: Deep learning profoundly impacts science and society due to its impressive empirical success in applying data-driven artificial intelligence. A key characteristic of deep learning is that accuracy empirically scales with the sizes of the model and the amount of training data. Over the past decade, this property has enabled dramatic improvements in state-of-the-art learning architectures across various fields. However, due to a lack of mathematical/statistical foundation, the developments are limited to specific applications and do not generalize to a broader class of highly confident applications. This lack of foundation is more evident under limited training sample regimes when applied to statistical estimation and inference. We attempt to develop statistically principled reasoning and theory to validate the application of deep learning, thereby paving the way for interpretable deep learning. Our approach builds on Bayesian statistical theory and methodology and scalable computation. We illustrate the methodology with a wide range of applications.