|
|
||||||||||||||||||||
|
Talk
Speaker: RUDDARRAJU AMRUTHA
Host: Manoj Kumar Keshari
Title: Elementary Symplectic Groups and their Generalizations
Time, day and date: 11:30:00 AM – 12:30:00 PM, Wednesday, March 18
Venue: Ramanujan Hall
Abstract: A.A. Suslin proved that the elementary linear group of size atleast 3 is a normal subgroup of the group of invertible matrices. Suslin also proved a relative version of this result w.r.t. an ideal. V.I. Kopeiko proved a symplectic analogue of this result: the elementary symplectic group is a normal subgroup of the symplectic group, where both are defined with respect to the standard skew-symmetric matrix. I will talk about a generalization of Kopeiko's result with respect to any invertible skew-symmetric matrix of Pfaffian 1.
A stronger result holds when the ring is Euclidean or semi-local. Over a Euclidean ring, the elementary symplectic group with respect to an invertible skew-symmetric matrix is equal to the symplectic group with respect to the skew-symmetric matrix. This result also holds for a semi-local ring. I will talk about these equality results.
Finally, I will talk about a generalization of the elementary symplectic group with respect to an invertible alternating matrix, in the case of a projective module, which is the Vaserstein group of a symplectic module.
This Vaserstein group is a normal subgroup of the group of isometries of the symplectic module.
Statistics and Probability seminar
Speaker: Dr. Abhinek Shukla, Centre for Biomedical Data Science, Duke-NUS Medical School
Host: Koushik Saha
Title: Improving Inference in Stochastic Gradient Descent Based on Equal Batch-Size Batch-Means Estimator
Time, day and date: 12:00:00 PM - 1:00:00 PM, Wednesday, March 18
Venue: Online (https://meet.google.com/wjs-deag-ysh)
Abstract: Stochastic gradient descent (SGD) is a widely used technique for solving various optimization problems ranging from regression on a large dataset employed in statistics, to training deep neural networks for high dimensional models in machine learning. Performing tasks such as inference in SGD is a challenging problem due to its time-inhomogeneous Markovian nature. An underlying asymptotic normality of the averaged SGD (ASGD) estimator allows for the construction of a batch-means estimator of the asymptotic covariance matrix. In contrast to the existing increasing batch-size (IBS) strategy proposed for reducing correlation between far-apart batches, we propose a memory efficient equal batch-size (EBS) estimator and show consistency of the proposed batch-means estimator under mild conditions. The proposed EBS technique offers bias-correction of the variance at no additional cost to memory and is also shown to outperform the IBS estimator in extensive simulations. Further, since joint inference for high dimensional problems may be difficult, we present marginal-friendly simultaneous confidence intervals, and demonstrate improved predictions based on the proposed covariance estimators of ASGD.
Mathematics Colloquium
Speaker: Sudarshan R. Gurjar, IIT Bombay
Title: Belyi-type theorems for vector bundles
Time, day and date: 4:00:00 PM - 5:00:00 PM, Wednesday, March 18
Venue: Ramanujan Hall
Abstract: A classical theorem of Belyi states that a non-singular, irreducible, complex projective curve is defined over number field if and only if it admits a non-constant morphism to CP^1 which is branched over at most 3 points. I will discuss analogues theorems for vector bundles and vector bundles with connections and Higgs fields.
This is based on some joint work with Indranil Biswas.
CACAAG seminar
Speaker: Madhusudan Manjunath, IIT Bombay
Title: A Gentle Introduction to Non-Archimedean Analytic Geometry à la Berkovich
Time, day and date: 05:30:00 PM – 6:30:00 PM, Wednesday, March 18
Venue: Ramanujan Hall
Abstract: A gentle introduction to Berkovich spaces with an emphasis on intuition, motivations and examples.