Skip to main content

Joint Chinese-Russian Mathematical Online Colloquium

PROGRAM
11:00 (GMT+3)
Weinan E Weinan E
Peking University

Bio: Weinan E is a professor in the School of Mathematical Sciences and the Center for Machine Learning Research (CMLR) at Peking University. He is also a professor at the Department of Mathematics and Program in Applied and Computational Mathematics at Princeton University. His main research interest is numerical algorithms, machine learning and multi-scale modeling, with applications to chemistry, material sciences and fluid mechanics. Weinan E was awarded the ICIAM Collatz Prize in 2003, the SIAM Kleinman Prize in 2009 and the SIAM von Karman Prize in 2014, the SIAM-ETH Peter Henrici Prize in 2019, and the ACM Gordon-Bell Prize in 2020. He is a member of the Chinese Academy of Sciences, and a fellow of SIAM, AMS and IOP. Weinan E is an invited plenary speaker at ICM 2022, an invited speaker at ICM 2002 in Beijing, ICIAM 2007 as well as the AMS National Meeting in 2003. He has also been an invited speaker at APS, ACS, AIChe annual meetings and the American Conference of Theoretical Chemistry.

The Mathematical Theory of Neural Network-based Machine Learning.

The task of supervised learning is to approximate a function using a given set of data. In low dimensions, its mathematical theory has been established in classical numerical analysis and approximation theory in which the function spaces of interest (the Sobolev or Besov spaces), the order of the error and the convergence rate of the gradient-based algorithms are all well-understood. Direct extension of such a theory to high dimensions leads to estimates that suffer from the curse of dimensionality as well as degeneracy in the over-parametrized regime.

In this talk, we attempt to put forward a unified mathematical framework for analyzing neural network-based machine learning in high dimension (and the over-parametrized regime). We illustrate this framework using kernel methods, shallow network models and deep network models. For each of these methods, we identify the right function spaces (for which the optimal complexity estimates and direct and inverse approximation theorems hold), prove optimal generalization error estimates and study the behavior of gradient decent dynamics.

 


 

12:00 (GMT+3)
Beklemishev L.D. Lev Beklemishev
Steklov Mathematical Insitute of RAS

Bio: Lev Beklemishev is a Principal researcher at Steklov Mathematical Institute of the Russian Academy of Sciences, Head of the Department of Mathematical Logic, and also an Academician of the Russian Academy of Sciences (2019). He graduated from the Faculty of Mathematics and Mechanics of Lomonosov Moscow State University in 1989 and defended Ph.D. thesis in 1992. He was awarded the Moscow Mathematical Society prize in 1994 and Alexander von Humboldt Fellowship (Germany) in 1998. His research interests are in mathematical logic, in particular in proof theory, formal arithmetic, provability logic, and modal logic.

Reflection algebras and conservativity spectra of theories.

Turing introduced progressions of theories obtained by iterating the process of extension of a theory by its consistency assertion. Generalized Turing progressions can be used to characterize the theorems of a given arithmetical theory of quantifier complexity level $\Pi^0_n$, for any specific $n$. Such characterizations reveal a lot of information about a theory, in particular, yield consistency proofs, bounds on provable transfinite induction and provably recursive functions.

The conservativity spectrum of an arithmetical theory is a sequence of ordinals characterizing its theorems of quantifier complexity levels $\Pi_1$, $\Pi_2$, \etc. by iterated reflection principles. We describe the class of all such sequences and show that it bears a natural structure of an algebraic model of a strictly positive modal logic - reflection calculus with conservativity modalities.


 The meeting will be held in the form of a webinar on the Zoom platform.

Pre-registration for the event is not required.

Link to the conference:

https://us06web.zoom.us/j/84285799938?pwd=TkFCK2wxSXNWTFFDbFZ0RmRxWithdz09

Meeting ID : 842 8579 9938

Passcode:987654