FINTECH THRUST SEMINAR | Nonlinear Monte Carlo and deep neural networks can approximate semilinear partial differential equations without the curse of dimensionality
Supporting the below United Nations Sustainable Development Goals:支持以下聯合國可持續發展目標:支持以下联合国可持续发展目标:
Nonlinear Monte Carlo and deep neural networks can approximate semilinear partial differential equations without the curse of dimensionality
Abstract:
Partial differential equations (PDEs) are among the most universal tools used in modeling problems in nature and man-made complex systems. Nearly all traditional approximation algorithms for PDEs in the literature suffer from the so-called "curse of dimensionality" in the sense that the number of required computational operations of the approximation algorithm to achieve a given approximation accuracy grows exponentially in the dimension of the considered PDE. With such algorithms it is impossible to approximately compute solutions of high-dimensional PDEs even when the fastest currently available computers are used. In the specific situation of linear parabolic PDEs and approximations at a fixed space-time point, the curse of dimensionality of deterministic methods can be overcome by means of Monte Carlo approximation algorithms and the Feynman-Kac formula. In this talk we show that deep neural networks (DNNs) have the fundamental property to be able to approximate solutions of semilinear PDEs with Lipschitz nonlinearities with the number of real parameters of the approximating DNN growing at most polynomially in, both, the reciprocal of the prescribed approximation accuracy and the PDE dimension. Our arguments are strongly based on suitable nonlinear Monte Carlo methods for such PDEs. In the second part of the talk we present and analyze acceleration techniques for stochastic gradient descent optimization in the context of deep learning approximations for PDEs and optimal control problems.
Arnulf Jentzen (*November 1983) is a professor at the Chinese University of Hong Kong, Shenzhen (CUHK-Shenzhen) (since 2021) and a professor at the University of Münster (since 2019). In 2004 he started his undergraduate studies in mathematics at Goethe University Frankfurt in Germany, in 2007 he received his diploma degree at this university, and in 2009 he completed his PhD in mathematics at this university. The core research topics of his research group are machine learning approximation algorithms, computational stochastics, numerical analysis for high dimensional partial differential equations (PDEs), stochastic analysis, and computational finance. Currently, he serves in the editorial boards of several scientific journals such as the Annals of Applied Probability, the Journal of Machine Learning, the SIAM Journal on Scientific Computing, the SIAM Journal on Numerical Analysis, and the SIAM/ASA Journal on Uncertainty Quantification. His research activities have been recognized through several major awards such as the Felix Klein Prize of the European Mathematical Society (EMS) (2020), an ERC Consolidator Grant from the European Research Council (ERC) (2022), the Joseph F. Traub Prize for Achievement in Information-Based Complexity (IBC) (2022), and a Frontier of Science Award in Mathematics (jointly with Jiequn Han and Weinan E) by the International Congress of Basic Science (ICBS) (2024). Details on the activities of his research group can be found at the webpage http://www.ajentzen.de