Events
Colloquium
Finite Element versus Finite Neuron Methods
|
Speaker: Jinchao Xu (Kaust) Professor of Applied Mathematics and Computational Sciences King Abdullah University of Science and Technology Wednesday, November 5, 2025 11:30AM - 1:00PM Lunch in 1307 at 11:30am
Talk in 1327 at 12:00pm Location: Yale Institute for Foundations of Data Science, Kline Tower 13th Floor, Room 1327, New Haven, CT 06511 and via Webcast: https://yale.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=15bec68c-b95c-460c-a56c-b38501215023 |
Talk summary: This talk presents a unified framework connecting Barron and Sobolev spaces to analyze the approximation properties of ReLU$^k$ neural networks. It establishes both classical and new sharp approximation rates, showing that for functions in the relevant Barron space, ReLU$^k$ networks can achieve high accuracy without the curse of dimensionality. The same convergence rate is obtained in the Sobolev space $H^{(d+2k+1)/2}$ for linearized ReLU$^k$ networks, and these rates are compared with similar results achievable for classical global polynomial spaces. A key insight is that the ReLU$^k$ Barron space and the Sobolev space $H^{(d+2k+1)/2}$ exhibit comparable metric entropy and complexity.An interesting consequence is that a piecewise linear finite element space, when defined in terms of ReLU neural networks, avoids the curse of dimensionality for sufficiently smooth functions, whereas the classical linear finite element space still suffers from it. We further introduces a bit-centric perspective, showing that parameter count alone is not a reliable measure of model complexity or approximability. Collectively, these results bridge finite element analysis and deep learning theory, offering new mathematical insights into scientific machine learning.
Speaker bio: Xu is Professor of Applied Mathematics and Computational Sciences at KAUST, and Director of the KAUST Lab for Scientific Computing and Machine Learning. His primary research interests include the design, analysis, and application of numerical methods for scientific computing—particularly finite element and multigrid methods—as well as machine learning, including deep neural networks and large language models.His representative contributions include pioneering theory and algorithms in multigrid and domain decomposition methods; the development of the FASP software package; the formulation of the subspace correction framework; and foundational work on the mathematics of deep learning. He has also contributed to machine learning through MgNet, which bridges ideas from numerical analysis and convolutional neural networks, and AceGPT, an LLM developed specifically for Arabic. Several influential theories and algorithms are named after him (X), including the BPX preconditioner, HX preconditioner, XZ identity, and the MWX element.He was an invited speaker at the International Congress on Industrial and Applied Mathematics (ICIAM) in 2007 and at the International Congress of Mathematicians (ICM) in 2010. He is a Fellow of the Society for Industrial and Applied Mathematics (SIAM), the American Mathematical Society (AMS), the American Association for the Advancement of Science (AAAS), the European Academy of Sciences (EURASC), and Academia Europaea.
Add To: Google Calendar | Outlook | iCal File
- Colloquium
Submit an Event
Interested in creating your own event, or have an event to share? Please fill the form if you’d like to send us an event you’d like to have added to the calendar.
