Yiping Lu

I am a Courant instructor at Courant Institute of Mathematical Sciences, New York University, and an incoming tenure-track assistant professor at Industrial Engineering & Management Science, Northwestern University. I received my Ph.D. degree in applied math from Stanford University in 2023 and my Bachelor’s degree in applied math from Peking University in 2019. The long-term goal of my research is to develop a hybrid scientific research discipline that combines domain knowledge (differential equation, stochastic process, control,…), machine learning and (randomized) experiments. To this end, I’m working on an interdisciplinary research approach across probability and statistics, machine learning, numerical algorithms, control theory, signal processing/inverse problem, and operations research. Yiping was a recipient of the Conference on Parsimony and Learning (CPAL) Rising Star Award in 2024, the Rising Star in Data Science from the University of Chicago in 2022, the Stanford Interdisciplinary Graduate Fellowship and the SenseTime Scholarship in 2021 for undergraduates in AI in 2019.

Research Interest:

  • Statistical Learning Theory, Deep Learning Theory
  • Functional analysis, Kernel Operator and Approximation Theory
  • Stochastic Simulation, Monte Carlo Methods, Stochastic Control, Optimal Transport
  • Robust Machine Learning, Uncertainty estimation, Model calibration
  • Application:
    • Scientific Machine Learning, AI4Science (Scaling law, Hyrbid methods, …)
    • Diffusion Process (Simulation free methods for control)
    • Foundation models

Advertisement I am happy to host (remote) undergraduate/graduate visitors and looking for Ph.D. Students and Postdocs. Prospective Students see here. Summer Intern for 2024 can see here. You can also find information about my research here

To anyone: I would appreciate (anonymous) feedback about anything!

Here’s my research statement and my latest CV.

Fine-grained research interest: Scientific Machine Learning (AI4Science), Stochastic Simulation, Machine Learning Theory (RKHS, Empirical Process, Deep Learning), Inverse Problem, Robust Machine Learning

Contact: yiping [dot] lu [at] northwestern [dot] edu, yiping [dot] lu [at] nyu dot edu

Office: Office 0926, Warren Weaver Hall (CIWW), 251 Mercer St, New York, NY 10012

**[*New*] ** AAAI 2024 Tutorial: Recent advance in Physics-informed Machine Learning website slide

Research Interest and Highlights

Debiasing Machine Learning Algorithm for Scientific Computing

1.[*New*] Jose Blanchet, Haoxuan Chen, Yiping Lu, Lexing Ying. When can Regression-Adjusted Control Variates Help? Rare Events, Sobolev Embedding and Minimax Optimality (alphabetical order) Neurips 2023

Experiment Design

1.[*New*] Yiping Lu, Jiajin Li, Lexing Ying, Jose Blancet. Synthetic Principal Component Design: Fast Covariate Balancing with Synthetic Controls.

Algorithms and Statistics of Scientific Machine Learning

1.[*New*] Jikai Jin, Yiping Lu, Jose Blanchet, Lexing Ying Minimax Optimal Kernel Operator Learning via Multilevel Training ICLR 2023

2.Yiping Lu, Haoxuan Chen, Jianfeng Lu, Lexing Ying, Jose Blanchet. Machine Learning For Elliptic PDEs: Fast Rate Generalization Bound, Neural Scaling Law and Minimax Optimality. International Conference on Learning Representations(ICLR) 2022

3.Yiping Lu, Jose Blanchet,Lexing Ying. Sobolev Acceleration and Statistical Optimality for Learning Elliptic Equations via Gradient Descent. Neurips 2022

4.Zichao long, Yiping Lu *, Xianzhong Ma, Bin Dong. “PDE-Net:Learning PDEs From Data”, Thirty-fifth International Conference on Machine Learning (ICML), 2018

Robust Machine Learning

  1. Wenlong Ji, Yiping Lu, et al. An unconstrained layer-peeled perspective on neural collapse. ICLR 2022.

  2. Dinghuai Zhang , Tianyuan Zhang *,Yiping Lu *, Zhanxing Zhu, Bin Dong. “You Only Propagate Once: Painless Adversarial Training Using Maximal Principle.” 33rd Annual Conference on Neural Information Processing Systems (NeurIPS) 2019 (equal contribution)

  3. [*New*]

    Yiping Lu, Wenlong Ji, Zach Izzo, et al. Importance Tempering: Group Robustness for Overparameterized Models. arXiv preprint arXiv:2209.08745, 2022.

Optimal Control Formulation of Deep Learning

  1. Yihang Chen, Fanghui Liu, Yiping Lu, Grigorios Chrysos, Volkan Cevher. Generalization Guarantees of Deep ResNets in the Mean-Field Regime, International Conference on Learning Representations(ICLR) 2024

2.Yiping Lu, Chao Ma, Yulong Lu, Jianfeng Lu, Lexing Ying. “A Mean-field Analysis of Deep ResNet and Beyond: Towards Provable Optimization Via Overparameterization From Depth” Thirty-seventh International Conference on Machine Learning (ICML), 2020

3.Dinghuai Zhang , Tianyuan Zhang *,Yiping Lu *, Zhanxing Zhu, Bin Dong. “You Only Propagate Once: Painless Adversarial Training Using Maximal Principle.” 33rd Annual Conference on Neural Information Processing Systems (NeurIPS) 2019 (equal contribution)

4.Yiping Lu , Aoxiao Zhong *, Quanzheng Li, Bin Dong. “Beyond Finite Layer Neural Network:Bridging Deep Architects and Numerical Differential Equations” Thirty-fifth International Conference on Machine Learning (ICML), 2018 (equal contribution)

Inverse Problem and Image Processing

1.Xiaoshuai Zhang, Yiping Lu *, Jiaying Liu, Bin Dong. “Dynamically Unfolding Recurrent Restorer: A Moving Endpoint Control Method for Image Restoration” Seventh International Conference on Learning Representations(ICLR) 2019(equal contribution)

2.Bin Dong, Haochen Ju, Yiping Lu, Zuoqiang Shi. “ CURE: Curvature Regularization For Missing Data Recovery.” SIAM Journal on Imaging Science, 13(4), 2169-2188, 2020 (alphabetical order)

My Erdos Number = 4

Yiping Lu -> Lexing Ying -> David L. Donoho -> Charles Kam-tai Chui -> Paul Erdős

Yiping Lu -> Jose H. Blanchet -> Martin I. Reiman -> Fan Chung -> Paul Erdős