Ruichen Jiang
About me
I am a 5th-year Ph.D. student in the Department of ECE at UT Austin, advised by Prof. Aryan Mokhtari.
Before UT, I received a B.E. degree in Electronic Engineering and a B.S. degree in Mathematics both from Tsinghua University in 2020.
My current research interests focus on convex and non-convex optimization, particularly in using online learning techniques to design optimization methods. Recently, I have been working on min-max optimization, second-order and higher-order methods, and bilevel optimization.
Selected Works
All Conference Papers
Adaptive and Optimal Second-order Optimistic Methods for Minimax Optimization
Ruichen Jiang, Ali Kavis, Qiujiang Jin, Sujay Sanghavi, Aryan Mokhtari
NeurIPS 2024
Stochastic Newton Proximal Extragradient Method
Ruichen Jiang, Michał Dereziński, Aryan Mokhtari
NeurIPS 2024
Non-asymptotic Global Convergence Analysis of BFGS with the Armijo-Wolfe Line Search
Qiujiang Jin, Ruichen Jiang, Aryan Mokhtari
NeurIPS 2024 (Spotlight)
An Accelerated Gradient Method for Simple Bilevel Optimization with Convex Lower-level Problem
Jincheng Cao, Ruichen Jiang, Erfan Yazdandoost Hamedani, Aryan Mokhtari
NeurIPS 2024
Krylov Cubic Regularized Newton: A Subspace Second-Order Method with Dimension-Free Convergence Rate
Ruichen Jiang, Parameswaran Raman, Shoham Sabach, Aryan Mokhtari, Mingyi Hong, and Volkan Cevher
AISTATS 2024
Accelerated Quasi-Newton Proximal Extragradient: Faster Rate for Smooth Convex Optimization
Ruichen Jiang and Aryan Mokhtari
NeurIPS 2023 (Spotlight)
Projection-Free Methods for Stochastic Simple Bilevel Optimization with Convex Lower-level Problem
Jincheng Cao, Ruichen Jiang, Nazanin Abolfazli, Erfan Yazdandoost Hamedani, and Aryan Mokhtari
NeurIPS 2023
Online Learning Guided Curvature Approximation: A Quasi-Newton Method with Global Non-Asymptotic Superlinear Convergence
Ruichen Jiang, Qiujiang Jin, and Aryan Mokhtari
COLT 2023
A Conditional Gradient-based Method for Simple Bilevel Optimization with Convex Lower-level Problem
Ruichen Jiang, Nazanin Abolfazli, Aryan Mokhtari, and Erfan Yazdandoost Hamedani
AISTATS 2023
Future Gradient Descent for Adapting the Temporal Shifting Data Distribution in Online Recommendation System
Mao Ye, Ruichen Jiang, Haoxiang Wang, Dhruv Choudhary, Xiaocong Du, Bhargav Bhushanam, Aryan Mokhtari, Arun Kejariwal, and Qiang Liu
UAI 2022
Antenna Efficiency in Massive MIMO Detection
Ruichen Jiang and Ya-Feng Liu
IEEE SPAWC 2021
Achieving Cooperative Diversity in Over-the-Air Computation via Relay Selection
Ruichen Jiang, Sheng Zhou, and Kaibin Huang
IEEE VTC2020-Fall 2020
Cluster-Based Cooperative Digital Over-the-Air Aggregation for Wireless Federated Edge Learning
Ruichen Jiang and Sheng Zhou
IEEE/CIC ICCC 2020
All Preprints
Online Learning Guided Quasi-Newton Methods with Global Non-Asymptotic Convergence
Ruichen Jiang and Aryan Mokhtari
ArXiv preprint, 2024
Convergence Analysis of Adaptive Gradient Methods under Refined Smoothness and Noise Assumptions
Devyani Maladkar, Ruichen Jiang, Aryan Mokhtari
ArXiv preprint, 2024
Non-asymptotic Global Convergence Rates of BFGS with Exact Line Search
Qiujiang Jin, Ruichen Jiang, Aryan Mokhtari
ArXiv preprint, 2024
An Inexact Conditional Gradient Method for Constrained Bilevel Optimization
Nazanin Abolfazli, Ruichen Jiang, Aryan Mokhtari, and Erfan Yazdandoost Hamedani
ArXiv preprint, 2023
Generalized Optimistic Methods for Convex-Concave Saddle Point Problems
Ruichen Jiang and Aryan Mokhtari
ArXiv preprint, 2022
Tightness and Equivalence of Semidefinite Relaxations for MIMO Detection [Companion Report]
Ruichen Jiang, Ya-Feng Liu, Chenglong Bao, and Bo Jiang
ArXiv preprint, 2021
|