Halyun Jeong
University of California, Los Angeles
Email: hajeong@math.ucla.edu
I am an assistant adjunct professor in the mathematics department at UCLA. My research mentor is Professor
Deanna Needell.
Previously, I was a PIMS postdoc at the University of British Columbia working with Ozgur Yilmaz, Yaniv Plan, and Michael Friedlander. I received my Ph.D. degree in Mathematics from
Courant Institute of Mathematical Sciences at New York University in 2017.
Research
My research interests span mathematical aspect of signal processing and machine learning including geometry of high-dimensional data sets, nonlinear signal recovery such as compressed sensing, and computationally efficient optimization.
As a postdoc at UBC, I worked on concentration of random matrices on sets, performance analysis of iterative algorithms for one-bit compressed sensing, and manifold identification properties of proximal gradient methods and gauge-dual based algorithms.
For my Ph.D. thesis, I studied phase retrieval, and quantization of phaseless measurements, and analyzed a randomized A/D conversion algorithm that eliminates spectral artifacts.
Publications
-
Stochastic gradient descent for streaming linear and rectified linear systems with Massart noise, Submitted
[arXiv link]
(joint work with Deanna Needell and
Elizaveta Rebrova)
-
Nearly Optimal Bounds for Cyclic Forgetting, Neural Information Processing Systems (NeurIPS), 2023
(joint work with
Mark Kong,
William Swartworth , Deanna Needell, and
Rachel Ward)
-
Linear Convergence of Reshuffling Kaczmarz Methods With Sparse Constraints, Submitted [arXiv link]
(joint work with Deanna Needell)
-
Federated Gradient Matching Pursuit, to appear in IEEE Transactions on Information Theory [1st version arXiv link]
(joint work with Deanna Needell and
Jing Qin)
-
Polar Deconvolution of mixed signals, IEEE Transactions on Signal Processing, 2022 [journal link]
(joint work with Zhenan Fan, Babhru Joshi, and Michael P. Friedlander)
-
NBIHT: An Efficient Algorithm for 1-bit Compressed Sensing with Optimal Error Decay Rate, IEEE Transactions on Information Theory, 2022 [journal link]
(joint work with Michael P. Friedlander, Yaniv Plan, and Ozgur Yilmaz)
-
Sub-Gaussian Matrices on Sets: Optimal Tail Dependence and Applications, Communications on Pure and Applied Mathematics, 2021 [journal link]
(joint work with Xiaowei Li, Yaniv Plan, and Ozgur Yilmaz)
- Atomic Decomposition Via Polar Alignment: The Geometry of Structured Optimization, Foundations and Trends in Optimization, Volume 3:280-366, 2020 [journal link] [pdf]
(joint work with Zhenan Fan, Michael P. Friedlander, and Yifan Sun)
- Non-Gaussian Random Matrices on Sets: Optimal Tail Dependence and Applications, Proceedings of International Conference on Sampling Theory and Applications (SampTA), 2019
(joint work with Xiaowei Li, Yaniv Plan, and Ozgur Yilmaz)
- Are we there yet? Manifold identification of gradient-related proximal methods, Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 2019
link.
(joint work with Yifan Sun, Julie Nutini, and Mark Schmidt)
- Convergence of the randomized Kaczmarz method for phase retrieval,
Preprint.
(joint work with Sinan Gunturk)
Future Talks in 2024
-
[May 2024] Invited talk for SIAM Conference on Applied Linear Algebra 24 minisymposium.
-
[April 2024] Invited talk for the Applied Mathematics Seminar at the University of California, Irvine.
Teaching at UCLA
Math156 Machine Learning
Math151B Numerical methods
Math170E Probability and Statistics: Probability
Math170S Probability and Statistics: Statistics
Teaching at UBC
Teaching at NYU
Fall 2016: Calculus 1 recitation
Fall 2015: Honors III (Fourier analysis) recitation
Fall 2014: Algebra and Calculus (Precalculus) recitation