"In God we trust, others must provide data."
-

Bio

I received my PhD in the Graduate Group in Applied Mathematics at the University of California, Davis where I was advised by Professor Jesús A. De Loera. I work in optimization, applied convex geometry, and mathematical data science. I am currently a CAM Assistant Professor (post-doc) in the UCLA Mathematics Department where my mentor is Professor Deanna Needell. In July 2021, I will join the Harvey Mudd College Mathematics Department as a tenure-track assistant professor!

CV



Contact

Email: j(lastname)@math.ucla.edu.
Office: MSB 7354

Mail: Department of Mathematics
University of California, Los Angeles
Box 951555
Los Angeles, CA 90095-1555, USA





Recent News:

[Feb. '21]

Applications are now open for my funded summer undergraduate research project Kaczmarz Methods for Large-scale Data Analysis! This project will be run in conjunction with the UCLA CAM REU and is partially funded by Harvey Mudd College. Applications from undergraduates at any institution are welcome and can be submitted through MathPrograms! If you are a Claremont colleges student, you can additionally apply at HMC URO.

[Jan. '21]

Our paper On a Guided Nonnegative Matrix Factorization (with student Josh Vendrow) was accepted to the 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)! In it, we propose an approach based upon the nonnegative matrix factorization (NMF) model, deemed Guided NMF, that incorporates user-designed seed word supervision. Our experimental results demonstrate the promise of this model and illustrate that it is competitive with other methods of this ilk with only very little supervision information!

[Dec. '20]

Our paper Greed Works: An Improved Analysis of Sampling Kaczmarz-Motzkin (with Anna Ma) was accepted for publication to the SIAM Journal on Mathematical Data Science (SIMODS)! In this work, we present an improved convergence analysis of the Sampling Kaczmarz-Motzkin (SKM) family of methods on consistent systems of linear equations. Our analysis illustrates the advantage of using greedier members of this family and presents intuition for why Motzkin's (maximal residual) method often converges faster than the Randomized Kaczmarz method! We additionally specialize our analysis to two specific forms of linear systems, including average consensus systems.

[Nov. '20]

Our paper Data-driven Algorithm Selection and Tuning in Optimization and Signal Processing was accepted for publication to the Annals of Mathematics and Artificial Intelligence! In this paper, we train machine learning methods to automatically improve the performance of optimization and signal processing algorithms. As a proof of concept, we use our approach to improve two popular data processing subroutines in data science: stochastic gradient descent and greedy methods in compressed sensing!

[Oct. '20]

We (with student Edwin Chau) submitted the paper On Application of Block Kaczmarz Methods in Matrix Factorization! In this work, we discuss and test a block Kaczmarz solver that replaces the least-squares subroutine in the common alternating scheme for matrix factorization. This variant trades a small increase in factorization error for significantly faster algorithmic performance. In doing so we find block sizes that produce a solution comparable to that of the least-squares solver for only a fraction of the runtime and working memory requirement!

[Oct. '20]

We (with student Sixian Li) submitted the paper Semi-supervised NMF Models for Topic Modeling in Learning Tasks! In this work, we propose several new semi-supervised NMF (SSNMF) models and show that these are naturally formulated as the maximum likelihood estimators given a generative factorization model and assumed distributions of uncertainty in the observed data. We develop training methods for the general forms of these models and illustrate how to apply them to the classification task; our experiments show that these methods are very promising and achieve high classification accuracy on the 20 Newsgroups data (while also developing a coherent topic model and classifying in a low-dimensional space)!




Teaching:

MATH 156: Machine Learning

Schedule:

I am speaking at the Colorado State University Data Science seminar on March 11, 2021 at noon PST (Zoom coordinates available on request)!

I am speaking at the AMS Spring Southeastern Sectional Meeting in the "Graphs in Data Science" session on March 13-14, 2021.

I am speaking in the "Moving Randomized Linear Algebra from Theory to Practice" minisymposium at the SIAM Conference on Applied Linear Algebra (LA21) in New Orleans, LA from May 17-21.

I am participating in the AMS MRC Finding Needles in Haystacks: Approaches to Inverse Problems using Combinatorics and Linear Algebra which is rescheduled for June 6-12, 2021 in West Greenwich, RI.

I am participating in the focus program on “Data Science, Approximation Theory, and Harmonic Analysis’’ at the Fields Institute in May/June 2022 and will be speaking during the Focus Week on “Computational Harmonic Analysis and Linear Algebra”.

Useful Links:

HMC Mathematics
UCLA Mathematics
UCLA Women in Math
UC Davis GGAM
UC Davis Mathematics