About Me

visualization of residual of iterative projection method for linear inequalities

I am an Assistant Professor in the Mathematics Department at Harvey Mudd College. My research focuses are in mathematical data science, optimization, and applied convex geometry. I leverage mathematical tools, such as those from probability, combinatorics, and convex geometry, on problems in data science and optimization. Areas in which I have been active recently include randomized numerical linear algebra, combinatorial methods for convex optimization, tensor decomposition for topic modeling, network consensus and ranking problems, and community detection on graphs and hypergraphs. My research is supported by NSF DMS #2211318 “Tensor Models, Methods, and Medicine”.

Before starting at HMC, I received my PhD in the Graduate Group in Applied Mathematics at the University of California, Davis where I was fortunate to be advised by Professor Jesús A. De Loera, and then was a CAM Assistant Professor (post-doc) in the University of California, Los Angeles (UCLA) Mathematics Department where my exceptional postdoctoral mentor was Professor Deanna Needell.


Recent News

April ‘24: We (with collaborators Alejandra Castillo, Iryna Hartsock, Paulina Hoyos, Lara Kassab, Alona Kryshchenko, Kamila R. Larripa, Deanna Needell, Shambhavi Suryanarayanan, and Karamatou Yacoubou-Djima) submitted our paper Randomized Iterative Methods for Tensor Regression Under the t-product! In this paper, we extend variants of the Kaczmarz and Gauss-Seidel methods to tensor regression under the t-product, which also yields novel insights into the standard matrix-vector and matrix-matrix regression settings. In addition, we survey the related work in the matrix-vector and tensor regression literature and provide a suite of numerical experiments that illustrate the strengths and weaknesses of our proposed methods, including demonstrating their application to image deblurring!

March ‘24: We (with students Nestor Coria and Jaime Pacheco) submitted our paper On Quantile Randomized Kaczmarz for Linear Systems with Time-Varying Noise and Corruption! In this paper, we consider solving systems of linear equations which have been perturbed by adversarial corruptions with the quantile randomized Kaczmarz (QRK) method. Previously, QRK was known to converge on large-scale systems of linear equations suffering from static corruptions. We proved that QRK converges even for systems corrupted by time-varying perturbations. This is an important regime as many applications where linear systems arise deal with distributed data access, and noise or corruption introduced into the system can vary across time and across data access!

March ‘24: I am serving as a SIAM representative on the Joint Taskforce on Data Science Modeling Curriculum, which is a shared effort between ACM, ASA, MAA and SIAM. The taskforce extends the efforts of the ACM Data Science Task Force towards a complete data science model curriculum, towards a multidisciplinary effort with representatives from computing, statistics, applied mathematics, and other possible societies. Excited to be a part of this great effort!

February ‘24: Our (with collaborators Bryan Curtis, Luyining Gan, Rachel Lawrence, and Sam Spiro) paper “Zero Forcing with Random Sets” appeared in Discrete Mathematics! In this paper, we investigate the probability that a randomly sampled set of vertices of a given graph (each vertex included in the set independently with probability p) will serve as a zero forcing set for the graph. This work additionally resolves a conjecture of Boyer et. al.

December ‘23: I was recently elected secretary of the SIAM Activity Group on Data Science (SIAG-DATA). The aim of SIAG-DATA is to advance the mathematics of data science, to highlight the importance and benefits of data science, to bring data science innovations to other areas of applied mathematics, and to identify and explore the connections between data science and other applied sciences. One of the main activities of this group is the biennial conference on the Mathematics of Data Science (MDS). I am also currently sitting on the organizing committee of this conference! I am excited to help lead SIAG-DATA and continue to advance the many application of mathematics in data science.

December ‘23: My group’s (with Paulina Hoyos Restrepo (UTA), Alona Kryshchenko (CSUCI), Kamila R. Larripa (Cal Poly Humboldt), Shambhavi Suryanarayanan (Princeton), and Karamatou Yacoubou-Djima (Wellesley)) proposal was accepted to the AIM SQuaRE program! We will be visiting for a week-long research visit in February 2024, where we will be studying randomized column-slice-action methods for tensor problems. We are very excited for this opportunity!

December ‘23: My group’s (with Paulina Hoyos Restrepo (UTA), Alona Kryshchenko (CSUCI), Deanna Needell (UCLA), Shambhavi Suryanarayanan (Princeton), and Karamatou Yacoubou-Djima (Wellesley)) proposal was accepted to Collaborate@ICERM! We will be visiting for a week-long research visit in Summer 2024, where we will be studying randomized algorithms for tensor problems with factorized operators or data. We are very excited for this opportunity!

December ‘23: My group’s (with Paulina Hoyos Restrepo (UTA), Alona Kryshchenko (CSUCI), Shambhavi Suryanarayanan (Princeton), and Karamatou Yacoubou-Djima (Wellesley)) proposal was accepted to SLMath Summer Research in Mathematics (SRiM)! We will be visiting for a two week-long research visit in Summer 2024, where we will be studying column-slice-action methods for tensor regression. We are very excited for this opportunity!

November ‘23: I am co-editing a topical collection on tensor methods in La Matematica (the flagship journal of the Association for Women in Mathematics) with Anna Konstorum (IDA/CCS) and Anna Ma (UCI)! Research in tensor (multidimensional array) analysis is accelerating, driven by the increasing complexity of data sets, arising from fields such as biomedical engineering, networks, and the physical and social sciences, which naturally lend themselves to a tensor structure. We invite research, survey, and review articles on novel theoretical, computational, and real-world application progress in tensor analysis. Our goal is to bring together new developments in tensor analysis into a Topical Collection at La Matemetica in order to give interested readers a broad and up-to-date opportunity to learn about new work in tensor analysis thereby lowering the barrier for entry into the field, and to feature wide array of research and researchers in this field.

November ‘23: I am coorganizing a workshop titled “Women in Randomized Numerical Linear Algebra” at the Institute for Pure and Applied Mathematics (IPAM) in summer 2025 with Malena Espanol (ASU), Anna Ma (UCI), and Deanna Needell (UCLA). We are excited to bring together an amazing group of researchers focused on this exciting topic!

October ‘23: Our paper (with students Tyler Will, Joshua Vendrow, Runyu Zhang, Mengdi Gao, and Eli Sadovnik, and colleagues Denali Molitor and Deanna Needell) Neural Nonnegative Matrix Factorization for Hierarchical Multilayer Topic Modeling was accepted to the “Sampling Theory, Signal Processing, and Data Analysis” journal! In this paper, we introduce a new model based on nonnegative matrix factorization (NMF) for detecting latent hierarchical structure in data, which we call Neural NMF. This model frames hierarchical NMF as a neural network, and we provide theoretical results which allow us to train Neural NMF via a natural backpropagation method. We illustrate the promise of this model with several numerical experiments on benchmark datasets and real world data.