About Me

I am a Research Lead at JPMorgan Chase & Co., working in the AI Research team. At JPM, my research focuses on deep generative models for time series data as part of the Synthetic Data initiative. I also explore robust uncertainty quantification for deep learning methods, with applications to time-series forecasting.

Recent News

05-01-2025: Our paper titled LSCD: Lomb-Scargle Conditioned Diffusion for Time series Imputation was accepted to ICML 2025!

04-30-2025: Our paper titled Mixup Regularization: A Probabilistic Perspective was accepted to UAI 2025!

04-06-2025: Our paper titled Fusion of Information in Multiple Particle Filtering in the Presence of Unknown Static Parameters was accepted to ICASSP 2025!

03-13-2025: Our journal paper titled Variational Neural Stochastic Differential Equations with Change Points was accepted to TMLR!

Educational Background

Previously, I earned my Ph.D. in Electrical Engineering from Stony Brook University, where I was advised by Monica Bugallo and co-advised by Petar Djuric. My dissertation, titled “Bayesian Learning for Complex Systems under Model Uncertainty,” focused on developing Monte Carlo methods for Bayesian inference in high-dimensional probability models in the case of regime switching. The theoretical aspects of my work involved developing adaptive importance sampling based on stochastic optimization and reinforcement learning for scalable inference. The practical aspects involved applying the methodologies to ecological data, for which I collaborated with Heather Lynch from the Department of Ecology & Evolution at Stony Brook University on age-structured modeling of penguin population dynamics. During my time at Stony Brook, the Institute for Advance Computational Sciences supported my research and my interdisciplinary collaboration via the Junior Researcher award.

Teaching Activity

In my free time, I teach courses at my alma mater, Stony Brook University. Most recently, in Spring 2024, I taught the course “ESE 531: Statistical Learning and Inference,” which covers the foundations of mathematical statistics (both classical and Bayesian paradigms).