Research

I work on the science of deep learning. I've had the privilege of learning to do research from Sam Gershman and Aditi Raghunathan.

I'm interested in understanding what systems centered around foundation models will look like in 5 years, and how they will touch our lives in unthinkable ways. This usually involves studying these models in a scientific way, running experiments across the stack: from pretraining to evals and beyond.

My path into deep learning was a meandering one. Early on in college I was entranced by pure math and neuroscience, and both led me to machine learning, albeit from opposite directions. Eventually, my research interests converged to what they are now. I am easily fascinated, and have published work on topics ranging from language model pretraining to quantization to high-dimensional probability and mouse olfaction.