Research

I’m currently a PhD student in the Department of Statistics at the University of Oxford, studying as part of the StatML CDT. I’m supervised by Arnaud Doucet and George Deligiannidis, and my work focuses on developing a theoretical understanding of generative modelling techniques, with a particular focus on diffusion models and variational autoencoders.

During my first year of the CDT, I worked on a couple of research projects. The first aimed to generalise denoising diffusion models to arbitrary state spaces, with particular applications to discrete spaces. The second focused on infinite-width limits of neural networks, and their use in studying the inductive biases and training dynamics of neural networks. Since then, I’ve also worked on variational and importance weighted autoencoders, investigating the properties of variational bound in high dimensions.

Publications

From Denoising Diffusions to Denoising Markov Models. Joe Benton, Yuyang Shi, Valentin De Bortoli, George Deligiannidis, Arnaud Doucet, arXiv preprint, arXiv:2211.03595

Alpha-divergence Variational Inference Meets Importance Weighted Auto-Encoders: Methodology and Asymptotics. Kamélia Daudel, Joe Benton*, Yuyang Shi*, Arnaud Doucet. arXiv preprint, arXiv:2210.06226

Polysemanticity and Capacity in Neural Networks. Adam Scherlis, Kshitij Sachan, Adam S. Jermyn, Joe Benton, Buck Shlegeris. arXiv preprint, arXiv:2210.01892

A Continuous Time Framework for Discrete Denoising Models. Andrew Campbell, Joe Benton, Valentin De Bortoli, Tom Rainforth, George Deligiannidis, Arnaud Doucet. Advances in Neural Information Processing Systems, 2022