Alex Beatson

abeatson -at-

Google Scholar  /  Twitter  /  Resume

I'm a PhD student in computer science at Princeton University, advised by Ryan P. Adams.

I'm broadly interested in deep learning, probabilistic modelling, stochastic estimation, and numerical methods. My main area of focus is developing machine learning methods to accelerate and automate numerical modeling, simulation, and design. Recent work has focussed on randomized gradient estimation (trading off bias, variance and computation) for optimization problems in ML and engineering, and ML methods for accelerating finite element analysis and PDE-constrained optimization.

This year I'm organizing a NeurIPS workshop on machine learning for engineering: ML4Eng.

I spent two fun summers at Google: at Google Brain in Zurich, working on AutoML for generative models with Sylvain Gelly, Olivier Teytaud, and Karol Kurach; and with the speech recognition team in New York, working on transfer learning with Pedro Moreno.

Previously, I received my Master's from Princeton advised by Han Liu. I did my B.Eng at the University of Canterbury in New Zealand, where I worked with Geoff Chase on signal processing for ventilators in the intensive care unit.

Recent papers

Learning composable energy surrogates for PDE order reduction
Alex Beatson, Jordan T. Ash, Geoffrey Roeder, Tianju Xue, Ryan P. Adams.
NeurIPS, 2020. Oral presentation.

Randomized automatic differentiation
Deniz Oktay, Nick McGreivy, Joshua Aduol, Alex Beatson, Ryan P. Adams.
In submission, 2020.

A data-driven computational scheme for the nonlinear mechanical properties of cellular mechanical metamaterials under large deformation
Tianju Xue, Alex Beatson, Maurizio Chiaramonte, Geoffrey Roeder, Jordan T. Ash, Yigit Menguc, Sigrid Adriaenssens, Ryan P. Adams, Sheng Mao.
Soft Matter, 2020.

Amortized finite element analysis for fast PDE-constrained optimization
Tianju Xue, Alex Beatson, Sigrid Adriaenssens, Ryan P. Adams.
ICML, 2020. Preprint at ICLR DeepDiffEq, 2020.

SUMO: Unbiased estimation of log marginal probability for latent variable models
Yucen Luo, Alex Beatson, Mohammad Norouzi, Jun Zhu, David Duvenaud, Ryan P. Adams, Ricky T. Q. Chen.
ICLR, 2020. Spotlight presentation.

Efficient optimization of loops and limits with randomized telescoping sums
Alex Beatson, Ryan P. Adams.
ICML, 2019.

Amortized Bayesian meta-learning
Sachin Ravi, Alex Beatson.
ICLR, 2019.

Continual learning in generative adversarial nets
Ari Seff, Alex Beatson, Daniel Suo, Han Liu.
arXiv, 2017.

Blind attacks on machine learners
Alex Beatson, Zhaoran Wang, Han Liu.
NeurIPS, 2016.