Alex Beatson

abeatson -at-

Google Scholar  /  Twitter  /  Resume

I'm a PhD student in computer science at Princeton University, advised by Ryan P. Adams.

I develop machine learning methods to accelerate and automate engineering modeling, simulation, and design. My favorite tools are deep learning, probabilistic modelling, stochastic estimation, and numerical methods. My recent work has focussed on randomized gradient estimation (trading off bias, variance and computation) for optimization problems in ML and engineering, and ML methods for accelerating finite element analysis and PDE-constrained optimization. If you want to chat about research, don't hesitate to get in touch.

This year I'm organizing a NeurIPS workshop on machine learning for engineering: ML4Eng.

I spent two fun summers at Google: at Google Brain in Zurich, working on AutoML for generative models with Sylvain Gelly, Olivier Teytaud, and Karol Kurach; and with the speech recognition team in New York, working on transfer learning with Pedro Moreno. I occasionally consult for private equity firm Rosemark Capital.

Previously, I received my Master's from Princeton advised by Han Liu. I did my B.Eng at the University of Canterbury in New Zealand, where I worked with Geoff Chase on signal processing for ventilators in the intensive care unit.

Recent papers

Randomized automatic differentiation
Deniz Oktay, Nick McGreivy, Joshua Aduol, Alex Beatson, Ryan P. Adams.
In submission, 2020.

Learning composable energy surrogates for PDE order reduction
Alex Beatson, Jordan T. Ash, Geoffrey Roeder, Tianju Xue, Ryan P. Adams.
In submission, 2020.

A data-driven computational scheme for the nonlinear mechanical properties of cellular mechanical metamaterials under large deformation
Tianju Xue, Alex Beatson, Maurizio Chiaramonte, Geoffrey Roeder, Jordan T. Ash, Yigit Menguc, Sigrid Adriaenssens, Ryan P. Adams, Sheng Mao.
Soft Matter, 2020.

Amortized finite element analysis for fast PDE-constrained optimization
Tianju Xue, Alex Beatson, Sigrid Adriaenssens, Ryan P. Adams.
ICML, 2020. Preprint at ICLR DeepDiffEq, 2020.

SUMO: Unbiased estimation of log marginal probability for latent variable models
Yucen Luo, Alex Beatson, Mohammad Norouzi, Jun Zhu, David Duvenaud, Ryan P. Adams, Ricky T. Q. Chen.
ICLR, 2020.

Efficient optimization of loops and limits with randomized telescoping sums
Alex Beatson, Ryan P. Adams.
ICML, 2019.

Amortized Bayesian meta-learning
Sachin Ravi, Alex Beatson.
ICLR, 2019.

Continual learning in generative adversarial nets
Ari Seff, Alex Beatson, Daniel Suo, Han Liu.
arXiv, 2017.

Blind attacks on machine learners
Alex Beatson, Zhaoran Wang, Han Liu.
NeurIPS, 2016.