JAX: Accelerated machine learning research via composable function transformations in Python
JAX had its initial open-source release in December 2018 (https://github.com/google/jax). It is currently being used by several groups of researchers for a wide range of advanced applications, from studying spectra of neural networks, to probabilistic programming and Monte Carlo methods, and scientific applications in physics and biology. Users appreciate JAX most of all for its ease of use and flexibility.
Bio: Dougal Maclaurin is a research scientist at Google. He works on programming languages and systems for machine learning, particularly the Python library JAX. He started Autograd, a system for automatic differentiation in Python, which has inspired the design of several systems, including PyTorch, MinPy, Torch Autograd and Julia Autograd. He is a co-founder of Day Zero Diagnostics, a startup developing a sequencing-based diagnostic for drug-resistant infections. He received his Ph.D. from Harvard in 2016, working with Ryan Adams on the development of methods for machine learning. His work on scalable MCMC, "Firefly Monte Carlo", was recognized with the Best Paper award at UAI 2014.
Lunch for talk attendees will be available at 12:00pm.
To request accommodations for a disability, please contact Emily Lawrence, firstname.lastname@example.org, 609-258-4624 at least one week prior to the event.