After many years in the making, we're open-sourcing Facebook's Adaptive Experimentation platform, Ax, and BoTorch, a library for Bayesian optimization research, built on PyTorch. ai.facebook.com/blog/open-sour…
Extrapolating the spectacular performance of GPT3 into the future suggests that the answer to life, the universe and everything is just 4.398 trillion parameters.
“Our models perform equivalently to backprop on ML benchmarks, while utilising only local and (mostly) Hebbian plasticity. Our method raises the potential that standard ML algorithms could in principle be directly implemented in neural circuitry.” 🧠
openreview.net/forum?id=PdauS…