Fritz Obermeyer (@ftzo) 's Twitter Profile
Fritz Obermeyer

@ftzo

Inference engineer @Positron_AI. Maintains @PyroAi. Bayesian pragmatist

ID: 242272620

calendar_today24-01-2011 11:27:29

235 Tweet

1,1K Followers

289 Following

Theofanis Karaletsos (@tkaraletsos) 's Twitter Profile Photo

Sharing a description of TyXe arxiv.org/abs/2110.00276, a little pyro-based BNN library we designed with Hippolyt Ritter which started with his internship with Uber AI Labs and got presented last year at #ProbProg2020. 1/5

Pyro (@pyroai) 's Twitter Profile Photo

Pyro 1.8 is released: 4 new tutorials including single cell transcriptomics, SARS-CoV-2 lineage growth, and drawing with pyro.render_model(); 4 new autoguides and improvements to AutoMultivariateNormal. Thanks Vitalii Kleshchevnikov, PhD, Yerdos Ordabayev! github.com/pyro-ppl/pyro/…

Fritz Obermeyer (@ftzo) 's Twitter Profile Photo

Dear GISAID Initiative, you've done a great job collecting the world's largest database of SARS-CoV-2 genomes. It would be great if that database were publicly available. My team would love to do Omicron research, but our epicov feed broke in November and we can no longer access genomes.

Pyro (@pyroai) 's Twitter Profile Photo

We are pleased to release NumPyro 0.9.0 which includes the new Stein variational inference, 4 new distributions, 5 new tutorials and examples, and many enhancements and bug fixes. github.com/pyro-ppl/numpy…

Pyro (@pyroai) 's Twitter Profile Photo

Pyro 1.8.1 is released: Update to PyTorch 1.11; new tutorial on Bayesian workflow; animated plots in GP tutorial; render_model() now shows params; new distributions and samplers. Thanks Nipun Batra, Karm Patel, and others! github.com/pyro-ppl/pyro/…

Fritz Obermeyer (@ftzo) 's Twitter Profile Photo

Martin Jankowiak found a beautifully clever sparse inference method for our viral GWAS, now on 6.9million SARS-CoV-2 genomes. Joint work with Jacob Lemieux biorxiv.org/content/10.110…

Generate:Biomedicines (@generate_biomed) 's Twitter Profile Photo

Today we introduced Chroma, a generative model that creates new proteins & protein complexes given geometric & functional constraints. It learns to transform unstructured, random 3D shapes into #protein molecules, which can have tens of thousands of atoms. ow.ly/Txn750LShFp

Pyro (@pyroai) 's Twitter Profile Photo

If you need a first draft Pyro model, try asking ChatGPT. For example: How can I use Pyro to learn user preferences among features based on sparse pairwise comparison data, using variational inference?

If you need a first draft Pyro model, try asking ChatGPT. For example:

How can I use Pyro to learn user preferences among features based on sparse pairwise comparison data, using variational inference?
Pyro (@pyroai) 's Twitter Profile Photo

Pyro 1.8.5 is released with fixes to support PyTorch 2, new conditional inverse and compose TransformModules, and a substitute handler. Thanks to many new contributors! github.com/pyro-ppl/pyro/…

Andrew Beam (@andrewlbeam) 's Twitter Profile Photo

1/n: We are excited to share that our paper on Chroma, a general purpose diffusion model for proteins, is out today in nature! nature.com/articles/s4158… A couple of my favorite highlights in the 🧵below 👇

Pyro (@pyroai) 's Twitter Profile Photo

Pyro 1.9 is released, adding type hints by Yerdos Ordabayev, a Zuko normalizing flows tutorial by François Rozet, a simple RandomWalkKernel, new tutorials, and many bug fixes. github.com/pyro-ppl/pyro/…

Pyro (@pyroai) 's Twitter Profile Photo

Pyro 1.9.1 is released with a Lévy Stable.log_prob(), a WeighedPredictive, PyroModuleList, bug fixes, and improved type hints. Thanks to Erik Tollerud (@[email protected]), Dario Coscia, Martin Bubel, Ben Zickel, Kipper Fletez-Brant, and others! github.com/pyro-ppl/pyro/…

Positron AI (@positron_ai) 's Twitter Profile Photo

Positron is proud to share our latest inference performance versus the GPU-based competition: ✅ 70% faster token generation on Llama3.1-8B ✅ 1/3 power usage on Llama3.1-8B ✅ 51% cost savings versus DGX-H100 💸 (Yes, IYKYK: less than half the cost.)

Edward Kmett (@kmett) 's Twitter Profile Photo

GPUs made training massive models possible, but inference needs better memory capacity, memory bandwidth utilization, more power efficiency, and an architecture built bottom up with transformers in mind. To that end, I'm excited to share that Positron just raised a $51.6M Series