Bryan Briney (@bryanbriney) 's Twitter Profile
Bryan Briney

@bryanbriney

Associate Professor at Scripps Research, studying antibody responses to immunization and infection. Big fan of open science and dogs.

ID: 40954944

calendar_today18-05-2009 20:06:32

686 Tweet

518 Followers

554 Following

bioRxiv Immunology (@biorxiv_immuno) 's Twitter Profile Photo

Conformational ensemble-based framework enables rapid development of Lassa virus vaccine candidates biorxiv.org/cgi/content/sh… #biorxiv_immuno

Biology+AI Daily (@biologyaidaily) 's Twitter Profile Photo

Conformational Ensemble-Based Framework Enables Rapid Development of Lassa Virus Vaccine Candidates • This study introduces an AI-driven framework combining subsampled AlphaFold2 and ProteinMPNN to stabilize the conformationally dynamic Lassa virus glycoprotein complex (GPC), a

Conformational Ensemble-Based Framework Enables Rapid Development of Lassa Virus Vaccine Candidates

• This study introduces an AI-driven framework combining subsampled AlphaFold2 and ProteinMPNN to stabilize the conformationally dynamic Lassa virus glycoprotein complex (GPC), a
Seunghyun Seo (@seunghyunseo7) 's Twitter Profile Photo

The concept of critical batch size is quite simple. Let’s assume we have a training dataset with 1M tokens. If we use a batch size of 10, we can update model param 100,000 times. On the other hand, if we increase the batch size to 100, the step size decreases to 10,000 (1/n).

Biology+AI Daily (@biologyaidaily) 's Twitter Profile Photo

ABCFold: Easier Running and Comparison of AlphaFold 3, Boltz-1, and Chai-1 - Structural biology has seen a revolution with deep learning-based protein structure predictors like AlphaFold 3, Boltz-1, and Chai-1. However, running and comparing these models efficiently remains a

ABCFold: Easier Running and Comparison of AlphaFold 3, Boltz-1, and Chai-1

- Structural biology has seen a revolution with deep learning-based protein structure predictors like AlphaFold 3, Boltz-1, and Chai-1. However, running and comparing these models efficiently remains a
Michael Baym (@baym) 's Twitter Profile Photo

Thrilled that our work on this problem with Karel Břinda, Zamin Iqbal, and others is out in Nature Methods today! We used phylogenetic compression (described in the thread) to compress every microbe ever sequenced onto a flash drive so that it can be searched with a laptop!

Brian Naughton (@btnaughton) 's Twitter Profile Photo

Since BoltzDesign1 just got published, people might also be interested in github.com/escalante-bio/… by Nick Boyd It has a nicely flexible system for including custom losses. It's also surprisingly little code on top of Boltz (and the Jaxified Joltz).

Since BoltzDesign1 just got published, people might also be interested in github.com/escalante-bio/… by Nick Boyd

It has a nicely flexible system for including custom losses. It's also surprisingly little code on top of Boltz (and the Jaxified Joltz).
Diego del Alamo (@ddelalamo) 's Twitter Profile Photo

Our several-years-old fix to ProteinMPNN's tendency to make weird antibody CDR seqs is finally out. We run an antibody LM in parallel & added its logits to ProteinMPNN's, fixing most issues we encountered. It also increased % of HER2-binding trastuzumab designs >10-fold

Our several-years-old fix to ProteinMPNN's tendency to make weird antibody CDR seqs is finally out. We run an antibody LM in parallel & added its logits to ProteinMPNN's, fixing most issues we encountered. It also increased % of HER2-binding trastuzumab designs >10-fold
Biology+AI Daily (@biologyaidaily) 's Twitter Profile Photo

Better antibodies engineered with a GLIMPSE of human data 1.GLIMPSE-1 is a new antibody language model trained solely on paired human antibody sequences. It achieves state-of-the-art results in humanization tasks, rivaling models trained on far larger and more diverse

Better antibodies engineered with a GLIMPSE of human data

1.GLIMPSE-1 is a new antibody language model trained solely on paired human antibody sequences. It achieves state-of-the-art results in humanization tasks, rivaling models trained on far larger and more diverse
Raiees Andrabi (@andrabi_raiees) 's Twitter Profile Photo

A fantastic collaborative effort with the BatistaLab (Ragon/Harvard), ShawLab(Penn) and IrvineLab (Scripps) where we show how targeted immunogen design can induce HIV broadly neutralizing antibody responses! biorxiv.org/content/10.110…

Soumith Chintala (@soumithchintala) 's Twitter Profile Photo

considering Muon is so popular and validated at scale, we've just decided to welcome a PR for it in PyTorch core by default. If anyone wants to take a crack at it... github.com/pytorch/pytorc…

considering Muon is so popular and validated at scale, we've just decided to welcome a PR for it in PyTorch core by default.
If anyone wants to take a crack at it... 
github.com/pytorch/pytorc…
PapersAnon (@papers_anon) 's Twitter Profile Photo

Mixture of Raytraced Experts Stacked MoE architecture that can dynamically select sequences of experts, producing computational graphs of variable width and depth. Allows predictions with increasing accuracy as the computation cycles through the experts' sequence. Links below

Mixture of Raytraced Experts

Stacked MoE architecture that can dynamically select sequences of experts, producing computational graphs of variable width and depth. Allows predictions with increasing accuracy as the computation cycles through the experts' sequence.

Links below
Mikhail Shugay (@antigenomics) 's Twitter Profile Photo

"Enhancing sequence alignment of adaptive immune receptors through multi-task deep learning" by Gur Yaari lab. Very promising: speed boost from GPU acceleration + solving problems of greedy aligners in capturing V(D)J rearrangement uncertainty academic.oup.com/nar/article/53…

Kyle Tretina, Ph.D. (@allthingsapx) 's Twitter Profile Photo

🚀Protenix‑Mini: trims 32 / 48 early Pairformers and swaps AF‑style 200‑step diffusion for a 2‑step ODE (η = 1, γ₀ = 0)👀 This cuts ≈85% inference FLOPs & 70% memory, with only ~2% LDDT hit on RecPDB🔥

🚀Protenix‑Mini: trims 32 / 48 early Pairformers and swaps AF‑style 200‑step diffusion for a 2‑step ODE (η = 1, γ₀ = 0)👀

This cuts ≈85% inference FLOPs & 70% memory, with only ~2% LDDT hit on RecPDB🔥
Liliang Ren (@liliang_ren) 's Twitter Profile Photo

We’re open-sourcing the pre-training code for Phi4-mini-Flash, our SoTA hybrid model that delivers 10× faster reasoning than Transformers — along with μP++, a suite of simple yet powerful scaling laws for stable large-scale training. 🔗 github.com/microsoft/Arch… (1/4)

Kevin K. Yang 楊凱筌 (@kevinkaichuang) 's Twitter Profile Photo

In 1965, Margaret Dayhoff published the Atlas of Protein Sequence and Structure, which collated the 65 proteins whose amino acid sequences were then known. Inspired by that Atlas, today we are releasing the Dayhoff Atlas of protein sequence data and protein language models.

In 1965, Margaret Dayhoff published the Atlas of Protein Sequence and Structure, which collated the 65 proteins whose amino acid sequences were then known. 

Inspired by that Atlas, today we are releasing the Dayhoff Atlas of protein sequence data and protein language models.
GAMA Miguel Angel 🐦‍⬛🔑 (@miangoar) 's Twitter Profile Photo

For protein science/bioinformatics: MMseqs Diamond FoldSeek BioTite ColabFold BioPython pdb-tools Seqkit hh-suite IQ-Tree MAFFT TMAlign PyRosetta (Just released in 2024) DSSP BLAST ProteinMPNN RFDiff AlphaFold2 ESM BindCraft RDKit Boltz HMMER GROMACS Which others?