Yousuf A. Khan (@theyousufkhan) 's Twitter Profile
Yousuf A. Khan

@theyousufkhan

Scientist @Stanford. CryoEM/ET, AAA+, RNA, Recoding, ML. Formerly: DeepMind AlphaFold,EvoscaleAI, Churchill Scholar@Cambridge_Uni & seen on @Netflix

ID: 1083072925295611906

linkhttps://www.ncbi.nlm.nih.gov/myncbi/1DaKepGFnknwBd/bibliography/public/ calendar_today09-01-2019 18:48:02

1,1K Tweet

596 Followers

450 Following

jack morris (@jxmnop) 's Twitter Profile Photo

excited to finally share on arxiv what we've known for a while now: All Embedding Models Learn The Same Thing embeddings from different models are SO similar that we can map between them based on structure alone. without *any* paired data feels like magic, but it's real:🧵

Yousuf A. Khan (@theyousufkhan) 's Twitter Profile Photo

This is a really fucking great thread. Confirms a suspicion that many have had for a while in a quantitative way. TLDR; after a certain point, the type model doesn't matter if you have enough parameters

EIRNA Bio (@eirnabio) 's Twitter Profile Photo

Our CSO, Prof. Pasha Baranov, discusses the limitations of RNA-Seq and the necessity of Ribo-seq for a complete understanding. 📹 Watch the full video on YouTube here: youtu.be/YxrLpdYEoJ8 ✉️ Contact us at [email protected] #HarnessingTranslatomics

Jonas Adler (@jonasaadler) 's Twitter Profile Photo

Oklart hur många svenska följare jag har, men likväl. Jag kommer vara i Almedalen 23-26 Juni för Googles räkning och prata om AlphaFold, Gemini och AGI i största allmänhet. Om någon vill ses, säg till! googlesessions.se/program

jack morris (@jxmnop) 's Twitter Profile Photo

when people were working on BERT i always found these types of visualizations compelling. seeing the attention mechanism in action is so cool why are they not popular anymore? do our models have too many layers for us to understand now? or are attention maps just not useful?

when people were working on BERT i always found these types of visualizations compelling. seeing the attention mechanism in action is so cool

why are they not popular anymore?  do our models have too many layers for us to understand now?  or are attention maps just not useful?
Cursor (@cursor_ai) 's Twitter Profile Photo

Cursor 1.0 is out now! Cursor can now review your code, remember its mistakes, and work on dozens of tasks in the background.

Gabriele Corso (@gabricorso) 's Twitter Profile Photo

Excited to unveil Boltz-2, our new model capable not only of predicting structures but also binding affinities! Boltz-2 is the first AI model to approach the performance of FEP simulations while being more than 1000x faster! All open-sourced under MIT license! A thread… 🤗🚀

Gabriel Rocklin (@grocklin) 's Twitter Profile Photo

These results are amazing: AF2 struggles to predict structures of non-ideal de novo proteins, but fine-tuning on 10k design models (no exp validation) improves this, AND the fine tuning *works better* if we only tune on an experimentally stable subset (6k) of the 10k models

Aviv Spinner (@avivspinner) 's Twitter Profile Photo

What would our data landscapes look like if we could biochemically characterize 100s, 1000s, 10^n evolutionary sequences? Papers like this make strides towards that moonshot dream!! Such neat work from Margaux Pinney, Ph.D. lab! science.org/doi/10.1126/sc…

Pascal Notin (@notinpascal) 's Twitter Profile Photo

🚨 New paper 🚨 RNA modeling just got its own Gym! 🏋️ Introducing RNAGym, large-scale benchmarks for RNA fitness and structure prediction. 🧵 1/9

🚨 New paper 🚨 RNA modeling just got its own Gym! 🏋️ Introducing RNAGym, large-scale benchmarks for RNA fitness and structure prediction.
🧵 1/9
Bharti Singal, PhD (@singalbharti) 's Twitter Profile Photo

Congratulations to Yousuf A. Khan and co-authors on the latest work (rdcu.be/euBhY), which reveals a unique “side loading” mechanism for Sec18/NSF-driven SNARE disassembly—offering fresh insight into membrane fusion! I’m proud to have collaborated on this work.

Jay Bhattacharya, MD, PhD (@nihdirector_jay) 's Twitter Profile Photo

This 4th of July, I’m reflecting on the freedoms that fuel scientific rigor and public health. From all of us at NIH, wishing you a safe and healthy Independence Day!