Jungyoon Lee (@yololulu_) 's Twitter Profile
Jungyoon Lee

@yololulu_

Master's @ Mila & Ex-finance Girlie @ JPMorgan | Generative Models, AI4Science, ML for chemistry

ID: 1496813760123650053

linkhttps://www.linkedin.com/in/jungyoon-lee-6a9aab139/ calendar_today24-02-2022 11:46:39

32 Tweet

78 Followers

201 Following

Olexandr Isayev 🇺🇦🇺🇸 (@olexandr) 's Twitter Profile Photo

AI in drug discovery is like teenage sex: everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it.

Kirill Neklyudov (@k_neklyudov) 's Twitter Profile Photo

🧵(1/5) Have you ever wanted to combine different pre-trained diffusion models but don't have time or data to retrain a new, bigger model? 🚀 Introducing SuperDiff 🦹‍♀️ – a principled method for efficiently combining multiple pre-trained diffusion models solely during inference!

Rob Brekelmans (@brekelmaniac) 's Twitter Profile Photo

I wrote a thing about "RL or control as Bayesian inference", which encompasses - RLHF and controlled generation in LLMs - Finetuning or guidance in diffusion models - Diffusion samplers from general unnormalized densities - Sequential Monte Carlo sampling for all of the above

I wrote a thing about "RL or control as Bayesian inference", which encompasses
- RLHF and controlled generation in LLMs
- Finetuning or guidance in diffusion models
- Diffusion samplers from general unnormalized densities
- Sequential Monte Carlo sampling for all of the above
Avery Ryoo (@averyryoo) 's Twitter Profile Photo

Looking for an undergrad volunteer who's interested in SSMs + transformers for neural decoding/BCIs at Mila! Strong coding + Pytorch skills are a must. Please DM/email me your CV + interests (priority given to those based in Montréal). Thanks! 🧠🤖

Joey Bose (@bose_joey) 's Twitter Profile Photo

Did you miss the ICML deadline? Why not consider submitting your cool work to our "Frontiers of Probabilistic Inference: Learning meets Sampling" workshop at #ICLR2025 ICLR 2026 NEW UPDATED DEADLINE: Feb 10th AoE website: sites.google.com/view/fpiworksh…

Did you miss the ICML deadline? Why not consider submitting your cool work to our "Frontiers of Probabilistic Inference: Learning meets Sampling" workshop at #ICLR2025 <a href="/iclr_conf/">ICLR 2026</a> 

NEW UPDATED DEADLINE: Feb 10th AoE 
website: sites.google.com/view/fpiworksh…
Joey Bose (@bose_joey) 's Twitter Profile Photo

New work on scaling Boltzmann generators!!!! We went back to the basics for this one: 1.) no explicit equivariance 2.) more scalable transformer based flows than flow matching!!! Work co-led with charliebtan

Kirill Neklyudov (@k_neklyudov) 's Twitter Profile Photo

SuperDiff goes super big! - Spotlight at #ICLR2025!🥳 - Stable Diffusion XL pipeline on HuggingFace huggingface.co/superdiff/supe… made by Viktor Ohanesian - New results for molecules in camera-ready arxiv.org/abs/2412.17762 Let's celebrate with a prompt guessing game in the thread👇

SuperDiff goes super big!
- Spotlight at #ICLR2025!🥳
- Stable Diffusion XL pipeline on HuggingFace huggingface.co/superdiff/supe… made by <a href="/OhanesianViktor/">Viktor Ohanesian</a>
- New results for molecules in camera-ready arxiv.org/abs/2412.17762 
Let's celebrate with a prompt guessing game in the thread👇
Arne Schneuing (@rneschneuing) 's Twitter Profile Photo

The code & camera-ready version of our #ICLR2025 paper on "Multi-domain Distribution Learning for De Novo Drug Design" are now available 📚 Paper: openreview.net/forum?id=g3VCI… 💻 Code: github.com/LPDI-EPFL/Drug… (1/4)

Gina El Nesr (@ginaelnesr) 's Twitter Profile Photo

Protein function often depends on protein dynamics. To design proteins that function like natural ones, how do we predict their dynamics? Hannah Wayment-Steele and I are thrilled to share the first big, experimental datasets on protein dynamics and our new model: Dyna-1! 🧵

Protein function often depends on protein dynamics. To design proteins that function like natural ones, how do we predict their dynamics?

<a href="/HWaymentSteele/">Hannah Wayment-Steele</a> and I are thrilled to share the first big, experimental datasets on protein dynamics and our new model: Dyna-1!

🧵
VantAI (@vant_ai) 's Twitter Profile Photo

Announcing Neo-1: the world’s most advanced atomistic foundation model, unifying structure prediction and all-atom de novo generation for the first time - to decode and design the structure of life 🧵(1/10)

Majdi Hassan (@majdi_has) 's Twitter Profile Photo

(1/n)🚨You can train a model solving DFT for any geometry almost without training data!🚨 Introducing Self-Refining Training for Amortized Density Functional Theory — a variational framework for learning a DFT solver that predicts the ground-state solutions for different

Marta Skreta (@martoskreto) 's Twitter Profile Photo

🧵(1/6) Delighted to share our ICML Conference 2025 spotlight paper: the Feynman-Kac Correctors (FKCs) in Diffusion Picture this: it’s inference time and we want to generate new samples from our diffusion model. But we don’t want to just copy the training data – we may want to sample

Kirill Neklyudov (@k_neklyudov) 's Twitter Profile Photo

(1/n) Sampling from the Boltzmann density better than Molecular Dynamics (MD)? It is possible with PITA 🫓 Progressive Inference Time Annealing! A spotlight GenBio Workshop @ ICML25 of ICML Conference 2025! PITA learns from "hot," easy-to-explore molecular states 🔥 and then cleverly "cools"

(1/n) Sampling from the Boltzmann density better than Molecular Dynamics (MD)? It is possible with PITA 🫓 Progressive Inference Time Annealing! A spotlight <a href="/genbio_workshop/">GenBio Workshop @ ICML25</a> of <a href="/icmlconf/">ICML Conference</a> 2025!

PITA learns from "hot," easy-to-explore molecular states 🔥 and then cleverly "cools"
Alex Tong (@alexandertong7) 's Twitter Profile Photo

A bit of backstory on PITA: the project started with a key goal—to fix the inherent bias in prior diffusion samplers (like iDEM!). PITA leverages importance sampling to guarantee correctness. This commitment to unbiasedness is what gives PITA its power. See thread for details👇

Joey Bose (@bose_joey) 's Twitter Profile Photo

🎉Personal update: I'm thrilled to announce that I'm joining Imperial College London Imperial College London as an Assistant Professor of Computing Imperial Computing starting January 2026. My future lab and I will continue to work on building better Generative Models 🤖, the hardest

Kirill Neklyudov (@k_neklyudov) 's Twitter Profile Photo

1/ Where do Probabilistic Models, Sampling, Deep Learning, and Natural Sciences meet? 🤔 The workshop we’re organizing at #NeurIPS2025! 📢 FPI@NeurIPS 2025: Frontiers in Probabilistic Inference – Learning meets Sampling Learn more and submit → fpiworkshop.org

Mathilde Papillon🦋 mathildepapillon .bsky .social (@mathildepapillo) 's Twitter Profile Photo

Our illustrated guide to non-Euclidian ML is finally published! Check it out for ⭐️ gorgeous figures (with new additions!) on topology, algebra, and geometry in the field ⭐️ broken down tables for easy reading ⭐️ accessible text, additional refs, and more iopscience.iop.org/article/10.108…

Our illustrated guide to non-Euclidian ML is finally published!
Check it out for 
⭐️ gorgeous figures (with new additions!) on topology, algebra, and geometry in the field
⭐️ broken down tables for easy reading
⭐️ accessible text, additional refs, and more
iopscience.iop.org/article/10.108…
Pranam Chatterjee (@pranamanam) 's Twitter Profile Photo

Lots of hype around multimodal FMs, virtual cells (and labs?), all-atom design...I really think core algorithms (not just scale/integration) will solve the next problems in AIxBio. Take Transition Path Sampling: models transitions for dynamics, optimization, and cell fate. 👇

Phil Fradkin (@phil_fradkin) 's Twitter Profile Photo

The news is out! We're starting Blank Bio to build a computational toolkit assisted with RNA foundation models. If you want to see my flip between being eerily still and overly animated check out the video below! The core hypothesis is that RNA is the most customizable molecule