Weilong Chen (@vollon3) 's Twitter Profile
Weilong Chen

@vollon3

PhD Student @TU_Muenchen Previous @TheDPTechnology | @chalmersuniv | @VolvoGroup | BSc. NUDT 18'
🦋: weilong30.bsky.social

ID: 1103932792608481280

calendar_today08-03-2019 08:17:42

28 Tweet

63 Takipçi

934 Takip Edilen

Simon Olsson (@smnlssn) 's Twitter Profile Photo

The Chalmers AI4Science speakers for the spring term have just been announced please check the homepage for all the details: psolsson.github.io/AI4ScienceSemi…

charliebtan (@charliebtan) 's Twitter Profile Photo

New preprint! 🚨 We scale equilibrium sampling to hexapeptide (in cartesian coordinates!) with Sequential Boltzmann generators!  📈 🤯 Work with Joey Bose, Chen Lin, Leon Klein, Michael Bronstein and Alex Tong Thread 🧵 1/11

New preprint! 🚨 We scale equilibrium sampling to hexapeptide (in cartesian coordinates!) with Sequential Boltzmann generators!  📈 🤯

Work with <a href="/bose_joey/">Joey Bose</a>, <a href="/WillLin1028/">Chen Lin</a>, <a href="/leonklein26/">Leon Klein</a>, <a href="/mmbronstein/">Michael Bronstein</a> and <a href="/AlexanderTong7/">Alex Tong</a>

Thread 🧵 1/11
Peter Holderrieth (@peholderrieth) 's Twitter Profile Photo

Our MIT class “6.S184: Introduction to Flow Matching and Diffusion Models” is now available on YouTube! We teach state-of-the-art generative AI algorithms for images, videos, proteins, etc. together with the mathematical tools to understand them. diffusion.csail.mit.edu (1/4)

Our MIT class “6.S184: Introduction to Flow Matching and Diffusion Models” is now available on YouTube!

We teach state-of-the-art generative AI algorithms for images, videos, proteins, etc. together with the mathematical tools to understand them.

diffusion.csail.mit.edu

(1/4)
Chaitanya K. Joshi @ICLR2025 🇸🇬 (@chaitjo) 's Twitter Profile Photo

Introducing All-atom Diffusion Transformers — towards Foundation Models for generative chemistry, from my internship with the FAIR Chemistry team FAIR Chemistry AI at Meta There are a couple ML ideas which I think are new and exciting in here 👇

Introducing All-atom Diffusion Transformers 

— towards Foundation Models for generative chemistry, from my internship with the FAIR Chemistry team <a href="/OpenCatalyst/">FAIR Chemistry</a> <a href="/AIatMeta/">AI at Meta</a> 

There are a couple ML ideas which I think are new and exciting in here 👇
Ross (@rssrwn) 's Twitter Profile Photo

Very excited to share our work on FLOWR! We really focused on making gen models for SBDD more practical with: 1. Much faster generation times 2. Improved 3D pose quality 3. The ability to condition on info from reference binders - eg. fragments, interactions, or scaffolds.

Christopher Kolloff (@chrisdkolloff) 's Twitter Profile Photo

New preprint alert 🚨 How can you guide diffusion and flow-based generative models when data is scarce but you have domain knowledge? We introduce Minimum Excess Work, a physics-inspired method for efficiently integrating sparse constraints. Thread below 👇arxiv.org/abs/2505.13375

New preprint alert 🚨
How can you guide diffusion and flow-based generative models when data is scarce but you have domain knowledge? We introduce Minimum Excess Work, a physics-inspired method for efficiently integrating sparse constraints.
Thread below 👇arxiv.org/abs/2505.13375
Aditi Krishnapriyan (@ask1729) 's Twitter Profile Photo

1/ Generating transition pathways (e.g., folded ↔ unfolded protein) is a huge challenge: we tackle this by combining the scalability of pre-trained, score-based generative models and statistical mechanics insights-—no training required! To appear at #ICML2025

Simon Olsson (@smnlssn) 's Twitter Profile Photo

New pre-print from PhD student Hang Zou on warm-starting the variational quantum eigensolver using flows: Flow-VQE! Flow-VQE is parameter transfer on steroids: it learns how to solve a family of related problems, dramatically reducing the aggregate compute cost!

New pre-print from PhD student Hang Zou on warm-starting the variational  quantum eigensolver using flows: Flow-VQE! Flow-VQE is parameter  transfer on steroids: it learns how to solve a family of related  problems, dramatically reducing the aggregate compute cost!
Jorge Bravo (@bravo_abad) 's Twitter Profile Photo

Enhancing machine learning potentials via transfer learning Transfer learning enables models to capitalize on previously acquired insights, thereby significantly reducing data requirements, training times, and computational costs. Sebastien Röcken and Julija Zavadlav introduce a

Enhancing machine learning potentials via transfer learning

Transfer learning enables models to capitalize on previously acquired insights, thereby significantly reducing data requirements, training times, and computational costs. Sebastien Röcken and Julija Zavadlav introduce a
Tony RuiKang OuYang (@tonyrkouyang) 's Twitter Profile Photo

Exited to share our new paper accepted by ICML 2025 👉 “PTSD: Progressive Tempering Sampler with Diffusion” , which aims to make sampling from unnormalised densities more efficient than state-of-the-art methods like parallel tempering. Check our threads below 👇

Exited to share our new paper accepted by ICML 2025
👉 “PTSD: Progressive Tempering Sampler with Diffusion”
, which aims to make sampling from unnormalised densities more efficient than state-of-the-art methods like parallel tempering.

Check our threads below 👇
Google DeepMind (@googledeepmind) 's Twitter Profile Photo

What if you could not only watch a generated video, but explore it too? 🌐 Genie 3 is our groundbreaking world model that creates interactive, playable environments from a single text prompt. From photorealistic landscapes to fantasy realms, the possibilities are endless. 🧵

MolSS Reading Group (@molss_group) 's Twitter Profile Photo

On the coming Tuesday (Aug 19th), we will have Maximilian Stupp talking about “Energy-Based Coarse-Graining in Molecular Dynamics: A Flow-Based Framework Without Data" (arxiv.org/abs/2504.20940) 🚀, from 4pm to 5pm (UK time). Join us via zoom: us05web.zoom.us/j/7780256206?p…

MolSS Reading Group (@molss_group) 's Twitter Profile Photo

On the coming Tuesday (Aug 26th), we will have Yuchen Zhu talking about “Beyond Euclidean data: Lie group and multimodal diffusion models"🚀, from 5pm to 6pm (UK time). Join us via zoom: us05web.zoom.us/j/7780256206?p… See more information below 👇

MolSS Reading Group (@molss_group) 's Twitter Profile Photo

MolSS is coming back after ICLR submission 🚀 On the coming Tuesday (7th Oct), we will have Christopher Kolloff Christopher Kolloff and Tobias Höppe Tobias Höppe to talk about “Minimium Excess Work” from 4pm to 5pm (UK time) 🔥 Join us via zoom: us05web.zoom.us/j/7780256206?p…

Simon Olsson (@smnlssn) 's Twitter Profile Photo

New pre-print from the lab -- Marginal Girsanov Reweighting: Stable Variance Reduction via Neural Ratio Estimation. We introduce Marginal Girsanov Reweighting (MGR), a way to get more stable Girsanov weights for long time horizons and large systems.

New pre-print from the lab -- Marginal Girsanov Reweighting: Stable Variance Reduction via Neural Ratio Estimation. We introduce Marginal Girsanov Reweighting (MGR), a way to get more stable Girsanov weights for long time horizons and large systems.
Simon Olsson (@smnlssn) 's Twitter Profile Photo

New preprint out! We present "Transferable Generative Models Bridge Femtosecond to Nanosecond Time-Step Molecular Dynamics,"

New preprint out!
We present "Transferable Generative Models Bridge Femtosecond to Nanosecond Time-Step Molecular Dynamics,"
Biology+AI Daily (@biologyaidaily) 's Twitter Profile Photo

Enhancing Sampling for Efficient Learning of Coarse-Grained Machine Learning Potentials 1. A new method for improving the accuracy and efficiency of coarse-grained molecular dynamics simulations has been proposed by Weilong Chen and colleagues. The study introduces an enhanced

Enhancing Sampling for Efficient Learning of Coarse-Grained Machine Learning Potentials

1. A new method for improving the accuracy and efficiency of coarse-grained molecular dynamics simulations has been proposed by Weilong Chen and colleagues. The study introduces an enhanced