Julius Berner @ICLR‘25 (@julberner) 's Twitter Profile
Julius Berner @ICLR‘25

@julberner

Research scientist @nvidia | postdoc @caltech | PhD @univienna | former research intern @MetaAI and @nvidia | views are my own

ID: 1039458967112613888

linkhttps://jberner.info calendar_today11-09-2018 10:21:44

136 Tweet

930 Followers

286 Following

Jean Kossaifi (@jeankossaifi) 's Twitter Profile Photo

Introducing NeuralOperator 1.0: a Python library that aims at democratizing neural operators for scientific applications by providing all the tools for learning neural operators in PyTorch : state-of-the-art models, built-in trainers for quick starting and modular neural operator

Jean Kossaifi (@jeankossaifi) 's Twitter Profile Photo

This release was long in the making and the result of a large group effort. Check out our white paper: arxiv.org/abs/2412.10354 With Zongyi Li, Nikola Kovachki, David Pitt, Miguel Liu-Schiaffini, @Robertljg, Boris Bonev, Kamyar Azizzadenesheli, Julius Berner and Prof. Anima Anandkumar

malkin1729 (@felineautomaton) 's Twitter Profile Photo

Happy to share our latest work on #diffusion models without data: building theoretical bridges between existing methods, analysing their continuous-time asymptotics, and showing some cool practical implications. arxiv.org/abs/2501.06148 #MachineLearning 1/9

Happy to share our latest work on #diffusion models without data: building theoretical bridges between existing methods, analysing their continuous-time asymptotics, and showing some cool practical implications. arxiv.org/abs/2501.06148 #MachineLearning 1/9
Marcin Sendera (@marcinsendera) 's Twitter Profile Photo

Happy to share one of my last works! If you are interested in diffusion samplers, please take a look🙃! Many thanks for all my colleagues for their intensive work and fruitful collaboration, especially for malkin1729 for leading this project! Stay tuned for the future ones!

Lorenz Richter @ICLR'25 (@lorenz_richter) 's Twitter Profile Photo

Our new work arxiv.org/pdf/2501.06148 studies connections between discrete and continuous time diffusion samplers - after all, most things are very much related (GFlowNets, optimal control, path measures, PDEs). This allows us to reach faster convergence by randomized time steps.

Prof. Anima Anandkumar (@animaanandkumar) 's Twitter Profile Photo

Thrilled to announce our paper "Robust Representation Consistency Model (rRCM)" was accepted to #ICLR2025. It combines contrastive learning with consistency training to enhance robust representation learning and sets a new standard in certified robustness. Jiachen Lei Julius Berner

pan tom (@daspantom) 's Twitter Profile Photo

We are hiring an intern on generative models for proteins for 6-12 months and typically results in publication. Find out what science is like in a bio-ml industry research lab. Bonus: float along the Rhine to work (perfectly normal commute here):tinyurl.com/3x7ef6z5

Shreyas Padhy (@shreyaspadhy) 's Twitter Profile Photo

Thanks for the kind words Arnaud Doucet ! I wanted to shout-out some other great work in the same vein as us - arxiv.org/abs/2501.06148 (Julius Berner @ICLR‘25, Lorenz Richter, Marcin Sendera et al) arxiv.org/abs/2410.02711 (Michael Albergo @ICLR2025 et al) arxiv.org/abs/2412.07081 (Junhua Chen et al)

Lorenz Richter @ICLR'25 (@lorenz_richter) 's Twitter Profile Photo

Our new work arxiv.org/pdf/2503.01006 extends the theory of diffusion bridges to degenerate noise settings, including underdamped Langevin dynamics (with Denis Blessing, Julius Berner). This enables more efficient diffusion-based sampling with substantially fewer discretization steps.

Our new work arxiv.org/pdf/2503.01006 extends the theory of diffusion bridges to degenerate noise settings, including underdamped Langevin dynamics (with <a href="/DenBless94/">Denis Blessing</a>, <a href="/julberner/">Julius Berner</a>). This enables more efficient diffusion-based sampling with substantially fewer discretization steps.
Julius Berner @ICLR‘25 (@julberner) 's Twitter Profile Photo

Reach out if you’re also at #ICLR25 and drop by our posters: 1️⃣Diffusion samplers & SMC: x.com/julberner/stat… 2️⃣Contrastive learning & consistency models: x.com/animaanandkuma… 3️⃣Underdamped diffusions (also come to our oral at the FPI workshop): x.com/lorenz_richter…

Reach out if you’re also at #ICLR25 and drop by our posters:

1️⃣Diffusion samplers &amp; SMC: x.com/julberner/stat…

2️⃣Contrastive learning &amp; consistency models: x.com/animaanandkuma…

3️⃣Underdamped diffusions (also come to our oral at the FPI workshop): x.com/lorenz_richter…
Arash Vahdat (@arashvahdat) 's Twitter Profile Photo

🔥📔 This week at #ICLR2025, our fundamental generative AI research team (GenAIR) is (co-)presenting 11 papers, 6 of which were developed or led primarily by our team members. Below, I am listing our main papers with a one-sentence summary.

Karsten Kreis (@karsten_kreis) 's Twitter Profile Photo

🔥 I'm at ICLR'25 in Singapore this week - happy to chat! 📜 With wonderful co-authors, I'm co-presenting 4 main conference papers and 3 GEMBio Workshop papers (gembio.ai), and I contribute to a panel (synthetic-data-iclr.github.io). 🧵 Overview in thread. (1/n)

🔥 I'm at ICLR'25 in Singapore this week - happy to chat!

📜 With wonderful co-authors, I'm co-presenting 4 main conference papers and 3 <a href="/gembioworkshop/">GEMBio Workshop</a> papers (gembio.ai), and I contribute to a panel (synthetic-data-iclr.github.io).

🧵 Overview in thread.

(1/n)
Alex Tong (@alexandertong7) 's Twitter Profile Photo

FPI workshop off to a great start with Emtiyaz Khan talking about Adaptive Bayesian Intelligence! Come check it out in Peridot 202-203 #FPIWorkshop #ICLR25

FPI workshop off to a great start with Emtiyaz Khan talking about Adaptive Bayesian Intelligence! Come check it out in Peridot 202-203 #FPIWorkshop #ICLR25
Prof. Anima Anandkumar (@animaanandkumar) 's Twitter Profile Photo

Excited to introduce our latest work, Guided Diffusion Sampling on Function Spaces (FunDPS) (arxiv.org/abs/2505.17004) - a discretization-agnostic generative framework for solving PDE-based forward and inverse problems. Diffusion-based posterior sampling on function spaces: Our

Wenda Chu (@wendachu32619) 's Twitter Profile Photo

I'm excited to share our new work, DAPS, an inference-time diffusion sampler for solving general inverse problems without fine-tuning. DAPS significantly improves sample quality across multiple image restoration tasks, particularly in complicated nonlinear inverse problems. DAPS