Guan-Horng Liu (@guanhorng_liu) 's Twitter Profile
Guan-Horng Liu

@guanhorng_liu

Research Scientist @MetaAI (FAIR NYC) • Schrödinger Bridge, diffusion, flow, stochastic optimal control • prev ML PhD @GeorgiaTech 🚀

ID: 908526953061392385

linkhttp://ghliu.github.io calendar_today15-09-2017 03:04:40

202 Tweet

749 Followers

324 Following

Gabriel Peyré (@gabrielpeyre) 's Twitter Profile Photo

In case you are wondering, this paper proves that, in general, diffusion models do not define optimal transport maps. The proof is not straightforward though (diffusion maps are optimal maps in 1D, for radial measure and for Gaussians ...) cvgmt.sns.it/media/doc/pape…

In case you are wondering, this paper proves that, in general, diffusion models do not define optimal transport maps.  The proof is not straightforward though (diffusion maps are optimal maps in 1D, for radial measure and for Gaussians ...) cvgmt.sns.it/media/doc/pape…
Alexander H. Liu (@alex_h_liu) 's Twitter Profile Photo

Presenting 2 works at #ICLR tomorrow! 📃Generative Pre-training for Speech with Flow Matching 📍5/9 (Wed) Hall B #68, 10:45am-12:45pm 📃Listen, Think, and Understand 📍5/9 (Wed) Hall B #60, 4:30pm-6:30pm Please stop by if you're interested! More details...👇

Presenting 2 works at #ICLR tomorrow!

📃Generative Pre-training for Speech with Flow Matching
📍5/9 (Wed) Hall B #68, 10:45am-12:45pm

📃Listen, Think, and Understand
📍5/9 (Wed) Hall B #60, 4:30pm-6:30pm

Please stop by if you're interested! More details...👇
Wei Deng (@dwgreyman) 's Twitter Profile Photo

[1/4] Glad to see Variational Schrödinger Diffusion Model arxiv.org/pdf/2405.04795 is accepted by ICML’24. We made Schrödinger diffusion more scalable by linearizing the forward diffusion via variational inference and deriving the appealing closed-form update of backward scores.

[1/4] Glad to see Variational Schrödinger Diffusion Model  arxiv.org/pdf/2405.04795 is accepted by ICML’24. We made Schrödinger diffusion more scalable by linearizing the forward diffusion via variational inference and deriving the appealing closed-form update of backward scores.
Frank Nielsen (@frnknlsn) 's Twitter Profile Photo

Well-known: Kullback-Leibler divergence between two normals p and q amounts to a Bregman divergence ⬇️ New: KLD between * truncated* normals p and q with support(p)⊆support(q) amounts to a generalized Bregman pseudo-divergence with two generators 👉 mdpi.com/1099-4300/24/3…

Well-known: 
Kullback-Leibler divergence between
two normals p and q amounts to a Bregman divergence
⬇️
New:
KLD between * truncated* normals p and q with support(p)⊆support(q)  amounts to a  generalized Bregman pseudo-divergence with two generators

 👉 mdpi.com/1099-4300/24/3…
Guan-Horng Liu (@guanhorng_liu) 's Twitter Profile Photo

🚨🚨 Don't forget to submit your manuscripts to the #Structured #Probabilistic #Inference & #Generative #Modeling workshop in #ICML2024! 🗓️ Deadline: May 25, AoE 👀 Link & info: spigmworkshop2024.github.io/submissions/

Guan-Horng Liu (@guanhorng_liu) 's Twitter Profile Photo

🚨🚨 Deadline of #SPIGM workshop at #ICML2024 is now #May27 #AoE, with page limit relaxing from #4to8 pages. For any questions, join our Slack (faster) or send us emails (slower) Slack invite link: see spigmworkshop2024.github.io) ✉️: [email protected]

Keenan Crane (@keenanisalive) 's Twitter Profile Photo

Interested in learning about differential geometry and its connection to geometric computing? All material from the Carnegie Mellon University course on #DiscreteDifferentialGeometry has been collected in a new webpage (videos, code, exercises, etc.). Check it out! geometry.cs.cmu.edu/ddg

Yuchen Zhu (@yuchen4975) 's Twitter Profile Photo

(1/6) Can we generate data on manifold without worrying about its curved geometry🤔? Trivialized Diffusion Model (TDM) provides a way for Lie groups. It introduces trivialized momentum variable to allow Euclidean score function (easier to learn and use)🎯! arxiv.org/abs/2405.16381

Lorenz Richter (@lorenz_richter) 's Twitter Profile Photo

I gave a talk on our latest work on the connections between dynamical systems, PDEs, control and path space measures for sampling from densities at the The Fields Institute in Toronto last week (with Julius Berner, Jingtong (Jeff) Sun). You can find the recording here: youtube.com/watch?v=ue8liZ…

Brandon Amos (@brandondamos) 's Twitter Profile Photo

📢 In our new UAI 2024 paper, we do neural optimal transport with costs defined by a Lagrangian (e.g., for physical knowledge, constraints, and geodesics) Paper: arxiv.org/abs/2406.00288 JAX Code: github.com/facebookresear… (w/ A. Pooladian, C. Domingo-Enrich, Ricky T. Q. Chen)

Keenan Crane (@keenanisalive) 's Twitter Profile Photo

Need to solve PDEs, and struggle with meshing? Heard about "Walk on Spheres," but didn't know where to start? Check out the awesome intro course by Rohan Sawhney and Bailey Miller, just posted from #SGP2024: youtube.com/watch?v=1u-5b4…

Yuanqi Du (@yuanqid) 's Twitter Profile Photo

If you are traveling to ICML Conference 2024 next week, don't miss our workshop on structured prob. inference and generative modeling on Friday in room Lehar 3! We have a stellar list of speakers to share the frontiers of sampling, Bayesian inference, generative models, and beyond!

If you are traveling to <a href="/icmlconf/">ICML Conference</a> 2024 next week, don't miss our workshop on structured prob. inference and generative modeling on Friday in room Lehar 3! We have a stellar list of speakers to share the frontiers of sampling, Bayesian inference, generative models, and beyond!
Dinghuai Zhang 张鼎怀 (@zdhnarsil) 's Twitter Profile Photo

I am going to attend #ICML2024 and organize SPIGM workshop on July 26th ✈️ If you wanna talk about generative modeling, probabilistic inference, etc, please consider joining us 🙉 Check our schedule at spigmworkshop2024.github.io

Brandon Amos (@brandondamos) 's Twitter Profile Photo

Some related papers for our recent Lagrangian OT: 0. On amortizing convex conjugates for OT 1. Neural Lagrangian Schrödinger Bridge 2a. Deep Generalized Schrödinger Bridge 2b. DGSB Matching 3. Wasserstein Lagrangian Flows 4. Metric learning via OT A 🧵 summarizing these ❤️

James Thornton (@jamestthorn) 's Twitter Profile Photo

Is optimal transport always optimal? By learning ground costs (ICNNs+flows) one can structure OT maps to use prior information Brilliant PhD student, Samuel Howard, will be presenting Cost-Parameterized Monge Maps at the Differentiable AE workshop differentiable.xyz/papers-2024/pa…

Is optimal transport always optimal? 

By learning ground costs (ICNNs+flows) one can structure OT maps to use prior information

Brilliant PhD student, Samuel Howard, will be presenting  Cost-Parameterized Monge Maps at the  Differentiable AE workshop

differentiable.xyz/papers-2024/pa…
Frank Nielsen (@frnknlsn) 's Twitter Profile Photo

At maximum likelihood estimator, observed Fisher information = Fisher information From 2nd Taylor expansion of likelihood: - likelihood curvature = Fisher information - radius of osculating circle=Variance of MLE for large sample size 👉doi.org/10.3390/e22101…

At maximum likelihood estimator, 
observed Fisher information = Fisher information

From 2nd Taylor expansion of likelihood: 

- likelihood curvature = Fisher information  
- radius of osculating circle=Variance of MLE for large sample size   

👉doi.org/10.3390/e22101…
Guan-Horng Liu (@guanhorng_liu) 's Twitter Profile Photo

📢 Life update: I've joined AI at Meta (#FAIR #NYC) as #ResearchScientist this week 🍎! Extremely grateful for everyone who's supported me along the way 🙂 I'll keep working on flow/diffusion for structural problems, Schrodinger bridges, optimization, stochastic control, & more🏃🏻‍♂️