Théo Uscidda (@theo_uscidda) 's Twitter Profile
Théo Uscidda

@theo_uscidda

PhD @ENSAEparis | currently @AmazonScience fundamental research | past @FlatironInst, @fabian_theis lab @HelmholtzMunich.

ID: 1716834060306382848

linkhttps://theouscidda6.github.io/ calendar_today24-10-2023 15:08:41

53 Tweet

246 Followers

234 Following

Tom Sander @NeurIPS (@rednastom) 's Twitter Profile Photo

🎉Exciting news from AI at Meta FAIR! We've released a Watermark Anything Model under the MIT license! It was announced yesterday: ai.meta.com/blog/meta-fair… Great project with Pierre Fernandez et al. ! We're close to hitting 1,000 stars on GitHub. Give it a try: github.com/facebookresear… 🚀

arXiv math.ST Statistics Theory (@mathstb) 's Twitter Profile Photo

Nayel Bettache: Bivariate Matrix-valued Linear Regression (BMLR): Finite-sample performance under Identifiability and Sparsity Assumptions arxiv.org/abs/2412.17749 arxiv.org/pdf/2412.17749

CNRS Sciences informatiques (@cnrsinformatics) 's Twitter Profile Photo

#Optimisation | Gabriel Peyré, directeur de recherche CNRS au DMA, intervient lors de la conférence optimisation pour parler des enjeux du #transport optimal dans la compréhension de la #génomique. ➡️ youtube.com/watch?v=vQOF-3… 🤝École normale supérieure | PSL

#Optimisation | Gabriel Peyré, directeur de recherche CNRS au DMA, intervient lors de la conférence optimisation pour parler des enjeux du #transport optimal dans la compréhension de la #génomique.
➡️ youtube.com/watch?v=vQOF-3…
🤝<a href="/ENS_ULM/">École normale supérieure | PSL</a>
Arnaud Doucet (@arnauddoucet1) 's Twitter Profile Photo

Speculative sampling accelerates inference in LLMs by drafting future tokens which are verified in parallel. With Valentin De Bortoli , Alexandre Galashov & Arthur Gretton, we extend this approach to (continuous-space) diffusion models: arxiv.org/abs/2501.05370

Speculative sampling accelerates inference in LLMs by drafting future tokens which are verified in parallel. With <a href="/ValentinDeBort1/">Valentin De Bortoli</a> , <a href="/agalashov/">Alexandre Galashov</a> &amp; <a href="/ArthurGretton/">Arthur Gretton</a>, we extend this approach to (continuous-space) diffusion models: arxiv.org/abs/2501.05370
Théo Uscidda (@theo_uscidda) 's Twitter Profile Photo

Our work on geometric disentangled representation learning has been accepted to ICLR 2025! 🎊See you in Singapore if you want to understand this gif better :)

Explainable Machine Learning (@explainableml) 's Twitter Profile Photo

(3/4) Disentangled Representation Learning with the Gromov-Monge Gap A fantastic work contributed by Théo Uscidda and Luca Eyring , with Karsten Roth, Fabian Theis, Zeynep Akata, and Marco Cuturi. 📖 [Paper]: arxiv.org/abs/2407.07829

Explainable Machine Learning (@explainableml) 's Twitter Profile Photo

Massimo Bini shuchen wu (4/4) Disentangled Representation Learning with the Gromov-Monge Gap Luca Eyring will present GMG, a novel regularizer that matches prior distributions with minimal geometric distortion. 📍 Hall 3 + Hall 2B #603 🕘 Sat Apr 26, 10:00 a.m.–12:30 p.m.

Fabian Theis (@fabian_theis) 's Twitter Profile Photo

1/ Excited to share CellFlow, a new approach for complex perturbation modeling in single-cell genomics based on flow matching. From cytokine screens to cell fate and organoid engineering, we show CellFlow’s broad power across many diverse tasks. 👉 Paper: biorxiv.org/content/10.110…

1/ Excited to share CellFlow, a new approach for complex perturbation modeling in single-cell genomics based on flow matching. From cytokine screens to cell fate and organoid engineering, we show CellFlow’s broad power across many diverse tasks.
 👉 Paper: biorxiv.org/content/10.110…
Luca Eyring @ICLR (@lucaeyring) 's Twitter Profile Photo

Catch me at #ICLR2025 today - I’ll be presenting our work on Quadratic OT for Representation Learning, the Gromov-Monge Gap, at Hall 2B, poster #603 during the morning session (10:00–12:30)!

Xavier Gonzalez (@xavierjgonzalez) 's Twitter Profile Photo

NeurIPS Conference overleaf has crashed, any chance we could just merge the full paper and supplemental deadlines, for a single deadline of May 22 (like last year)?