Nicholas Krämer (@pnkraemer) 's Twitter Profile
Nicholas Krämer

@pnkraemer

Probabilistic numerics, state-space models, differentiable linear algebra, and of course a healthy dose of figure-making.

ID: 1235551063782158336

linkhttps://pnkraemer.github.io calendar_today05-03-2020 13:01:48

115 Tweet

543 Takipçi

337 Takip Edilen

Stefan Sommer (@stefanhsommer) 's Twitter Profile Photo

Stochastic bunnies and stochastic spheres: A function space perspective on stochastic shape evolution Libby Thomas Besnier arxiv.org/abs/2302.05382 A new intrinsic stochastic model for shape evolutions. The stochastics is modelled in a Sobolev space of maps from the ...

Vector Institute (@vectorinst) 's Twitter Profile Photo

Did you know Vector’s new Postdoc Fellow Agustinus Kristiadi moved all the way from Tübingen, Germany to Toronto this past month? Learn more about why he decided to join Vector in his full YouTube interview here🧵👉 youtu.be/ql68hhFZ6vs #VectorResearcherFridays

Federico Bergamin (@fedebergamin) 's Twitter Profile Photo

On my way to #NeurIPS23 for the first time. I’ll be there presenting our work "Riemannian Laplace approximations for Bayesian Neural Networks". Work done together with Pablo Moreno-Muñoz, Søren Hauberg, and Georgios Arvanitidis. I’ll be at poster #1223 on Wednesday from 17pm (Session 4).

On my way to #NeurIPS23 for the first time. I’ll be there presenting our work "Riemannian Laplace approximations for Bayesian Neural Networks". Work done together with <a href="/pablorenoz/">Pablo Moreno-Muñoz</a>, Søren Hauberg, and <a href="/arvanitg/">Georgios Arvanitidis</a>. I’ll be at poster #1223 on Wednesday from 17pm (Session 4).
Anshuk Uppal (@sigmabayesian) 's Twitter Profile Photo

On my way 🚨✈️🚨 to NOLA. I'll be presenting our work on scalable implicit VI on Wed from 5pm (#1313). I am also looking for internships for 2024 in fundamental or applied generative modelling and uncertainty quantification. #NeurIPS2023

On my way 🚨✈️🚨 to NOLA. I'll be presenting our work on scalable implicit VI on Wed from 5pm (#1313). I am also looking for internships for 2024 in fundamental or applied generative modelling and uncertainty quantification. #NeurIPS2023
Runa Eschenhagen (@runame_) 's Twitter Profile Photo

How can Kronecker-Factored Approximate Curvature (K-FAC) be generalised to modern deep learning architectures like transformers, graph, and convolutional neural networks? Find out in our #NeurIPS2023 spotlight paper! (1/10) arxiv.org/abs/2311.00636

Pablo Moreno-Muñoz (@pablorenoz) 's Twitter Profile Photo

We’ll be presenting our work in poster session 5 (#1214, Thursday 10:45) @ #NeurIPS2023 — Come to chat with us if you want to know more about the role of masking in SSL and its connection with Bayesian principles! ⚖️🔮

We’ll be presenting our work in poster session 5 (#1214, Thursday 10:45) @ #NeurIPS2023 — Come to chat with us if you want to know more about the role of masking in SSL and its connection with Bayesian principles! ⚖️🔮
Paul Jeha @ICLR 2025 (@jeha_paul) 's Twitter Profile Photo

Excited to share that our paper on reducing variance in diffusion models with control variates is published at the SPIGM ICML Conference workshop. Come check it out! Thanks a lot to will grathwohl Jes Frellsen @carlhenrikek Michael Riis for the collaboration! openreview.net/pdf?id=YqFIzHA…

Stefan Sommer (@stefanhsommer) 's Twitter Profile Photo

Score matching for bridges can be learned without time-reversal arxiv.org/abs/2407.15455 w/ Libby Moritz Schauer We learn grad log p(t,x; T,y) for a target y directly without reversing time by combining score matching with adjoint diffusions (Milstein 2004) that give the

Score matching for bridges can be learned without time-reversal arxiv.org/abs/2407.15455 w/ <a href="/thelibbybaker/">Libby</a> <a href="/MoritzSchauer/">Moritz Schauer</a>  We learn grad log p(t,x; T,y) for a target y directly without reversing time by combining score matching with adjoint diffusions (Milstein 2004) that give the
Federico Bergamin (@fedebergamin) 's Twitter Profile Photo

Heading to Vancouver for NeurIPS to present our paper “On Conditional Diffusion Models for PDE Simulation”. I'll be together with Sasha and Cristiana Diaconu at poster 2500 during Thursday’s late afternoon session. Looking forward exciting discussions and meeting new people!

Heading to Vancouver for NeurIPS to present our paper “On Conditional Diffusion Models for PDE Simulation”. I'll be together with Sasha and <a href="/CristianaD2202/">Cristiana Diaconu</a> at poster 2500 during Thursday’s late afternoon session. Looking forward exciting discussions and meeting new people!
Nicholas Krämer (@pnkraemer) 's Twitter Profile Photo

🛩️ On my way to #NeurIPS2024 and excited to chat about (ML applications of) linear algebra, differentiable programming, and probabilistic numerics! Feel free to DM if you’d like to meet up, hang out, and/or discuss any of these topics 😊

Nicholas Krämer (@pnkraemer) 's Twitter Profile Photo

Poster session happening *today* at 4:30 local time. *East* Exhibit Hall. Poster #3511. Looking forward to presenting this work! See you there? 🙂