Jiajun He (@jiajunhe614) 's Twitter Profile
Jiajun He

@jiajunhe614

PhD student in machine learning

ID: 1819685032916615168

linkhttps://jiajunhe98.github.io/ calendar_today03-08-2024 10:41:45

18 Tweet

37 Followers

61 Following

Mingtian@ICLR2025 (@mingtianzhang) 's Twitter Profile Photo

Reverse KL typically suffers from the mode collapse problem. We propose minimizing the reverse KL in the diffusion space, which significantly improves mode coverage of the learned generative models.

Zijing Ou (@jzinou) 's Twitter Profile Photo

🚀 We propose a new way to estimate the denoising covariance in diffusion models, which can lead to lower estimation error, better FID and likelihood tradeoff with fewer steps, and improved generation diversity. ArXiv link: arxiv.org/abs/2406.10808. More details below👇

🚀 We propose a new way to estimate the denoising covariance in diffusion models, which can lead to lower estimation error, better FID and likelihood tradeoff with fewer steps, and improved generation diversity.
ArXiv link: arxiv.org/abs/2406.10808. 
More details below👇
Shreyas Padhy (@shreyaspadhy) 's Twitter Profile Photo

Checkout this paper with some really interesting insights led by the excellent Jiajun He and Yuanqi Du - TLDR: Neural density samplers really need guidance imposed through Langevin annealing to make them work well