Artëm Sobolev (@art_sobolev) 's Twitter Profile
Artëm Sobolev

@art_sobolev

Preserving the entropy. Ex @BayesGroup

ID: 111886381

linkhttp://artem.sobolev.name/ calendar_today06-02-2010 13:37:42

843 Tweet

847 Followers

547 Following

Kirill Neklyudov (@k_neklyudov) 's Twitter Profile Photo

Our #icml2020 paper "Involutive MCMC: a Unifying Framework" is now available on arxiv arxiv.org/abs/2006.16653. It describes many MCMC algorithms from a single perspective. Work with Max Welling, Evgenii Egorov, Dmitry Vetrov

Our #icml2020 paper "Involutive MCMC: a Unifying Framework" is now available on arxiv arxiv.org/abs/2006.16653. It describes many MCMC algorithms from a single perspective.

Work with <a href="/wellingmax/">Max Welling</a>, <a href="/eeevgen/">Evgenii Egorov</a>, Dmitry Vetrov
Artëm Sobolev (@art_sobolev) 's Twitter Profile Photo

Finally someone scaled up VAEs! Since Glow (and autoregressive models) we knew likelihood-based methods can generate realistic images at the expense of requiring lots of parameters, so the same should have been possible for VAEs. And now we know it is!

Miles Cranmer (@milescranmer) 's Twitter Profile Photo

Here's a condensed version of the matplotlib cheatsheets so it can fit a desktop background (github.com/matplotlib/che…) Full image: drive.google.com/file/d/1kwYFaR… and vectorized .svg, with the non-standard fonts outlined: drive.google.com/file/d/1b2LtZU… Thanks Nicolas P. Rougier et al for making it!

Here's a condensed version of the matplotlib cheatsheets so it can fit a desktop background
(github.com/matplotlib/che…)

Full image: drive.google.com/file/d/1kwYFaR…

and vectorized .svg, with the non-standard fonts outlined: drive.google.com/file/d/1b2LtZU…

Thanks <a href="/NPRougier/">Nicolas P. Rougier</a> et al for making it!
Artëm Sobolev (@art_sobolev) 's Twitter Profile Photo

Really liked this talk by Kilian Weinberger at ML-RSA workshop #NeurIPS2020 on why just beating the SotA isn't something one should strive for and to keep digging until a simple explanation of what makes it a SotA is found. slideslive.com/38938218/the-i…

Matplotlib (@matplotlib) 's Twitter Profile Photo

We have this awesome function called sublots_mosaic where you can pass us a layout id'ed on name axd = plt.subplot_mosaic( """ ABD CCD """) matplotlib.org/stable/tutoria…

We have this awesome function called sublots_mosaic where you can pass us a layout id'ed on name 
axd = plt.subplot_mosaic(
    """
    ABD
    CCD
    """) 

matplotlib.org/stable/tutoria…
Ekaterina Lobacheva (@katelobacheva) 's Twitter Profile Photo

Come by our poster "On the Periodic Behavior of Neural Network Training with Batch Normalization and Weight Decay" tomorrow at #NeurIPS2021 poster session 6! With Maxim Kodryan, Nadia Chirkova, Andrey Malinin, and Dmitry Vetrov. Poster: nips.cc/virtual/2021/p…

Artëm Sobolev (@art_sobolev) 's Twitter Profile Photo

You can play chess with #ChatGPT, but it appears it's not very good at it. Curious to see how much better it could get with simple fine-tuning on a large base of chess plays.

You can play chess with #ChatGPT, but it appears it's not very good at it. Curious to see how much better it could get with simple fine-tuning on a large base of chess plays.
Artëm Sobolev (@art_sobolev) 's Twitter Profile Photo

Just stumbled upon this gem by John Schulman – a very interesting read on estimating KL divergences in practice and possible bias-variance tradeoffs joschu.net/blog/kl-approx…

ksa 🏴‍☠️ (@kosa12matyas) 's Twitter Profile Photo

Best paper I've read so far this month: All elementary functions (sin, cos, tan, exp, log, powers, roots, hyperbolic functions, π, e, and even basic arithmetic) can be generated from just one binary operator: eml(x, y) = exp(x) − ln(y) …plus the constant 1.

Best paper I've read so far this month: 

All elementary functions (sin, cos, tan, exp, log, powers, roots, hyperbolic functions, π, e, and even basic arithmetic) can be generated from just one binary operator:
eml(x, y) = exp(x) − ln(y)
…plus the constant 1.