Jack Mayo (@jackjmayo1) 's Twitter Profile
Jack Mayo

@jackjmayo1

ML theory PhD with @tverven. Interested mostly in parameter-free online learning, (contextual) bandits and RL. Working on something new.

ID: 1417761970183364610

linkhttp://jackjmayo.nl calendar_today21-07-2021 08:23:07

76 Tweet

281 Takipçi

644 Takip Edilen

Luca Viano (@lucaviano4) 's Twitter Profile Photo

We are organizing a Multi Agent RL summer school in Lausanne :) Looking forward for the talks by Panayotis Mertikopoulos, Julia Olkhovskaya, Kaiqing Zhang , Chi Jin , Ioannis Panageas , Caglar Gulcehre Giorgia Ramponi Dec 2@ELLISUnConf, Dec 3-7@NeurIPS and Maryam Kamgarpour !! Apply here: sites.google.com/view/marl-scho…

ARLET (@arlet_workshop) 's Twitter Profile Photo

🧵 Thrilled to announce the #ICML RL workshop 'Aligning RL Experimentalists and Theorists'! We will have several talks and a panel delivered by a super lineup of speakers: Martha White, Sham Kakade, Amy Zhang, Dylan Foster, Niao He, Sergey Levine, and Mengdi Wang. 1/3

🧵 Thrilled to announce the #ICML RL workshop 'Aligning RL Experimentalists and Theorists'! We will have several talks and a panel delivered by a super lineup of speakers: <a href="/white_martha/">Martha White</a>, <a href="/ShamKakade6/">Sham Kakade</a>, <a href="/yayitsamyzhang/">Amy Zhang</a>, Dylan Foster, Niao He, <a href="/svlevine/">Sergey Levine</a>, and <a href="/MengdiWang10/">Mengdi Wang</a>.

1/3
Csaba Szepesvari (@csabaszepesvari) 's Twitter Profile Photo

Now that the #COLT2024 decisions are out, I'd like to announce a workshop that we are organize that will happen just before COLT. The workshop theme is RL Theory. All are welcome! Details here: rltheory-workshop.github.io Please spread the word!

Antoine Moulin (@antoine_mln) 's Twitter Profile Photo

Sent your best RL research to #NeuRIPS2024? We'd love to see it at ARLET :) The submission deadline is May 29th! arlet-workshop.github.io

Max Welling (@wellingmax) 's Twitter Profile Photo

We are on the lookout for a SE lead to join us on our exciting journey to discover new materials for carbon capture. linkedin.com/posts/mghissas…

Sebastien Bubeck (@sebastienbubeck) 's Twitter Profile Photo

Lots of progress on bandit convex optimization recently arxiv.org/abs/2406.18672 arxiv.org/abs/2406.06506 arxiv.org/abs/2302.05371, I wish I could follow it more closely ... looks like Conjecture 1 from arxiv.org/abs/1607.03084 is going to be resolved soon!!!

John Hopfield (@hopfieldjohn) 's Twitter Profile Photo

This award is very hard to respond to. I have received many hundred congratulatory notes, from former students, post-docs, Princeton University juniors and seniors, funding agencies and foundations, authors, signature collectors, amateurs, elementary school neural network

Deedy (@deedydas) 's Twitter Profile Photo

All languages covey information at a similar rate when spoken (39bits/s). Languages that are spoken faster have less information density per syllable! One of the coolest results in linguistics.

All languages covey information at a similar rate when spoken (39bits/s).

Languages that are spoken faster have less information density per syllable!

One of the coolest results in linguistics.
Jack Mayo (@jackjmayo1) 's Twitter Profile Photo

Barcelona has become a veritable center of RL Theory in no small part by the efforts of Gergely and his group. If you're closing out an MSc and have interest in the topics mentioned, I could scarcely imagine a better setting in which to study them.

Jack Mayo (@jackjmayo1) 's Twitter Profile Photo

Great piece by Gene Li outlining some foundational open problems in RL theory—doubling conveniently as a guide to the more animated debates you’ll overhear at an RL Theory Virtual Seminars seminar. Especially handy if (like myself) you're accustomed to the comforts of fixed state-spaces.

Jack Mayo (@jackjmayo1) 's Twitter Profile Photo

Really cool talking with DeGatchi on the Scraping Bits Podcast podcast. We covered (convex & contextual) bandits, online convex optimisation, regret bounds, adaptive rates, and a whole host of assorted (& roughly associated) topics. Check it out below👇 open.spotify.com/episode/6b2pMb…

Jack Mayo (@jackjmayo1) 's Twitter Profile Photo

Quite correct. Indeed, the fundamental bottleneck - one that that doesn't improve by simply throwing compute at a problem - is the data quality. Here's a nice experiment we ran at kurtos.ai. With a reasonable online selection procedure, one can drastically improve the amount

Quite correct. Indeed, the fundamental bottleneck - one that that doesn't improve by simply throwing compute at a problem - is the data quality.

Here's a nice experiment we ran at <a href="/kurtos_ai/">kurtos.ai</a>. With a reasonable online selection procedure, one can drastically improve the amount