
POURCEL Guillaume
@guillaumeap
PhD student @univgroningen, intern @InriaScool, @FlowersINRIA. CogScI, AI. Inspired by brains (make RNNs behave like autograd) and behavior (open-ended goals)
ID: 942422904276488195
https://guillaumepourcel.github.io/ 17-12-2017 15:55:04
248 Tweet
116 Followers
796 Following


The (true) story of development and inspiration behind the "attention" operator, the one in "Attention is All you Need" that introduced the Transformer. From personal email correspondence with the author 🇺🇦 Dzmitry Bahdanau @ NeurIPS ~2 years ago, published here and now (with permission) following






Joel Lehman So when I first worked on unsupervised environment design, I was hoping to mitigate KU. There's a section in that paper dealing with the connection to "decisions under ignorance" (KU under another name). arxiv.org/abs/2012.02096 the open-ended complexity surprised me!







we've seen nothing yet! hosted a 9-13 yo vibe-coding event w. Robert Keus 👨🏼💻 this w-e (h/t Anton Osika – eu/acc Lovable Build) takeaway? AI is unleashing a generation of wildly creative builders beyond anything I'd have imagined and they grow up *knowing* they can build anything!




Self-Improving Language Models for Evolutionary Program Synthesis: A Case Study on ARC-AGI by Pourcel Julien @ICML, Cédric and Pierre-Yves Oudeyer Another example of ARC-AGI as a research playground that has general applicability

I’m attending #ICML this week! We’ll be presenting MAGELLAN during the poster session on Thursday with Carta Thomas & Clément ROMAC @ ICML 2025 If you’re not in Vancouver, we recorded a talk presenting the paper last week, it’s available on YouTube (link below)

