Edoardo Cetin (@edo_cet) 's Twitter Profile
Edoardo Cetin

@edo_cet

ML Ph.D., Research Scientist @SakanaAILabs. Previously intern @AIatMeta (FAIR), #Twitter (Cortex team), Toyota, and Goldman Sachs.

ID: 1520719228264976384

linkhttps://aladoro.github.io/ calendar_today01-05-2022 10:58:18

31 Tweet

484 Followers

58 Following

Michael Bronstein @ICLR2025 🇸🇬 (@mmbronstein) 's Twitter Profile Photo

New post with Edoardo Cetin Ben Chamberlain and jjh on hyperbolic RL (we are presenting the paper at ICLR tomorrow). This is the first time we use AI-generated illustration from Stability AI courtesy of the prompt wizard hardmaru towardsdatascience.com/hyperbolic-dee…

New post with <a href="/edo_cet/">Edoardo Cetin</a> <a href="/DrBPChamberlain/">Ben Chamberlain</a> and <a href="/jjh/">jjh</a>  on hyperbolic RL (we are presenting the paper at ICLR tomorrow). This is the first time we use AI-generated illustration from <a href="/StabilityAI/">Stability AI</a> courtesy of the prompt wizard <a href="/hardmaru/">hardmaru</a> 

towardsdatascience.com/hyperbolic-dee…
Edoardo Cetin (@edo_cet) 's Twitter Profile Photo

Super excited to share that I joined Sakana AI as a Research Scientist! Looking forward to working with an amazing team to develop new nature-inspired methods and tackle some of AI's most relevant challenges ^^

Super excited to share that I joined <a href="/SakanaAILabs/">Sakana AI</a> as a Research Scientist!

Looking forward to working with an amazing team to develop new nature-inspired methods and tackle some of AI's most relevant challenges ^^
Sakana AI (@sakanaailabs) 's Twitter Profile Photo

Introducing An Evolved Universal Transformer Memory sakana.ai/namm Neural Attention Memory Models (NAMMs) are a new kind of neural memory system for Transformers that not only boost their performance and efficiency but are also transferable to other foundation models,

Introducing An Evolved Universal Transformer Memory

sakana.ai/namm

Neural Attention Memory Models (NAMMs) are a new kind of neural memory system for Transformers that not only boost their performance and efficiency but are also transferable to other foundation models,
Sakana AI (@sakanaailabs) 's Twitter Profile Photo

We’re excited to introduce Transformer², a machine learning system that dynamically adjusts its weights for various tasks! sakana.ai/transformer-sq… Adaptation is a remarkable natural phenomenon, like how the octopus can blend in with its environment, or how the brain rewires

Sakana AI (@sakanaailabs) 's Twitter Profile Photo

We’re excited to introduce Text-to-LoRA: a Hypernetwork that generates task-specific LLM adapters (LoRAs) based on a text description of the task. Catch our presentation at #ICML2025! Paper: arxiv.org/abs/2506.06105 Code: github.com/SakanaAI/Text-… Biological systems are capable of

Sakana AI (@sakanaailabs) 's Twitter Profile Photo

Introducing Reinforcement-Learned Teachers (RLTs): Transforming how we teach LLMs to reason with reinforcement learning (RL). Blog: sakana.ai/rlt Paper: arxiv.org/abs/2506.08388 Traditional RL focuses on “learning to solve” challenging problems with expensive LLMs and

Sakana AI (@sakanaailabs) 's Twitter Profile Photo

We’re excited to introduce ShinkaEvolve: An open-source framework that evolves programs for scientific discovery with unprecedented sample-efficiency. Blog: sakana.ai/shinka-evolve/ Code: github.com/SakanaAI/Shink… Like AlphaEvolve and its variants, our framework leverages LLMs to

Sakana AI (@sakanaailabs) 's Twitter Profile Photo

Introducing DroPE: Extending the Context of Pretrained LLMs by Dropping Their Positional Embeddings pub.sakana.ai/DroPE/ We are releasing a new method called DroPE to extend the context length of pretrained LLMs without the massive compute costs usually associated with

Sakana AI (@sakanaailabs) 's Twitter Profile Photo

We’re excited to introduce Doc-to-LoRA and Text-to-LoRA, two related research exploring how to make LLM customization faster and more accessible. pub.sakana.ai/doc-to-lora/ By training a Hypernetwork to generate LoRA adapters on the fly, these methods allow models to instantly