Tilde (@tilderesearch) 's Twitter Profile
Tilde

@tilderesearch

Building the interpreter models to optimize AI deployments.

ID: 1815151852784332800

linkhttps://tilderesearch.com calendar_today21-07-2024 22:28:26

35 Tweet

1,1K Followers

3 Following

Tilde (@tilderesearch) 's Twitter Profile Photo

Mixture‑of‑Experts (MoE) powers many frontier models like R1, K2, & Qwen3 ⚡️ To make frontier-scale MoE models accessible to train, we open-source MoMoE, a hyper-performant MoE implementation built for training and inference, outpacing the fastest existing ones by up to: - 70%

Mixture‑of‑Experts (MoE) powers many frontier models like R1, K2, & Qwen3

⚡️ To make frontier-scale MoE models accessible to train, we open-source MoMoE, a hyper-performant MoE implementation built for training and inference, outpacing the fastest existing ones by up to:

- 70%