Mika Senghaas (@mikasenghaas) 's Twitter Profile
Mika Senghaas

@mikasenghaas

research intern @primeintellect, msc data science @epfl

ID: 837406685488611330

linkhttps://mikasenghaas.de calendar_today02-03-2017 20:58:06

5 Tweet

97 Followers

171 Following

Prime Intellect (@primeintellect) 's Twitter Profile Photo

Today we’re launching INTELLECT-2: The first decentralized 32B-parameter RL training run open to join for anyone with compute — fully permissionless. Scaling towards frontier reasoning across coding, math and science.

Mika Senghaas (@mikasenghaas) 's Twitter Profile Photo

wrote a little something on our learnings from decentralizing inference and open-sourced 3 research codebases. tl;dr optimizing inference under decentralized constraints is worthwhile, non-trivial, and far from solved. excited to be building this with the team! more soon, when we

Justus Mattern (@matternjustus) 's Twitter Profile Photo

Very excited to soon release SYNTHETIC-2, partially powered by consumer grade GPUs. Very confident that what we’ve planned for this dataset will be incredibly useful for the open source community

samsja (@samsja19) 's Twitter Profile Photo

We wrote an extensive blog post on large-scale pipelined inference and released a VLLM integration to connect any machines over the internet to serve a model. Will be the foundation of our next synthetic-2 run (and later allow consumer GPU to join rl run)

Prime Intellect (@primeintellect) 's Twitter Profile Photo

Releasing INTELLECT-2: We’re open-sourcing the first 32B parameter model trained via globally distributed reinforcement learning: • Detailed Technical Report • INTELLECT-2 model checkpoint primeintellect.ai/blog/intellect…

Mika Senghaas (@mikasenghaas) 's Twitter Profile Photo

with synthetic-2 we scaled heterogeneity across not 1, not 2, but 3 axes in a single shot. we run on a variety of models, tasks and hardware, all while allowing completely permissionless compute contributions. on the surface it might just seem like a dataset release, but so much