aranku (@aranku_) 's Twitter Profile
aranku

@aranku_

technically member of staff @magicailabs

ID: 236321797

linkhttps://github.com/aranku calendar_today10-01-2011 09:39:49

22 Tweet

293 Takipçi

563 Takip Edilen

Magic (@magicailabs) 's Twitter Profile Photo

Meet LTM-1: LLM with *5,000,000 prompt tokens* That's ~500k lines of code or ~5k files, enough to fully cover most repositories. LTM-1 is a prototype of a neural network architecture we designed for giant context windows.

Eric Steinberger (@ericsteinb) 's Twitter Profile Photo

AI with long-term memory! *A lot* of work left to do but happy to share a little more about what we've been up to. It's been incredibly fulfilling to work with a wonderful team and the trust of our backers towards this milestone. Thank you for the opportunity <3

Magic (@magicailabs) 's Twitter Profile Photo

We've raised $117M from Nat Friedman and others to build an AI software engineer. Code generation is both a product and a path to AGI, requiring new algorithms, lots of CUDA, frontier-scale training, RL, and a new UI. We are hiring!

We've raised $117M from <a href="/natfriedman/">Nat Friedman</a> and others to build an AI software engineer.

Code generation is both a product and a path to AGI, requiring new algorithms, lots of CUDA, frontier-scale training, RL, and a new UI.

We are hiring!
Magic (@magicailabs) 's Twitter Profile Photo

LTM-2-Mini is our first model with a 100 million token context window. That’s 10 million lines of code, or 750 novels. Full blog: magic.dev/blog/100m-toke… Evals, efficiency, and more ↓

Magic (@magicailabs) 's Twitter Profile Photo

Excited to announce we’re building an Applied Team focused on post-training. Come explore what's possible with our new (and still unreleased) LTM2 models and their 100M token context window. Apply here: magic.dev/careers/5652b4…

Eric Steinberger (@ericsteinb) 's Twitter Profile Photo

We're hiring for a new team aiming to train AI SWEs to robustly complete long-horizon work on a no-restrictions computer via the GUI. Today's models excel at small, Olympiad-type coding tasks but struggle in complex codebases and aren't easy to integrate into existing enterprise