Armen Aghajanyan
@armenagha
ex-RS FAIR/MSFT
ID: 1515424688
14-06-2013 05:43:07
591 Tweet
11,11K Followers
266 Following
The Alpaca moment of Large Multimodal Models! Can we build native LMMs just like Llama for simple multimodal generation? Introducing Anole: the first open-source, autoregressive native LMM for multimodal generation. Building on Chameleon by AI at Meta: github.com/GAIR-NLP/anole
Armen Aghajanyan Excited that we have deciphered most of the cryptic posts π
New from AI at Meta: Efficient Early-Fusion Pre-training with Mixture of Modality-Aware Experts With a 1T token training budget, MoMa 1.4B achieves FLOPs savings of 3.7x π Authors Victoria X Lin, Akshat Shrivastava, and Liang Luo will be on alphaXiv this week to answer your questions.