Inferact (@inferact) 's Twitter Profile
Inferact

@inferact

ID: 1998160192903745536

calendar_today08-12-2025 22:38:37

58 Tweet

3,3K Followers

3 Following

Roy (@jasonlovelu) 's Twitter Profile Photo

Been loving the vLLM journey since 2023 and this wonderfully warm community. Proud to work with a brilliant team and keep pushing vLLM to be the best open source inference engine in the world. 💙

Jason Cui (@jasonscui) 's Twitter Profile Photo

We're excited to lead the $150m seed round for Inferact and to support the vLLM and the future of inference. The vLLM community is already thriving and will continue to be a critical inference backbone. Congrats Simon Mo, Woosuk Kwon, Kaichao You, and Roger Wang!

David Bloom (@daveybloom) 's Twitter Profile Photo

When exceptional talent meets a compelling vision, it's an easy decision to invest. This team proved themselves on campus while growing vLLM . Now we're excited to support them as they build something special. Let's get to work! Woosuk Kwon Simon Mo

Michael Goin (@mgoin_) 's Twitter Profile Photo

I'm excited for the latest growth of vLLM with the announcement of Inferact! The past two years working on vLLM was life-changing. Looking forward to collaborating on even bigger things in open source. Let's go vLLM!

Yusen DAI | 戴雨森 (@yusen) 's Twitter Profile Photo

Very excited to partner with Inferact in support of their mission to build the inference engine for AI. ZhenFund is proud to have been an early supporter of vLLM. Huge congrats to Simon Mo, Woosuk Kwon, Kaichao You, Roger Wang, Ion Stoica, and the rest of the founding

Lightspeed (@lightspeedvp) 's Twitter Profile Photo

Inferact CEO Simon Mo says the AI infrastructure buildout is misunderstood: "The clusters being built for training—six months later, they'll be used entirely for inference." "Inference will start to eat up that capacity, and consume all the newly provisioned energy."

Lightspeed (@lightspeedvp) 's Twitter Profile Photo

Inferact Co-Founder Simon Mo on AI economics: "You build the data centers, the training cluster, fund the training run, produce a model… but at that point, there is no value created." "Only delivering inference is the point where you can actually capitalize on this

Hao Zhang (@haozhangml) 's Twitter Profile Photo

Big congrats on Inferact! Since we initiated vLLM’s earliest research push back in 2023, it has been incredible to watch vLLM become the OSS inference engine for so many teams. Building a project like this takes persistence across everything: research breakthroughs,

Woosuk Kwon (@woosuk_k) 's Twitter Profile Photo

Thank you so much! I still remember the day Hao Zhang suggested working on LLM inference back in 2022. vLLM truly wouldn’t exist without you.

The House Fund (@thehousevc) 's Twitter Profile Photo

We backed @Inferact at inception, based on the Berkeley research project vLLM. Today, they announced a $150M seed led by a16z and @LightspeedVP, with @Sequoia and The House Fund — one of the largest seed rounds ever. What started in a lab is now the open-source inference

Bogomil Balkansky (@bogiebalkansky) 's Twitter Profile Photo

It's wonderful to see the creators of the vLLM start a company Inferact . vLLM has been capturing the hearts and minds of the technical community for years, and a company based on it means more innovation from the brilliant minds behind it: Simon Mo WookSu Kim and the