vLLM (@vllm_project) 's Twitter Profile
vLLM

@vllm_project

A high-throughput and memory-efficient inference and serving engine for LLMs. Join slack.vllm.ai to discuss together with the community!

ID: 1774187564276289536

linkhttps://github.com/vllm-project/vllm calendar_today30-03-2024 21:31:01

327 Tweet

12,12K Followers

15 Following

vLLM (@vllm_project) 's Twitter Profile Photo

πŸ’‘ vLLM @ Open Source AI Week! 1⃣ Wednesday, Oct 23 & Thursday, Oct 24: vLLM @ Pytorch Conference 2025 πŸš€ Explore vLLM at PyTorch Conference 2025! πŸ“… Sessions to catch: 1. Easy, Fast, Cheap LLM Serving for Everyone – Simon Mo, Room 2004/2006 2. Open Source Post-Training Stack:

πŸ’‘ vLLM @ Open Source AI Week!
1⃣ Wednesday, Oct 23 & Thursday, Oct 24: vLLM @ Pytorch Conference 2025
πŸš€ Explore vLLM at PyTorch Conference 2025!
πŸ“… Sessions to catch:
1. Easy, Fast, Cheap LLM Serving for Everyone – Simon Mo, Room 2004/2006
2. Open Source Post-Training Stack: