Julien Chaumond (@julien_c) 's Twitter Profile
Julien Chaumond

@julien_c

Co-founder and CTO at @huggingface 🤗. ML/AI for everyone, building products to propel communities fwd. @Stanford + @Polytechnique

ID: 16141659

linkhttps://huggingface.co calendar_today05-09-2008 07:56:27

16,16K Tweet

58,58K Followers

1,1K Following

Wauplin (@wauplin) 's Twitter Profile Photo

Say hello to `hf`: a faster, friendlier Hugging Face CLI ✨ We are glad to announce a long-awaited quality-of-life improvement: the Hugging Face CLI has been officially renamed from huggingface-cli to hf! So... why this change?

Say hello to `hf`: a faster, friendlier Hugging Face CLI ✨

We are glad to announce a long-awaited quality-of-life improvement: the Hugging Face CLI has been officially renamed from huggingface-cli to hf!

So... why this change?
Lysandre (@lysandrejik) 's Twitter Profile Photo

This is not simply a speed-up, but also a memory reduction: run bigger models, faster. We're working with kernel providers like Unsloth AI, Liger Kernel, Red Hat AI, vLLM, ggml and others so that kernels they build are reused across runtimes, from their Hub org.

Arthur Zucker (@art_zucker) 's Twitter Profile Photo

With the latest release, I want to make sure I get this message to the community: we are listening! Hugging Face we are very ambitious and we want `transformers` to accelerate the ecosystem and enable all hardwares / platforms! Let's build AGI together 🫣 Unbloat and Enable!

With the latest release, I want to make sure I get this message to the community: we are listening! 

<a href="/huggingface/">Hugging Face</a> we are very ambitious and we want `transformers` to accelerate the ecosystem and enable all hardwares / platforms! 
Let's build AGI together 🫣
Unbloat and Enable!
Quentin Lhoest 🤗 (@lhoestq) 's Twitter Profile Photo

Announcing 🤗hf jobs 🚨 Tools + CLI to run compute job on Hugging Face infra. For training, fine-tuning, for your own scripts... Select any GPU. Run many jobs in parallel. Compatible with `uv`. Pay as you go. Try it today: pip install -U huggingface_hub

Charlie Marsh (@charliermarsh) 's Twitter Profile Photo

The new Hugging Face jobs CLI is powered by uv 🤗 You can use `hf jobs uv run` to initiate a job from a standalone Python script.

The new Hugging Face jobs CLI is powered by uv 🤗

You can use `hf jobs uv run` to initiate a job from a standalone Python script.
clem 🤗 (@clementdelangue) 's Twitter Profile Photo

How much are you using Hugging Face's CLI? Mostly to upload and download models and datasets? We just revamped it (welcome to `hf`!) and added the capability to run jobs directly on our infra. Useful?

How much are you using <a href="/huggingface/">Hugging Face</a>'s CLI? Mostly to upload and download models and datasets? 

We just revamped it (welcome to `hf`!) and added the capability to run jobs directly on our infra. Useful?
👋 Jan (@jandotai) 's Twitter Profile Photo

Jan v0.6.6 is out: Jan now runs fully on llama.cpp. - Cortex is gone, local models now run on Georgi Gerganov's llama.cpp - Toggle between llama.cpp builds - Hugging Face added as a model provider - Hub enhanced - Images from MCPs render inline in chat Update Jan or grab the latest.

Thomas Wolf (@thom_wolf) 's Twitter Profile Photo

Long-form AI reading is back and we’ve just dropped the ultimate summer read. Inspired by the likes of Stripe Press, we’re proud to announce the first book from HF Press: a carefully crafted, book-length PDF edition of the Ultra-Scale Playbook. Over 200 dense pages to learn the

Long-form AI reading is back and we’ve just dropped the ultimate summer read.

Inspired by the likes of Stripe Press, we’re proud to announce the first book from HF Press: a carefully crafted, book-length PDF edition of the Ultra-Scale Playbook.

Over 200 dense pages to learn the
Julien Chaumond (@julien_c) 's Twitter Profile Photo

50 (!) LLMs released these past 2-3 weeks. But the real kicker is when you think of this: It is the most releases we’ve seen so far, but the least releases we’ll see in the future 🤯

50 (!) LLMs released these past 2-3 weeks.

But the real kicker is when you think of this:

It is the most releases we’ve seen so far, but the least releases we’ll see in the future 🤯