Eric L. Buehler (@ericlbuehler) 's Twitter Profile
Eric L. Buehler

@ericlbuehler

ID: 1676401205747105797

linkhttps://github.com/EricLBuehler calendar_today05-07-2023 01:23:04

44 Tweet

67 Takipçi

163 Takip Edilen

David (drbh) Holtz (@justdrbh) 's Twitter Profile Photo

Excited to share the Kernel Hub, optimized CUDA kernels, plug-and-play from the Hugging Face Hub. No boilerplate, just speed. huggingface.co/blog/hello-hf-…

Excited to share the Kernel Hub, optimized CUDA kernels, plug-and-play from the Hugging Face Hub.
No boilerplate, just speed.
 huggingface.co/blog/hello-hf-…
clem 🤗 (@clementdelangue) 's Twitter Profile Photo

You can now add Hugging Face to Cursor to find models, datasets, papers, apps,... Vibe coding a website is cool but imagine if the new AI powered code editors would turn everyone into an AI builder able to train AI themselves? How cool would that be?

You can now add <a href="/huggingface/">Hugging Face</a> to <a href="/cursor_ai/">Cursor</a> to find models, datasets, papers, apps,...

Vibe coding a website is cool but imagine if the new AI powered code editors would turn everyone into an AI builder able to train AI themselves? How cool would that be?
Thien Tran (@gaunernst) 's Twitter Profile Photo

Was quite surprised that Qwen3 impl in HF transformers looks very clean. And some layers use kernels from HF kernels hub 👀

Was quite surprised that Qwen3 impl in HF transformers looks very clean. And some layers use kernels from HF kernels hub 👀
Arthur Zucker (@art_zucker) 's Twitter Profile Photo

With the recent efforts around isolating kernels, using flash attentionX (2, 3, ...) should be as simple as this: Will give you a lightweight install! You need `pip install kernels`

With the recent efforts around isolating kernels, using flash attentionX (2, 3, ...) should be as simple as this:
Will give you a lightweight install! 
You need `pip install kernels`
Lysandre (@lysandrejik) 's Twitter Profile Photo

The new transformers release comes w/ a surprise: kernels support ⚡️ It integrates deeply with precompiled kernels on the HF Hub. - opt-in, automatic kernels for your hardware and software - kernels like FA2/3 w/o compilation - community-built kernels, for inference & training

Andrej Karpathy (@karpathy) 's Twitter Profile Photo

I am (slowly) re-reading the Tolkien legendarium (of which Lord of the Rings is a small part). The whole body of work is so incredible and there's nothing else like it... it dilutes other worlds of fiction. Wait - your story doesn't have a comprehensive history/mythology spanning

David (drbh) Holtz (@justdrbh) 's Twitter Profile Photo

writing CUDA kernels is fun. getting them to actually ship is pain. we built kernel-builder so you can skip the pain → huggingface.co/blog/kernel-bu…

Sayak Paul (@risingsayak) 's Twitter Profile Photo

Wrote an FA3 attention processor for Qwen Image using the 🤗 Kernels library. The process is so enjoyable! Stuff cooking stuff coming 🥠 gist.github.com/sayakpaul/ff71…

Wrote an FA3 attention processor for <a href="/Alibaba_Qwen/">Qwen</a> Image using the 🤗 Kernels library. The process is so enjoyable!

Stuff cooking stuff coming 🥠

gist.github.com/sayakpaul/ff71…
SpaceX (@spacex) 's Twitter Profile Photo

View of Starship landing burn and splashdown on Flight 10, made possible by SpaceX’s recovery team. Starship made it through reentry with intentionally missing tiles, completed maneuvers to intentionally stress its flaps, had visible damage to its aft skirt and flaps, and still