Ivan Fioravanti ᯅ (@ivanfioravanti) 's Twitter Profile
Ivan Fioravanti ᯅ

@ivanfioravanti

Co-founder and CTO of @CoreViewHQ GenAI/LLM addicted, Apple MLX, Ollama, Microsoft 365, Azure, Kubernetes, Investor in innovation

ID: 43874767

calendar_today01-06-2009 12:21:31

23,23K Tweet

12,12K Followers

1,1K Following

N8 Programs (@n8programs) 's Twitter Profile Photo

MLX-LM has extreme alpha. It’s a unified inference and training framework that works across MLX and CUDA and constantly gets new features without losing modularity. It will only get better.

elie (@eliebakouch) 's Twitter Profile Photo

always feels weird seeing decentralize baseline outperforming the centralized baseline? i can vaguely remember something similar on diloco when you don't fully decay the learning rate (not 100% sure about this), any thought on why this is happening Teknium (e/λ) emozilla Bowen Peng

always feels weird seeing decentralize baseline outperforming the centralized baseline? i can vaguely remember something similar on diloco when you don't fully decay the learning rate (not 100% sure about this), any thought on why this is happening <a href="/Teknium/">Teknium (e/λ)</a> <a href="/theemozilla/">emozilla</a> <a href="/bloc97_/">Bowen Peng</a>
Ivan Fioravanti ᯅ (@ivanfioravanti) 's Twitter Profile Photo

Can't run mistralai/Ministral-3-8B-Instruct-2512 locally with MLX, skill issue? 🤷🏻‍♂️ ValueError: Tokenizer class TokenizersBackend does not exist or is not currently imported.

Cursor (@cursor_ai) 's Twitter Profile Photo

The new Codex model is available in Cursor! It's free to use until December 11th. We worked with OpenAI to optimize Cursor's agent harness for the model. cursor.com/blog/codex-mod…

SkalskiP (@skalskip92) 's Twitter Profile Photo

basketball AI YouTube tutorial is finally live over 1000 hours of work compressed into 37 minutes link below; like and comments please

Ivan Fioravanti ᯅ (@ivanfioravanti) 's Twitter Profile Photo

Why is it so hard to interact in a positive and constructive way on X? We are all here to share and learn from each other. Personally, I’ve learned so many amazing things and discovered so many amazing people here! 🔥

Ivan Fioravanti ᯅ (@ivanfioravanti) 's Twitter Profile Photo

“I am increasingly convinced that the important question is not only what these models can do, but how we learn to work with them.” Thanks for sharing your experience in detail! 🙏

N8 Programs (@n8programs) 's Twitter Profile Photo

mlx-lm can be used for much more than standard LLM training/inference. Here, I use mlx-lm to train an autoregressive 8M transformer on MNIST to generate digits, and further refine it via GRPO w/ self-classification. I use mlx-lm's BatchedGenerate backend for both the GRPO step