Mohamed Baioumy (@mo_baioumy) 's Twitter Profile
Mohamed Baioumy

@mo_baioumy

Building @exolabs
PhD in AI & Robotics @oxfordrobots

ID: 805458739

calendar_today05-09-2012 21:47:26

834 Tweet

7,7K Takipçi

769 Takip Edilen

EXO Labs (@exolabs) 's Twitter Profile Photo

Controlling my AI assistant from the indestructible Nokia 3310 Sends SMS directly to AI home cluster with access to all my files, emails, calendar and other apps. Try it yourself, link below.

EXO Labs (@exolabs) 's Twitter Profile Photo

2024 wrapped - 34 VC rejections - a few investors said yes including Naval - 1 co-founder quit - pivoted 3 times - launched repo, hit #1 trending in the world and 18k stars - backdoor attempt on repo through innocent looking pr (north korea?) - approached twice for acquisition -

EXO Labs (@exolabs) 's Twitter Profile Photo

Introducing EXO Private Search Privacy-preserving web search for local LLMs. Augments local LLMs with realtime search using Linearly Homomorphic Encryption: - Live data from X, crypto, Wikipedia (more coming) - 100,000x less data transfer than client sync - <2s round-trip time

Mohamed Baioumy (@mo_baioumy) 's Twitter Profile Photo

Local LLMs are private but lack real time data like weather, stock prices or the latest posts on X Using homomorphic encryption we can augment local LLMs with web search while maintaining end-to-end privacy.

EXO Labs (@exolabs) 's Twitter Profile Photo

What if we could train an open-source AI model on 1,000 Macs? EXO is excited to announce EXO Gym, an open research competition for low-bandwidth distributed training algorithms with access to up to 1,000 Macs. Today, every frontier AI model is trained on clusters of NVIDIA

What if we could train an open-source AI model on 1,000 Macs?

EXO is excited to announce EXO Gym, an open research competition for low-bandwidth distributed training algorithms with access to up to 1,000 Macs.

Today, every frontier AI model is trained on clusters of NVIDIA
EXO Labs (@exolabs) 's Twitter Profile Photo

What if we could connect all the dark compute across the globe to build the world's biggest AI data center? Most of the compute in the world is dark: phones, laptops, Tesla's, PS5's, TV's. These devices have powerful GPUs but are mostly sitting idle. Today, EXO Labs is

EXO Labs (@exolabs) 's Twitter Profile Photo

Giving an AI agent access to my iPhone. Your phone knows you better than anyone. What if your AI agent could go through your phone to truly understand you? The EXO Agent uses iPhone mirroring to look through your apps including YouTube/Netflix watch history, X likes and photos.

EXO Labs (@exolabs) 's Twitter Profile Photo

First look at SPARTA, a distributed AI training algorithm that avoids synchronization by randomly exchanging sparse sets of parameters (<0.1%) asynchronously between GPUs. Preliminary results and details of our mac mini training run are available now (link below). SPARTA

Matt Beton (@mattbeton) 's Twitter Profile Photo

Training an LLM on 8 M4 Mac Minis Ethernet interconnect between Macs is 100x slower than NVLink so Macs can’t synchronise model gradients every training step. I got DiLoCo running so Macs synchronise once every 1000 training steps using 1000x less communication than DDP

Training an LLM on 8 M4 Mac Minis

Ethernet interconnect between Macs is 100x slower than NVLink so Macs can’t synchronise model gradients every training step.

I got DiLoCo running so Macs synchronise once every 1000 training steps using 1000x less communication than DDP
Alex Cheema - e/acc (@alexocheema) 's Twitter Profile Photo

8 months ago, exo was a hackathon project. Today it's front page of The Wall Street Journal The Wall Street Journal. We're a real company now (I guess..?), we raised some money from a few investors like Naval, hit #1 trending on github, published at ICML, shipped an enterprise product, and we're

8 months ago, exo was a hackathon project. Today it's front page of The Wall Street Journal <a href="/WSJ/">The Wall Street Journal</a>.

We're a real company now (I guess..?), we raised some money from a few investors like <a href="/naval/">Naval</a>, hit #1 trending on github, published at ICML, shipped an enterprise product, and we're
Alex Cheema - e/acc (@alexocheema) 's Twitter Profile Photo

Apple have given me early access to 2 maxed out M3 Ultra 512GB Mac Studios ahead of the public release. I will run the full DeepSeek R1 (8-bit) using EXO Labs or die trying. The 1TB(!!) of Unified Memory should be enough for all 671B parameters + context.

Apple have given me early access to 2 maxed out M3 Ultra 512GB Mac Studios ahead of the public release.

I will run the full DeepSeek R1 (8-bit) using <a href="/exolabs/">EXO Labs</a> or die trying.

The 1TB(!!) of Unified Memory should be enough for all 671B parameters + context.
Tycho van der Ouderaa (@tychovdo) 's Twitter Profile Photo

This past spring, I spent time with the EXO Labs team to work on a new DL optimizer and wiring up clusters of Macs for distributed TRAINING on Apple Silicon. If you’re at ICML, be sure to come by the ES-FoMo@ICML2025 workshop (posters 1-2:30pm) this Saturday. I’ll be there to share some

This past spring, I spent time with the <a href="/exolabs/">EXO Labs</a> team to work on a new DL optimizer and wiring up clusters of Macs for distributed TRAINING on Apple Silicon. If you’re at ICML, be sure to come by the <a href="/ESFoMo/">ES-FoMo@ICML2025</a> workshop (posters 1-2:30pm) this Saturday. I’ll be there to share some