Josh Lehman (@jlehman_) 's Twitter Profile
Josh Lehman

@jlehman_

Partner @ Martian Engineering | ex-ED @urbitfoundation | ex-CTO @starcity (YC S16)

ID: 131996258

linkhttps://martian.engineering calendar_today12-04-2010 00:31:05

702 Tweet

1,1K Takipçi

1,1K Takip Edilen

Josh Lehman (@jlehman_) 's Twitter Profile Photo

Got my Limitless pendant on Friday and had a great first experience, but not the one I was expecting. I didn't bother setting it up until today because I bought it for work and didn't do much of that over Easter weekend. I didn't work today, but curiosity was gnawing at me,

Josh Lehman (@jlehman_) 's Twitter Profile Photo

Now with support for o3. Plug in an API key (assuming you've got API access) and it'll use o3 any time you're within its context limits, and otherwise fall back to Gemini 2.5 Pro.

Josh Lehman (@jlehman_) 's Twitter Profile Photo

Soren Larson The idea here is mainly that there exist other models that appear to be smarter than 3.7 Sonnet at this point, but Claude Code + Sonnet (necessarily) is better at banging out the implementation than other tools. x.com/tryfoundergg/s…

meatball times (@meatballtimes) 's Twitter Profile Photo

fix the teachers, the grades, the curriculum, teach more practical skills, and let them use AI, and people won't cheat. but after you've done all that, what is left of universities? very possibly, nothing, and THAT's the real reason educators are freaking out

Josh Lehman (@jlehman_) 's Twitter Profile Photo

Great post from Thomas H. Ptacek at Fly.io: but you have no idea what the code is Are you a vibe coding Youtuber? Can you not read code? If so: astute point. Otherwise: what the fuck is wrong with you? (link below)

Great post from <a href="/tqbf/">Thomas H. Ptacek</a> at <a href="/flydotio/">Fly.io</a>:

but you have no idea what the code is

Are you a vibe coding Youtuber? Can you not read code? If so: astute point. Otherwise: what the fuck is wrong with you?

(link below)
Pluralis Research (@pluralishq) 's Twitter Profile Photo

We've reached a major milestone in fully decentralized training: for the first time, we've demonstrated that a large language model can be split and trained across consumer devices connected over the internet - with no loss in speed or performance.

We've reached a major milestone in fully decentralized training: for the first time, we've demonstrated that a large language model can be split and trained across consumer devices connected over the internet - with no loss in speed or performance.