Josh Clemm (@joshclemm) 's Twitter Profile
Josh Clemm

@joshclemm

Software engineer building AI. VP Eng at @Dropbox. Formerly @UberEats, @LinkedIn. Founder of Draft Punk and @YaddleAI.

ID: 132739629

linkhttps://joshclemm.com calendar_today14-04-2010 02:01:58

383 Tweet

2,2K Followers

94 Following

Josh Clemm (@joshclemm) 's Twitter Profile Photo

Costco's worth $500 Trillion? This is a great example of why retrieval quality needs to be rock solid before working on generative AI. LLMs still treat any passed in context as gospel, so your RAG pipeline needs to be super precise.

Hiten Shah (@hnshah) 's Twitter Profile Photo

Bad AI products come from teams who don’t use AI. Good ones come from teams who feel the friction, fight through the flaws, and keep building. You don’t design this stuff. You uncover it by living with it. By using AI daily and figuring out how to make it work right.

Josh Clemm (@joshclemm) 's Twitter Profile Photo

Ethan Mollick LLM memory is still a blunt instrument. The real challenge isn’t storing more. It’s knowing what to use when. Same problem as RAG: context selection. Right now, models treat all context, memory or fresh input, as gospel. That’s gotta change.

Josh Clemm (@joshclemm) 's Twitter Profile Photo

Another day, another paper shows LLMs seriously struggle at not saying "I don't know" if there's insufficient context (aka abstention). Instead it just hallucinates. And reasoning LLMs are even worse. Better retrieval (RAG) is still the best way to get better quality.

tobi lutke (@tobi) 's Twitter Profile Photo

I really like the term “context engineering” over prompt engineering. It describes the core skill better: the art of providing all the context for the task to be plausibly solvable by the LLM.

Josh Clemm (@joshclemm) 's Twitter Profile Photo

tobi lutke Hiten Shah This is why RAG isn't dead. Selecting the right context (retrieval) is really the key difference in output quality. LLMs still massively struggle with bad or too much context. Humans still need to provide the right context!