
Josh Clemm
@joshclemm
Software engineer building AI. VP Eng at @Dropbox. Formerly @UberEats, @LinkedIn. Founder of Draft Punk and @YaddleAI.
ID: 132739629
https://joshclemm.com 14-04-2010 02:01:58
383 Tweet
2,2K Followers
94 Following





Ethan Mollick LLM memory is still a blunt instrument. The real challenge isn’t storing more. It’s knowing what to use when. Same problem as RAG: context selection. Right now, models treat all context, memory or fresh input, as gospel. That’s gotta change.


tobi lutke Hiten Shah This is why RAG isn't dead. Selecting the right context (retrieval) is really the key difference in output quality. LLMs still massively struggle with bad or too much context. Humans still need to provide the right context!