Sampath Chanda (@chandasampath) 's Twitter Profile
Sampath Chanda

@chandasampath

Tweets about #CV #DL #ML #SystemDesign Applied Scientist @Amazon @CarnegieMellon @BitsPilaniIndia

ID: 2909477538

linkhttps://bio.link/sampathchanda calendar_today25-11-2014 01:10:04

457 Tweet

181 Followers

885 Following

#CVPR2025 (@cvpr) 's Twitter Profile Photo

Networking like a pro! Follow the Inigo technique as follows. • Greeting: "Hello" • Introduction: "My name is Inigo Montoya." • Connection: "You killed my father." • Call for action: "Prepare to die." #CVPR2024

Sampath Chanda (@chandasampath) 's Twitter Profile Photo

What company has the dumbest customer service agents? PNC Bank! Agent can't understand simple request, repeatedly says will help but doesn't help, refuses to transfer to supervisor. What could we do? What say? PNC Bank Help PNC Bank

Shashwat Goel (@shashwatgoel7) 's Twitter Profile Photo

Paper fresh of the press: The Illusion of Diminishing Returns: Measuring Long Horizon Execution in LLMs. Are small models the future of agentic AI? Is scaling LLM compute not worth the cost due to diminishing returns? Are autoregressive LLMs doomed, and thinking an illusion?

Paper fresh of the press: The Illusion of Diminishing Returns: Measuring Long Horizon Execution in LLMs.

Are small models the future of agentic AI? Is scaling LLM compute not worth the cost due to diminishing returns? Are autoregressive LLMs doomed, and thinking an illusion?
Sampath Chanda (@chandasampath) 's Twitter Profile Photo

Deepseek-OCR says: “A picture is worth a thousand words!” Can compress text context by converting into visual tokens rather.

Reece Shuttleworth (@reeceshuttle) 's Twitter Profile Photo

🧵 LoRA vs full fine-tuning: same performance ≠ same solution. Our NeurIPS ‘25 paper 🎉shows that LoRA and full fine-tuning, even when equally well fit, learn structurally different solutions and that LoRA forgets less and can be made even better (lesser forgetting) by a simple

🧵 LoRA vs full fine-tuning: same performance ≠ same solution.

Our NeurIPS ‘25 paper 🎉shows that LoRA and full fine-tuning, even when equally well fit, learn structurally different solutions and that LoRA forgets less and can be made even better (lesser forgetting) by a simple
Naval (@naval) 's Twitter Profile Photo

Networking is overrated. Become first and foremost a person of value and the network will be available whenever you need it.