niszoig
@kartikc14
ID: 1177919974490329089
http://kartikchincholikar.github.io 28-09-2019 12:16:28
424 Tweet
67 Takipçi
593 Takip Edilen
Amazing that Jürgen Schmidhuber gave this talk back in 2012, months before AlexNet paper was published. In 2012, many things he discussed, people just considered to be funny and a joke, but the same talk now would be considered at the center of AI debate and controversy. Full talk:
Slides for my lecture “LLM Reasoning” at Stanford CS 25: dennyzhou.github.io/LLM-Reasoning-… Key points: 1. Reasoning in LLMs simply means generating a sequence of intermediate tokens before producing the final answer. Whether this resembles human reasoning is irrelevant. The crucial
Gary Marcus Gary’s right — “distribution shift” is one of AI’s biggest unsolved problems. In simple terms, it means this: when an AI system is trained on one type of data but then encounters something even slightly different in the real world, its performance often collapses. It’s like
Peter Voss 🤷♂️ while having to deal with size, occlusions, and photometric variations that have nothing to do with the meaning.
Dileep George I know it’s popular to hate tokenizers, but visual representations (which are also tokenized) bring a lot of messiness as well. Aspect ratios, cropping, resolution, brightness, etc. Sure, models learn to deal with that but it requires lots of data to make them robust wrt these.
A new episode of The Information Bottleneck podcast!🎙️ This week we talked with Randall Balestriero (Randall Balestriero), assistant professor at Brown University, about Joint Embedding Predictive Architectures (JEPA) 🥳🥳🥳
Building fully native Android apps is so easy! 1. Install Android Studio 2. Open Android Studio and start a blank project 3. Install Google Antigravity 4. Point Google Antigravity at the folder created by Android Studio 5. Set the agent to Gemini 3 Flash and tell Google Antigravity what you