Ogban (@ogbanugot) 's Twitter Profile
Ogban

@ogbanugot

How is it possible that we have subjective experience?

ID: 65780323

calendar_today15-08-2009 00:38:50

9,9K Tweet

605 Followers

592 Following

Ogban (@ogbanugot) 's Twitter Profile Photo

Here is some recommended reading for you. Max Bennet does a great job weaving through time the key components of the brain’s evolution and intelligence. It’s a great pair with Gerald Schneider’s more technical “Brain Structures and its Origin”.

Here is some recommended reading for you. Max Bennet does a great job weaving through time the key components of the brain’s evolution and intelligence. It’s a great pair with Gerald Schneider’s more technical “Brain Structures and its Origin”.
Alex Cheema - e/acc (@alexocheema) 's Twitter Profile Photo

While Apple has been positioning M4 chips for local AI inference with their unified memory architecture, NVIDIA just undercut them massively. Stacking Project Digits personal computers is now the most affordable way to run frontier LLMs locally. The 1 petaflop headline feels

While Apple has been positioning M4 chips for local AI inference with their unified memory architecture, NVIDIA just undercut them massively.

Stacking Project Digits personal computers is now the most affordable way to run frontier LLMs locally.

The 1 petaflop headline feels
Ogban (@ogbanugot) 's Twitter Profile Photo

This is a huge over simplification. Time series models have always existed. The breakthrough in auto regressive Transformers is not next token prediction. It is learning to use all parts of the previous sequence to make that prediction.

Philipp Koellinger (@pkoellinger) 's Twitter Profile Photo

How to fix scientific publishing 1 - The current business model in scientific publishing is absurd: You either pay to read or when your work is getting approved for publication. Meanwhile, the scientists who do the review work get nothing - not even recognition. DeSci Labs

Ogban (@ogbanugot) 's Twitter Profile Photo

Pantheon’s characters are emulating sentient brains yet when they display the code supposedly doing all that its feed forward neural nets written in Python? 🫠

Noam Brown (@polynoamial) 's Twitter Profile Photo

6 years ago AI pioneer and now Turing Award winner Richard Sutton distilled 75 years of AI into a simple Bitter Lesson: general methods that scale with data and compute ultimately win. With the rise of AI agents it's an important lesson to keep in mind: cs.utexas.edu/~eunsol/course…

Vivek Galatage (@vivekgalatage) 's Twitter Profile Photo

The way neural networks shape and transform data is pure geometry! Chris Olah’s 2014 blog on manifolds & topology still holds up as a brilliant explainer. It's a must-read for anyone learning about AI #DeepLearning #AI PS: The whole blog is a treasure trove - explore it all!

The way neural networks shape and transform data is pure geometry!

Chris Olah’s 2014 blog on manifolds & topology still holds up as a brilliant explainer. It's a must-read for anyone learning about AI 

#DeepLearning #AI

PS: The whole blog is a treasure trove - explore it all!
Andy Keller (@t_andy_keller) 's Twitter Profile Photo

In the physical world, almost all information is transmitted through traveling waves -- why should it be any different in your neural network? Super excited to share recent work with the brilliant Mozes Jacobs: "Traveling Waves Integrate Spatial Information Through Time" 1/14

Ogban (@ogbanugot) 's Twitter Profile Photo

Language in humans evolved as a meta-program to strengthen theory of mind, it forces the brain to continuously engage in self-referential abstraction and reasoning.

ARC Prize (@arcprize) 's Twitter Profile Photo

Today we are announcing ARC-AGI-2, an unsaturated frontier AGI benchmark that challenges AI reasoning systems (same relative ease for humans). Grand Prize: 85%, ~$0.42/task efficiency Current Performance: * Base LLMs: 0% * Reasoning Systems: <4%

Today we are announcing ARC-AGI-2, an unsaturated frontier AGI benchmark that challenges AI reasoning systems (same relative ease for humans).

Grand Prize: 85%, ~$0.42/task efficiency

Current Performance:
* Base LLMs: 0%
* Reasoning Systems: &lt;4%
Alessandro Gozzi (@gozziale.bsky.social) (@gozzi_ale) 's Twitter Profile Photo

Breathing isn’t just for life—it’s a brain pacemaker! This excellent review reveals how each breath sends sensory waves that globally synchronize neural networks - with implications for "functional connectivity". Congrats to the authors! 👉 doi.org/10.1038/s41583…

Donald Hoffman (@donalddhoffman) 's Twitter Profile Photo

”It’s now clear that a great deal of information carried by touch neurons converges in the spinal cord and brainstem before reaching the cognitive parts of the brain, suggesting that the touch signals are processed earlier…” quantamagazine.org/touch-our-most…

Quanta Magazine (@quantamagazine) 's Twitter Profile Photo

The computer scientist Ryan Williams recently devised a mathematical procedure for transforming any algorithm — no matter what it does — into a form that uses much less computing space. “I just thought I was losing my mind.” quantamagazine.org/for-algorithms…

The computer scientist Ryan Williams recently devised a mathematical procedure for transforming any algorithm — no matter what it does — into a form that uses much less computing space. “I just thought I was losing my mind.” quantamagazine.org/for-algorithms…
Raphaël Millière (@raphaelmilliere) 's Twitter Profile Photo

Transformer-based neural networks achieve impressive performance on coding, math & reasoning tasks that require keeping track of variables and their values. But how can they do that without explicit memory? 📄 Our new ICML paper investigates this in a synthetic setting! 🧵 1/13