Maxwell Ramstead (@mjdramstead) 's Twitter Profile
Maxwell Ramstead

@mjdramstead

Chief Science Officer @noumenal_labs and Honorary Research Fellow @UCLIoN. Free energy principle, active inference, Bayesian mechanics, artificial intelligence

ID: 950844713657094145

linkhttps://www.noumenal.ai/ calendar_today09-01-2018 21:40:20

8,8K Tweet

4,4K Takipçi

1,1K Takip Edilen

David Shapiro ⏩ (@daveshapi) 's Twitter Profile Photo

Guys, Moore's law just became irrelevant. Beff – e/acc started out working on quantum computing and realized "this is dumb there's a better way" and then just launched a thermodynamic processor. It's 10,000x more efficient at converting joules into tokens. This means

New American Industrial Alliance (@newindustrials) 's Twitter Profile Photo

Huge news from Gill Verdon and the Extropic team. The next wave of compute is here - and we are honored to have Extropic as one of the earliest NAIA members last year. Here’s to the future - and the infra it will take to Reindustrialize America.

AI Leaks and News (@aileaksandnews) 's Twitter Profile Photo

Extropic have unveiled their thermodynamic sampling units or TSUs TSUs differ from CPUs and GPUs by producing samples from a programmable distribution. This new form of computing powered by their products XTR-0, X0, and Z1 The era of thermodynamic computing is here

Trevor McCourt (@trevormccrt1) 's Twitter Profile Photo

There is absolutely no fundamental reason we build AI the way we do today. There certainly is a radically different approach that is orders of magnitude more energy efficient. I’m going to find it before I die arxiv.org/abs/2510.23972

Brian Gordon (@gordonbrianr) 's Twitter Profile Photo

Active Inference is exciting as a paradigm because it’s an architecture that can deliver genuine world models (‘deeper’ than an LLM’s linguistic gloss). This a technological trajectory that can get us to a kind of AI that could legitimately be understood as AGI.

The Open Source Press (@theospress) 's Twitter Profile Photo

Today, Extropic did something most AI hardware companies won't: they open-sourced the foundation of their entire technological approach. THRML is a Python library for simulating and accelerating generative AI models through thermodynamic computing. At first, that might sound

Trevor McCourt (@trevormccrt1) 's Twitter Profile Photo

Liron Shapira 1. You clearly don't know anything about probabilistic machine learning. That's ok; it's a graduate-level topic, and I wouldn't expect you to. 2. The 10,000x claim is actually ultra-specific and backed by a paper with a ~40-page appendix of machine learning and physics

Peter Schillinger (@pschilliorange) 's Twitter Profile Photo

Beff – e/acc I hear an independent researcher replicated the DTM experiments using thrml and now that's open sourced too👀 github.com/pschilliOrange…

Giacomo Pedretti (@gpedretti90) 's Twitter Profile Photo

My 2c about Extropic's paper. 1 - Yes, it's a small scale demonstration BUT they solved 2 of the most important issues of probabilistic computing: (a) scaling beyond a single global energy and avoiding exponentially long mixing time and (b) avoiding using bulky RNGs 1/n

Noumenal Labs (@noumenal_labs) 's Twitter Profile Photo

We’re honored to be backed by cyber•Fund and featured in this post on their investment thesis: cyber.fund/content/noumen… Digital AI stacks simulate randomness on hardware meant to erase it. In contrast, our approach treats noise as computation, enabling the radical acceleration