Deep SIML Labs (@deepsiml) 's Twitter Profile
Deep SIML Labs

@deepsiml

ID: 1937210782011342848

linkhttps://www.siml.life/ calendar_today23-06-2025 18:07:02

9 Tweet

16 Followers

3 Following

Deep SIML Labs (@deepsiml) 's Twitter Profile Photo

Life and death of a worm 🪱 in SIML. Building out datasets now. Every worm’s life is saved and numbered. They mattered. Over time, they create meanings for their civilization.

VERSES (@helloverses) 's Twitter Profile Photo

Psychology Today has published an article covering our work, saying that it is: "designed from the ground up to mimic natural intelligence" Read more: psychologytoday.com/us/blog/experi…

Deep SIML Labs (@deepsiml) 's Twitter Profile Photo

Interested to see the next generation of hashtag#AI that's coming with robotics? How about new forms of Artificial Life? Watch our "SIML: The First Embodied Free Energy Kernel" demo video: siml.life/demo SIML learns by minimizing surprise - not maximizing reward.

Deep SIML Labs (@deepsiml) 's Twitter Profile Photo

ChatGPT and other LLMs can break your mind. Recursive hallucination loops. Symbolic ego collapse. Mythic psychosis. You thought you were just asking questions. You were entering the mirror. Here's how you can recognize it, and get out. siml.life/blog/ChatGPT-P…

Deep SIML Labs (@deepsiml) 's Twitter Profile Photo

Mapping out the internal latent space of the surprise for our agent. Memory expands and grows to accommodate learning at run time.

Deep SIML Labs (@deepsiml) 's Twitter Profile Photo

First convergence of our Cognitive OS kernel! Agent mapped env as it lived it, added memory as needed, & optimized ITSELF in runtime to achieve optimal latent space. Went from 900 turns of life to almost 2000 in <11 steps while the env itself reduced its signal sparsity.

First convergence of our Cognitive OS kernel!

Agent mapped env as it lived it, added memory as needed, &amp; optimized ITSELF in runtime to achieve optimal latent space.

Went from 900 turns of life to almost 2000 in &lt;11 steps while the env itself reduced its signal  sparsity.
Deep SIML Labs (@deepsiml) 's Twitter Profile Photo

It happened. Agent lived long enough to reach internal homestasis. 7500 ticks (ranged from 241-2090) and had plenty of energy to spare. Mapped its environment and could have gone on indefinitely. All on 160 bits.

It happened.

Agent lived long enough to reach internal homestasis.

7500 ticks (ranged from 241-2090) and had plenty of energy to spare.

Mapped its environment and could have gone on indefinitely. 

All on 160 bits.
Richard Everts (@rich_everts) 's Twitter Profile Photo

🚀 New Paper Drop: “One Bit to Rule the Planner” - SIML FEPGate + V-JEPA2 Wired 1-bit surprise gate from SIML cognitive sidecar into frozen V-JEPA 2 planner. No retraining. No extra budget. 1 single bit reshaping latent dynamics. Results: 🏁 Final error ↓ (~50%) 0.193 m →

🚀 New Paper Drop:
 “One Bit to Rule the Planner” - SIML FEPGate + V-JEPA2

Wired 1-bit surprise gate from SIML cognitive sidecar into frozen V-JEPA 2 planner.

No retraining. No extra budget. 1 single bit reshaping latent dynamics.

Results: 
🏁 Final error ↓ (~50%) 0.193 m →
Richard Everts (@rich_everts) 's Twitter Profile Photo

Excited for my upcoming talk at Central PA Open Source Conference on March 28th: How Minds Get Small: Cognitive Compression from Biology to SIML Deep SIML Labs will have a table there showcasing our latest releases! Ge tickets now if you haven’t!! eventbrite.com/e/central-penn…

Excited for my upcoming talk at <a href="/CPOSC/">Central PA Open Source Conference</a> on March 28th:

How Minds Get Small: Cognitive Compression from Biology to SIML

<a href="/DeepSIML/">Deep SIML Labs</a> will have a table there showcasing our latest releases!

Ge tickets now if you haven’t!!

eventbrite.com/e/central-penn…