Maxime Chevalier (@love2code) 's Twitter Profile
Maxime Chevalier

@love2code

💖 ➞ λ: Compiler designer working on YJIT, a JIT compiler inside CRuby. Casual audiophile. Follow me for code reviews, stock picks and dating advice 💖🌈 🇨🇦

ID: 423065127

linkhttps://pointersgonewild.com calendar_today28-11-2011 01:40:25

6,6K Tweet

17,17K Takipçi

290 Takip Edilen

Maxime Chevalier (@love2code) 's Twitter Profile Photo

This benchmark does a very good job of demonstrating how poorly the current generation of models generalize to data they haven't been extensively trained on. There's room for lots more innovation yet!

Maxime Chevalier (@love2code) 's Twitter Profile Photo

Tried to ask gemini code CLI to optimize some triangle rasterization code but there were many visual glitches and gemini couldn't seem to fix the problem. Found the fix, had to ask "please try to do the calculations in a way that is numerically robust" in the prompt 😆

Maxime Chevalier (@love2code) 's Twitter Profile Photo

IMO this will ultimately turn out to be the wrong approach. Your robot dataset will quickly become stale if you modify the robot platform. You want to be able to learn directly from videos of humans doing tasks and then fine-tune in simulation.

Maxime Chevalier (@love2code) 's Twitter Profile Photo

One issue I have with Gemini Code CLI is that it tends to make changes you didn't ask for. I go and rename a method, and it undoes the change. Sometimes have to repeatedly tell it to leave some things alone 😆

Maxime Chevalier (@love2code) 's Twitter Profile Photo

I optimized my rasterization code and was able to get ~2200 polygons rendering in real-time in my toy programming language. This is running in an interpreter using a single thread. The fact that it's flat shaded, not textured, makes it not too hard to get good performance.

Maxime Chevalier (@love2code) 's Twitter Profile Photo

Let's make building datacenters much harder than it needs to be. After all, why not? We need a new investment fad after quantum computing.

Maxime Chevalier (@love2code) 's Twitter Profile Photo

Brainstorming low poly city game designs with the AI (grok imagine) 🤔 You can now visualize/storyboard game ideas without writing any code or modeling anything. Seems like an awesome design tool. Even moreso if you're a lone coder.

Brainstorming low poly city game designs with the AI (grok imagine) 🤔

You can now visualize/storyboard game ideas without writing any code or modeling anything. Seems like an awesome design tool. Even moreso if you're a lone coder.
Maxime Chevalier (@love2code) 's Twitter Profile Photo

Using simple optimizations like inline caching and code patching, I was able to make my toy programming language do raytracing 3.5x faster. I wrote a blog post about it! :) pointersgonewild.com/2025-10-12-opt…

Maxime Chevalier (@love2code) 's Twitter Profile Photo

Today I vibe-coded a procedural 808-style drum machine in Plush, my toy programming language🧸🎵🥁🤖 Gemini code CLI wrote ~95% of the code, but tweaks and manual debugging were required. I'm sharing the code of this program as CC0 public domain: github.com/maximecb/plush…

Maxime Chevalier (@love2code) 's Twitter Profile Photo

The more I play with coding LLMs, the more I feel like those predictions for AGI and no more coding jobs in two years are dumb takes by clueless people with a vested interest in receiving AI investments. Valuable tools for sure, but "true" AGI might still be 10-20 years away.

Maxime Chevalier (@love2code) 's Twitter Profile Photo

Vibe coding of the day: a spectrum analyzer in Plush, using FFT and mel frequency bins :) (sorry for the potato audio quality, macOS screen recording can only record audio using the laptop mic) github.com/maximecb/plush…

Ian Hanschen (@furan) 's Twitter Profile Photo

Amiga used stacked protoboard "towers" to prototype the custom chips - mostly a combination of 74xx TTL and PAL. Photo credit: Martin Becker

Amiga used stacked protoboard "towers" to prototype the custom chips - mostly a combination of 74xx TTL and PAL.

Photo credit: Martin Becker
Maxime Chevalier (@love2code) 's Twitter Profile Photo

Asked Grok a programming question and it suddenly started answering in Spanish. I guess the probability of switching language mid-document is nonzero in the training data? LLMs have the weirdest failure modes.