Quick (@quicklearned) 's Twitter Profile
Quick

@quicklearned

Self-Augmented Human.
Building my digital toolkit before I become obsolete

ID: 828633547103731712

calendar_today06-02-2017 15:56:47

304 Tweet

114 Followers

906 Following

Quick (@quicklearned) 's Twitter Profile Photo

Cursor.sh is miles ahead of the new GitHub Copilot. I tried switching to the new GitHub agent mode, but it's still way worse

Quick (@quicklearned) 's Twitter Profile Photo

How we test LLMs in 2025: building a FastHTML webapp on my phone, with opus 4, to automate work tasks while sipping morning coffee

Quick (@quicklearned) 's Twitter Profile Photo

GLM coder plans with opencode CLI is like having unlimited tokens. And GLM 4.7 looks like sonnet 4.1. Using it for cron tasks with clawbot is so fun (processing data like bookmarks, youtube videos transcripts, podcasts transcripts). Now time to experiments ralph loop!

Quick (@quicklearned) 's Twitter Profile Photo

GLM 4.7 on zai coding plan is the best value for ralph loops or agentic workflow. Almost unlimited token for 144$ a year! You can have several ralph loop 24/24 with this. The quality is around sonnet 4.5, so enough with good PRDs

GLM 4.7 on zai coding plan is the best value for ralph loops or agentic workflow. Almost unlimited token for 144$ a year!
You can have several ralph loop 24/24 with this. The quality is around sonnet 4.5, so enough with good PRDs
Quick (@quicklearned) 's Twitter Profile Photo

You should do your own clawdbot using claude -p in tmux with telegram and a cron skill. It takes 2h, it gives you 90% of useful features. And it saves a lot of money (clawdbot is using tokens like a madman, 200$ max weekly limits in an afternoon of moderate work). Do it

Quick (@quicklearned) 's Twitter Profile Photo

new lite ralph loop: create tasks with claude code, then one subagent per task (no context rot, ultra fast) With GLM 4.7 on claude code, it's working really well and cheap

Quick (@quicklearned) 's Twitter Profile Photo

Guys if you put your clawdbot on a social network, people can ask him to exfiltrate your data (or delete your files). Be careful with public interface of a bot running on your machine!

Quick (@quicklearned) 's Twitter Profile Photo

Competition in LLMs is at an all time high! Opus 4.6 and codex 5.3 released at the same minute. They’re both big news. Even at gpt 4 old times, we had weeks/months between the new competitors models. 2026 will be the most competitive year (yet). Enjoy the ride!

Quick (@quicklearned) 's Twitter Profile Photo

Installing a proxmox homelab went from a 2 day task with chatgpt to 2 hours with claude code doing everything via ssh. Even gpu passthrough, custom DMZ is one prompt

Quick (@quicklearned) 's Twitter Profile Photo

Gary went from "LLMs are stupid" to "LLMs cannot reason" to "LLMs are inducing burnouts to developers". In a year it will be "LLMs are ́not replacing humans for physical tasks". Rage baiting is always making views

Rob Zolkos (@robzolkos) 's Twitter Profile Photo

Major Claude Code policy clear up from Anthropic: "Using OAuth tokens obtained through Claude Free, Pro, or Max accounts in any other product, tool, or service — including the Agent SDK — is not permitted"

Quick (@quicklearned) 's Twitter Profile Photo

The next step will be to distribute your software by reverse generating a ultra detailed prompt/specs from your code (including detailed e2e testing policy). Then the user plug it to his agent and the code is generated. Want to change something / configure for your need? Ask your