Jacek Migdal (@jakozaur) 's Twitter Profile
Jacek Migdal

@jakozaur

Co-founder of Quesma. Database gateway. Also, spontaneous "let's fix the world" policy insights. Ex Sumo Logic ($0 to 300M ARR). 🇵🇱

ID: 709235256

linkhttp://jacek.migdal.pl calendar_today21-07-2012 16:36:00

2,2K Tweet

1,1K Followers

915 Following

Jacek Migdal (@jakozaur) 's Twitter Profile Photo

Outbox pattern. Gunnar Morling 🌍 seems to be burnt by dual writes. For example, if you add two rows into different SQL tables in a single transaction, there are no atomic guarantees after Debezium CDC puts that into Kafka. The Outbox pattern fixes that by adding a new table.

Outbox pattern. <a href="/gunnarmorling/">Gunnar Morling 🌍</a> seems to be burnt by dual writes. For example, if you add two rows into different SQL tables in a single transaction, there are no atomic guarantees after Debezium CDC puts that into Kafka.

The Outbox pattern fixes that by adding a new table.
Andrej Karpathy (@karpathy) 's Twitter Profile Photo

+1 for "context engineering" over "prompt engineering". People associate prompts with short task descriptions you'd give an LLM in your day-to-day use. When in every industrial-strength LLM app, context engineering is the delicate art and science of filling the context window

Jacek Migdal (@jakozaur) 's Twitter Profile Photo

Great talk on how to optimize Cursor vibe coding by Grzegorz Kossakowski and Patryk Kabaj at AI Thinkers Warsaw. Tldr: Ask to prepare a plan in a prompt, answer clarifying questions, save the plan to markdown, and then kick off a big job.

Jacek Migdal (@jakozaur) 's Twitter Profile Photo

Your life is not that complex. I mean in terms of tokens needed for the LLM model. Maciej Cielecki from 10Clouds at AI Thinkers Warsaw. Markdown files as a database is all you need.

Jacek Migdal (@jakozaur) 's Twitter Profile Photo

Early-stage startups are about a lot of trial and error. Most things do not perform well enough, but you still need energy to execute relentlessly. Though occasionally I discover something extraordinary. Started last Friday, got six customer interviews in the next two days.

Jason Cohen (@asmartbear) 's Twitter Profile Photo

Unless you have 10,000 customers: Approximately no one knows who you are. Approximately no one knows your brand. Approximately no one knows your product. You can change everything, any time. Take advantage of that.

Jacek Migdal (@jakozaur) 's Twitter Profile Photo

Europe is back with a $200M Series A round for Lovable. Meanwhile, Polish Psyho, as the last natural neural network, beat OpenAI's artificial neural networks in a hardcore algorithm competition.

Jacek Migdal (@jakozaur) 's Twitter Profile Photo

Two customer discovery calls done, four scheduled from outreach and intros, plus a few commits powered by Claude Code. All during a business trip abroad today between meetings and a flight. The Zen of an early-stage startup founder.

Two customer discovery calls done, four scheduled from outreach and intros, plus a few commits powered by Claude Code. All during a business trip abroad today between meetings and a flight.

The Zen of an early-stage startup founder.
Hubert Thieblot (@hthieblot) 's Twitter Profile Photo

Unpopular opinion, but this is what I’d do if I started a new company today: •6-year vesting (companies take longer to build) •2-year cliff (most breakups happen in the first 2 years) •Founding engineers/growth members get 5%+ but on the same terms as founders

Andrzej Kubisiak (@kubisiaka) 's Twitter Profile Photo

Podobno obrazki z 😺 i 🐶 przyciągają uwagę⁉️ 🚸Jeżeli tak, nie widziałem ostatnio lepszego wykresu, który pokazywałbym spadek zainteresowania rodzicielstwem 👇 Za XYZ

Podobno obrazki z 😺 i 🐶 przyciągają uwagę⁉️

🚸Jeżeli tak, nie widziałem ostatnio lepszego wykresu, który pokazywałbym spadek zainteresowania rodzicielstwem 👇

Za <a href="/xyz_oficjalnie/">XYZ</a>
Jacek Migdal (@jakozaur) 's Twitter Profile Photo

Can GPT-oss:20b on my MacBook outperform GPT-5? Sometimes. Prompt: How many bs in blueberry? GPT-5: “3” 🤦 GPT-oss:20b: “2” ✅ (under 16 GB RAM) Why? GPT-5 is a mixture of models with a smart router. Sometimes the router underestimates a problem and picks the wrong expert.

Can GPT-oss:20b on my MacBook outperform GPT-5? Sometimes.

Prompt: How many bs in blueberry? 
GPT-5: “3” 🤦
GPT-oss:20b: “2” ✅ (under 16 GB RAM)

Why? GPT-5 is a mixture of models with a smart router. Sometimes the router underestimates a problem and picks the wrong expert.