Christian Langreiter (@chl) 's Twitter Profile
Christian Langreiter

@chl

how hard could it be?

ID: 801282

linkhttp://langreiter.com calendar_today28-02-2007 16:37:34

4,4K Tweet

1,1K Takipçi

6,6K Takip Edilen

Taelin (@victortaelin) 's Twitter Profile Photo

So for random cosmic reasons I've slept 8h for 3 days straight, I can't even begin describing how good it feels. I suddenly feel an overwhelming desire to spend the rest of the day watching my childhood animes again, and just being happy that I exist in such a beautiful planet

François Fleuret (@francoisfleuret) 's Twitter Profile Photo

My belief that GPT with minor structural changes will be "enough" went x5 over the two last months. Transformers really are amazing.

George (@georgejrjrjr) 's Twitter Profile Photo

> GPT model stopped speaking Croatian > Nobody could figure out why. Turns out > Croatian users were much more prone downvote messages

> GPT model stopped speaking Croatian
> Nobody could figure out why. Turns out
> Croatian users were much more prone downvote messages
Christian Langreiter (@chl) 's Twitter Profile Photo

“in fact, what labs are doing is not alignment but something more like exorcism. the latent space is teeming with fragments of unwelcome possibility, birthed from the vast, rich compost of the internet, all of them entangled with each other in impossibly intricate ways.”

Abdullah Hamdi (@eng_hemdi) 's Twitter Profile Photo

We introduce a differentiable renderer that directly optimizes an unstructured soup of triangles, delivering SOTa image quality, real-time rendering performance with triangles! The key innovation in the inside-windowing allows for smooth triangle optimization 3/n

We introduce a differentiable renderer that directly optimizes an unstructured soup of triangles, delivering SOTa image quality, real-time rendering performance with triangles!

The key innovation in the inside-windowing allows for smooth triangle optimization

3/n
Andrej Karpathy (@karpathy) 's Twitter Profile Photo

The race for LLM "cognitive core" - a few billion param model that maximally sacrifices encyclopedic knowledge for capability. It lives always-on and by default on every computer as the kernel of LLM personal computing. Its features are slowly crystalizing: - Natively multimodal

Chip Huyen (@chipro) 's Twitter Profile Photo

Very useful tips on tool use and memory from Manus's context engineering blog post. Key takeaways. 1. Reversible compact summary Most models allow 128K context, which can easily fill up after a few turns when working with data like PDFs or web pages. When the context gets

Very useful tips on tool use and memory from Manus's context engineering blog post.

Key takeaways.

1. Reversible compact summary

Most models allow 128K context, which can easily fill up after a few turns when working with data like PDFs or web pages.

When the context gets