João Gabriel (@joaogabrieljunq) 's Twitter Profile
João Gabriel

@joaogabrieljunq

PhD Student @ufg_oficial | AI Engineer at Panoplai

ID: 1898711015618928640

calendar_today09-03-2025 12:22:54

10 Tweet

8 Followers

316 Following

João Gabriel (@joaogabrieljunq) 's Twitter Profile Photo

Not every researcher is a good one — or a good reviewer. Still, we need academics in academia and business minds in the market. It’s about trying, being part of the process, and learning to live with it.

Eleanor Berger (@intellectronica) 's Twitter Profile Photo

Paul Graham Moderately intelligent humans have _a lot_ of implicit context, that's why they can figure out what is needed from a couple of words or even a non-verbal gesture. What we call "prompt engineering" is primarily about providing AI with context. We'll need more of that, not less.

Eleanor Berger (@intellectronica) 's Twitter Profile Photo

If software engineering leverage increases dramatically thanks to AI-assistance, but still requires some engineers to drive (looks likely currently), I think we'll see work shifting to indies getting paid per outcome faster than companies adjust to high salaries.

Dev Crítico (@devcritico) 's Twitter Profile Photo

Se você olhar bem, a base de desenvolvimento de software hoje é composta por coisas “antigas”: Linux, RDBMS, SQL, processos e filas.

Gabriele Berton (@gabriberton) 's Twitter Profile Photo

Some advice to anyone starting a PhD in ML, or things that I heard from more experienced researchers and I tried to follow: 1) focus on a real problem. Something tangible, that can benefit people. Talk to industry folks if you're looking for open problems. Talk to the end (1/8)

BentoML - Infrastructure for Building AI Systems (@bentomlai) 's Twitter Profile Photo

🤔 What is KV cache offloading and why does it matter for LLM inference? #LLMs use the KV cache to accelerate inference speed by avoiding recompute for KV during the prefill phase, however cache grows linearly with requests and has to be evicted to due to GPU memory limit. KV