Andre Martins (@andre_t_martins) 's Twitter Profile
Andre Martins

@andre_t_martins

NLP/ML researcher in Lisbon (@ andre-t-martins.bsky.social)

ID: 3314842991

calendar_today09-06-2015 11:21:25

639 Tweet

2,2K Followers

392 Following

Slator (@slatornews) 's Twitter Profile Photo

.Unbabel exposes 🔎 how using the same metrics for both training and evaluation can create misleading ⚠️ #machinetranslation performance estimates and proposes how to solve this with MINTADJUST. José Maria Pombal Ricardo Rei Andre Martins #translation #xl8 #MT slator.ch/UnbabelBiasAIT…

José Maria Pombal (@zmprcp) 's Twitter Profile Photo

New paper out 🚀 Zero-shot Benchmarking: A Framework for Flexible and Scalable Automatic Evaluation of Language Models: arxiv.org/abs/2504.01001. We present a framework and release a repository for creating reliable benchmarks for (V)LM tasks quickly and fully automatically.

New paper out 🚀 Zero-shot Benchmarking: A Framework for Flexible and Scalable Automatic Evaluation of Language Models: arxiv.org/abs/2504.01001.

We present a framework and release a repository for creating reliable benchmarks for (V)LM tasks quickly and fully automatically.
Andre Martins (@andre_t_martins) 's Twitter Profile Photo

US big tech companies with AI teams located in Europe are asking their employees to sign non-compete agreements to prevent AI talent to leave. This is blocking AI innovation in Europe and should be stopped. Non-compete clauses should be illegal. European Commission Henna Virkkunen

Instituto de Telecomunicações (@itnewspt) 's Twitter Profile Photo

📢Instituto de Telecomunicações and ELLIS Unit Lisbon invite you to the Talk: “Reviving Encoder Models: Making Old New, Iterate” with Pierre Colombo, Professor from CentraleSupélec and Equall. 🎉 Don’t miss it! See you there! #NLP #AI #MachineLearning #TechTalk Andre Martins

📢Instituto de Telecomunicações and <a href="/Lisbon_ELLIS/">ELLIS Unit Lisbon</a> invite you to the Talk: “Reviving Encoder Models: Making Old New, Iterate” with Pierre Colombo, Professor from CentraleSupélec and Equall.

🎉 Don’t miss it! See you there!

#NLP #AI #MachineLearning #TechTalk <a href="/andre_t_martins/">Andre Martins</a>
José Maria Pombal (@zmprcp) 's Twitter Profile Photo

We just released M-Prometheus, a suite of strong open multilingual LLM judges at 3B, 7B, and 14B parameters! Check out the models and training data on Huggingface: huggingface.co/collections/Un… and our paper: arxiv.org/abs/2504.04953

We just released M-Prometheus, a suite of strong open multilingual LLM judges at 3B, 7B, and 14B parameters!

Check out the models and training data on Huggingface: huggingface.co/collections/Un…
and our paper: arxiv.org/abs/2504.04953
Andre Martins (@andre_t_martins) 's Twitter Profile Photo

If you're attending #AISTATS2025 stop by our poster #45 (Sunday May 4th 3pm)! And check out the nice interactive website which Margarida put together: …ns-conformal-predictors.streamlit.app

Dennis Fucci (@dennisfucci) 's Twitter Profile Photo

🎉 Excited to share our paper “Different Speech Translation Models Encode and Translate Speaker Gender Differently” was accepted at #ACL2025 (main)! ✍🏼 Big thanks to amazing co-authors: Marco Gaido, Matteo Negri, Luisa Bentivogli, Andre Martins, Giuseppe Attanasio! 📄 Preprint out soon!

ELLIS Unit Lisbon (@lisbon_ellis) 's Twitter Profile Photo

Attending ICML Conference July 13th - 19th? Stop by our posters! Click the links 👇 for more info: S. Santos et al: tinyurl.com/fbpm5rpt Gonçalves et al: tinyurl.com/4ye6y2bt Duarte et al: tinyurl.com/mryuvst4 P. Santos et al: tinyurl.com/3w57mbtu

Attending <a href="/icmlconf/">ICML Conference</a> July 13th - 19th? Stop by our posters! Click the links 👇 for more info:

S. Santos et al: tinyurl.com/fbpm5rpt

Gonçalves et al: tinyurl.com/4ye6y2bt

Duarte et al: tinyurl.com/mryuvst4

P. Santos et al: tinyurl.com/3w57mbtu
UTTER (@utterproject) 's Twitter Profile Photo

🚀 Proud moment! Prof. Andre Martins represented UTTER & #EuroLLM at #GTCParis + #VivaTech2025, showcasing their role in Europe’s sovereign AI future. And the highlight? Both projects were featured in Jensen Huang’s keynote! 🙌 #EU #NVIDIA #LLMs #AIResearch

🚀 Proud moment!

Prof. <a href="/andre_t_martins/">Andre Martins</a> represented <a href="/UTTERProject/">UTTER</a> &amp;   #EuroLLM at #GTCParis + #VivaTech2025, showcasing their role in Europe’s sovereign AI future.

And the highlight? Both projects were featured in Jensen Huang’s keynote! 🙌

#EU #NVIDIA #LLMs #AIResearch
Marcos Treviso (@marcostreviso) 's Twitter Profile Photo

Sparse attention isn't just effective... It's also fast! ✅ Our work builds on AdaSplash, an accelerated sparse attention kernel that will be presented as an Oral at ICML 2025 🎤 📦 Code: github.com/deep-spin/adas…

ELLIS Unit Lisbon (@lisbon_ellis) 's Twitter Profile Photo

Attending ACL 2025 in Vienna? Check out our Unit's papers 👇 Peters, Martins arxiv.org/abs/2403.03923 Zaranis et al. arxiv.org/abs/2410.10995 Fucci et al. arxiv.org/abs/2506.02172 Gomes et al. arxiv.org/abs/2504.01225 Pombal et al. arxiv.org/abs/2412.04205

Attending <a href="/aclmeeting/">ACL 2025</a> in Vienna? Check out our Unit's papers 👇

Peters, Martins arxiv.org/abs/2403.03923
Zaranis et al. arxiv.org/abs/2410.10995
Fucci et al. arxiv.org/abs/2506.02172
Gomes et al. arxiv.org/abs/2504.01225
Pombal et al. arxiv.org/abs/2412.04205
Andre Martins (@andre_t_martins) 's Twitter Profile Photo

Attending #ICML2025? Come see our oral presentation on "AdaSplash: Adaptive Sparse Flash Attention" today at 15:30 (Oral 2D Efficent ML) or catch us in the poster session at 16:30 (East Exhibition Hall A-B #E-3305). With Nuno Gonçalves and Marcos Treviso.

Andre Martins (@andre_t_martins) 's Twitter Profile Photo

The sparsemax paper reached 1000 citations now and it keeps to bear fruit. Two recent sparse attention examples: long-context efficiency with Adasplash (arxiv.org/abs/2502.12082) and better length generalization with ASEntmax (arxiv.org/abs/2506.16640). Check the story below!

The sparsemax paper reached 1000 citations now and it keeps to bear fruit. Two recent sparse attention examples: long-context efficiency with Adasplash (arxiv.org/abs/2502.12082) and better length generalization with ASEntmax (arxiv.org/abs/2506.16640). Check the story below!
José Maria Pombal (@zmprcp) 's Twitter Profile Photo

I'll be at ACL presenting our work, A Context-aware Framework for Translation-mediated Conversations (arxiv.org/pdf/2412.04205) in the Machine Translation session, 28 Jul, 14:00-15:30, room 1.85. Come check it out if you're interested in bilingual chat MT!