Ferdinand Schlatt (@fschlatt1) 's Twitter Profile
Ferdinand Schlatt

@fschlatt1

PhD Student, efficient and effective neural IR models 🧠🔎

ID: 924666754055499776

calendar_today29-10-2017 15:58:28

70 Tweet

137 Takipçi

199 Takip Edilen

Andrew Parry (@mrparryparry) 's Twitter Profile Photo

🚨 New Pre-Print! 🚨 Reviewer 2 has once again asked for DL’19, what can you say in rebuttal?  We have re-annotated DL’19 in the form of classic evaluation stability studies. Work done with Maik Fröbe, Harry Scells, Ferdinand Schlatt, Guglielmo Faggioli, Saber Zerhoudi, Sean MacAvaney, Eugene Yang 🧵

🚨 New Pre-Print! 🚨 Reviewer 2 has once again asked for DL’19, what can you say in rebuttal?  We have re-annotated DL’19 in the form of classic evaluation stability studies. Work done with <a href="/maik_froebe/">Maik Fröbe</a>, <a href="/hscells/">Harry Scells</a>, <a href="/fschlatt1/">Ferdinand Schlatt</a>, <a href="/guglielm0f/">Guglielmo Faggioli</a>, <a href="/saber_zerhoudi/">Saber Zerhoudi</a>, <a href="/macavaney/">Sean MacAvaney</a>, <a href="/EYangTW/">Eugene Yang</a> 🧵
tomaarsen (@tomaarsen) 's Twitter Profile Photo

I've just ported the excellent monoELECTRA-{base, large} reranker models from Ferdinand Schlatt & the research network Webis Group to Sentence Transformers! These models were introduced in the Rank-DistiLLM paper, and distilled from LLMs like RankZephyr and RankGPT4. Details in 🧵

I've just ported the excellent monoELECTRA-{base, large} reranker models from <a href="/fschlatt1/">Ferdinand Schlatt</a> &amp; the research network Webis Group to Sentence Transformers!

These models were introduced in the Rank-DistiLLM paper, and distilled from LLMs like RankZephyr and RankGPT4.

Details in 🧵
Andrew Parry (@mrparryparry) 's Twitter Profile Photo

Now accepted at #SIGIR2025! looking forward to discussing evaluation with LLMs at #ECIR2025 this week and of course in Padua! In the mean time skim this thread.

Antonio Mallia (@antonio_mallia) 's Twitter Profile Photo

It was a really pleasant surprise to learn that our paper “Efficient Constant-Space Multi-Vector Retrieval” aka ConstBERT, co-authored with Sean MacAvaney and Nicola Tonellotto received the Best Short Paper Honourable Mention at ECIR 2025! #ECIR2025 #IR #Pinecone

It was a really pleasant surprise to learn that our paper “Efficient Constant-Space Multi-Vector Retrieval” aka ConstBERT, co-authored with <a href="/macavaney/">Sean MacAvaney</a>  and <a href="/ntonellotto/">Nicola Tonellotto</a> received the Best Short Paper Honourable Mention at ECIR 2025!
#ECIR2025 #IR  #Pinecone
Ferdinand Schlatt (@fschlatt1) 's Twitter Profile Photo

Thank you Carlos Lassance for shout-out of Lightning IR in the LSR tutorial at #SIGIR2025 If you want to fine your own LSR models, check out our framework at github.com/webis-de/light…

Thank you <a href="/cadurosar/">Carlos Lassance</a> for shout-out of Lightning IR in the LSR tutorial at #SIGIR2025

If you want to fine your own LSR models, check out our framework at github.com/webis-de/light…
Glasgow IR Group (@ir_glasgow) 's Twitter Profile Photo

Now it’s Andrew Parry presenting the reproducibility efforts of a large team of researchers in relation to the shelf life of test collections #sigir2025

Now it’s <a href="/MrParryParry/">Andrew Parry</a> presenting the reproducibility efforts of a large team of researchers in relation to the shelf life of test collections  #sigir2025
Andrew Parry (@mrparryparry) 's Twitter Profile Photo

Really like this work, if you haven't read it yet, have a look: arxiv.org/pdf/2502.20937 PSA from Ian Soboroff! move to Krippendorff's Alpha if you want to check the annotator agreement!

Ferdinand Schlatt (@fschlatt1) 's Twitter Profile Photo

Want to know how to make bi-encoders more than 3x faster with a new backbone encoder model? Check out our talk on the Token-Independent Text Encoder (TITE) #SIGIR2025 in the efficiency track. It pools vectors within the model to improve efficiency dl.acm.org/doi/10.1145/37…

Want to know how to make bi-encoders more than 3x faster with a new backbone encoder model? Check out our talk on the Token-Independent Text Encoder (TITE) #SIGIR2025 in the efficiency track. It pools vectors within the model to improve efficiency dl.acm.org/doi/10.1145/37…
Webis Group (@webis_de) 's Twitter Profile Photo

Happy to share that our paper "The Viability of Crowdsourcing for RAG Evaluation" received the Best Paper Honourable Mention at #SIGIR2025! Very grateful to the community for recognizing our work on improving RAG evaluation. 📄 webis.de/publications.h…

Happy to share that our paper "The Viability of Crowdsourcing for RAG Evaluation" received the Best Paper Honourable Mention at #SIGIR2025! Very grateful to the community for recognizing our work on improving RAG evaluation.

📄 webis.de/publications.h…
Webis Group (@webis_de) 's Twitter Profile Photo

Honored to win the ICTIR Best Paper Honorable Mention Award for "Axioms for Retrieval-Augmented Generation"! Our new axioms are integrated with ir_axioms: github.com/webis-de/ir_ax… Nice to see axiomatic IR gaining momentum.

Honored to win the ICTIR Best Paper Honorable Mention Award for "Axioms for Retrieval-Augmented Generation"!
Our new axioms are integrated with ir_axioms: github.com/webis-de/ir_ax…
Nice to see axiomatic IR gaining momentum.