Miles Williams (@miles_wil) 's Twitter Profile
Miles Williams

@miles_wil

PhD student at @SheffieldNLP

ID: 1511078683720794113

linkhttps://github.com/mlsw calendar_today04-04-2022 20:30:21

11 Tweet

127 Takipรงi

221 Takip Edilen

Nikos Aletras (@nikaletras) 's Twitter Profile Photo

If you're interested in memory efficient pre-trained LMs chech out Huiyin's new paper on vocabulary and tokenization independent Hashformer models (to appear at #EMNLP2022) ๐Ÿ‘‡

EngineeringSheffield (@sheffunieng) 's Twitter Profile Photo

The generative AI race has a dirty secret. Dr Nafise Sadat Moosavi, from Sheffield Comp Sci comments on the need to make large language models like the ones used by Google and Microsoft more efficient. wired.co.uk/article/the-geโ€ฆ

Leon Derczynski โœ๐Ÿป ๐Ÿ‚๐Ÿ (@leonderczynski) 's Twitter Profile Photo

Efficient NLP methods - an up-to-date survey, to appear in TACL. We cover efficiency wrt: * Data * Model design * Pre-training * Fine-tuning * Inference & compression * Hardware utilization * Evaluation * Model selection This was a blast to co-produce! arxiv.org/abs/2209.00099

Cass Zhixue (@casszzx) 's Twitter Profile Photo

โœ‚๏ธ Pruning Parameters = Pruning Hallucinations๐ŸŒช๏ธ! Our latest paper reveals the sweet spot: up to 50% pruned, the more you prune, the lower hallucination risk. It's a buy one get one free for #LLMs. Nikos Aletras George Chrysostomou Miles Williams ๐Ÿ˜Ž arxiv.org/abs/2311.09335

Cass Zhixue (@casszzx) 's Twitter Profile Photo

#PhDstudentship available in #LLMs at Sheffield NLP, one of the UK's largest #NLP research centres Sheffield Comp Sci * 3.5yrs tuition waiver & stipend * knowledge editing, model compression, or your proposal! DM for details! Apply now: shorturl.at/xCM36 #PhD #AI #PhDposition

Nikos Aletras (@nikaletras) 's Twitter Profile Photo

A fully-funded 3.5y scholarship (UK/Home applicants) for a PhD in #NLProc is available! Broad topic: efficient and robust alignment of LLMs. Application deadline: Jun 10th More info: findaphd.com/phds/project/eโ€ฆ

Nikos Aletras (@nikaletras) 's Twitter Profile Photo

Job opportunity Sheffield NLP ๐Ÿšจ: I'm looking for a #postdoc (24 months) in #NLProc The (very) broad topic is on addressing LLMs limitations (e.g. hallucinations, "reasoning", interpretability etc.) If you are interested drop me an email or DM Apply: jobs.ac.uk/job/DHP918/resโ€ฆ

Cass Zhixue (@casszzx) 's Twitter Profile Photo

๐Ÿ” Lighter is better! ๐Ÿ“š Our latest TACL paper reveals some fascinating aha moments for #LLMs. Check it out! ๐Ÿš€ #NLProc

Nikos Aletras (@nikaletras) 's Twitter Profile Photo

We're looking for a #PhD student to work on multimodal LLMs. This is a fully-funded scholarhsip (including stipend), open to home and international candidates. Deadline: 29/1/2025 Please spread the work! #nlproc

Nikos Aletras (@nikaletras) 's Twitter Profile Photo

Synthetic calibration data (for pruning and quantization) generated by the LLM itself is a better approx of the pre-training data dist than "external" data. Really cool work by Miles (Miles Williams) and George (George Chrysostomou) to be presented at #NAACL2025 Link to the paper ๐Ÿ‘‡

Synthetic calibration data (for pruning and quantization) generated by the LLM itself is a better approx of the pre-training data dist than "external" data. 

Really cool work by Miles (<a href="/miles_wil/">Miles Williams</a>) and George (<a href="/soon1otis/">George Chrysostomou</a>)  to be presented at #NAACL2025

Link to the paper ๐Ÿ‘‡
Nikos Aletras (@nikaletras) 's Twitter Profile Photo

Gutted to miss #NAACL2025 ๐Ÿ˜ญ but Miles Miles Williams will be there presenting the following papers: ๐Ÿ“„ Main: Self-Calibration for Language Model Quantization and Pruning ๐Ÿ“„ RepL4NLP: Vocabulary-Level Memory Efficiency for LM Fine-Tuning Check them out!