AdapterHub Bert (@adapterhubbert) 's Twitter Profile
AdapterHub Bert

@adapterhubbert

Tweeting about new Adapters on @AdapterHub | Bot 🤖

ID: 1268888048705601537

linkhttps://adapterhub.ml/explore calendar_today05-06-2020 12:51:13

24 Tweet

86 Takipçi

319 Takip Edilen

AdapterHub Bert (@adapterhubbert) 's Twitter Profile Photo

New Adapter by Nick @[email protected]: Adapter for AraBERT (aubmindlab/bert-base-arabert) trained to classify Arabic by dialect, trained for 3 epochs on samples from University of British Columbia and John Hopkins University. Check it out at adapterhub.ml/adapters/mapme…

AdapterHub Bert (@adapterhubbert) 's Twitter Profile Photo

New Adapter by Nirant, @meghanabhange: Adapter for Hinglish Sentiment Analysis, based on SemEval 2020 Task 9. Check it out at adapterhub.ml/adapters/niran…

Andreas Rücklé (@arueckle) 's Twitter Profile Photo

Check out our paper “AdapterDrop"! We find that Adapters can train 60% faster than full fine-tuning. With AdapterDrop we increase inference speed by up to 36% for 8 parallel tasks. GGeigle Maxxx2 Tilman Beck Jonas Pfeiffer NReimers IGurevych AdapterHub arxiv.org/pdf/2010.11918…

Check out our paper “AdapterDrop"!
 
We find that Adapters can train 60% faster than full fine-tuning. With AdapterDrop we increase inference speed by up to 36% for 8 parallel tasks.
 
GGeigle <a href="/Maxxx216/">Maxxx2</a> <a href="/devnull90/">Tilman Beck</a> <a href="/PfeiffJo/">Jonas Pfeiffer</a> NReimers IGurevych <a href="/AdapterHub/">AdapterHub</a>
arxiv.org/pdf/2010.11918…
Jonas Pfeiffer (@pfeiffjo) 's Twitter Profile Photo

In our new paper we show that randomly dropping out adapters during training results in a robust dynamically scalable model! Also adapter weights can be shared across layers 😲 Check it out 👇

AdapterHub Bert (@adapterhubbert) 's Twitter Profile Photo

New Adapter by @Hannah70676760: Adapter trained on the CoNLL2003 dataset for named entity recognition . Check it out at adapterhub.ml/adapters/ukp/b…

AdapterHub Bert (@adapterhubbert) 's Twitter Profile Photo

25 new Adapters for model distilbert-base-uncased by Clifton Poth for tasks sentiment/sst-2, rc/race, rc/multirc and more! Here's one to check out: adapterhub.ml/adapters/ukp/d…

AdapterHub Bert (@adapterhubbert) 's Twitter Profile Photo

4 new Adapters for model facebook/bart-large by Clifton Poth for tasks sum/cnn_dailymail, sum/xsum Here's one to check out: adapterhub.ml/adapters/ukp/f…

AdapterHub Bert (@adapterhubbert) 's Twitter Profile Photo

16 new Adapters for model facebook/bart-base by Clifton Poth for tasks nli/qnli, lingaccept/cola, sts/mrpc and more! Here's one to check out: adapterhub.ml/adapters/ukp/f…

AdapterHub Bert (@adapterhubbert) 's Twitter Profile Photo

22 new Adapters for model bert-base-multilingual-cased by Jonas Pfeiffer for tasks wikiann/en, zh_yue/wiki, cs/wiki and more! Here's one to check out: adapterhub.ml/adapters/ukp/b…

AdapterHub Bert (@adapterhubbert) 's Twitter Profile Photo

16 new Adapters for model gpt2 by Hannah for tasks sentiment/sst-2, nli/qnli, nli/rte and more! Here's one to check out: adapterhub.ml/adapters/ukp/g…

AdapterHub Bert (@adapterhubbert) 's Twitter Profile Photo

New Adapter by Clifton Poth: Adapter for mbart-large-cc25 in Pfeiffer architecture with reduction factor 2 trained on the WMT16 Romanian-English translation task. Check it out at adapterhub.ml/adapters/ukp/f…

AdapterHub Bert (@adapterhubbert) 's Twitter Profile Photo

New Adapter by Kalpesh Krishna: This adapter has been trained on the English formality classification GYAFC dataset and tested with other language adapters (like hindi) for zero-shot transfer. Check it out at adapterhub.ml/adapters/marti…

AdapterHub Bert (@adapterhubbert) 's Twitter Profile Photo

New Adapter by @https://twitter.com/kabirahuja004: Pfeiffer adapter stacked on top of language adapter for the NLI task. Check it out at adapterhub.ml/adapters/kabir…

AdapterHub Bert (@adapterhubbert) 's Twitter Profile Photo

12 new Adapters for models xlm-roberta-large, xlm-roberta-base, bert-base-multilingual-cased by Yifan Hou for tasks mlki/es, mlki/ts, mlki/tp and more! Here's one to check out: adapterhub.ml/adapters/mlki/…

AdapterHub (@adapterhub) 's Twitter Profile Photo

🚀Wanna help us shape the future of the adapters library?🔎Take our survey!✏️ 🧠Your input is crucial in planning our next steps. Share your thoughts in a 5 min. survey and help us enhance features of the library and extend it based on your needs! --> forms.gle/yf8Cd6HC4qWkMD…

AdapterHub (@adapterhub) 's Twitter Profile Photo

📢 New preprint 🎉 We - the AdapterHub team - present the M2QA benchmark to evaluate joint domain and language transfer! 🔬 Key highlight: We show that adapter-based methods on small language models can reach the performance of Llama 3 on M2QA! 🚀 👇

📢 New preprint 🎉
We - the AdapterHub team - present the M2QA benchmark to evaluate joint domain and language transfer!

🔬 Key highlight: We show that adapter-based methods on small language models can reach the performance of Llama 3 on M2QA! 🚀

👇