Joshua Maynez (@maynez_joshua) 's Twitter Profile
Joshua Maynez

@maynez_joshua

Research scientist at @GoogleAI. Focusing in Natural Language Generation.

ID: 1433487960645804033

calendar_today02-09-2021 17:52:35

7 Tweet

89 Followers

36 Following

Sebastian Gehrmann (@sebgehr) 's Twitter Profile Photo

Listing issues in NLG evaluations turned into a 25 page survey! In “Repairing the Cracked Foundation: A Survey of Obstacles in Evaluation Practices for Generated Text”, Thibault Sellam Elizabeth Clark and I cover 250+ papers. 📄Link: arxiv.org/abs/2202.06935 Want to learn more?👇

Listing issues in NLG evaluations turned into a 25 page survey!

In “Repairing the Cracked Foundation: A Survey of Obstacles in Evaluation Practices for Generated Text”, <a href="/ThiboIbo/">Thibault Sellam</a> <a href="/eaclark07/">Elizabeth Clark</a> and I cover 250+ papers. 
📄Link: arxiv.org/abs/2202.06935 

Want to learn more?👇
Google AI (@googleai) 's Twitter Profile Photo

Introducing the 540 billion parameter Pathways Language Model. Trained on two Cloud #TPU v4 pods, it achieves state-of-the-art performance on benchmarks and shows exciting capabilities like mathematical reasoning, code writing, and even explaining jokes. goo.gle/3j6eMnK

Joshua Maynez (@maynez_joshua) 's Twitter Profile Photo

Had the change to collaborate on this Large Language Model latest work at Google. The capabilities of this model are impressive and for NLG, it shows the first glimpses of few-shot generation on long context tasks.

Joshua Maynez (@maynez_joshua) 's Twitter Profile Photo

Can we scale QA models to low resource languages with just a few-examples? We propose to use pre-trained language models (PLMs) under few-shot settings to build large multilingual training data sets. QAmeleon: Multilingual QA with Only 5 Examples: arxiv.org/abs/2211.08264