Language Processing Lab @ UChicago
@uchicagolanglab
We work on sentence processing, experimental semantics and pragmatics, and experimental syntax. PI: Dr. Ming Xiang
ID: 1369841500884983808
https://lucian.uchicago.edu/blogs/lpl/ 11-03-2021 02:44:09
81 Tweet
466 Takipçi
165 Takip Edilen
[email protected] - help us out by flooding Kent’s VC with emails. Linguistics has been suffocated over years by university marketing strategy we have no control over. Despite everything we’ve produced the best students who have gone on to do PhDs at the best depts in the world.
Very happy to announce the publication of Eszter Ronai (NorthwesternLinguist) & Ming Xiang's new paper "What could have been said? Alternatives and variability in pragmatic inferences" in Journal of Memory and Language! authors.elsevier.com/a/1idTA_,1j5AB…
We have a postdoc position open, on a project about QUD processing. Starting date July 1 2024 or after. Drop me an email if you are interested! ([email protected])
Six UChicago faculty members have been newly elected to the American Academy of Arts & Sciences, one of the nation’s oldest and most prestigious honorary societies. These scholars have made breakthroughs in fields ranging from linguistics to fundamental biology. Learn more: ms.spr.ly/6011YKi7S
How far the implicit learning account can go? Check out our new paper (with Richard Futrell) accepted to #CogSci2024 CogSci Society : "A hierarchical Bayesian model for syntactic priming" Preprint: arxiv.org/pdf/2405.15964
What does it mean for language comprehension to be “good-enough”? With Richard Futrell, we present a computational formalization of model of shallow and deep processing using rate-distortion theory in our new #CogSci2024 paper: arxiv.org/abs/2405.08223 (1/n)
Work with Language Processing Lab @ UChicago on processing appositive (ARCs) vs. restrictive relative clauses (RRCs) is out: doi.org/10.1111/cogs.1… ARCs typically contain side-commentary info; does a distractor in an ARC lead to an absence of agreement attraction effect? (1/8)
Very proud of this work with Sanghee Kim Sanghee J. Kim