Linda Drijvers
@lindadrijvers
cognitive neuroscientist studying brain oscillations, multimodal language comprehension & production | PI @combrainlab | Donders Institute & @MPI_NL
ID:147520018
http://lindadrijvers.nl 24-05-2010 10:26:17
1,2K Tweets
2,3K Followers
1,1K Following
New preprint w/ Cecilia Husta and Antje Meyer: we used RIFT to study the attentional distribution between speech planning and comprehension. Read the exciting results here! Communicative Brain lab biorxiv.org/content/10.110β¦
New paper together with Miguel Rubianes et al on the effect of subliminal processing of facial information on syntactic processing! And more coming soon! π
Another great paper led by Marlijn ter Bekke! Gestures speed up responses to questions π«΄π€π, check it out here!
Gestures speed up responses to questions. New paper by Marlijn ter Bekke , Linda Drijvers & Judith Holler
doi.org/10.1080/232737β¦
Now out in Cognitive Science (w/ Marlijn ter Bekke and Judith Holler): hand gestures have predictive potential, and typically start before their lexical affiliates! Read it here below!
Can gestures be used for prediction? Check out our paper (with Marlijn ter Bekke + Judith Holler) below - and excited that more work is on the way soon! :)
New open PhD position in my lab at the Donders Institute !
We are looking for a motivated PhD candidate with an interest in the computational underpinnings of body augmentation and embodiment.
Deadline: 14 February, 2024
Link: ru.nl/en/working-at/β¦
Happy to share that this project is finally out! We (w/ Jan Mathijs Schoffelen, Peter Hagoort and Linda Drijvers) investigated attention and audiovisual integration during multimodal communication, using rapid invisible frequency tagging (RIFT) and MEG Communicative Brain lab
Want to do a PhD on the processing of multimodal communicative signals in noise, in individuals who are hard-of-hearing or suffer from central auditory processing disorder? Come work with James Trujillo (and me)! Apply below :-)
Our review (w/ Sara Mazzini) on oscillations in audiovisual language is now out! If you can't access the pdf, send us a message. Check it here!
Thrilled to finally share this work! We used MEG & Rapid Invisible Frequency Tagging (RIFT) in visual search & show (1) V1/V2 behaves like a priority map and (2) alpha oscillations impose a *blanket inhibition* on all visual inputs. The Centre for Human Brain Health π§΅
biorxiv.org/content/10.110β¦
Our protocol on combining dual-EEG and audiovisual recordings to study naturalistic human communication is now available on STAR Protocols! star-protocols.cell.com/protocols/2813 Linda Drijvers Judith Holler
The first project of my PhD is out! π₯³ In the midst of the pandemic, we set out to investigate whether delta- and theta-band responses to individual words were affected by being placed in sentence context. π¬1/9π§΅
andrea e. martin
jneurosci.org/content/43/26/β¦