Elizabeth Mieczkowski (@beth_miecz) 's Twitter Profile
Elizabeth Mieczkowski

@beth_miecz

Studying multi-agent collaboration with @CoCoSci_Lab & @Velez_CoLab.
PhD Student @PrincetonCS.
Prev: @CornellCIS 2021, Lab Tech @MITBrainandCog @KanwisherLab.

ID: 1189705328381906944

linkhttps://emieczkowski.github.io/ calendar_today31-10-2019 00:47:15

84 Tweet

284 Takipçi

299 Takip Edilen

Chantal Valdivia-Moreno (@talvaldivia) 's Twitter Profile Photo

Excited to share my (first!) first-author preprint with Stephanie Fine Sasse @HilaryKLambert @KatieAMcLaughlin Leah Somerville Erik Nook 🏳️‍🌈 : Emotion word production develops in tandem with general verbal fluency and reveals key dimensions organizing emotion concepts osf.io/urf2w

Chi Jin (@chijinml) 's Twitter Profile Photo

Everyone's talking about Claude 3.7 playing Pokémon Red, but stay tuned—we've got some exciting work on LLMs for Pokémon battles coming soon! 😃🔥

Gianluca Bencomo (@gianlucabencomo) 's Twitter Profile Photo

New pre-print! In this work, we explore the extent to which different inductive biases can be instantiated among disparate neural architectures, specifically Transformers, CNNs, MLPs, and LSTMs. Link: arxiv.org/abs/2502.20237 (1/4)

Seth Karten (@sethkarten) 's Twitter Profile Photo

Can a Large Language Model (LLM) with zero Pokémon-specific training achieve expert-level performance in competitive Pokémon battles? Introducing PokéChamp, our minimax LLM agent that reaches top 30%-10% human-level Elo on Pokémon Showdown! New paper on arXiv and code on github!

Can a Large Language Model (LLM) with zero Pokémon-specific training achieve expert-level performance in competitive Pokémon battles?
Introducing PokéChamp, our minimax LLM agent that reaches top 30%-10% human-level Elo on Pokémon Showdown!
New paper on arXiv and code on github!
Alexander Ku (@alex_y_ku) 's Twitter Profile Photo

(1/11) Evolutionary biology offers powerful lens into Transformers learning dynamics! Two learning modes in Transformers (in-weights & in-context) mirror adaptive strategies in evolution. Crucially, environmental predictability shapes both systems similarly.

(1/11) Evolutionary biology offers powerful lens into Transformers learning dynamics! Two learning modes in Transformers (in-weights & in-context) mirror adaptive strategies in evolution. Crucially, environmental predictability shapes both systems similarly.
Pramod RT/ಪ್ರಮೋದ್ ರಾ ತಾ (@pramodrt9) 's Twitter Profile Photo

Thrilled to announce our new publication titled 'Decoding predicted future states from the brain's physics engine' with Elizabeth Mieczkowski, Cyn X. Fang, Nancy Kanwisher @[email protected], and Josh Tenenbaum. science.org/doi/full/10.11… (1/n)

Cognition (@cognitionjourn) 's Twitter Profile Photo

“People Evaluate Idle Collaborators Based on their Impact on Task Efficiency” 📢 New from: Elizabeth Mieczkowski, Cameron Rouse Turner, Natalia Vélez, & Tom Griffiths sciencedirect.com/science/articl… TL;DR: Sometimes it's acceptable not to help with group work 🧵👇

Elizabeth Mieczkowski (@beth_miecz) 's Twitter Profile Photo

So excited our paper is now out in Cognition! Huge thanks to our editor and reviewers 🧠 Their thoughtful suggestions inspired Experiments 3 & 4, including a striking inverse correlation between idleness judgments and speed-up predictions