wichmann
@wichmann
@zondervan.bksy.social (wichmann)
ID: 15582074
24-07-2008 14:04:19
5,5K Tweet
161 Followers
149 Following
For years since the GPT-2 paper, emergent in-context learning (ICL) from 'next-token' training has been treated as something deeply tied to ๐ก๐ฎ๐ฆ๐๐ง ๐ฅ๐๐ง๐ ๐ฎ๐๐ ๐. But โฆ is it? Thrilled to share our latest result: ๐๐ฒ๐ป๐ผ๐บ๐ถ๐ฐ๐งฌ ๐บ๐ผ๐ฑ๐ฒ๐น๐ ๐๐ฟ๐ฎ๐ถ๐ป๐ฒ๐ฑ ๐ค๐ฃ๐ก๐ฎ ๐ผ๐ป
Justine Moore It looks like ass