Shawn O' Neil (@shawntoneil) 's Twitter Profile
Shawn O' Neil

@shawntoneil

Assistant Prof @ CU Anschutz Med School | TISLab | Biomedical Informatics | Data Engineering | ML | Bioinformatics | Curious Learner & Passionate Educator

ID: 1643679738731565056

linkhttps://tislab.org calendar_today05-04-2023 18:20:08

16 Tweet

17 Takipçi

12 Takip Edilen

Yann LeCun (@ylecun) 's Twitter Profile Photo

In this interview with Le Monde, Yoshua Bengio expresses his fears of some catastrophe scenarios that could be enabled by progress in AI. One such scenario he is worried about is a flood of disinformation and political propaganda on social networks. He says that we have a "moral

Shawn O' Neil (@shawntoneil) 's Twitter Profile Photo

“We are, of course, rapidly integrating chatbots into our email, calendar, and office suite, and fully intend to monetize that data stream. Don’t say we didn’t warn you.”

Sven Dorkenwald (@sdorkenw) 's Twitter Profile Photo

We are releasing a whole-brain connectome of the fruit fly, including ~130k annotated neurons and tens of millions of typed synapses! Explore the connectome: codex.flywire.ai Reconstruction paper: biorxiv.org/content/10.110… Annotation paper: biorxiv.org/content/10.110… 1/6

Brendan Dolan-Gavitt (@moyix) 's Twitter Profile Photo

Church’s lambda calculus and the Turing machine are equally powerful but differ in the fact that Turing machines use mutable state. To this day, there is a rift between functional and imperative programming languages, because of the separation of Church and state.

Yann LeCun (@ylecun) 's Twitter Profile Photo

This is huge: Llama-v2 is open source, with a license that authorizes commercial use! This is going to change the landscape of the LLM market. Llama-v2 is available on Microsoft Azure and will be available on AWS, Hugging Face and other providers Pretrained and fine-tuned

Vaibhav Adlakha (@vaibhav_adlakha) 's Twitter Profile Photo

We introduce LLM2Vec, a simple approach to transform any decoder-only LLM into a text encoder. We achieve SOTA performance on MTEB in the unsupervised and supervised category (among the models trained only on publicly available data). 🧵1/N Paper: arxiv.org/abs/2404.05961

We introduce LLM2Vec, a simple approach to transform any decoder-only LLM into a text encoder. We achieve SOTA performance on MTEB in the unsupervised and supervised category (among the models trained only on publicly available data). 🧵1/N

Paper: arxiv.org/abs/2404.05961