jack morris (@jxmnop) 's Twitter Profile
jack morris

@jxmnop

researcher @meta // getting my phd in nlp @cornell_tech 🚠 // academic optimist // master of the semicolon

ID: 783098774130401280

linkhttp://jxmo.io calendar_today04-10-2016 00:17:51

3,3K Tweet

17,17K Followers

788 Following

jack morris (@jxmnop) 's Twitter Profile Photo

fun idea I tested out this morning: Language model fine-tuning in embedding space here's the idea: learn a model of *embeddings* of a certain text distribution; then, to generate text, sample embedding and map back to text with vec2text this lets us generate language without

fun idea I tested out this morning: Language model fine-tuning in embedding space

here's the idea: learn a model of *embeddings* of a certain text distribution; then, to generate text, sample embedding and map back to text with vec2text

this lets us generate language without
jack morris (@jxmnop) 's Twitter Profile Photo

TIL that the volume of the unit sphere goes up until a dimensionality of exactly 5, and then tends towards 0 with infinite dimensions (as a human who experiences the world in three dimensions, i find this very unpleasant)

TIL that the volume of the unit sphere goes up until a dimensionality of exactly 5, and then tends towards 0 with infinite dimensions

(as a human who experiences the world in three dimensions, i find this very unpleasant)
jack morris (@jxmnop) 's Twitter Profile Photo

let's get some Friday standies goin: yesterday - work at meta office - gym (leg day) - coffee w twitter mutual - cooked a lasagna today - work from home - deep thinking about text embeddings - check on vec2text models that i trained - head to tahoe w friends no blockers

jack morris (@jxmnop) 's Twitter Profile Photo

it’s gone long enough without happening that i am tweeting this research project into existence: i am wholly convinced that images can be *exactly* recovered from their embeddings, and the fact that no one has done this so far is simply a skill issue a year ago now we showed

jack morris (@jxmnop) 's Twitter Profile Photo

a big hope i have for future AI is to rediscover, clean, and polish all the digital cruft I’ve left in my wake over the years imagine being able to dispatch 100 copies of yourself to finish and document repositories, edit and synthesize all the notes you’ve written sounds nice

jack morris (@jxmnop) 's Twitter Profile Photo

there are three types of AI people in SF: - idealist (care about conceptual aesthetics; in it for the beauty of it all; novelty >> function) - grinder/tech bro (love NVIDIA/TSLA; mostly just in it for the $$) - doomer (think AI might kill us; earnestly trying to save the world)

jack morris (@jxmnop) 's Twitter Profile Photo

TIME Magazine has rightly named famed deep learning pioneer ptrblock as the most influential person in Artificial Intelligence.

TIME Magazine has rightly named famed deep learning pioneer ptrblock as the most influential person in Artificial Intelligence.
jack morris (@jxmnop) 's Twitter Profile Photo

On why you should read more (research papers): one of the most valuable problems we could solve as a community is idea deduplication at the meta-project level the counterintuitive consequence is that the most effective way for researchers to cut through the AI hype and be more

CLS (@chengleisi) 's Twitter Profile Photo

Automating AI research is exciting! But can LLMs actually produce novel, expert-level research ideas? After a year-long study, we obtained the first statistically significant conclusion: LLM-generated ideas are more novel than ideas written by expert human researchers.

Automating AI research is exciting! But can LLMs actually produce novel, expert-level research ideas?

After a year-long study, we obtained the first statistically significant conclusion: LLM-generated ideas are more novel than ideas written by expert human researchers.
jack morris (@jxmnop) 's Twitter Profile Photo

funniest thing about San francisco is that its location was chosen for a Spanish fort in the 1700s it’s the least hospitable location on the entire bay (hilly, foggy, windy, marshy) and yet everyone just still lives here anyway thanks, spaniards (beautiful city though)

funniest thing about San francisco is that its location was chosen for a Spanish fort in the 1700s

it’s the least hospitable location on the entire bay (hilly, foggy, windy, marshy) and yet everyone just still lives here anyway

thanks, spaniards

(beautiful city though)
jack morris (@jxmnop) 's Twitter Profile Photo

holy cow guy who writes (very helpful!) blogs about logistic regression, activation functions, validation sets, et cetera has an h-index of 59 that's higher than almost all the researchers i've ever met interesting strategy 🤔

holy cow 

guy who writes (very helpful!) blogs about logistic regression, activation functions, validation sets, et cetera has an h-index of 59

that's higher than almost all the researchers i've ever met

 interesting strategy  🤔
jack morris (@jxmnop) 's Twitter Profile Photo

a little-known fact about vec2text is that the reason it works well is test-time compute our model iteratively refines its guesses and then re-embeds them ("Feedback") –– improves massively w/ more forward passes more people should be trying this

a little-known fact about vec2text is that the reason it works well is test-time compute

our model iteratively refines its guesses and then re-embeds them ("Feedback") –– improves massively w/ more forward passes

more people should be trying this
jack morris (@jxmnop) 's Twitter Profile Photo

strawberry is cool 🍓 but isn't "more test-time compute => better performance" already a well-known phenomenon that holds on existing models? i took the figure on the right from Charlie Snell's recent paper on inference-time scaling laws, but have seen this idea many places

strawberry is cool 🍓

but isn't "more test-time compute => better performance" already a well-known phenomenon that holds on existing models? 

i took the figure on the right from <a href="/sea_snell/">Charlie Snell</a>'s recent paper on inference-time scaling laws, but have seen this idea many places