Eirikur Agustsson (@etagust) 's Twitter Profile
Eirikur Agustsson

@etagust

Senior Research Scientist, Gemini Team, Google Deepmind

ID: 534042677

linkhttps://scholar.google.com/citations?user=Uhvyua4AAAAJ&hl=en calendar_today23-03-2012 08:26:57

198 Tweet

1,1K Followers

510 Following

Eirikur Agustsson (@etagust) 's Twitter Profile Photo

Excited to share our latest work on Video Compression using Transformers (VCT), check it out! PS: Fabian, Sergi and I will be attending CVPR, looking forward to catch up with everyone 😊

Dustin Tran (@dustinvtran) 's Twitter Profile Photo

It’s surprising that in 2022, there remains little movement away from LaTeX toward a new language. It has some of the most unintuitive designs and syntax you’d expect in a language today.

Eirikur Agustsson (@etagust) 's Twitter Profile Photo

Apparently Twitter isn't confident that a tweet by the Chairman @ Twitter about the world's wealthiest person is about "Business personalities"? 😅

Apparently Twitter isn't confident that a tweet by the Chairman @ Twitter about the world's wealthiest person is about "Business personalities"? 😅
Eirikur Agustsson (@etagust) 's Twitter Profile Photo

This Icelandic company just reveived a 100M+€ EU grant to build a massive CO2 storage hub in my home town, Hafnarfjordur, Iceland. Super cool tech that basically turns CO2 into stone by pumping carbonated water into lava!

Eirikur Agustsson (@etagust) 's Twitter Profile Photo

A brief life update: 🇨🇭➡️🇮🇸 After 10 wonderful years in Switzerland, I have finally returned back to my home country, Iceland. Goodbye Google Zurich, and hello Google Iceland 🥶🌋

Fabian Mentzer (@mentzer_f) 's Twitter Profile Photo

1/ Excited to share that our work on masked transformers for compression has been accepted at #ICCV2023! We show two things: a) how a simple transformer based architecture can lead to state-of-the-art image compression results: arxiv.org/abs/2304.07313

1/ Excited to share that our work on masked transformers for compression has been accepted at #ICCV2023! We show two things: a) how a simple transformer based architecture can lead to state-of-the-art image compression results: arxiv.org/abs/2304.07313
AK (@_akhaliq) 's Twitter Profile Photo

Finite Scalar Quantization: VQ-VAE Made Simple paper page: huggingface.co/papers/2309.15… propose to replace vector quantization (VQ) in the latent representation of VQ-VAEs with a simple scheme termed finite scalar quantization (FSQ), where we project the VAE representation down to a

Finite Scalar Quantization: VQ-VAE Made Simple

paper page: huggingface.co/papers/2309.15…

propose to replace vector quantization (VQ) in the latent representation of VQ-VAEs with a simple scheme termed finite scalar quantization (FSQ), where we project the VAE representation down to a
Emilien Dupont (@emidup) 's Twitter Profile Photo

We build neural codecs from a *single* image or video, achieving compression performance close to SOTA models trained on large datasets, while requiring ~100x fewer FLOPs for decoding ⚡ #CVPR2024 c3-neural-compression.github.io

We build neural codecs from a *single* image or video, achieving compression performance close to SOTA models trained on large datasets, while requiring ~100x fewer FLOPs for decoding ⚡ #CVPR2024

c3-neural-compression.github.io
Jeff Dean (@jeffdean) 's Twitter Profile Photo

What a way to celebrate one year of incredible Gemini progress -- #1🥇across the board on overall ranking, as well as on hard prompts, coding, math, instruction following, and more, including with style control on. Thanks to the hard work of everyone in the Gemini team and

What a way to celebrate one year of incredible Gemini progress -- #1🥇across the board on overall ranking, as well as on hard prompts, coding, math, instruction following, and more, including with style control on.

Thanks to the hard work of everyone in the Gemini team and