Ujjwal Upadhyay
@theujjwal9
Vision Language Models | Medical Imaging | Neuroscience
ID: 833094373894156289
https://ujjwal9.com 18-02-2017 23:22:31
500 Tweet
84 Takipçi
472 Takip Edilen
Log-linear attention — a new type of attention proposed by Massachusetts Institute of Technology (MIT) which is: - fast and efficient as linear attention - expressive as softmax It uses a small but growing number of memory slots that increases logarithmically with the sequence length. Here's how it works: