Andreas Auer (@andauer) 's Twitter Profile
Andreas Auer

@andauer

ML PhD Student @ ELLIS Unit Linz - JKU - ML Institute |
Time Series - Sequence Modeling - Uncertainty |
🏠 based in Vienna

ID: 1522868377647263746

linkhttps://apointa.github.io/ calendar_today07-05-2022 09:18:29

19 Tweet

270 Followers

223 Following

Kajetan Schweighofer (@kschweig_) 's Twitter Profile Photo

πŸš€ Excited to share our latest research on quantifying the predictive uncertainty of machine learning models. QUAM searches for adversarial models (not adversarial examples!) to better estimate the epistemic uncertainty, the uncertainty about chosen model parameters. 1/5

πŸš€ Excited to share our latest research on quantifying the predictive uncertainty of machine learning models. QUAM searches for adversarial models (not adversarial examples!) to better estimate the epistemic uncertainty, the uncertainty about chosen model parameters.
1/5
Sepp Hochreiter (@hochreitersepp) 's Twitter Profile Photo

Super paper on reliable prediction for time series at NeurIPS. Conformal Predictions adapted to time series yields excellent results.

Fabian Paischer (@paischerfabian) 's Twitter Profile Photo

Interested in a semantic memory for reinforcement learning? I was recently invited to a podcast talking about our #NeurIPS2023 paper: Semantic HELM (arxiv.org/abs/2306.09312). In case you are interested, you can stream the episode here: open.spotify.com/episode/4n2lmC…

Johannes Brandstetter (@jo_brandstetter) 's Twitter Profile Photo

xLSTM is out -- putting LSTM networks on steroids to become a more than serious LLM competitor. How? Via exponential gating and enhanced (cell state) memory capacities. Does it work? Oh, yeah πŸš€πŸš€ arxiv.org/abs/2405.04517

Sepp Hochreiter (@hochreitersepp) 's Twitter Profile Photo

I am so excited that xLSTM is out. LSTM is close to my heart - for more than 30 years now. With xLSTM we close the gap to existing state-of-the-art LLMs. With NXAI we have started to build our own European LLMs. I am very proud of my team. arxiv.org/abs/2405.04517

Maximilian Beck (@maxmbeck) 's Twitter Profile Photo

The #xLSTM is finally live! What an exciting day! How far do we get in language modeling with the LSTM compared to State-of-the-Art LLMs? I would say pretty, pretty far! How? We extend the LSTM with Exponential Gating and parallelizable Matrix Memory! arxiv.org/pdf/2405.04517

The #xLSTM is finally live! What an exciting day! 
How far do we get in language modeling with the LSTM  compared to State-of-the-Art LLMs? 
I would say pretty, pretty far! 
How? We extend the LSTM with Exponential Gating and parallelizable Matrix Memory!
arxiv.org/pdf/2405.04517
Johannes Brandstetter (@jo_brandstetter) 's Twitter Profile Photo

Introducing Vision-LSTM - making xLSTM read images 🧠It works ... pretty, pretty well πŸš€πŸš€ But convince yourself :) We are happy to share code already! πŸ“œ: arxiv.org/abs/2406.04303 πŸ–₯️: nx-ai.github.io/vision-lstm/ All credits to my stellar PhD Benedikt Alkin

Introducing Vision-LSTM - making xLSTM read images 🧠It works ... pretty, pretty well πŸš€πŸš€ But convince yourself :) We are happy to share code already!

πŸ“œ: arxiv.org/abs/2406.04303
πŸ–₯️: nx-ai.github.io/vision-lstm/

All credits to my stellar PhD <a href="/benediktalkin/">Benedikt Alkin</a>