Erik Bekkers (@erikjbekkers) 's Twitter Profile
Erik Bekkers

@erikjbekkers

Associate Prof @AmlabUva @UvA_Amsterdam @Ellis_Amsterdam | @ELLISforEurope Scholar | Geometric and Group Equivariant Deep Learning

ID: 1169879005408817153

linkhttps://scholar.google.com/citations?hl=en&user=yeWrfR4AAAAJ&view_op=list_works&sortby=pubdate calendar_today06-09-2019 07:45:35

784 Tweet

4,4K Takipçi

959 Takip Edilen

Rianne van den Berg (@vdbergrianne) 's Twitter Profile Photo

Our deep learned exchange-correlation functional Skala is finally available to try out 🎉 Tell us what works and where skala can be improved!

Floor Eijkelboom (@feijkelboom) 's Twitter Profile Photo

We asked the same question: how can we combine the strengths of continuous and discrete approaches? Similar to CDCD, in our work, Purrception, we extend Variational FM to model VQ latents through continuous-discrete transport for image generation :D 👉 arxiv.org/abs/2510.01478

We asked the same question: how can we combine the strengths of continuous and discrete approaches?

Similar to CDCD, in our work, Purrception, we extend Variational FM to model VQ latents through continuous-discrete transport for image generation :D

👉 arxiv.org/abs/2510.01478
Sharvaree Vadgama (@sharvvadgama) 's Twitter Profile Photo

Presenting our second work in ‘Equivariance vs scale’ discussion with 🔹 Platonic Transformers. Safe to say, Equivariance is not dead. 😄 To learn more check this 👇

Mohammad Niaz (@mohammad_niaz94) 's Twitter Profile Photo

What if you could equip transformers with symmetry without slowing them down? 💡 Our new paper extends RoPE transformers to symmetry groups. The key breakthrough: It achieves this with zero computational overhead, improving performance even on non-equivariant tasks.

Rishabh Anand 🧬 (@rishabh16_) 's Twitter Profile Photo

🚨 New preprint! Love using Transformers and scaling? Love equivariance and inductive bias? They needn’t be at odds!! Introducing the Platonic Transformer, a geometric Transformer framework that hijacks RoPE to achieve E(3) equivariance at no additional cost ⚡️ We are as fast

Rishabh Anand 🧬 (@rishabh16_) 's Twitter Profile Photo

I will be presenting this at MIT Jameel Clinic for AI & Health’s #MoML2025 this Wednesday, 22nd Oct. Come by our poster to chat about how you can incorporate PlatoFormers into your pipeline TODAY

I will be presenting this at <a href="/AIHealthMIT/">MIT Jameel Clinic for AI & Health</a>’s #MoML2025 this Wednesday, 22nd Oct. 

Come by our poster to chat about how you can incorporate PlatoFormers into your pipeline TODAY
Rishabh Anand 🧬 (@rishabh16_) 's Twitter Profile Photo

MIT Jameel Clinic for AI & Health I think it's absolutely remarkable that we can achieve E(3) equivariance without any gimmicky modules or expensive operations commonly seen in their geometric GNN counterparts We hope inductive bias can finally enter its scaling era 🔥

<a href="/AIHealthMIT/">MIT Jameel Clinic for AI & Health</a> I think it's absolutely remarkable that we can achieve E(3) equivariance without any gimmicky modules or expensive operations commonly seen in their geometric GNN counterparts

We hope inductive bias can finally enter its scaling era 🔥
David Wessels (@dafidofff) 's Twitter Profile Photo

Recent discussions have largely focused on scaling versus geometry. Another perfect example showcasing that geometry could be made scalable, if we as GDL people start to take scaling seriously. Lets take the best of both worlds 🦾🦾

Erik Bekkers (@erikjbekkers) 's Twitter Profile Photo

Time to scale up equivariant architectures!❤️Absolutely wonderful work Max! I wouldn't go as far as saying efficiency is everything, bc equivariance is ;), but for sure this seems like an indisputably winning combo! Scale+equiv ftw imo😁 & great to see Triton paying off like this

Alejandro García (@algarciacast) 's Twitter Profile Photo

✨CAMERA READY UPDATE✨ with new cool plots in which we show how we can use our Equivariant Neural Eikonal Solver for path planning in Riemannian manifolds Check our paper here arxiv.org/pdf/2505.16035 And see you at NeurIPS 🥰

✨CAMERA READY UPDATE✨ with new cool plots in which we show how we can use our Equivariant Neural Eikonal Solver for path planning in Riemannian manifolds   

Check our paper here arxiv.org/pdf/2505.16035 

And see you at NeurIPS 🥰