Matthijs Pals
@matthijs_pals
Using deep learning to elucidate neural representations and dynamics @MackeLab
ID: 1460663812303032336
16-11-2021 17:39:45
108 Tweet
322 Followers
521 Following
Want to train neuroscience models consisting of single cells, recurrent neural networks (RNNs), or huge feedforward networks - all with detailed biophysics? Michael Deistler's Jaxley has your back! 👇
Back in 2022, Roxana Zeraati & I organized a Cosyne workshop on neural timescales, and after working on it for the last 2 years together, it's now a review paper! arxiv.org/abs/2409.02684 w/ Anna Levina & Jakob Macke (2nd blogpost to turn into a real review paper this year lol)
We’re at Bernstein Conference next week with lots of new work to share: 10 posters, 1 workshop talk, and don’t miss Jakob Macke’s invited talk on Wednesday! If you’re excited about machine learning for (neuro)science, come chat with us—we’re hiring PhD students & postdocs!
At Poster III-69 Matthijs Pals will explain how to fit RNNs to neural data - and use them as generative models. Want to understand the fit models? We show how to obtain all fixed points in low-rank piecewise-linear RNNs.
Want to stay up-to-date with exciting #ml4science #ai4science research happening in our lab? Come have a look at Machine Learning in Science.bsky.social :).
🎉Finally published PLOS Comp Biol ! Why do neurons use low-frequency oscillations for encoding? Why not use higher frequencies for better sampling resolution? We identify a speed-precision trade-off driven by noise, showing that theta (3–8 Hz) maximizes bits/s! Check it out 👇