sparsenn (@sparsenn) 's Twitter Profile
sparsenn

@sparsenn

Sparsity in Neural Networks Workshop

May 5th 2023
sparseneural.net

ID: 1394827128374497282

linkhttps://www.sparseneural.net/ calendar_today19-05-2021 01:28:57

93 Tweet

429 Followers

6 Following

Sophie M. Fosson (@sophiefosson) 's Twitter Profile Photo

Today, I'm virtually at #ICLR2023 at SNN workshop sparsenn to "play the lottery with concave regularizers" 🌈 #sparsity #pruning #LotteryTicketHypothesis

Anand Subramoney (@anandsubramoney) 's Twitter Profile Photo

I'm presenting a poster in the Sparse Neural Network workshop sparsenn at #ICLR2023 on "Efficient Real Time Recurrent Learning through combined activity and parameter sparsity". Come by if you're around! Link to paper: arxiv.org/abs/2303.05641

sparsenn (@sparsenn) 's Twitter Profile Photo

Our second spotlight presented by Shiwei Liu Ten Lessons We Have Learned in the New ''Sparseland'': A Short Handbook for Sparse Neural Network Researchers

sparsenn (@sparsenn) 's Twitter Profile Photo

The second session has just started with the talk by Aakanksha Chowdhery, “Efficiently Scaling Transformer Inference”. #ICLR2023

The second session has just started with the talk by Aakanksha Chowdhery, “Efficiently Scaling Transformer Inference”.
#ICLR2023
Vithu Thangarasa (@vithursant19) 's Twitter Profile Photo

I'll be presenting our paper at the #ICLR2023 Sparsity workshop sparsenn with my co-authors Abhay Gupta and Shreyas Saxena! Great to see so many experts in the #sparsity field come together to share insights and knowledge. Come by if you're around. arxiv.org/abs/2303.10464

I'll be presenting our paper at the #ICLR2023 Sparsity  workshop <a href="/sparsenn/">sparsenn</a> with my co-authors <a href="/gupta__abhay/">Abhay Gupta</a> and Shreyas Saxena! Great to see so many experts in the #sparsity field come together to share insights and knowledge. Come by if you're around. arxiv.org/abs/2303.10464
sparsenn (@sparsenn) 's Twitter Profile Photo

We will have the second round of the spotlight presentations now! Starting with “Massive Language Models Can be Accurately Pruned in One-Shot” presented by Elias Franter. #ICLR2023

We will have the second round of the spotlight presentations now! Starting with “Massive Language Models Can be Accurately Pruned in One-Shot” presented by Elias Franter.
#ICLR2023
sparsenn (@sparsenn) 's Twitter Profile Photo

The next spotlight presentation is “Efficient Backpropagation for Sparse Training with Speedup” presented by Mahdi Nikdan #ICLR2023

The next spotlight presentation is “Efficient Backpropagation for Sparse Training with Speedup” presented by Mahdi Nikdan #ICLR2023
sparsenn (@sparsenn) 's Twitter Profile Photo

Our final spotlight “Automatic Noise Filtering with Dynamic Sparse Training in Deep Reinforcement Learning” is presented by Bram Grooten #ICLR2023

Our final spotlight “Automatic Noise Filtering with Dynamic Sparse Training in Deep Reinforcement Learning” is presented by <a href="/BramGrooten/">Bram Grooten</a> 
#ICLR2023
sparsenn (@sparsenn) 's Twitter Profile Photo

Excited to start the next part of the workshop! We have now the breakout sessions where the attendees discuss and brainstorm various topics in sparsity with each other! #ICLR2023

Excited to start the next part of the workshop! We have now the breakout sessions where the attendees discuss and brainstorm various topics in sparsity with each other!
#ICLR2023
Vithu Thangarasa (@vithursant19) 's Twitter Profile Photo

Martha White Brought up a great point during the panel discussion at the sparsenn workshop, “rather than taking existing architectures, which we’ve been working with for many years are kind of designed for dense architectures, and then saying...that’s our gold standard...(1/2)

Thiago Serra (@thserra.bsky.social) (@thserra) 's Twitter Profile Photo

Come see us at the virtual poster session of the sparsenn workshop at #iclr2023, starting 11 AM ET Our work is on the effect of sparsification in the number of linear regions of neural networks, and how understanding this effect may help pruning with better accuracy (Poster 30)

Come see us at the virtual poster session of the <a href="/sparsenn/">sparsenn</a> workshop at #iclr2023, starting 11 AM ET

Our work is on the effect of sparsification in the number of linear regions of neural networks, and how understanding this effect may help pruning with better accuracy (Poster 30)
Gintare Karolina Dziugaite (@gkdziugaite) 's Twitter Profile Photo

Check out a new pruning library, JaxPruner! Excited to see how it will impact those already working in network pruning and quantization, and attract new people interested in trying out / applying these methods in new domains 🚀

Vithu Thangarasa (@vithursant19) 's Twitter Profile Photo

Jeff Dean discussed the importance of sparse computation, adaptive computation and dynamically-changing neural networks sparsenn. He thinks "dense models are going to give way to these efficient sparse models"...I agree 💯

<a href="/JeffDean/">Jeff Dean</a> discussed the importance of sparse computation, adaptive computation and dynamically-changing neural networks <a href="/sparsenn/">sparsenn</a>. He thinks "dense models are going to give way to these efficient sparse models"...I agree 💯