Abdulkadir Canatar (@canatar_a) 's Twitter Profile
Abdulkadir Canatar

@canatar_a

Research Fellow @FlatironCCN. Theoretical Neuroscience, Machine Learning, Physics. Prev: @Harvard and @sabanciu

ID: 1504453316

calendar_today11-06-2013 15:55:00

157 Tweet

369 Followers

667 Following

Simons Foundation (@simonsfdn) 's Twitter Profile Photo

Congratulations to SueYeon Chung on being awarded the Klingenstein-Simons Fellowship Award in Neuroscience! SueYeon Chung is an associate research scientist & project leader at Flatiron CCN - and an assistant professor NYU Center for Neural Science. Read more here: simonsfoundation.org/2023/07/05/neu…

Benjamin S Ruben (@benjaminsruben) 's Twitter Profile Photo

How can you mitigate double-descent without relying on a task-tuned regularization? In my new preprint with Cengiz Pehlevan, we show that ensembling over models of different size does the trick. (1/n) arxiv.org/abs/2307.03176

SueYeon Chung (@s_y_chung) 's Twitter Profile Photo

🎉Happy to share an exciting new theory paper our group, just published in Physical Review Letters with Editor's suggestion📖. Also featuring in Physics Magazine. Congratulations Albert Wakhloo on this milestone! Title: Linear Classification of Neural Manifolds with Correlated Variability

🎉Happy to share an exciting new theory paper our group, just published in <a href="/PhysRevLett/">Physical Review Letters</a> with Editor's suggestion📖. Also featuring in Physics Magazine. Congratulations <a href="/AlbertWakhloo/">Albert Wakhloo</a> on this milestone! 

Title: Linear Classification of Neural Manifolds with Correlated Variability
Tansu Daylan (@tansudaylan) 's Twitter Profile Photo

Today I am thrilled to embark on a journey as an assistant professor of physics. I just sat down in my office, took a deep breath, and sketched a long-term work plan. It feels great to have bold dreams full of opportunities and challenges. Let the good times roll.

Today I am thrilled to embark on a journey as an assistant professor of physics. I just sat down in my office, took a deep breath, and sketched a long-term work plan. It feels great to have bold dreams full of opportunities and challenges. Let the good times roll.
Jenelle Feather (@jenellefeather) 's Twitter Profile Photo

How do the spectral properties of a model influence neural prediction benchmarks? Check out our *new* paper, “A Spectral Theory of Neural Prediction and Alignment,” accepted to #NeurIPS2023 as a spotlight! arxiv.org/abs/2309.12821 w/ Abdulkadir Canatar SueYeon Chung Albert Wakhloo 🧵1/11

SueYeon Chung (@s_y_chung) 's Twitter Profile Photo

Wouldn't it be great if we can reliably figure out why some ANNs predict neural data better than others? When models show similar neural predictivity, how can we differentiate between them? We show a practical & theoretical soln to this, using geometry and gen error theory.

Wouldn't it be great if we can reliably figure out why some ANNs predict neural data better than others?

When models show similar neural predictivity, how can we differentiate between them?

We show a practical &amp; theoretical soln to this, using geometry and gen error theory.
Dmitri "Mitya" (@chklovskii) 's Twitter Profile Photo

A biologically plausible neural network for online whitening with slow (synaptic plasticity) and fast (gain control) learning accepted as a spotlight at #NeurIPS2023! A collaboration with the Simoncelli group led by the brilliant Lyndon Duong, David Lipshutz arxiv.org/abs/2308.13633

Blake Bordelon ☕️🧪👨‍💻 (@blake__bordelon) 's Twitter Profile Photo

Hot off the presses: ResNet hyperparameter transfer across depth and width! Tl;dr transfer for LR+schedules, momentum, L2 reg., etc. for wide ResNets and ViTs, with and without Batch/LayerNorm w/ Lorenzo Noci Mufan (Bill) Li Boris Hanin Cengiz Pehlevan arxiv.org/abs/2309.16620

Hot off the presses: ResNet hyperparameter transfer across depth and width!

Tl;dr transfer for LR+schedules, momentum, L2 reg., etc. for wide ResNets and ViTs, with and without Batch/LayerNorm

w/ <a href="/lorenzo_noci/">Lorenzo Noci</a> <a href="/mufan_li/">Mufan (Bill) Li</a> <a href="/BorisHanin/">Boris Hanin</a> <a href="/CPehlevan/">Cengiz Pehlevan</a> arxiv.org/abs/2309.16620
Ş. Furkan Öztürk (@sfurkanozturk61) 's Twitter Profile Photo

Check out my most recent paper on the origins of homochirality—just published in Nature Communications today! #homochirality #chirality This paper marks the conclusion of a four-part series on this subject. nature.com/articles/s4146…

Check out my most recent paper on the origins of homochirality—just published in <a href="/NatureComms/">Nature Communications</a> today! #homochirality #chirality

This paper marks the conclusion of a four-part series on this subject.

nature.com/articles/s4146…
Alex Atanasov (@abatanasov) 's Twitter Profile Photo

Very happy to share this work in NeurIPS 2023 with Nikhil Vyas, Blake Bordelon ☕️🧪👨‍💻, Sab Sainathan, @DepenKenpachi, and Cengiz Pehlevan on the consistent behavior of feature-learning networks across large widths arxiv.org/abs/2305.18411. What is large width consistency? Read on! 1/n

Jenelle Feather (@jenellefeather) 's Twitter Profile Photo

At #NeurIPS2023? Interested in brains, neural networks, and geometry? Come by our **Spotlight Poster** Tuesday @ 5:15PM (#1914) on A Spectral Theory of Neural Prediction and Alignment. w/ Abdulkadir Canatar SueYeon Chung Albert Wakhloo

At #NeurIPS2023? Interested in brains, neural networks, and geometry? Come by our **Spotlight Poster** Tuesday @ 5:15PM (#1914) on A Spectral Theory of Neural Prediction and Alignment. 
w/ <a href="/canatar_a/">Abdulkadir Canatar</a> <a href="/s_y_chung/">SueYeon Chung</a> <a href="/AlbertWakhloo/">Albert Wakhloo</a>
Alex Williams (@itsneuronal) 's Twitter Profile Photo

Short thread about our work at #NeurIPS2023. Topics include: - representational similarity - high-D covariance estimates - noise correlations / stochastic representation - optimal transport - scientific applications of deep nets to audio data + social neuro Details below👇

Short thread about our work at #NeurIPS2023. Topics include:

 - representational similarity
 - high-D covariance estimates
 - noise correlations / stochastic representation
 - optimal transport
 - scientific applications of deep nets to audio data + social neuro

Details below👇
SueYeon Chung (@s_y_chung) 's Twitter Profile Photo

Just arrived in New Orleans for #NeurIPS2023 this week. Topics I am excited about: neuro-AI, neural manifolds (representation geometry), stat physics for machine learning, interpretability, relational & causal representations My group is presenting their awesome work👇 (1/n)

Just arrived in New Orleans for #NeurIPS2023 this week. 

Topics I am excited about: neuro-AI, neural manifolds (representation geometry), stat physics for machine learning, interpretability, relational &amp; causal representations

My group is presenting their awesome work👇 (1/n)
SueYeon Chung (@s_y_chung) 's Twitter Profile Photo

🔥Lots of new theories on day 3 of Cosyne: [3-100] capacity for nonlinear classification of manifolds [3-105] theory of multitask learning (optimal repr geometry + geometric measures for data analysis) [3-167] tuning diversity shapes efficient representation geometry 🧵👇🏻

Alex Atanasov (@abatanasov) 's Twitter Profile Photo

[1/n] Thrilled that this project with Jacob Zavatone-Veth and @cpehlevan is finally out! Our group has spent a lot of time studying high dimensional regression and its connections to scaling laws. All our results follow easily from a single central theorem 🧵 arxiv.org/abs/2405.00592

Simons Foundation (@simonsfdn) 's Twitter Profile Photo

It is with great sadness that the Simons Foundation announces the death of its co-founder and chair emeritus, James Harris Simons. Jim was an award-winning mathematician, a legendary investor and a generous philanthropist. simonsfoundation.org/2024/05/10/sim…

It is with great sadness that the Simons Foundation announces the death of its co-founder and chair emeritus, James Harris Simons. Jim was an award-winning mathematician, a legendary investor and a generous philanthropist. simonsfoundation.org/2024/05/10/sim…
Simons Institute for the Theory of Computing (@simonsinstitute) 's Twitter Profile Photo

We mourn the loss of our friend and founding benefactor, Jim Simons. Jim was visionary, brilliant, and generous beyond measure. He has left an indelible mark on our field.

We mourn the loss of our friend and founding benefactor, Jim Simons. Jim was visionary, brilliant, and generous beyond measure. He has left an indelible mark on our field.
SueYeon Chung (@s_y_chung) 's Twitter Profile Photo

NYU-CDS article about MMCR (Maximum Manifold Capacity Representations): The key principle of MMCR is to maximize the number of image manifolds (generated by nuisance variations) that can be linearly decoded in the representations, hence maximizing the efficiency of the