Neeraj Kumar Pandit (@neeraj_compchem) 's Twitter Profile
Neeraj Kumar Pandit

@neeraj_compchem

PhD Student, University of Göttingen. Interested in computational chemistry and machine learning

ID: 1270629660150452225

calendar_today10-06-2020 08:12:12

63 Tweet

168 Followers

266 Following

🔥 Matt Dancho (Business Science) 🔥 (@mdancho84) 's Twitter Profile Photo

For years, I was hyperparameter tuning XGBoost models wrong. In 3 minutes, I'll share one secret that took me 3 years to figure out. When I did, it cut my training time 10X. Let's dive in. 1. XGBoost: XGBoost (eXtreme Gradient Boosting) is a popular machine learning algorithm,

For years, I was hyperparameter tuning XGBoost models wrong. In 3 minutes, I'll share one secret that took me 3 years to figure out.  When I did, it cut my training time 10X. Let's dive in. 

1. XGBoost: XGBoost (eXtreme Gradient Boosting) is a popular machine learning algorithm,
🔥 Matt Dancho (Business Science) 🔥 (@mdancho84) 's Twitter Profile Photo

R-Squared (Lesson 6 of 24): R-squared is one of the most commonly used metrics to measure performance. But it took me 2 years to figure out mistakes that were killing my regression models. In 2 minutes, I'll share how I fixed 2 years of mistakes (and made 50% more accurate models

R-Squared (Lesson 6 of 24): R-squared is one of the most commonly used metrics to measure performance. But it took me 2 years to figure out mistakes that were killing my regression models. In 2 minutes, I'll share how I fixed 2 years of mistakes (and made 50% more accurate models
🔥 Matt Dancho (Business Science) 🔥 (@mdancho84) 's Twitter Profile Photo

The 10 types of clustering that all data scientists need to know. Let's dive in: 1. K-Means Clustering: This is a centroid-based algorithm, where the goal is to minimize the sum of distances between points and their respective cluster centroid. 2. Hierarchical Clustering: This

The 10 types of clustering that all data scientists need to know. Let's dive in:

1. K-Means Clustering: This is a centroid-based algorithm, where the goal is to minimize the sum of distances between points and their respective cluster centroid.

2. Hierarchical Clustering: This
David Andrés 🤖📈🐍 (@daansan_ml) 's Twitter Profile Photo

SHAP is a powerful technique in machine learning for interpreting the output of complex models. Commonly used for ✨Feature Engineering✨ Let's explore SHAP further 🧵 👇

SHAP is a powerful technique in machine learning for interpreting the output of complex models.

Commonly used for ✨Feature Engineering✨

Let's explore SHAP further 🧵 👇
SPP 2363 (@spp2363) 's Twitter Profile Photo

That’s a wrap! 🎁 A big Thank you to all speakers and participants making the #LeopoldinAIchem a truly special event!🥳 The shared knowledge, connections, and ideas have ignited a bright future💡📈

That’s a wrap! 🎁 A big Thank you to all speakers and participants making the #LeopoldinAIchem a truly special event!🥳 The shared knowledge, connections, and ideas have ignited a bright future💡📈
Neeraj Kumar Pandit (@neeraj_compchem) 's Twitter Profile Photo

Three days with three different weathers in Halle, yet each day with inspiring talks at the Leopoldina Symposium on Molecular Machine Learning. Thanks to all the speakers and SPP 2363 for organizing. #LeopoldinAIchem

Three days with three different weathers in Halle, yet each day with inspiring talks at the Leopoldina Symposium on Molecular Machine Learning. Thanks to all the speakers and <a href="/spp2363/">SPP 2363</a> for organizing. #LeopoldinAIchem
Bastian Grossenbacher-Rieck (@pseudomanifold) 's Twitter Profile Photo

🎉 Stoked that Ernst Röell's first Ph.D. work "Differentiable Euler Characteristic Transforms for Shape Classification" was accepted ICLR 2026 #ICLR2024 👉A new, super fast topological layer (based on Euler Characteristic Transform) #geometry #topology #MachineLearning 🧵1/n

Corin Wagen (@corinwagen) 's Twitter Profile Photo

great demonstration of why handling conformers correctly matters in predicting reaction selectivity, from Rubén Laplaza clemence corminboeuf - in computational chemistry it's easy to be right, or wrong, for bad reasons! (dx.doi.org/10.26434/chemr…)

great demonstration of why handling conformers correctly matters in predicting reaction selectivity, from <a href="/ruben_laplaza/">Rubén Laplaza</a> <a href="/corminboeuf_lab/">clemence corminboeuf</a> - in computational chemistry it's easy to be right, or wrong, for bad reasons!

(dx.doi.org/10.26434/chemr…)
Machine Learning in Chemistry (@ml_chem) 's Twitter Profile Photo

Performance Assessment of Universal Machine Learning Interatomic Potentials: Challenges and Directions for Materials’ Surfaces #machinelearning #compchem pubs.acs.org/doi/abs/10.102…

Günter Klambauer (@gklambauer) 's Twitter Profile Photo

SCIKIT-FINGERPRINTS Unlike in computer vision, in computational chemistry extracted low-level features are still competitive (if not SOTA) for many tasks. P: arxiv.org/abs/2407.13291 C: github.com/scikit-fingerp…

SCIKIT-FINGERPRINTS

Unlike in computer vision, in computational chemistry extracted low-level features are still competitive (if not SOTA) for many tasks.

P: arxiv.org/abs/2407.13291
C:  github.com/scikit-fingerp…
Machine Learning Street Talk (@mlstreettalk) 's Twitter Profile Photo

This is Sayash Kapoor calmly dismantling AI scaling laws hype, during our discussion of his article he published with Arvind Narayanan earlier today. This interview slapped. #ICML2024 is a wrap!

Albert Solé-Daura (@asoledaura) 's Twitter Profile Photo

Our work applying the Marcus theory to estimate energy-transfer kinetics is now available online in Chemical Science ! Check it out here: doi.org/10.1039/D4SC03… Maseras Group ICIQ

Bingqing Cheng (@chengbingqing) 's Twitter Profile Photo

Bothered by the lack of long-range interactions in ML potentials? Meet Latent Ewald Summation—our solution to fix "shortfalls" in short-ranged ML potentials for electrostatic and dielectric systems, with only a modest computational cost! arxiv.org/abs/2408.15165

Paolo Pellizzoni (@pa0l0_p) 's Twitter Profile Photo

New paper accepted at NeurIPS 2024! On the Expressivity and Sample Complexity of Node-Individualized Graph Neural Networks 📜Paper: openreview.net/pdf?id=8APPypS… 1. Message-passing GNNs are limited by the expressive power of the 1-Weisfeiler-Leman (color refinement) test for graph

New paper accepted at NeurIPS 2024!
On the Expressivity and Sample Complexity of Node-Individualized Graph Neural Networks

📜Paper: openreview.net/pdf?id=8APPypS…

1. Message-passing GNNs are limited by the expressive power of the 1-Weisfeiler-Leman (color refinement) test for graph
Kevin M Jablonka (@kmjablonka) 's Twitter Profile Photo

Chemists often combine many different techniques to elucidate structures. Adrian Mirza has been building a system that mimics this using machine-learning models and genetic algorithms.