Hariharan Ramasubramanian (@rf_hari) 's Twitter Profile
Hariharan Ramasubramanian

@rf_hari

PhD student at @CarnegieMellon | Prev Research Intern: at @Applied4Tech, @argonne, @genentech

ID: 1432864325677498373

calendar_today01-09-2021 00:34:23

14 Tweet

70 Followers

392 Following

Tian Xie (@xie_tian) 's Twitter Profile Photo

Announcing ICLR 2023 workshop on ML4Materials: from molecules to materials. We hope to bring together the ML and materials science communities to tackle unique challenges in modeling materials, building on the success of modeling molecules and proteins. ml4materials.com

Announcing ICLR 2023 workshop on ML4Materials: from molecules to materials.

We hope to bring together the ML and materials science communities to tackle unique challenges in modeling materials, building on the success of modeling molecules and proteins. 
ml4materials.com
Sterling Crispin 🕊️ (@sterlingcrispin) 's Twitter Profile Photo

I’m really excited about Clifford Algebra and Geometric Algebra research for AI. I’m going to do my best to explain it in an approachable way for anyone, even if you don’t know how neural networks work and you don’t know a lot of math. The paper I ’m breaking down has an

I’m really excited about Clifford Algebra and Geometric Algebra research for AI. I’m going to do my best to explain it in an approachable way for anyone, even if you don’t know how neural networks work and you don’t know a lot of math. The paper I ’m breaking down has an
Anatole von Lilienfeld (@profvlilienfeld) 's Twitter Profile Photo

Q: How to alter your training set selection if you only care about 1 prediction? A: Bias by similarity and train on the fly (basically kNN on steroids 🤩). Reaching chemical accuracy for a given query in QM9 this way implies 2 orders of magnitude less data than random training

Q: How to alter your training set selection if you only care about 1 prediction? A: Bias by similarity and train on the fly (basically kNN on steroids 🤩). Reaching chemical accuracy for a given query in QM9 this way implies 2 orders of magnitude less data than random training
Alexandre Duval (@aduvalinho) 's Twitter Profile Photo

Super excited to finally release our "Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems" !🤗 Link: arxiv.org/pdf/2312.07511… Written with Chaitanya K. Joshi Simon Mathis Victor Schmidt 💀🐔 Santiago Miret Fragkiskos Malliaros Taco Cohen Pietro Lio, Yoshua Bengio Michael Bronstein 😍 See thread below 👇 (1/8)

Super excited to finally release our "Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems" !🤗

Link: arxiv.org/pdf/2312.07511…

Written with <a href="/chaitjo/">Chaitanya K. Joshi</a> <a href="/SimMat20/">Simon Mathis</a> <a href="/vict0rsch/">Victor Schmidt 💀🐔</a> Santiago Miret <a href="/frank8m/">Fragkiskos Malliaros</a> <a href="/TacoCohen/">Taco Cohen</a> Pietro Lio, Yoshua Bengio <a href="/mmbronstein/">Michael Bronstein</a> 😍

See thread below 👇
(1/8)
Nature Computational Science (@natcomputsci) 's Twitter Profile Photo

📢Mingda Li and colleagues propose a virtual node graph neural network to enable the prediction of materials properties with variable output dimension. MIT School of Engineering MIT Science MIT Chemistry MIT EECS MIT Nuclear Science and Engineering Oak Ridge Lab nature.com/articles/s4358… ➡️rdcu.be/dNzJh

Davide Donadio (@nanophononics) 's Twitter Profile Photo

We have written a tutorial manuscript about using GPUMD, TDEP, and kALDo together to get accurate temperature-dependent thermal conductivity and elastic properties. Great work by Dylan Folkner (first paper!) Zekun Chen Florian Knoop Giuseppe Barbalinardo Nanotheory @UCD

We have written a tutorial manuscript about using GPUMD, TDEP, and kALDo together to get accurate temperature-dependent thermal conductivity and elastic properties. Great work by Dylan Folkner (first paper!)
<a href="/ZNanotheory/">Zekun Chen</a>
<a href="/flokno_phys/">Florian Knoop</a>
<a href="/GiuseppeQuantum/">Giuseppe Barbalinardo</a>
<a href="/Nanotheory1/">Nanotheory @UCD</a>
Nick McGreivy (@nmcgreivy) 's Twitter Profile Photo

Our new paper in Nature Machine Intelligence tells a story about how, and why, ML methods for solving PDEs do not work as well as advertised. We find that two reproducibility issues are widespread. As a result, we conclude that ML-for-PDE solving has reached overly optimistic conclusions.

Our new paper in <a href="/NatMachIntell/">Nature Machine Intelligence</a> tells a story about how, and why, ML methods for solving PDEs do not work as well as advertised.

We find that two reproducibility issues are widespread. As a result, we conclude that ML-for-PDE solving has reached overly optimistic conclusions.
Michael Galkin (@michael_galkin) 's Twitter Profile Photo

ICLR 2025 submissions are now available on OpenReview, here are some fresh GNNs and Geometric learning subs that caught my attention (and haven't appeared during ICML/NeurIPS cycles). Based on the abstracts, but PDFs should be there shortly 🧵 Thread! 1/n