Ramanarayan Mohanty (@ramanarayanm) 's Twitter Profile
Ramanarayan Mohanty

@ramanarayanm

Research Scientist at Intel Labs, India. Machine learning, Deep learning, Graph AI, Computer Vision, Hyperspectral Imaging.

ID: 2840497129

calendar_today04-10-2014 17:49:48

1,1K Tweet

199 Followers

1,1K Following

Jean de Nyandwi (@jeande_d) 's Twitter Profile Photo

A Survey on Efficient Training of Transformers Transformers are the one architecture driving AI research in most modalities. They require large compute resources, but more and more efficient training approaches are being proposed. A survey on the topic: arxiv.org/abs/2302.01107

A Survey on Efficient Training of Transformers

Transformers are the one architecture driving AI research in most modalities. They require large compute resources, but more and more efficient training approaches are being proposed.

A survey on the topic: arxiv.org/abs/2302.01107
elvis (@omarsar0) 's Twitter Profile Photo

Stanford MLSys Seminar Series This seminar series has an incredible amount of knowledge and tips on a wide range of topics in ML. Just finished watching the OPT episode which is a really good watch if you are training LLMs or just want youtube.com/playlist?list=…

Stanford MLSys Seminar Series

This seminar series has an incredible amount of knowledge and tips on a wide range of topics in ML.

Just finished watching the OPT episode which is a really good watch if you are training LLMs or just want youtube.com/playlist?list=…
Sebastian Raschka (@rasbt) 's Twitter Profile Photo

You love using PyTorch for Deep Learning but want it a bit more organized, so it's easier to take advantage of more advanced features? Great news: Unit 5 is finally live! In Unit 5, I'll show you how to train PyTorch models with the Lightning Trainer! 🔗 lightning.ai/pages/courses/…

You love using PyTorch for Deep Learning but want it a bit more organized, so it's easier to take advantage of more advanced features?

Great news: Unit 5 is finally live! In Unit 5, I'll show you how to train PyTorch models with the Lightning Trainer!

🔗 lightning.ai/pages/courses/…
Konstantin Rusch@ICLR2025 🇸🇬 (@tk_rusch) 's Twitter Profile Photo

Oversmoothing is one of the most prominent issues in GNNs 🔥 However, there are some ambiguities in this context. That is why together with Michael Bronstein and S. Mishra we decided to write a survey providing a formal definition of oversmoothing. 📝 arxiv.org/abs/2303.10993 1/6

Oversmoothing is one of the most prominent issues in GNNs 🔥
However, there are some ambiguities in this context. That is why together with <a href="/mmbronstein/">Michael Bronstein</a> and S. Mishra we decided to write a survey providing a formal definition of oversmoothing. 

📝 arxiv.org/abs/2303.10993

1/6
Jean de Nyandwi (@jeande_d) 's Twitter Profile Photo

The Little Book of Deep Learning A very concise/brief book on deep learning. Covers almost any topic you'd want to know today from foundations, efficient computation, model architectures, training models, synthesis(generative AI), etc... And it's free: fleuret.org/public/lbdl.pdf

The Little Book of Deep Learning

A very concise/brief book on deep learning. Covers almost any topic you'd want to know today from foundations, efficient computation, model architectures, training models, synthesis(generative AI), etc...

And it's free: fleuret.org/public/lbdl.pdf
Sanyam Bhutani (@bhutanisanyam1) 's Twitter Profile Photo

“Transformers from scratch” by Brandon Rohrer 🤖 This is one of the best write ups, that starts from 0 and explains every single detail of the model architecture. Even if you need a refresher or don’t, I would still highly recommend reading it: e2eml.school/transformers.h…

“Transformers from scratch” by Brandon Rohrer 🤖 

This is one of the best write ups, that starts from 0 and explains every single detail of the model architecture. 

Even if you need a refresher or don’t, I would still highly recommend reading it:

e2eml.school/transformers.h…
Yi Tay (@yitayml) 's Twitter Profile Photo

So many misconceptions about architectures (esp encoder-decoder vs decoder) partially due to nomenclature being confusing. - EncDec, PrefixLMs, Causal Dec-onlys are *all* autoregressive. Even T5/UL2's objective is autoregressive. - All 3 archs are not that different. People

FAIR Chemistry (@opencatalyst) 's Twitter Profile Photo

📢We're thrilled to announce the 3rd Open Catalyst Challenge NeurIPS Conference 2023! This year's focus: computing adsorption energy — which builds on our past challenges and moves closer to practical applications. Visit our website to learn more! opencatalystproject.org/challenge.html 🧵1/8

Ramanarayan Mohanty (@ramanarayanm) 's Twitter Profile Photo

Worst service by tata play d2h in Sobha Dream Acres, Bangalore. Relocation to different flat within the same society has already taken more than 3 weeks no connection till now no body cares after mails and calls. #worstservice #tataplayd2h haritnagpal Tata Play Tata Play Binge

Ramanarayan Mohanty (@ramanarayanm) 's Twitter Profile Photo

Worst service by Godrej Interio, Whitefield, Bangalore. Booked a dining table made complete payment and they committed to give it within 2 weeks but after 10 days they haven't allotted the table and providing false info. Nadir Godrej Godrej Appliances Godrej Interio India @GodrejGroup

elvis (@omarsar0) 's Twitter Profile Photo

A Survey on Retrieval-Augmented Language Models This paper covers the most important recent developments in RAG and RAU systems. It includes evolution, taxonomy, and an analysis of applications. There is also a section on how to enhance different components of these systems

A Survey on Retrieval-Augmented Language Models

This paper covers the most important recent developments in RAG and RAU systems. It includes evolution, taxonomy, and an analysis of applications. 

There is also a section on how to enhance different components of these systems
AI at Meta (@aiatmeta) 's Twitter Profile Photo

📝 New from FAIR: An Introduction to Vision-Language Modeling. Vision-language models (VLMs) are an area of research that holds a lot of potential to change our interactions with technology, however there are many challenges in building these types of models. Together with a set

📝 New from FAIR: An Introduction to Vision-Language Modeling.

Vision-language models (VLMs) are an area of research that holds a lot of potential to change our interactions with technology, however there are many challenges in building these types of models. Together with a set
elvis (@omarsar0) 's Twitter Profile Photo

After reading 100s of AI papers this week, it's clear how useful small language models will be and the importance of efficiently enhancing reasoning and understanding in LLMs. If you are looking for some weekend reads, here are a few notable AI papers I read this week: -

After reading 100s of AI papers this week, it's clear how useful small language models will be and the importance of efficiently enhancing reasoning and understanding in LLMs.

If you are looking for some weekend reads, here are a few notable AI papers I read this week:

-
elvis (@omarsar0) 's Twitter Profile Photo

Beautiful visual guide to quantization, which is becoming a super important technique for compressing LLMs. This is such a fun guide with lots of visuals to build intuition about quantization. Highly recommended!

Beautiful visual guide to quantization, which is becoming a super important technique for compressing LLMs. 

This is such a fun guide with lots of visuals to build intuition about quantization. Highly recommended!
Evan Luthra (@evanluthra) 's Twitter Profile Photo

#RatanTata is the World's Biggest Donor. He has donated ₹829,734 crore. Built multiple free hospitals, schools & saved millions of lives. Today, on his death, the whole world is crying. Some unheard instances of Mr. Tata that will make you cry: 🧵

#RatanTata is the World's Biggest Donor.

He has donated ₹829,734 crore.

Built multiple free hospitals, schools &amp; saved millions of lives.

Today, on his death, the whole world is crying.

Some unheard instances of Mr. Tata that will make you cry: 🧵
Ramanarayan Mohanty (@ramanarayanm) 's Twitter Profile Photo

Why Do You Keep Thinking About Someone Who’s No Longer in Your Life? | Mindfulness | Buddha Mind youtu.be/nUJ0pf1-7-k. A must watch video. Loved it #mindfulness #motivational journey #life-lessons