Bao Pham (@baophamhq) 's Twitter Profile
Bao Pham

@baophamhq

ML PhD Student at @rpi. Interested in unnormalized probabilistic and modern Hopfield-related models.

ID: 1723130980154724353

linkhttp://bhqpham.com calendar_today11-11-2023 00:10:28

101 Tweet

130 Followers

131 Following

Bao Pham (@baophamhq) 's Twitter Profile Photo

Energy-based modeling is keen on learning a target data distribution. But, by using the rules of modern Hopfield networks, it can be used to design novel dynamical neural systems whose dynamics are dictated by a global energy function operating in a latent space. For example, we

Energy-based modeling is keen on learning a target data distribution. But, by using the rules of modern Hopfield networks, it can be used to design novel dynamical neural systems whose dynamics are dictated by a global energy function operating in a latent space. For example, we
Bao Pham (@baophamhq) 's Twitter Profile Photo

During training, diffusion models are being taught to be effective denoisers, like Associative Memory systems. At what point do these models stop being denoisers and behaving like data generators? To learn about how these models arise from being Associative Memory systems to

During training, diffusion models are being taught to be effective denoisers, like Associative Memory systems. At what point do these models stop being denoisers and behaving like data generators? 

To learn about how these models arise from being Associative Memory systems to
Krishna Balasubramanian (@krizna_b) 's Twitter Profile Photo

Adding on to this #ICML2025 tutorial on Associative memories, here is a thread about recent work on simultaneously performing memorization and generalization with Dense Associative Memories

Adding on to this #ICML2025 tutorial on Associative memories, here is a thread about recent work on simultaneously performing memorization and generalization with Dense Associative Memories
Luca Ambrogioni (@lucaamb) 's Twitter Profile Photo

1/3) I am biased, but I think this is going to be big! CoVAE: Consistency Training of Variational Autoencoders We unify consistency models with VAEs to obtain a powerful and elegant generative autoencoder! The brainchild of the brilliant Gianluigi Silvestri (who is looking for jobs!)

1/3) I am biased, but I think this is going to be big!

CoVAE: Consistency Training of Variational  Autoencoders

We unify consistency models with VAEs to obtain a powerful and elegant generative autoencoder!

The brainchild of the brilliant <a href="/gisilvs/">Gianluigi Silvestri</a> (who is looking for jobs!)
Dmitry Krotov (@dimakrotov) 's Twitter Profile Photo

Thanks everyone who came to our Tutorial yesterday. It was fun! I will host an additional Q&A session at the IBM Research booth in the West Exhibition Hall A today between 4.30pm and 6pm. If you want to chat about Associative Memory, Energy Transformers, diffusion models, AI &

Thanks everyone who came to our Tutorial yesterday. It was fun! I will host an additional Q&amp;A session at the IBM Research booth in the West Exhibition Hall A today between 4.30pm and 6pm. 

If you want to chat about Associative Memory, Energy Transformers, diffusion models, AI &amp;
Luca Ambrogioni (@lucaamb) 's Twitter Profile Photo

Consistency Variational Autoencoders (CoVAE) follow naturally from β-VAEs. A family of β-VAEs (with increasing β) can be organized as a sequence of latent encodings with decreasing SNR . This implicit definition of a 'forward process' is used to define a consistency-style loss!

Consistency Variational Autoencoders (CoVAE) follow naturally from β-VAEs.

A family of β-VAEs (with increasing β) can be organized as a sequence of latent encodings with decreasing SNR .

This implicit definition of a 'forward process' is used to define a consistency-style loss!
Kempner Institute at Harvard University (@kempnerinst) 's Twitter Profile Photo

New in the #DeeperLearningBlog: #KempnerInstitute researchers Binxu Wang 🐱 and John J. Vastola explain their work uncovering the linear Gaussian structure in diffusion models and the potential to use it to enhance performance. bit.ly/4lCauDv #AI #DiffusionModels

Bao Pham (@baophamhq) 's Twitter Profile Photo

How to build a factual but creative system? It is a question surrounding memory and creativity in modern ML systems. My colleagues from IBM Research and MIT-IBM Watson AI Lab are hosting the Memory and Vision Workshop workshop at #ICCV2025, which explores the intersection between memory and generative

How to build a factual but creative system? It is a question surrounding memory and creativity in modern ML systems. My colleagues from <a href="/IBMResearch/">IBM Research</a> and <a href="/MITIBMLab/">MIT-IBM Watson AI Lab</a> are hosting the <a href="/MemVis_ICCV25/">Memory and Vision Workshop</a> workshop at #ICCV2025, which explores the intersection between memory and generative
Zexue He (@zexuehe) 's Twitter Profile Photo

📢 𝐂𝐚𝐥𝐥 𝐟𝐨𝐫 𝐏𝐚𝐩𝐞𝐫𝐬 – 𝐌𝐞𝐦𝐕𝐢𝐬 @ ICCV | 𝐇𝐨𝐧𝐨𝐥𝐮𝐥𝐮, 𝐇𝐚𝐰𝐚𝐢𝐢 🌺 Topics: 🧠 Memory-augmented models 🎥 Temporal & long-context vision 🤖 Multimodal & scalable systems and more on 𝐦𝐞𝐦𝐨𝐫𝐲 + 𝐯𝐢𝐬𝐢𝐨𝐧 ... 👉OpenReview Submission:

📢 𝐂𝐚𝐥𝐥 𝐟𝐨𝐫 𝐏𝐚𝐩𝐞𝐫𝐬 – 𝐌𝐞𝐦𝐕𝐢𝐬 @ ICCV | 𝐇𝐨𝐧𝐨𝐥𝐮𝐥𝐮, 𝐇𝐚𝐰𝐚𝐢𝐢 🌺
Topics: 
🧠 Memory-augmented models
🎥 Temporal &amp; long-context vision  
🤖 Multimodal &amp; scalable systems
and more on 𝐦𝐞𝐦𝐨𝐫𝐲 + 𝐯𝐢𝐬𝐢𝐨𝐧 ...

👉OpenReview Submission:
Memory and Vision Workshop (@memvis_iccv25) 's Twitter Profile Photo

Good news! 📢 The MemVis #ICCV2025 submission deadline is extended to 𝟭𝟬 𝗔𝘂𝗴𝘂𝘀𝘁 -- more time to send us your best work on memory & vision!

Georgia Channing (@cgeorgiaw) 's Twitter Profile Photo

Not sure if the physics community is aware, but Polymathic AI has been quietly putting 10TB of physics simulation data on Hugging Face. ⛓️‍💥huggingface.co/collections/po…

Not sure if the physics community is aware, but Polymathic AI has been quietly putting 10TB of physics simulation data on <a href="/huggingface/">Hugging Face</a>. 

⛓️‍💥huggingface.co/collections/po…
Symmetry and Geometry in Neural Representations (@neur_reps) 's Twitter Profile Photo

📢 Reminder! The submission deadline for NeurReps 2025 is Friday, August 22 ‼️ Come join us in San Diego! 🏄 Submit your work on how mathematical structure shapes computation in the brain and in AI systems: neurreps.org/call-for-papers

Surya Ganguli (@suryaganguli) 's Twitter Profile Photo

Very excited to lead this new Simons Foundation collaboration on the physics of learning and neural computation to develop powerful tools from physics, math, CS, stats, neuro and more to elucidate the scientific principles underlying AI. See our website for more: physicsoflearning.org

Memory and Vision Workshop (@memvis_iccv25) 's Twitter Profile Photo

⏰ Just 2 days left to submit to #MemVis @ #ICCV2025! We welcome works on state-space models, diffusion, retrieval, lifelong learning, multimodal memory, and more. 📝 Formats: ≤4p abstract or full ICCV paper. Don’t miss it ▶️ sites.google.com/view/memvis-ic…

Luca Ambrogioni (@lucaamb) 's Twitter Profile Photo

1/2) I am very happy to finally share something I have been working on and off for the past year: "The Information Dynamics of Generative Diffusion" This paper connects the entropy production, divergence of vector fields and spontaneous symmetry breaking in a unified framework

1/2) I am very happy to finally share something I have been working on and off for the past year:

"The Information Dynamics of Generative Diffusion"

This paper connects the entropy production, divergence of vector fields and spontaneous symmetry breaking in a unified framework