YingTang (@yingtangphysics) 's Twitter Profile
YingTang

@yingtangphysics

Physicist, AI for physics, stochastic dynamics, statistical physics, generative model. Professor at UESTC, Chengdu.

ID: 879187959139844096

linkhttps://jamestang23.github.io/ calendar_today26-06-2017 04:01:58

48 Tweet

87 Takipçi

289 Takip Edilen

YingTang (@yingtangphysics) 's Twitter Profile Photo

Great work by Aleksandra Walczak and Thierry Mora's group. Glad that our previous work on how dynamics of signaling transmit information helps motivate more theoretical studies on this topic. nature.com/articles/s4146…

Perimeter Institute (@perimeter) 's Twitter Profile Photo

Can large language models (LLMs) like ChatGPT help advance quantum computing? Yes! In a paper released today, PI's Roger Melko and visiting fellow Juan Felipe Carrasquilla Álvarez describe how the same algorithmic structure used in LLMs might do just that! Full article: hubs.ly/Q02hdRgd0

YingTang (@yingtangphysics) 's Twitter Profile Photo

The exploration of the neural-network approach to uncover dynamical phase transition of nonequilibrium statistical mechanics is out with some updates: nature.com/articles/s4146…

Sebastian Goldt (@sebastiangoldt) 's Twitter Profile Photo

🚨Deadline approaching soon🚨 Apply for our four-year, fully funded PhD programme at SISSA before Feb 23rd; details below 👇 or via DM !

Physical Review Letters (@physrevlett) 's Twitter Profile Photo

The tensor-network message-passing method reduces the error in the calculation of local observables by several orders of magnitude compared with state-of-the-art techniques go.aps.org/3IYvItD

The tensor-network message-passing method reduces the error in the calculation of local observables by several orders of magnitude compared with state-of-the-art techniques go.aps.org/3IYvItD
Lei Wang (@wangleiphy) 's Twitter Profile Photo

CrystalFormer generates crystalline materials with an autoregressive transformer paper: arxiv.org/abs/2403.15734 codes: github.com/deepmodeling/C…

CrystalFormer generates crystalline materials with an autoregressive transformer

paper: arxiv.org/abs/2403.15734
codes: github.com/deepmodeling/C…
YingTang (@yingtangphysics) 's Twitter Profile Photo

This work is more about opening a problem: even for the simplest noise-induced transition in bistable potential, the prevailing methods SINDy, FORCE are found inadequate. Leveraging Reservoir Computing is one way out, motivating to extend the other methods nature.com/articles/s4146…

Soon Hoe Lim (@shoelim8) 's Twitter Profile Photo

📣 We are organizing a winter school on "Physics of ML & ML for Physics" from Jan 13-24 at Nordita in Stockholm, quite in line with this year's Nobel Prize! Application deadline is Nov 10: indico.fysik.su.se/event/8856/ Help me to spread the words! #ML #AI #NobelPrize #winterschool

Bruno Loureiro (@_brloureiro) 's Twitter Profile Photo

🚨 1 week left for registering for the Les Houches workshop "Towards a theory for typical-case algorithmic hardness" 🌐 leshouches-algorithms.github.io Also a good occasion to remind the important role that Les Houches played in the history of Statistical Physics & Computer Science 🤓👇

Lei Wang (@wangleiphy) 's Twitter Profile Photo

Autoregressive model: alphabets, actions, and atoms wangleiphy.github.io/lectures/AAA-h… This lecture tried to offer a unified perspective to LLM, RL, and atomistic modeling!

Soon Hoe Lim (@shoelim8) 's Twitter Profile Photo

📣 We are hiring! Want to move to beautiful Stockholm 🇸🇪 and work on cutting-edge ML research? Join our group and help push the frontiers of machine learning! academicjobsonline.org/ajo/jobs/30017 📍 Apply now / spread the word! #ML #AI #Postdoc #Nordita #KTH #Stockholm

YingTang (@yingtangphysics) 's Twitter Profile Photo

Flow matching has become a dominant paradigm in generative modeling. We develop Quantum Flow Matching, with diverse applications: generate states with target magnetization and entanglement entropy; probe nonequilibrium free energy and superdiffusion. arxiv.org/abs/2508.12413

Flow matching has become a dominant paradigm in generative modeling. We develop Quantum Flow Matching, with diverse applications: generate states with target magnetization and entanglement entropy; probe nonequilibrium free energy and superdiffusion. arxiv.org/abs/2508.12413
Keenan Crane (@keenanisalive) 's Twitter Profile Photo

“Everyone knows” what an autoencoder is… but there's an important complementary picture missing from most introductory material. In short: we emphasize how autoencoders are implemented—but not always what they represent (and some of the implications of that representation).🧵

“Everyone knows” what an autoencoder is… but there's an important complementary picture missing from most introductory material.

In short: we emphasize how autoencoders are implemented—but not always what they represent (and some of the implications of that representation).🧵
Surya Ganguli (@suryaganguli) 's Twitter Profile Photo

Our new paper: "The geometry and dynamics of annealed optimization in the coherent Ising machine with hidden and planted solutions" arxiv.org/abs/2510.21109 How do algorithms like gradient descent negotiate the complex geometry of high dimensional loss landscapes to find near

Our new paper: "The geometry and dynamics of annealed optimization in the coherent Ising machine with hidden and planted solutions" arxiv.org/abs/2510.21109
How do algorithms like gradient descent negotiate the complex geometry of high dimensional loss landscapes to find near
Surya Ganguli (@suryaganguli) 's Twitter Profile Photo

Our new paper "Deriving neural scaling laws from the statistics of natural language" arxiv.org/abs/2602.07488 lead by Francesco Cagnetta & Allan Raventós w/ Matthieu Wyart makes a breakthrough! We can predict data-limited neural scaling law exponents from first principles using the

Our new paper "Deriving neural scaling laws from the statistics of natural language" arxiv.org/abs/2602.07488 lead by <a href="/Fraccagnetta/">Francesco Cagnetta</a> &amp; <a href="/AllanRaventos/">Allan Raventós</a> w/ Matthieu Wyart makes a breakthrough! We can predict data-limited neural scaling law exponents from first principles using the
YingTang (@yingtangphysics) 's Twitter Profile Photo

Deeply saddened by the passing of Nobel laureate Tony Leggett. Once in a conference, the posters were inadvertently placed too low, yet he knelt down to listen to our presentations. His respect for every piece of knowledge will endure alongside his legendary contributions to phys

Deeply saddened by the passing of Nobel laureate Tony Leggett. Once in a conference, the posters were inadvertently placed too low, yet he knelt down to listen to our presentations. His respect for every piece of knowledge will endure alongside his legendary contributions to phys