Cheng Han Chiang (姜成翰) (@dcml0714) 's Twitter Profile
Cheng Han Chiang (姜成翰)

@dcml0714

Fourth-year Ph.D. student at National Taiwan University Interests: music🎶, 📷photography, Japanese drama
Cat person🐱
Research interests: NLP

ID: 1213772234910654464

linkhttps://d223302.github.io/ calendar_today05-01-2020 10:40:42

125 Tweet

377 Takipçi

224 Takip Edilen

Shao-Hua Sun (@shaohua0116) 's Twitter Profile Photo

We invite in-person tutorial proposals to the Asian Conference on Machine Learning (ACML) 2025 in Taipei, Taiwan, on Dec 12, 2025! Share your research with us & visit vibrant Taiwan! #ACML2025 Deadline: Aug 1; notification: Sep 5 CFT: acml-conf.org/2025/tutorial.… Please retweet!

We invite in-person tutorial proposals to the Asian Conference on Machine Learning (ACML) 2025 in Taipei, Taiwan, on Dec 12, 2025! Share your research with us & visit vibrant Taiwan! #ACML2025
Deadline: Aug 1; notification: Sep 5
CFT: acml-conf.org/2025/tutorial.…
Please retweet!
Cheng Han Chiang (姜成翰) (@dcml0714) 's Twitter Profile Photo

RAFT is the building block of TRACT. It is incredible to see a simple, well-motivated, but non-trivial method like RAFT can yield meaningful improvement without any training/inference overhead.

Ke-Han Lu (@kehan_lu) 's Twitter Profile Photo

What is the best way to create training data for a Large Audio Language Model? Check out our latest paper: arxiv.org/abs/2507.02768

What is the best way to create training data for a Large Audio Language Model?

Check out our latest paper: arxiv.org/abs/2507.02768
Cheng Han Chiang (姜成翰) (@dcml0714) 's Twitter Profile Photo

Excited to be at #ACL2025! 🎉 Looking forward to sharing our research and learning from the community. Open to discussions on: - LLM-as-a-judge: our paper, TRACT, was presented this morning - Audio-LLM-as-a-judge for speaking style - STITCH: Our newest SLM that can think and

Yung-Sung Chuang (@yungsungchuang) 's Twitter Profile Photo

This work began as my summer intern project last year, and has since grown into a long-term effort in large-scale pretraining. Huge thanks to project leaders Hu Xu Shang-Wen Li, and Yang Li Dong Wang for all the dedication! AI at Meta MIT CSAIL 🧑‍💻 github.com/facebookresear…