Daochen Zha (@zdcfrank) 's Twitter Profile
Daochen Zha

@zdcfrank

MLE @Airbnb | CS Ph.D. @RiceUniversity | Former intern @Meta | AI | ML | Reinforcement learning

ID: 738570184705740800

linkhttp://dczha.com calendar_today03-06-2016 03:17:08

28 Tweet

536 Followers

633 Following

MLCommons (@mlcommons) 's Twitter Profile Photo

The future of #ML is data-centric! That’s why we built #DataPerf, the leaderboard for data. It is the 1st platform and community for data-centric competitions. Together we will break through data limitations and unlock better ML for the world mlcommons.org/en/news/datape…

AI at Meta (@aiatmeta) 's Twitter Profile Photo

Today, Meta researchers together with MLCommons working group, are launching DataPerf, the first platform for building data & data-centric AI algorithm leaderboards. We're excited for how DataPerf will help to push the data-centric AI field forward ⬇️

Zhaozhuo Xu (@zhaozhuox) 's Twitter Profile Photo

Introducing the first Research On Algorithms & Data Structures (ROADS) to Mega-AI Models Workshop at MLSys 2023! The workshop will focus on algorithmic approaches to address the scalability challenges of AI in the future. CFP: roads2megaai.github.io DDL: May 05

Xia “Ben” Hu (@huxia) 's Twitter Profile Photo

Should we use LLMs or fine-tuned models for downstream tasks? If you are interested in this question, please take a look: Harnessing the Power of LLMs in Practice: A Survey on ChatGPT and Beyond arxiv.org/abs/2304.13712

Alex Ratner (@ajratner) 's Twitter Profile Photo

1/ *Data* is the key differentiator in building LLMs/foundation models. wired.com/story/stack-ov… - as the world realizes this, things are going to get interesting! Meanwhile: Enterprises will be avoiding this mess by training on their own data & knowledge for real AI moats.

AK (@_akhaliq) 's Twitter Profile Photo

Pre-train and Search: Efficient Embedding Table Sharding with Pre-trained Neural Cost Models abs: arxiv.org/abs/2305.01868 github: github.com/daochenzha/neu…

Pre-train and Search: Efficient Embedding Table Sharding with Pre-trained Neural Cost Models

abs: arxiv.org/abs/2305.01868 
github: github.com/daochenzha/neu…
Daochen Zha (@zdcfrank) 's Twitter Profile Photo

Our latest work demonstrates the efficacy of pre-training cost models and searching in addressing sharding challenges in ML Systems. Learn how we partitioned a large ML model across multiple devices in our #MLSys2023 paper. Check it out!

Open Finance@Columbia (@ai4finance) 's Twitter Profile Photo

FinGPT: Democratizing Internet-scale Data for Financial Large Language Models. Paper: arxiv.org/abs/2307.10485 Codes: github.com/AI4Finance-Fou…

Daochen Zha (@zdcfrank) 's Twitter Profile Photo

🚀Join us for the KDD 2023 Tutorial “Data-centric AI: Techniques and Future Perspectives" tomorrow on Aug 8! Learn cutting-edge methods and discuss the future of data-centric AI. You can also join us virtually in Zoom. See more info at dcaitutorial.github.io #KDD2023

🚀Join us for the KDD 2023 Tutorial “Data-centric AI: Techniques and Future Perspectives" tomorrow on Aug 8! Learn cutting-edge methods and discuss the future of data-centric AI. You can also join us virtually in Zoom. See more info at dcaitutorial.github.io #KDD2023
Daochen Zha (@zdcfrank) 's Twitter Profile Photo

Sharing the video recording and slides from our recent tutorial on data-centric AI at KDD 2023. Your feedback is appreciated! #AI #datacentric #KDD2023 YouTube: youtu.be/6WjHpFeOgQ0 Slides: dcaitutorial.github.io/files/data-cen…

Sharing the video recording and slides from our recent tutorial on data-centric AI at KDD 2023. Your feedback is appreciated! #AI #datacentric #KDD2023
YouTube: youtu.be/6WjHpFeOgQ0
Slides: dcaitutorial.github.io/files/data-cen…
Qiaoyu Tan (@qiaoyu_tan) 's Twitter Profile Photo

📢📊 Exciting opportunity alert! The call for papers is now open for the Data-Centric AI Workshop at #WWW2024. 🌐🤖 Join us in shaping the future of data-centric AI at #WWW2024. More info at: dcai-workshop.github.io Submission deadline: February 10, 2024

Jiayi Yuan (@jiayiyuan99) 's Twitter Profile Photo

Excited to introduce KIVI🥝, the first 2bit KV cache quantization breakthrough! 🚀 KIVI can be directly integrated into existing LLMs without any tuning. 📄 Paper: arxiv.org/abs/2402.02750 💻 Code: github.com/jy-yuan/KIVI #KIVI #LLM #AI #MachineLearning

Excited to introduce KIVI🥝, the first 2bit KV cache quantization breakthrough! 🚀 KIVI can be directly integrated into existing LLMs without any tuning.
📄 Paper: arxiv.org/abs/2402.02750
💻 Code: github.com/jy-yuan/KIVI
#KIVI #LLM #AI #MachineLearning
Wei Jin (@weisshelter) 's Twitter Profile Photo

【Deadline extended to Feb 15th】 Passionate about advancing AI through data excellence? Join the Data-Centric AI (DCAI) Workshop at WWW 2024 to shape the future of AI! 🌐 Submission guidelines: dcai-workshop.github.io Deadline: Feb 15th, 2024

Daochen Zha (@zdcfrank) 's Twitter Profile Photo

Join us for Data-Centric AI (DCAI) at WWW'24 with 19 paper presentations and a series of keynotes. Check out all the details at dcai-workshop.github.io. Secure your spot at WWW by signing up forthe DCAI workshop now: www2024.thewebconf.org. See you at WWW'24! 🌟

Yu-Neng Chuang (@yunengchuang) 's Twitter Profile Photo

Introducing the LTSM-bundle Package! 🌟Thrilled to launch our open-source tool 🔧Assess various crucial designs to train Large Time Series Models (LTSMs), and identity the best training practices 🔗 Paper: arxiv.org/abs/2406.14045 🔗 GitHub: github.com/daochenzha/ltsm

Introducing the LTSM-bundle Package! 
🌟Thrilled to launch our open-source tool
🔧Assess various crucial designs to train Large Time Series Models (LTSMs), and identity the best training practices

🔗 Paper: arxiv.org/abs/2406.14045 
🔗 GitHub: github.com/daochenzha/ltsm