Zichen Wang (@zichenwangphd) 's Twitter Profile
Zichen Wang

@zichenwangphd

ML Scientist @awscloud. #MachineLearning, Healthcare, Biology. Formerly PhD & research faculty @IcahnMountSinai. Views are my own.

ID: 713877589

linkhttps://wangz10.github.io calendar_today24-07-2012 07:33:23

249 Tweet

261 Followers

445 Following

Zichen Wang (@zichenwangphd) 's Twitter Profile Photo

Had a great time at the #FASEB conference on illuminating the druggable proteome. Nice to catch up with old colleagues and sharing our research on GraphML for biology.

Had a great time at the #FASEB conference on illuminating the druggable proteome. Nice to catch up with old colleagues and sharing our research on GraphML for biology.
Nick Erickson (@innixma) 's Twitter Profile Photo

AutoGluon 1.0 is live!! Shatters SOTA, wins 75% vs prior release, 63% win-rate vs best-in-hindsight combination of other methods. To our knowledge, this is the biggest leap forward in tabular ML in the past 4 years. See how we did it: github.com/autogluon/auto… #AutoML #AutoGluon

AutoGluon 1.0 is live!! Shatters SOTA, wins 75% vs prior release, 63% win-rate vs best-in-hindsight combination of other methods.
To our knowledge, this is the biggest leap forward in tabular ML in the past 4 years. 

See how we did it: github.com/autogluon/auto…

#AutoML #AutoGluon
Balasubramaniam Srinivasan (@balasrini32) 's Twitter Profile Photo

We at AWS AI Research & Education (AIRE) are hiring PhD students as Applied Scientist Interns for Summer '24 to work on AI for Science research projects related to biology, chemistry, and simulations. DM me for more info or apply @ amazon.jobs/en/jobs/242688…

Biology+AI Daily (@biologyaidaily) 's Twitter Profile Photo

LC-PLM: Long-Context Protein Language Model • LC-PLM introduces a structured state-space model (BiMamba-S) for protein sequences, enabling efficient modeling of long-range dependencies within protein structures, a capability that conventional transformer-based models struggle

LC-PLM: Long-Context Protein Language Model

• LC-PLM introduces a structured state-space model (BiMamba-S) for protein sequences, enabling efficient modeling of long-range dependencies within protein structures, a capability that conventional transformer-based models struggle
Yingheng Wang (@yingheng_wang) 's Twitter Profile Photo

📢 We're excited to introduce LC-PLM, a Long-Context Protein Language Model that redefines the potential of protein sequence modeling! LC-PLM addresses the challenges of modeling longer protein sequences and complex biological interactions. Our model demonstrates superior scaling

📢 We're excited to introduce LC-PLM, a Long-Context Protein Language Model that redefines the potential of protein sequence modeling! LC-PLM addresses the challenges of modeling longer protein sequences and complex biological interactions. Our model demonstrates superior scaling
Zichen Wang (@zichenwangphd) 's Twitter Profile Photo

Excited to announce that we just released the LC-PLM model weights and related codes on GitHub: github.com/amazon-science…

Biology+AI Daily (@biologyaidaily) 's Twitter Profile Photo

Protein Structure Tokenization: Benchmarking and New Recipe 1/ Recent advances in protein structure tokenization (PST) methods enable direct application of language modeling techniques to protein 3D structures. However, the capabilities and limitations of these methods remain

Protein Structure Tokenization: Benchmarking and New Recipe  

1/ Recent advances in protein structure tokenization (PST) methods enable direct application of language modeling techniques to protein 3D structures. However, the capabilities and limitations of these methods remain
Zichen Wang (@zichenwangphd) 's Twitter Profile Photo

Check out our recent work led by our outstanding intern Xinyu: we created the first benchmark specifically for protein structure tokenization and presents AminoAseed, a novel approach to address the codebook collapse problem in vector quantization.