Jason Peng (@xbpeng4) 's Twitter Profile
Jason Peng

@xbpeng4

Assistant Prof at @SFU and Research Scientist at @NVIDIA

ID: 981980649975300096

linkhttp://xbpeng.github.io calendar_today05-04-2018 19:43:26

171 Tweet

5,5K Takipçi

31 Takip Edilen

Zhongyu Li (@zhongyuli4) 's Twitter Profile Photo

Interested in making your bipedal robots to be athletes? We summarized our RL work to create robust & adaptive controllers for general bipedal skills. 400m-dash, running over terrains/against perturbations, targeted jumping, compliant walking, not a problem for bipeds now.🧵👇

Mathis Petrovich (@mathispetrovich) 's Twitter Profile Photo

Just arrived in Seattle for #CVPR2024!😎 Catch me at the #HuMoGen workshop this Tuesday, I will be presenting my latest work "Multi-Track Timeline Control for Text-Driven 3D Human Motion Generation"🚀 mathis.petrovich.fr/stmc

Just arrived in Seattle for #CVPR2024!😎

Catch me at the #HuMoGen workshop this Tuesday, I will be presenting my latest work "Multi-Track Timeline Control for Text-Driven 3D Human Motion Generation"🚀

mathis.petrovich.fr/stmc
Yi Shi (@yishi_333) 's Twitter Profile Photo

We are excited to present our #SIGGRAPH2024 paper, "Interactive Character Control with Auto-Regressive Motion Diffusion Model (AMDM)," on Friday, August 1st, at 2:10 PM in the Mile High 4 room. Project Page: yi-shi94.github.io/amdm_page

Chen Tessler (@chentessler) 's Twitter Profile Photo

Excited to share our latest work! 🤩 Masked Mimic 🥷: Unified Physics-Based Character Control Through Masked Motion Inpainting Project page: research.nvidia.com/labs/par/maske… with: Yunrong (Kelly) Guo, Ofir Nabati, Gal Chechik and Jason Peng. SIGGRAPH Asia ➡️ Hong Kong (ACM TOG). 1/ Read

Excited to share our latest work! 🤩  

Masked Mimic 🥷: Unified Physics-Based Character Control Through Masked Motion Inpainting 

Project page: research.nvidia.com/labs/par/maske…  

with: Yunrong (Kelly) Guo, <a href="/ofirnabati/">Ofir Nabati</a>, <a href="/GalChechik/">Gal Chechik</a> and <a href="/xbpeng4/">Jason Peng</a>. 

<a href="/SIGGRAPHAsia/">SIGGRAPH Asia ➡️ Hong Kong</a> (ACM TOG). 

1/ Read
Chen Tessler (@chentessler) 's Twitter Profile Photo

Excited to share our code for physics-based animation! 🥳 "ProtoMotions -- primitive or fundamental types of movement that serve as a basis for more complex motions.” github.com/NVlabs/ProtoMo… This contains our recent work on MaskedMimic, but some additional pieces. 🧵 1/

Zhongyu Li (@zhongyuli4) 's Twitter Profile Photo

Introducing HiLMa-Res: a hierarchical RL framework for quadrupeds to tackle loco-manipulation tasks with sustained mobility! Designed for general learning tasks (vision-based, state-based, real-world data, etc), the robot now can step over stones🐾/navigate boxes📦/dribble⚽.

Yanjie Ze (@zeyanjie) 's Twitter Profile Photo

We’ve seen humanoid robots walk around for a while, but when will they actually help with useful tasks in daily life? The challenge here is the diversity and complexity of real-world scenes. Our new work tackles this problem via 3D visuomotor policy learning. Using data from

Zixuan Chen (@c___eric417) 's Twitter Profile Photo

Smooth behaviors is vital for successful sim2real transfer of RL policies. This is often achieved with smoothness rewards or low-pass filters, which are not easily differentiable and tend to require tedious tuning. We introduce Lipschitz-Constrained Policies (LCP), a simple and

Jason Peng (@xbpeng4) 's Twitter Profile Photo

Our team at NVIDIA is recruiting PhD interns for 2025 to work on a wide range of topics in computer animation! If you want to join us to create some of the most capable and realistic simulated characters, then apply here: nvidia.wd5.myworkdayjobs.com/en-US/NVIDIAEx…

BridgeDP Robotics (@brigdgedp_robot) 's Twitter Profile Photo

When we bought this humanoid robot, added our control system magic, not just a robot, but a performer. ... 💃🤖 "Bringing rhythm to real robots!" "#BridgeDP_Robotics #AI #Humanoid #EmbodiedAI #Dance #Motion #Control #sim2real

Yanjie Ze (@zeyanjie) 's Twitter Profile Photo

🤖Introducing TWIST: Teleoperated Whole-Body Imitation System. We develop a humanoid teleoperation system to enable coordinated, versatile, whole-body movements, using a single neural network. This is our first step toward general-purpose robots. 🌐humanoid-teleop.github.io

Michael Xu (@mxu_cg) 's Twitter Profile Photo

Interested in simulated characters traversing complex terrains? PARC: Physics-based Augmentation with Reinforcement Learning for Character Controllers Project page: michaelx.io/parc/index.html with: Yi Shi, KangKang Yin, and Jason Peng ACM SIGGRAPH 2025 Conference Paper 1/

Zixuan Chen (@c___eric417) 's Twitter Profile Photo

🚀Introducing GMT — a general motion tracking framework that enables high-fidelity motion tracking on humanoid robots by training a single policy from large, unstructured human motion datasets. 🤖A step toward general humanoid controllers. Project Website:

Haoru Xue (@haoruxue) 's Twitter Profile Photo

🚀 Introducing LeVERB, the first 𝗹𝗮𝘁𝗲𝗻𝘁 𝘄𝗵𝗼𝗹𝗲-𝗯𝗼𝗱𝘆 𝗵𝘂𝗺𝗮𝗻𝗼𝗶𝗱 𝗩𝗟𝗔 (upper- & lower-body), trained on sim data and zero-shot deployed. Addressing interactive tasks: navigation, sitting, locomotion with verbal instruction. 🧵 ember-lab-berkeley.github.io/LeVERB-Website/

Jason Peng (@xbpeng4) 's Twitter Profile Photo

As we scale up the amount of data used to train humanoid controllers, cleaning motion data is becoming a significant bottleneck. With StableMotion, we can train motion cleanup models directly on raw corrupted motion data, and use it to automatically clean up the whole dataset.

Jason Peng (@xbpeng4) 's Twitter Profile Photo

I have always been surprised by how few positive samples adversarial imitation learning needs to be effective. With ADD we take this to the extreme! A differential discriminator trained with a SINGLE positive sample can still be effective for a wide range of tasks.