Quanquan Peng (@quanquanpeng03) 's Twitter Profile
Quanquan Peng

@quanquanpeng03

@SJTU1896 ACM Class 25 | RA @uw_robotics 🤖Robotics, RL/IL, Embodied AI

ID: 1038381219409321985

linkhttps://bariona.github.io/ calendar_today08-09-2018 10:59:09

2 Tweet

84 Followers

445 Following

Jianglong Ye (@jianglong_ye) 's Twitter Profile Photo

How to generate billion-scale manipulation demonstrations easily? Let us leverage generative models! 🤖✨ We introduce Dex1B, a framework that generates 1 BILLION diverse dexterous hand demonstrations for both grasping 🖐️and articulation 💻 tasks using a simple C-VAE model.

Ruihan Yang (@rchalyang) 's Twitter Profile Photo

How can we leverage diverse human videos to improve robot manipulation? Excited to introduce EgoVLA — a Vision-Language-Action model trained on egocentric human videos by explicitly modeling wrist & hand motion. We build a shared action space between humans and robots, enabling

An-Chieh Cheng (@anjjei) 's Twitter Profile Photo

Let your robot peek around corners, size up the gap between chairs, and know exactly where everything sits. 🤖 Our new work SR-3D masters this kind of spatial reasoning. It learns from multiple views to understand distances, layouts, and how objects relate in 3D space.

Guangqi Jiang (@luccachiang) 's Twitter Profile Photo

Ever want to enjoy all the privileged information in sim while seamlessly transferring to the real world? How can we correct policy mistakes after deployment? 👉Introducing GSWorld, a real2sim2real photo-realistic simulator with interaction physics with fully open-sourced code.

Xueyan Zou (@xyz2maureen) 's Twitter Profile Photo

As an AI researcher, are you interested in tracking trends from CV/NLP/ML to robotics—even Nature/Science. Our paper “Real Deep Research for AI, Robotics & Beyond” automates survey generation and trend/topic discovery across fields 🔥Explore RDR at realdeepresearch.github.io

Jianglong Ye (@jianglong_ye) 's Twitter Profile Photo

How do we make dexterous hands handle both power and precision tasks with ease? 🫳👌🫰 We introduce Power to Precision (💪➡️🎯), our new paper that optimizes both control and fingertip geometry to unlock robust manipulation from power grasp to fine-grained manipulations. With

Quanquan Peng (@quanquanpeng03) 's Twitter Profile Photo

Really inspiring! Check out this amazing project that optimizes with both software and hardware to produce such dexterous behavior!

Xuanbin Peng (@xuanbin_peng) 's Twitter Profile Photo

What if a humanoid robot could choose how to interact with the environment 🤖 — soft when it needs compliance, stiff when it needs precision, and force-aware when it must push/pull? That’s exactly what our Heterogeneous Meta-Control (HMC) framework enables. Our new framework

Yutong Liang (@yutongliang_) 's Twitter Profile Photo

How far can we push the limit of in-hand manipulation dexterity? Introducing our work on motion capture: DexterCap & DexterHand ! DexterCap: A high-fidelity motion capture system for intricate in-hand manipulation motion. DexterHand: A dataset featuring true in-hand

How far can we push the limit of in-hand manipulation dexterity?

Introducing our work on motion capture: DexterCap & DexterHand ! 

DexterCap: A high-fidelity motion capture system for intricate in-hand manipulation motion.

DexterHand: A dataset featuring true in-hand
cw j (@cwj99770123) 's Twitter Profile Photo

Can we bridge the Sim-to-Real gap in complex manipulation without explicit system ID? 🤖 Presenting Contact-Aware Neural Dynamics — a diffusion-based framework that grounds simulation with real-world touch. Implicit Alignment: No tedious parameter tuning. Tactile-Driven:

Quanquan Peng (@quanquanpeng03) 's Twitter Profile Photo

To what extent can the humanoid cross-embodiment do? Please check out the latest work! 💪 If you've done humanoid, then you must know how hard it is to get those different robots work at the same time!