Lai Wei (@laiiwei) 's Twitter Profile
Lai Wei

@laiiwei

MS CSE student @UCSanDiego | Robotics learning

ID: 1894472656612986882

linkhttps://i-am-future.github.io/ calendar_today25-02-2025 19:41:04

0 Tweet

1 Takipçi

17 Takip Edilen

Guangqi Jiang (@luccachiang) 's Twitter Profile Photo

Ever want to enjoy all the privileged information in sim while seamlessly transferring to the real world? How can we correct policy mistakes after deployment? 👉Introducing GSWorld, a real2sim2real photo-realistic simulator with interaction physics with fully open-sourced code.

Jianglong Ye (@jianglong_ye) 's Twitter Profile Photo

How do we make dexterous hands handle both power and precision tasks with ease? 🫳👌🫰 We introduce Power to Precision (💪➡️🎯), our new paper that optimizes both control and fingertip geometry to unlock robust manipulation from power grasp to fine-grained manipulations. With

Lai Wei (@laiiwei) 's Twitter Profile Photo

Dexterous hands are increasingly common. Although they offer greater capability than low-DoF grippers, most dexterous hands still fail to perform a simple task that basic grippers handle easily: precise grasping. To bridge this gap, we co-designed both the fingertip geometry and

Lai Wei (@laiiwei) 's Twitter Profile Photo

Really appreciate the recognition and the chance to be part of this project. I’m currently seeking the PhD positions for the coming year — I’d be glad to connect with labs or researchers in robotics and related fields! Personal website 👉 i-am-future.github.io

Xuanbin Peng (@xuanbin_peng) 's Twitter Profile Photo

What if a humanoid robot could choose how to interact with the environment 🤖 — soft when it needs compliance, stiff when it needs precision, and force-aware when it must push/pull? That’s exactly what our Heterogeneous Meta-Control (HMC) framework enables. Our new framework

Xiaolong Wang (@xiaolonw) 's Twitter Profile Photo

Most robot learning has focused on simple position control. But think about how a human uses a wrench 🔧: you’re not just rotating in one direction—you’re continuously shaping the forces, pushing and pulling differently as you move. Our robot can do exactly that now.

Xiongyi Cai (@xiongyicai) 's Twitter Profile Photo

How do you teach a robot to do something it has never seen before? 🤖 With human data. Our new Human0 model is co-trained on human and humanoid data. It allows the robot to understand a novel language command and execute it perfectly in the wild without prior practice.

Rui Yan (@hi_im_ruiyan) 's Twitter Profile Photo

Meet ACE-F — a novel, foldable teleoperation platform for collecting high-quality robot demonstration data across robot embodiments. Using a specialized soft-controller pipeline, we interpret end-effector positional deviations as virtual force signals to provide the user with

Xueyan Zou (@xyz2maureen) 's Twitter Profile Photo

I will join Tsinghua University, College of AI, as an Assistant Professor in the coming month. I am actively looking for 2026 spring interns and future PhDs (ping me if you are in #NeurIPS). It has been an incredible journey of 10 years since I attended an activity organized by

I will join Tsinghua University, College of AI, as an Assistant Professor in the coming month. I am actively looking for 2026 spring interns and future PhDs (ping me if you are in #NeurIPS).

It has been an incredible journey of 10 years since I attended an activity organized by
Quanquan Peng (@quanquanpeng03) 's Twitter Profile Photo

"Cross-embodiment" is a sign of generalization. We’ve seen huge progress in manipulation and navigation — but what about humanoid whole-body control? Can ONE policy control multiple different humanoids? Meet our #ICRA2026 work 🦅EAGLE: Embodiment-Aware Generalist Specialist