Biography

Dr. Yixin Zhu received a Ph.D. degree (‘18) from UCLA advised by Prof. Song-Chun Zhu. His research builds interactive AI by integrating high-level common sense (functionality, affordance, physics, causality, intent) with raw sensory inputs (pixels and haptic signals) to enable richer representation and cognitive reasoning on objects, scenes, shapes, numbers, and agents. Dr. Zhu directs the PKU CoRe Lab, working on abstract reasoning, visually grounded reasoning, and interactive reasoning.

[CV] [Dissertation]

Getting Started 帮你自学大学四年AI的所有内容
通班相关文章 汇集官方和权威渠道对通班的报道
博雅一号(成都) 目前仅限AI院全职PI和北大通班学生使用
I am looking for PostDocs on a rolling basis. We offer competitive benefits and a first-class platform. 北京大学博士后项目的更多信息请参见这里.

Pre-Print

UniAct: Unified Motion Generation and Action Streaming for Humanoid Robots
Cross-Scenario Unified Modeling of User Interests at Billion Scale
Vi-TacMan: Articulated Object Manipulation via Vision and Touch
TacMan-Turbo: Proactive Tactile Control for Robust and Efficient Articulated Object Manipulation
AffordX: Generalizable and Slim Affordance Reasoning for Task-oriented Manipulation
Learning to Plan with Personalized Preferences
AlphaChimp: Tracking and Behavior Recognition of Chimpanzees
Dexterous Functional Pre-Grasp Manipulation with Diffusion Policy

Selected Publications

[CHI26] NarrativeLoom: Enhancing Creative Storytelling through Multi-Persona Collaborative Improvisation
[NeurIPS25] DrivAerStar: An Industrial-Grade CFD Dataset for Vehicle Aerodynamic Optimization
[CoRL25] CLONE: Closed-Loop Whole-Body Humanoid Teleoperation for Long-Horizon Tasks
[CogSci25] Probing and Inducing Combinational Creativity in Vision-Language Models
[CogSci25] A simulation-heuristics dual-process model for intuitive physics
[T-RO25] Tac-Man: Tactile-Informed Prior-Free Manipulation of Articulated Objects