Intelligent Mobile Manipulation Lab
IM^2 Lab
Our mission is to create intelligent mobile-manipulation robots that efficiently perform manipulation tasks in diverse and unstructured environments. We are inspired by the enormous difficulties that arise in dealing with large numbers of potentially unknown objects and robot's limited knowledge and sensing capabilities. To realize this mission, we conduct interdisciplinary research at the intersection of task and motion planning, computer vision, and machine learning.

Recent publications


ICLR 2024

CORN: Contact-based Object Representation for Nonprehensile Manipulation of General Unseen Objects. Yoonyoung Cho, Junhyek Han, Yoontae Cho, and Beomjoon Kim. [Paper] [video] [project]


ICLR 2024

An Intuitive Multi-Frequency Feature Representation for SO(3)-Equivariant Networks. Dongwon Son, Jaehyung Kim, Sanghyeon Son, and Beomjoon Kim. [Paper] [video] [project]


CoRL 2023

Preference learning for guiding the tree search in continuous POMDPs. Jiyong Ahn, Sanghyeon Son, Dongryung Lee, Jisu Han, Dongwon Son, and Beomjoon Kim. [Paper] [video] [project]


IROS 2023

Pre- and post-contact policy decomposition for non-prehensile manipulation with zero-shot sim-to-real transfer. Minchan Kim, Junhyek Han, Jaehyung Kim, and Beomjoon Kim. [arXiv] [video] [project]


RSS 2023

Local object crop collision network for efficient simulation of non-convex objects in GPU-based simulators. Dongwon Son and Beomjoon Kim. [arXiv] [video] [project]


IROS 2022

Ohm^2: Optimal hierarchical planner for object search in large environments via mobile manipulation. Yoonyoung Cho*, Donghoon Shin*, and Beomjoon Kim. [pdf]

Recent talks

MIT Embodied Intelligence Seminar: Making Robots See and Manipulate

Learning to reason for robot task and motion planning problems