Robotics and Biology Laboratory

Online Interactive Perception

Our code of our perceptual system for Online Interactive Perception of articulated objects. It includes our methods presented in at IROS14 and ICRA16. The system extracts patterns of motion at different levels (point feature motion, rigid body motion, kinematic structure motion) and infers the kinematic structure and state of the interacted articulated objects. It uses an RGB-D stream of an interaction as input. Optionally, it can reconstruct the shape of the moving parts and use it to improve tracking. More info at omip wiki.

Repository: https://github.com/tu-rbo/omip and https://github.com/tu-rbo/omip_msgs

Accompanies papers: Online Interactive Perception of Articulated Objects with Multi-Level Recursive Estimation Based on Task-Specific Priors and An Integrated Approach to Visual Perception of Articulated Objects

Maintainers: Roberto Martín-Martín

Project: Interactive Perception