Robotics and Biology Laboratory

Robotics Related

© RBO

Estimating Robust Affordances from Vision by Combining Multiple Models

Patrick Lowin

Efficient interactions with the environment require knowing its affordances. In manipulation tasks, an affordance describes parts of the environment that are graspable and movable. However, estimating such affordances from RGBD data without interaction is inherently ambiguous. These ambiguities manifest in different affordance estimates depending on leveraged cues, e.g, appearance or geometry. Thus, different models provide only uncertain measurements, which we can fuse to obtain robust estimates. To do so, we recursively estimate beliefs over affordances for multiple existing affordance predictors separately and fuse their beliefs.

© RBO

Distance estimation using fixation and event camera

Juan Antonio Gómez Daza

Humans navigate and interact with the 3D world using only 2D eye sensors and exploiting regularities in 3D space. By using gaze fixation and specific movements, humans are able to extract relevant 3D properties of the world through 2D sensors that measure changes, somewhat like an event camera. In this project, find out how event cameras, can help robots interact with the 3D world as effortlessly as humans.

Active Learning to Manipulate Objects From Human Demonstration Videos

Adrian Pfisterer

This thesis aims to develop a system that allows robots to acquire manipulation skills directly from human demonstration videos. The novelty of this system is to actively command the robot to perform exploratory actions and gather additional sensory information rather than solely relying on passive observed information from demonstrations.