Research

My interests are in multimodal object representations for autonomous sensorimotor interaction. I am currently working on a robotic agent that learns to grasp and manipulate objects on-the-job.

Keywords: Robotics, computer vision, robotic manipulation, machine learning.

Research Projects↑top

Task Generalization

Generalizing Task Parameters

TOMSY Part of the EU project TOMSY.

Generalization

Part-based Grasp Generalization

CogX Part of the EU project CogX.

TOMSY Part of the EU project TOMSY.

Schunk hand Tactile pads

Grasping Using Tactile And Visual Data

CogX Part of the EU project CogX.

Grasp density

Autonomous Learning of Object Grasp Models

PACO-PLUS Part of the EU project PACO-PLUS.

Pose estimation

3D Object Pose Estimation and Recognition

PACO-PLUS Part of the EU project PACO-PLUS.

Overview (Popular Science)↑top

Industrial robots perform systematic tasks with an accuracy and speed largely superior to humans'. Yet, to date, they have mostly been confined to highly-controlled factories designed for them. This confinement is explained by the way industrial robots function. These robots execute programs that are specific to one task, and that assume a particular environment. If the task or the environment changes, the robot has to be re-programmed. For instance, let us consider a car-assembly robot that picks up wheels from a feeder and bolts them to an axle. In this type of scenario, the place of the feeder, and the pick-and-bolt behavior, are hard-coded into the robot's program. If we move the robot to another factory where the feeder is placed differently, or to a factory where the robot is expected to remove wheels instead of attaching them, a technician will need to reprogram the robot before it can work again. Today's robots are still far less versatile than humans. As a result, we mostly use them in highly-controlled workplaces designed for them.

Industrial robotArmar

Robots working in controlled vs. uncontrolled environments. Left: an industrial robot fixed at its workstation, right: a household robot in a kitchen.

The robotics research community is currently striving to develop robots that can evolve in regular factories, and houses, officies, or hospitals. Developing such robots is difficult because of the diversity inherent to human environments. Room layouts differ from one building to another. Most common objects exist in different sizes or colors, and vary in weight or in stiffness. Preprogramming a robot to readily work in an arbitrary house or factory is unpractical, as it would require the robot to have access to the complete layout of any building it enters, to have models of all objects and tools it may need to manipulate or use, and to have preprogrammed behaviors adapted to every other combination of tools and tasks. In response, the community has moved beyond preprogrammed designs, and it is now developing robots that learn to adapt to new tasks and environments. By observing the environmental effects of their actions and the actions of others, these robots can progressively acquire the knowledge necessary to execute their work. Consequently, the “program” that governs the robot's actions is evolving over time.

In my own research, I develop a robotic agent that learns to manipulate objects – for instance, pick up a plate from a table and place it in a dishwasher, or slide a dish into an oven. By reproducing tasks demonstrated by a human, and also by experimenting by itself, the robot learns how to place its hand on various objects in order to grasp them. The robot also learns how to exploit tactile cues to maintain the stability of a grasp. By experimentation, the robot understands that certain tactile signals indicate that the grasp is wrong, or that the object is slipping away, and therefore a reactive action is necessary. As the robot becomes familiar with a small set of objects, it progressively abstracts generic skills from its experience. In turn, these skills are transfered to novel objects as they appear, allowing the robot to quickly adapt to a changing environment.


Page last modified: October 09, 2016 Valid HTML5 and CSS