Objectives

Implementation of spiking tactile neural encoding on a humanoid robot, development of decoding strategies for perception and behavior generation. The robot will be a testbed to reproduce biological touch experiments (in collaboration with SISSA) of the perception of multiple stimulus properties arising from active object exploration, including light pressure, vibration, texture, lateral motion, and stretch. The task will be active exploration and modelling of an object using tactile and visual information. Visual information will be used to form an initial guess of the object shape, tactile information will be used to refine this information and complement it using features that characterize the local curvature of the object and areas that are not visible (due to occlusion). This project will use the algorithms developed within research themes 5 and 6 to classify local features from the object and machine learning models (e.g. Gaussian Processes) to model the object surface and provide an accurate shape. In the final part of the project, we will validate the surface reconstruction method in the context of object grasping, using state-of-the-art techniques that rely on object models. For comparison evaluation, we will use a dataset of objects for which accurate models are available (the Yale-CMU-Berkeley Object Data set, a dataset of object manipulation benchmarking).

Expected Results

Active exploration strategy based on tactile feedback for object modelling, experimental validation and benchmarking in a grasping scenario.

Planned Secondments

  • SISSA:
    definition of tasks and protocols for psychophysics
  • USFD:
    to learn prediction and action selection
  • PAL:
    to test on different robotic platform (generalisation)

Enrolments (in Doctoral degree/s)

University of Genova

Supervisors

C. Bartolozzi, L. Natale, M. Diamond, S. Panzeri, T. Prescott, S. Terreri

Tags

ART ROB
ESR11: Active touch and behaviour
This project is funded by the EU Horizon 2020 research and innovation programme under grant agreement No 813713 (NeuTouch)
© 2018 Neutouch