How Do I Sound Like? Forward Models for Robot Ego-Noise Prediction

A. Pico, G. Schillaci, V. V. Hafner (Humboldt Universität zu Berlin) and B. Lara (Universidad Autónoma del Estado de Morelos)

Frontiers in Robotics and AI, section Humanoid Robotics, June 30, 2016,

[showhide type=”Abstract”] Abstract: How do robots sound like? Robot ego-noise, that is the sound produced by a robot while moving around, is an important factor that can affect the way an artificial agent perceives the environment and interacts with it. In robot audition, for example, ego-noise is usually addressed due to its effects on the quality of the auditory input signal, as it can severely impact the performance of processes such as speech recognition. Nonetheless, robot ego-noise can carry out very useful information about the robot embodiment or about the external environment. In this study, we present a mechanism for
learning and for predicting the auditory consequences of self-generated
movements on a custom robotic platform. We show two experiments based on a computational model capable of performing forward predictions. First, we demonstrate that the system can classify motor behaviours by comparing the noise they produce with that of simulated actions. Thus, we show that, by using similar processes, the robot can detect unexpected environmental conditions, such as changes in the inclination of the surface it is walking on. [/showhide]

Copyright Notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author’s copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.

Paper: Paper_EPIROB_2016_UBER_AP