V. Tourbabin (Ben-Gurion University of the Negev), H. Barfuss (FAU Erlangen-Nuremberg), B. Rafaely (Ben-Gurion University of the Negev), W. Kellermann (FAU Erlangen-Nuremberg)
IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Brisbane, Australia, April 20-24, 2015.
Abstract: Auditory systems of humanoid robots usually acquire the surrounding sound field by means of microphone arrays. These arrays can undergo motion related to the robot’s activity. The conventional approach to dealing with this motion is to stop the robot during sound acquisition. This approach avoids changing the positions of the microphones during the acquisition and reduces the robot’s ego-noise. However, stopping the robot can interfere with the naturalness of its behaviour. Moreover, the potential performance improvement due to motion of the sound acquiring system can not be attained. This potential is analysed in the current paper. The analysis considers two different types of motion: (i) rotation of the robot’s head and (ii) limb gestures. The study presented here combines both theoretical and numerical simulation approaches. The results show that rotation of the head improves the high-frequency performance of the microphone array positioned on the head of the robot. This is complemented by the limb gestures, which improve the low-frequency performance of the array positioned on the torso and limbs of the robot.
©2015 IEEE.Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.