Ego-Noise Reduction Using a Motor Data-Guided Multichannel Dictionary
A. Schmidt (FAU Erlangen-Nuremberg), A. Deleforge (INRIA Rennes), and W. Kellermann (FAU Erlangen-Nuremberg)
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Daejeon, South Korea, Oct. 9-14, 2016
[showhide type=”Abstract”]Abstract: We address the problem of ego-noise reduction, i.e., suppressing the noise a robot causes by its own motions. Such noise degrades the recorded microphone signal massively such that the robot’s auditory capabilities suffer. To suppress it, it is intuitive to use also motor data, since it provides additional information about the robot’s joints and thereby the noise sources. We propose to fuse motor data to a recently
proposed multichannel dictionary algorithm for ego-noise reduction.
At training, a dictionary is learned that captures spatial and spectral characteristics of ego-noise. At testing, nonlinear classifiers are used to efficiently associate the current robot’s motor state to relevant sets of entries in the learned dictionary. By this, computational load is reduced by one third in typical scenarios while achieving at least the same noise reduction performance. Moreover, we propose to train dictionaries on different microphone array geometries and use them for ego-noise reduction while the head to which the microphones are mounted is moving. In such scenarios, the motor guided approach results in significantly better performance values.[/showhide]
Copyright Notice ©2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Paper: Paper_IROS_2016_FAU_AS