TitleAuditory Perspective Taking
Publication TypeJournal Article
Year of Publication2012
AuthorsMartinson, E, Brock, DP
JournalIEEE Transactions on Systems, Man and Cybernetics Part A: Systems and Humans
Date Published2012
KeywordsAcoustic propagation, auditory displays, human–robot interaction, robot sensing systems
Abstract

Effective communication with a mobile robot using
speech is a difficult problem even when you can control the auditory
scene. Robot self-noise or ego noise, echoes and reverberation,
and human interference are all common sources of decreased
intelligibility. Moreover, in real-world settings, these problems are
routinely aggravated by a variety of sources of background noise.
Military scenarios can be punctuated by high decibel noise from
materiel and weaponry that would easily overwhelm a robot’s
normal speaking volume. Moreover, in nonmilitary settings, fans,
computers, alarms, and transportation noise can cause enough
interference to make a traditional speech interface unusable. This
work presents and evaluates a prototype robotic interface that uses
perspective taking to estimate the effectiveness of its own speech
presentation and takes steps to improve intelligibility for human
listeners.

Refereed DesignationRefereed
Full Text
pdf: 
http://www.nrl.navy.mil/itd/aic/sites/www.nrl.navy.mil.itd.aic/files/pdfs/martinson-brock-2012.pdf
NRL Publication Release Number: 
11-1226-4456
pub_tags: 
3DAudio