Visual and cognitive predictors of visual enhancement in noisy listening conditions

No Thumbnail Available
Issue Date
2013-09
Embargo End Date
Authors
Jansen, Samantha D.
Chaparro, Alex
Downs, David
Keebler, Joseph R.
Advisor
Citation

Jansen, S.D., Chaparro, A., Downs, D., & Keebler, J.R. (2013). Visual and cognitive predictors of visual enhancement in noisy listening conditions. Proceedings from 2013 Human Factors and Ergonomics Society Annual Meeting, San Diego, California, September 30-October 4, 2013.

Abstract

Researchers have demonstrated that visual and auditory cues interact, improving speech intelligibility under noisy listening conditions. For instance, recent findings demonstrated that simulated cataracts hinder the ability of listeners to utilize visual cues to understand (i.e., speechread) televised speech sentences. The purpose of this study was to determine which measures of visual, auditory, and cognitive performance predicted participants' ability to speechread televised spoken messages in the presence of background babble. Specifically, 30 young adults with normal visual acuity and hearing sensitivity completed a battery of visual, auditory, and cognitive assessments. Speech intelligibility was tested under two conditions: auditory-only with no visual input and auditory-visual with normal viewing. Speech intelligibility scores were used to calculate average visual enhancement, or the average benefit participants gained from viewing visual information in addition to auditory information. Regression analyses demonstrated that the best predictors of visual enhancement were measures of contrast sensitivity and executive functioning, including the Digit Symbol Substitution Test and Trail Making Test, part B. These results suggest that audiovisual speech integration is dependent on both low-level sensory information and high-level cognitive processes, particularly those associated with executive functioning.

Table of Content
Description
Click on the DOI link below to access the article (may not be free).
publication.page.dc.relation.uri
DOI