Show simple item record

dc.contributor.authorJansen, Samantha D.
dc.contributor.authorChaparro, Alex
dc.contributor.authorDowns, David
dc.contributor.authorKeebler, Joseph R.
dc.date.accessioned2016-02-10T20:38:01Z
dc.date.available2016-02-10T20:38:01Z
dc.date.issued2013-09
dc.identifier.citationJansen, S.D., Chaparro, A., Downs, D., & Keebler, J.R. (2013). Visual and cognitive predictors of visual enhancement in noisy listening conditions. Proceedings from 2013 Human Factors and Ergonomics Society Annual Meeting, San Diego, California, September 30-October 4, 2013.
dc.identifier.otherdoi: 10.1177/1541931213571267
dc.identifier.urihttp://doi.org/bcfq
dc.identifier.urihttp://hdl.handle.net/10057/11774
dc.descriptionClick on the DOI link below to access the article (may not be free).
dc.description.abstractResearchers have demonstrated that visual and auditory cues interact, improving speech intelligibility under noisy listening conditions. For instance, recent findings demonstrated that simulated cataracts hinder the ability of listeners to utilize visual cues to understand (i.e., speechread) televised speech sentences. The purpose of this study was to determine which measures of visual, auditory, and cognitive performance predicted participants' ability to speechread televised spoken messages in the presence of background babble. Specifically, 30 young adults with normal visual acuity and hearing sensitivity completed a battery of visual, auditory, and cognitive assessments. Speech intelligibility was tested under two conditions: auditory-only with no visual input and auditory-visual with normal viewing. Speech intelligibility scores were used to calculate average visual enhancement, or the average benefit participants gained from viewing visual information in addition to auditory information. Regression analyses demonstrated that the best predictors of visual enhancement were measures of contrast sensitivity and executive functioning, including the Digit Symbol Substitution Test and Trail Making Test, part B. These results suggest that audiovisual speech integration is dependent on both low-level sensory information and high-level cognitive processes, particularly those associated with executive functioning.
dc.language.isoen_US
dc.relation.ispartofseriesHuman Factors and Ergonomics Society Annual Meeting
dc.relation.ispartofseries2013
dc.titleVisual and cognitive predictors of visual enhancement in noisy listening conditions
dc.typeConference paper
dc.rights.holderHuman Factors and Ergonomics Society, Inc.


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record