Human bilateral cochlear implant users do poorly on tasks involving interaural time differences (ITD), a cue that provides important benefits to the normal hearing, especially in challenging acoustic environments, yet the precision of neural ITD coding in acutely deafened, bilaterally implanted cats is essentially normal (Smith and Delgutte, 2007a). One explanation for this discrepancy is that the extended periods of binaural deprivation typically experienced by cochlear implant users degrades neural ITD sensitivity, by either impeding normal maturation of the neural circuitry or altering it later in life. To test this hypothesis, we recorded from single units in inferior colliculus of two groups of bilaterally implanted, anesthetized cats that contrast maximally in binaural experience: acutely deafened cats, which had normal binaural hearing until experimentation, and congenitally deaf white cats, which received no auditory inputs until the experiment. Rate responses of only half as many neurons showed significant ITD sensitivity to low-rate pulse trains in congenitally deaf cats compared with acutely deafened cats. For neurons that were ITD sensitive, ITD tuning was broader and best ITDs were more variable in congenitally deaf cats, leading to poorer ITD coding within the naturally occurring range. A signal detection model constrained by the observed physiology supports the idea that the degraded neural ITD coding resulting from deprivation of binaural experience contributes to poor ITD discrimination by human implantees.
ASJC Scopus subject areas
- General Neuroscience