Neural functional organization account of speech perception - PowerPoint PPT Presentation

1 / 1
About This Presentation
Title:

Neural functional organization account of speech perception

Description:

This functional organization may map onto ... in macaque monkeys.21. Functional neuroimaging results are consistent with multiple, parallel, cascaded ... – PowerPoint PPT presentation

Number of Views:21
Avg rating:3.0/5.0
Slides: 2
Provided by: MedicalIll95
Category:

less

Transcript and Presenter's Notes

Title: Neural functional organization account of speech perception


1
Distributed auditory processing is compatible
with the information conveyed by the speech signal
Sarah Hawkins and Ingrid Johnsrude Phonetics
Laboratory, Department of Linguistics, University
of Cambridge
sh110_at_cam.ac.uk ingrid.johnsrude_at_queensu.ca
Supported by the Leverhulme Trust, CIHR,NSERC
and Canada Research Chair Progamme
is consistent with a distributed, interactive
  • Problem Models of speech perceptionoften
    emphasize phonetic or phonological
    categories(features, phonemes, gestures) that
  • are stable, abstract entities
  • result from stripping irrelevant variation from
    speech
  • are prerequisite to the processing of other
    aspects of speech (grammar and meaning).
  • Neural functional organization account of speech
    perception
  • Distributed
  • Speech comprehension probably relies on multiple
    cortical networks that operate in parallel
  • This functional organization may map onto
    anatomically segregated processing streams
    similar to those identifiedin macaque monkeys.21
  • Functional neuroimaging results are consistent
    with multiple, parallel, cascaded auditory
    streams of processing.22,23,26-28
  • The subcortical auditory system is also highly
    parallel.
  • Neurophysiological studies suggest that
    information ineven core auditory cortex regions
    is integrated over many time domains
  • This suggests that multiple representations of
    the input,at many temporal grains, are
    simultaneously available for processing by higher
    centres.32,33
  • Interactive
  • Information flow in the auditory system is not
    unidirectional. Eachcortical feedforward
    connection has its feedback complement.29-31
  • Anatomy suggests converging influences from
    multiple higher stages of perception on lower
    stages.21
  • Auditory cortical responsivity is context
    dependent and plastic, and
  • probably driven by feedback from higher-order
    areas.
  • Behaviourally relevant stimuli produce rapid
    changes in the response characteristics of single
    units in ferret primary auditory cortex.39
  • Humans demonstrate greater fMRI activity to
    learned phonological contrasts in auditory belt
    or parabelt.40

Examples Grammatical information conveyed by
systematic acoustic variation b. About
Morphemes
  • a. About function vs content words Pronounbe
    system
  • Speakers show different assimilation in function
    and content words. E.g. /m/ assimilates to place
    of next consonant in Im but not lime or crime11
  • Im blowing / going / watching a?m
    a?? a?w lime bark lime goes crime wave
    a?m
  • The acoustic pattern may be used by the listener
    to inform about the grammatical class of the
    speech segment being perceived.In its place in
    an utterance, Im has few or no acoustic
    competitors.
  • Concluding remarks
  • Fine phonetic detail simultaneously informs about
    perceptual units at multiple linguistic levels
    and thus over different time domains.12,13,35
    Compatible with multiple, parallel,
    hierarchically organized anatomical pathways.
  • Each linguistic category is relational and
    plastic each is bound with other elements
    (larger, smaller) and no element can be described
    independently of its prosodic, grammatical, and
    functional context. Supported by interactive
    organization of stages of auditory processing.
  • Rapid perceptual tuning is manifest in many ways,
    e.g., flexible phonemic category boundaries and
    perceptual learning of degraded speech.1-9 May
    rely on feedback from higher areas to early
    auditory regions.
  • Models of visual object perception posit Bayesian
    integration of image features and knowledge to
    determine the most probable interpretation of the
    current input. eg 38,41,42,43,44 A similar
    mechanism may act to combine information from
    multiple sources (top down and bottom up) to
    constrain speech perception.
  • The framework outlined here
  • requires that no unit of speech perception is
    primary
  • emphasizes the knowledge-driven nature of
    perception
  • coherently integrates behavioural,
    neurobiological and linguistic data and theory

BUT this over-emphasizes phonology
early loss of detail seems unlikely
  • Phonemic category boundaries are context
    dependent thus plastic
  • Boundaries shift with phonetic context, stimulus
    distribution and range, meaning (lexical status,
    sensible world) shifts are fast1-9
  • Much phonetic variability is systematic and
    informs about properties other than phonemic
    categories
  • Realization of phonemes is systematically
    influenced by10,11
  • 1) allophonic variation
  • position in the syllable (eg tip vs pit)
  • boundaries between words (eg grey train vs
    great rain)
  • grammatical status (eg productivity of a
    morpheme content vs function words)
  • 2) speaker intent register (discourse
    function, casualness, rate)
  • 3) talker identity
  • Experiments show listeners use much of this
    variability12-17

Some of the perceptual information available from
/m?st/ in mistimes and mistakes
Perceptual information available in the short
sound section mist, fromTess mistimes it and
Tess mistakes it. Information about featural,
phonemic and lexical identity, and syllabic,
morphemic and grammatical structure, is conveyed
simultaneously in the fine phonetic detail
events at segment boundaries and longer-term
relationships. Prior knowledge is required for
linguistic informationat all levelsto be
extracted from sensory input. No unit is
identifiable independent of context, and no
unit/level is primary. Information is mapped onto
prosodic structures linked to grammatical
structures12,13,34 . Examples of structures are
available at http//kiri.ling.cam.ac.uk/sarah/doc
s/CNS06trees.pdf
Bold font nodes in linguistic structure
potential perceptual units
For references, see separate handout or email us
Write a Comment
User Comments (0)
About PowerShow.com