book excerptise:   a book unexamined is wasted paper

Handbook of psycholinguistics

Matthew J. (eds.) Traxler and Morton Ann Gernsbacher

Traxler, Matthew J. (eds.); Morton Ann Gernsbacher;

Handbook of psycholinguistics

Elsevier/Academic Press, 2006, 1184 pages

ISBN 0123693748, 9780123693747

topics: |  psychology | language | reference | anthology

Excerpts


06 Speech Perception - Information-Theoretic Framework

 	by Keith R. Kluender, Michael Kiefte, 153-199

Reducing Dimensionality


speech sounds are comprised of multiple acoustic attributes, many of which
are redundant, one acoustic attribute serves to predict the occurrence of
another. Through experience, perceptual processes come to absorb these
correlations in a way that increases efficiency. When perceptual systems
encode correlations among attributes, there are two consequences. First,
there is a decrease in sensitivity to differences between two sounds that
share the same pattern of correlation among the same set of attributes.
Second, two sounds with different patterns of correlation become easier to
distinguish.

A PCA analogy: 

In Principal Component Analysis
one begins with a correlation matrix of multiple variables, created to
assess the degree to which each variable is correlated with every other
variable across many observations. Can now compute vectors = weighted
combinations of variables that 
account for as much shared variance as possible. 
a limited number of vectors
(few relative to the number of variables) can account for a high percentage
of the total variance across observations. 

However, how the PCA analogy fails is itself illuminating.
First, PCA is a linear analysis, and it is well-known that sensory processes are
nonlinear. Also, PCA requires that vectors be ordered from most
to least amount of variance accounted for, and these vectors must be orthogonal
(Eigenvectors.)  ICA overcomes some deficiencies of PCA (such as the normally
distributed values assumption). 

4.3.2. “Phonemes” as correlations?


Through experience with redundant attributes, a simplified encoding of inputs
as patterns of correlation [emerges]... 
These correlation vectors [may be thought to] correspond to the
putative linguistic units called phonemes (Kluender & Lotto, 1999) 175

5. Sound contrasts in the native language


The profound role of experience is especially clear for speech perception.

thousands of languages in use around the world... most without writing
systems.  Across a survey of only 317 languages, Maddieson (1984) describes
558 different consonants, 260 different vowels, and 51 diphthongs used by
talkers around the world. ... 
acoustic distinctions that are communicatively necessary in one
language should be ignored by speakers of another. Experience plays a
critical role in tuning speech perception to the distributions of sounds
within one’s language environment.  Much, if not most, of this development as
a native listener takes place during the first year of life.  177

5.1. Vowels


But how do infants come to hear speech contrasts in a way that is appropriate
to their native language environment?  ...
clearly audible differences such as gender of talker,
speaking rate, emotional state, and other factors have profound effects on
the acoustic signal, which must be overcome...

At least by the age of 6 months, infants can distinguish stimuli by vowel
type even when different instances of the vowel differ considerably between
presentations (Kuhl, 1983).  In a reinforced head turn paradigm, Kuhl trained
infants to turn their heads only when the vowel of the background stimulus
changed during presentation of the closely related vowels [a] (as in ‘tot’)
and [ ] (as in ‘taught’) spoken by a male talker.

When tested on novel vowels produced by women and children (adding random
variation in pitch contour in addition to shifting absolute frequencies of
formants), infants provided the correct response on the first trial
demonstrating that they recognized the novel instances as consistent with
training vowels despite talker changes. 178

[this would agree with the correlation-based formulation of 4.3. -
correlations are discovered] on the basis of attributes that tend to
co-occur. Attributes such as those accompanying changes in talker are
irrelevant to particular consonants and vowels, so they do not play much role
in phonetic distinctions.

later studies: the degree to which infants treat as equivalent acoustically
different instances of the same vowel is critically dependent on their
experience with a particular language. For example, 6-month-old infants
detect differences between vowel sounds differently depending on whether they
lived in an English-speaking (Seattle) or Swedish-speaking (Stockholm) home
(Kuhl, Williams, Lacerda, Stevens, & Lindblom, 1992).

Birds and Computational Studies

Further evidence for the role of experience can be found in experiments in
which performance by European starlings (Sturnus vulgaris), having learned
statistically-controlled distributions of renditions of Swedish and English
vowels, was highly correlated with performance of adult human listeners
(Kluender, Lotto, Holt, & Bloedel, 1998). 

A simple linear association network model, exposed to the same vowels heard
by the birds, accounted for 95% of the variance in avian
responses. Consistent with the principle that consonants and vowels are
defined mostly by what sounds they are not, both human goodness judgments
(Lively, 1993) and starling response rates illustrate an anisotropy: 
peak responses are skewed away from competing vowel sounds more than
they are defined by centroids of vowel distributions. [?]

5.2. Consonants


Werker and her colleagues (Werker, Gilbert, Humphrey, & Tees, 1981; Werker &
Logan, 1985; Werker & Lalonde, 1988; Werker & Tees, 1983; Werker & Tees,
1984a, 1984b) demonstrate that, as a function of experience with consonants
in their native language, infants’ tendency to respond to differences between
some consonants that are not in their language begins to attenuate.  The
series of studies by Werker and Lalonde (1988) permits a relatively complete
description of the phenomenon. They exploited the fact that speakers of
English and Hindi use place of articulation somewhat differently for stop
consonants.
While for English, three places of articulation are used for voiced stop
consonants: labial, alveolar, and velar (e.g. /b/, /d/, and /g/,
respectively), in Hindi four places are used: labial, dental, retroflex, and
velar (e.g. /b/, /d/ (dental; dil), /D/, and /g/, respectively.) They created a
synthetic series that varied perceptually from /b/ to /d/ (for native-English
speaking adults) and from /b/ to /d/ to /D/ (for native-Hindi speaking
adults).  179

Used reinforced head turn procedure (as in Kuhl):  
* 6- to 8-month-old English infants respond to the /b/ to /d/ and
  also to [d] / [D]; 
* 11- to 13-mo English infants responded reliably only to the English
  [b]-[d] contrast, and not to the Hindi [d]-[D] contrast. 

Essentially, 6- to 8-month-old infants responded in a manner typical of
native-Hindi adults, while 11- to 13-montholds responded like native-English
adults treating both dental and retroflex stops as being the same.

 
  When hearing dental and retroflex Hindi stops [dil vs DAmar] 6- to
  8-month-old infants from English-speaking homes respond in a manner 
  typical of native-Hindi adults and 11- to 12-mo Hindi infants Before 
  they are a year old, English infants start treating both consonants 
  the same.  fig based on [Werker and Lalonde, 1988]. p. 180

Werker and her colleagues have found analogous results in studies using
different consonant contrasts from different languages... For vowels and
consonants, perception of speech is shaped during the first year of life in
ways that respect the statistics of the linguistic environment.

Maturation : development of larynx



It is impossible for small developing vocal tracts to produce adult-like sounds
of a language owing to differences in the supralaryngeal anatomy and control, 
( Kent & Miolo, 1995...; Kent etal 2005). 

The infant vocal tract begins as single tube not unlike that of a
chimpanzee - facilitates simultaneous drinking and
breathing.  But production of many speech sounds hampered. 
Larynx begins too high with a vocal tract too short - undergoes drastic
restructuring across first 6 years... 

What is a neotalker to do?  Mimicking speech
sounds of the adult is not an option. May be possible to produce sounds that
are different in ways similar  to how adult speech sounds differ. 
e.g. shorter vocal tracts have resonances at higher frequencies than do
longer vocal tracts, so center frequencies of formants are much higher.  
==> children's formants cannot approximate the same frequencies as
adults.  

However, the [child's production] can preserve acoustic contrasts
proportional to those heard from adult talkers. e.g. same relative change in
formant frequency can be produced irrespective of vocal tract size... the
problem so recast, becomes inherently relative.



17: Psycholinguistics Electrified II (1994–2005) p.659

    Marta Kutas, Cyma K. Van Petten, Robert Kluender

Much discussion in psycholinguistic and linguistic literatures on
the extent to which there is a basic distinction between literal and
non-literal language representations and processes. Extremes:

   - Unified view: the dichotomy between literal and figurative thought or
	language is a psychological illusion; single set of
	processes is responsible for the processing of both
   - Differently comprehended: figurative language is unusual and special,
	and engages different comprehension processes

	Katz, A. N., Cacciari, C., Gibbs, R. W., Jr., & Turner,
	M. (1998). Figurative language and thought.  New York: Oxford
	University Press.

Very few electrophysiological investigations of non-literal language
processing, specifically of jokes and metaphors.  early reports that one
subtle communicative deficit in patients with damage to the right hemisphere
is difficulty understanding non-literal language [Brownell 1990] but recent
claim that right- and left-hemisphere patients are more similar than
different. [Gagnon 2003]

  [Brownell, H., Simpson, T., Bihrle, A., & Potter, H. (1990). Appreciation
    of metaphoric alternative word meanings by left and right brain-damaged
    patients. Neuropsychologia, 28, 375–383.
  Gagnon, L., Goulet, P., Giroux, F., & Joanette, Y. (2003). Processing of
    metaphoric and nonmetaphoric alternative meanings of words after right-
    and left-hemisphere lesions. Brain and Language, 87, 217–226.

5.5.2. Metaphors


current models of metaphor comprehension: mostly assume that
same operations are involved in literal and metaphorical language
comprehension, but that metaphorical language especially taxes certain
operations (see Katz et al., 1998).  

But several sources of behavioral evidence indicate that metaphorical
meanings are sometimes available at the same time as literal meanings and may
even compete with each other. Researchers have examined these issues with
ERPs as equivalent reaction times do not necessarily translate into
equivalent processing demands.

No electrophysiological study has yet offered any strong evidence for a
qualitative difference in the way literal and metaphorical language is
processed. The final words of metaphors typically elicit slightly larger N400
amplitudes than equally unexpected (low cloze) words completing literal
statements. This suggests that people invoke the same operations, but also do
experience more difficulty integrating words with a metaphoric than literal
context.

(from Marquez etal, Methods in cognitive linguistics, 2007: 

Cloze probability: 
   probability that a given word will be produced in a given context on a
   sentence completion task.  

   The word “month” has a high cloze probability in “The bill was due at the
   end of the –,” a low cloze probability in “The skater had trained for many
   years to achieve this –,” and an intermediate cloze probability in
   “Because it was such an important exam, he studied for an entire –.”

Pynte etal 1996 established that final words of short metaphoric sentences
elicited larger N400s than categorical statements, despite being matched on
cloze probability. Subsequent experiments showed that the ease of processing
metaphoric statement, like literal statements, could be modulated by prior
context. When presented in isolation, relatively familiar and unfamiliar
metaphors elicited equivalent ERPs (e.g., “Those fighters are LIONS.” versus
“Those apprentices are LIONS.”).

Pynte, J., Besson, M., Robichon, F. -H., & Poli, J. (1996). The time-course
    of metaphor comprehension: An event-related potential study. Brain and
    Language, 55, 293–316.

role of preceding context:
unfamiliar metaphor with a useful context (“They are not cowardly. Those
apprentices are LIONS.”) elicited a smaller N400 than a familiar metaphor
preceded by an irrelevant context (“They are not naïve. Those fighters are
LIONS.”).

The metaphors-in-context were not compared to a literal condition to
determine if the enhanced N400 observed for isolated metaphors disappeared
with appropriate context. However, across the multiple experiments, there was
no hint of distinct processing stages during metaphor comprehension.

Tartter and colleagues raise the possibility that, while processing a
metaphorical expression, comprehenders nonetheless do take note of the
anomalous nature of the expression’s literal meaning (Tartter, Gomes,
Dubrovsky, Molholm, & Stewart, 2002).  They suggest this realization may
underlie the phenomenological sense of satisfaction experienced when
confronting a metaphorical statement. 

They compared the ERPs to final words completing the same sentence frame
either literally, metaphorically, or anomalously (e.g., “The flowers were
watered by nature’s RAIN / TEARS / LAUGHTER”, respectively). Cloze
probabilities were higher for the literal endings than the other two
conditions (both near zero). They argue that if context is used to construct
a meaningful interpretation of a metaphorical expression without any
accompanying appreciation that the expression’s literal meaning is anomalous,
then a metaphorical but literally incongruous ending should not elicit an
N400. This construal of the N400 as an anomaly detector is problematic given
that words that fit but are less expected also elicit sizable N400s; semantic
anomalies are neither necessary nor sufficient to elicit N400s.  

Tartter etal 2002 obtained a three-way amplitude difference in the peak
latency range of the N400: anomalous > metaphorical > literal; however, the
ERPs to literal completions pulled away from the other two conditions
earlier than the differentiation between metaphoric and anomalous
completions. This pattern of results suggests (to us) that semantically
anomalous sentence endings were more difficult to process (as reflected in
larger and longer N400 congruity effect) than the metaphorical endings,
which were in turn more difficult to fit with the prior context (as
reflected in greater N400 activity) than the literal, congruent endings.

Also: suggests that metaphors are initially processed much the same
as semantic anomalies, although they are meaningfully resolved in a shorter
duration. However, this latter conclusion is somewhat complicated by the
difference in cloze probability and frequency between the literal and
metaphoric completions.

Tartter, V. C., Gomes, H., Dubrovsky, B., Molholm, S., & Stewart,
    R. V. (2002). Novel metaphors appear anomalous at least momentarily:
    Evidence from N400. Brain and Language, 80, 488–509.  
Coulson, S., & Van Petten, C. (2002). Conceptual integration and metaphor: An
    event-related potential study. Memory and Cognition, 30, 958–968.
Kazmerski, V. A., Blasko, D. G., & Dessalegn-Banchiamlack, G. (2003). ERP and
    behavioral evidence of individual differences in metaphor
    comprehension. Memory and Cognition, 31, 673–689. 

Coulson and Van Petten (2002) - significant analytic and empirical step -
hypothesize that the same conceptual operations important for understanding
metaphors are often also engaged during the comprehension of literal
statements.  These include establishing mappings and recruiting background
information, or, more specifically, looking for correspondences in attributes
and relations between the target and source domains, setting up the mappings,
aligning them, selecting some, and suppressing others.

By using sentences describing situations where one object was substituted,
mistaken for, or used to represent another (the literal mapping condition,
e.g., “He used cough syrup as an INTOXICANT.”), they created sentences
requiring mappings between two objects and the domains in which they commonly
occur, albeit with less effort than for a metaphor (e.g., “He knows that
power is a strong INTOXICANT.”), but more than for a simple literal statement
with fewer or no mappings (e.g., “He knows that whiskey is a strong
INTOXICANT.”).  ERPs elicited by sentence-final words showed graded N400
activity, with metaphor > literal mapping > literal, although the three
conditions were matched in cloze probability. These data indicate that
although literal and figurative language may engage qualitatively similar
processes, increasing the burdens on mapping and conceptual integration can
make metaphors more difficult to process.

Kazmerski etal 2003: examined individual differences in metaphor
comprehension, and found that both vocabulary and working memory capacity
were important factors as individuals determined whether a metaphoric
statement was literally untrue (as compared to false statements without
metaphoric interpretations, e.g., “The beaver is a LUMBERJACK.”  versus “The
rumor was a LUMBERJACK.”).

High IQ participants showed greater interference, presumably because the
figurative meaning was extracted without voluntary effort (Kazmerski, Blasko,
& Dessalegn- Banchiamlack, 2003). Lower IQ participants had equivalent N400s
for the metaphoric and anomalous statements, suggesting that they had no
additional trouble rejecting metaphorical sentences as untrue. Thus, although
individuals with lower IQs clearly understood the metaphors in an off-line
task, the on-line evidence provided by the ERP seems to indicate that
metaphorical processing is not always obligatory or automatic.


Contents

    Preface, Matthew J. Traxler					vii-viii
01: Observations on the Past and Future of Psycholinguistics 		  1-18
	Alan Garnham, Simon Garrod, Anthony Sanford

Section 1: Language Production 19

02: Properties of Spoken Language Production 				 21-59
	Zenzi M. Griffin, Victor S. Ferreira
03: Syntax and Production 						 61-91
	Fernanda Ferreira, Paul E. Engelhardt
04: Speech Disorders 							 93-124
	Gary Weismer
05: Functional Neuroimaging of Speech Production 			125-150
	Thomas A. Zeffiro, Jennifer L. Frymiare

Section 2: Language Comprehension

06: Speech Perception within a Biologically Realistic
	    Information-Theoretic Framework 				153-199
	Keith R. Kluender, Michael Kiefte
07: The Perception of Speech 						201-248
	Jennifer S. Pardo, Robert E. Remez
08: Spoken Word Recognition 						249-283
	Delphine Dahan, James S. Magnuson
09: Visual Word Recognition: The Journey from Features to Meaning
	     (A Travel Update) 						285-375
	David A. Balota, Melvin J. Yap, Michael J. Cortese
10: Lexical Processing and Sentence Context Effects 			377-401
	Robin K. Morris
11: Semantic Memory 							403-453
	Beth A. Ober, Gregory K. Shenaut
12: Syntactic Parsing 						455-503
	Martin J. Pickering, Roger P. G. van Gompel
13: Prosody 								505-537
	Shari Speer, Allison Blodgett
14: The Syntax-Semantics Interface: On-Line Composition of Sentence
	    Meaning 							539-579
	Liina Pylkkänen, Brian McElree
15: Constraint Satisfaction Accounts of Lexical and Sentence
	     Comprehension 						581-611
	Maryellen C. MacDonald, Mark S. Seidenberg
16: Eye-Movement Control in Reading 					613-657
	Keith Rayner, Alexander Pollatsek
17: Psycholinguistics Electrified II (1994–2005) 			659-724
	Marta Kutas, Cyma K. Van Petten, Robert Kluender
18: Discourse Comprehension 						725-764
	Rolf A. Zwaan, David N. Rapp
19: Neuroimaging Contributions to the Understanding of Discourse
	    Processes 							765-799
	Robert A. Mason, Marcel Adam Just
20: Comprehension Ability in Mature Readers 				801-833
	Debra L. Long, Clinton L. Johns, Phillip E. Morris
21: Figurative Language 						835-861
	Raymond W. Gibbs Jr., Herbert L. Colston
22: Eye Movements and Spoken Language Comprehension 			863-900
	Michael K. Tanenhaus, John C. Trueswell
23: Perspective Taking and the Coordination of Meaning in Language Use
	Dale J. Barr Boaz Keysar 					901-938
24: Comprehension Disorders in Aphasia: The Case of Sentences that
	    Require Syntactic Analysis 					939-966
	David Caplan, Gloria Waters
25: Language Processing in Bilingual Speakers 			967-999
	Ana I. Schwartz, Judith F. Kroll
26: Psycholinguistic and Neurolinguistic Perspectives on Sign
	    Languages 							1001-1024
	David P. Corina, Heather P. Knapp

Section 3: Language Development

27: Language Learning in Infancy 					1027-1071
	Anne Fernald, Virginia A. Marchman
28: Acquisition of Syntax and Semantics 				1073-1110
	Stephen Crain, Rosalind Thornton
29: Learning to Read 							1111-1142
	Richard K. Wagner, Joseph K. Torgesen, Shayne B. Piasta
30: Cognitive and Linguistic Issues in the Study of Children with
	    Specific Language Impairment 				1143-1171
	Laurence B. Leonard, Patricia Deevy
    Subject Index 1173-1184


amitabha mukerjee (mukerjee [at-symbol] gmail) 2011 Nov 17