Effects of language experience on the perception of American Sign Language
Introduction
Deaf individuals, unlike hearing individuals, vary considerably in the age of exposure to their first language. Auditory deprivation prevents exposure to a spoken language from birth, and a variety of social and demographic patterns prevent exposure to a signed language from birth for more than 90% of deaf individuals. Children who are exposed to a signed language from birth reach all of the language development milestones at similar ages to children acquiring spoken languages (Bonvillian and Folven, 1993, Newport and Meier, 1985, Petitto and Marentette, 1991). As adults, these individuals perform significantly better than deaf adults exposed to signed languages at later ages across a variety of language tasks. There is wide-ranging evidence that the variability in age of exposure to a signed language is related to language performance in adulthood (Boudreault and Mayberry, 2006, Emmorey et al., 1995, Emmorey and Corina, 1990, Mayberry, 1993, Mayberry and Fischer, 1989, Morford, 2003, Newport, 1990). What is not yet clear is whether age of exposure to a signed language affects all aspects of language acquisition and processing, or whether poor performance on a range of language tasks reflects difficulties with specific processes only. Mayberry (1994:84) hypothesized that disruptions in phonological processing may have cascading effects upon subsequent stages of sign language comprehension. This study takes a first step toward investigating this hypothesis by isolating sign perception from other components of sign language processing.
American Sign Language (ASL) is the primary language used in the Deaf2 communities of the United States and parts of Canada. It has a lexicon and grammar that are distinct from the spoken languages in use in the same communities (e.g., English). ASL is also distinct from other signed languages (e.g., British Sign Language). Signed languages exhibit a level of structure that has been analyzed within the framework of phonological theory. Signed language phonologists investigate the contrastive form units that combine to create lexical and grammatical units in signed languages (Brentari, 1998, Sandler, 1987). Research on signed languages is recent enough that there is still controversy about the minimal units of the language. However, the first widely known phonological analysis by William Stokoe (Stokoe, 1960, Stokoe et al., 1965) identified a set of phonological parameters that are still core components of phonological descriptions of signed languages: handshape, place of articulation (POA) and movement. Two of these parameters, handshape and POA, were investigated in this study.
Section snippets
A role for language experience in perception?
In spoken language processing, language experience influences the perception of some phoneme contrasts more than others. Stop consonant contrasts that differ only in voice onset time (e.g., /p/–/b/) appear to be fairly impervious to language experience. Human infants as well as non-human animals show a striking similarity to human adults in their greater sensitivity to acoustic variation in the perceptual region of stop consonant boundaries than to acoustic variation within these phoneme
Participants
Thirteen deaf native ASL signers, thirteen deaf non-native signers of ASL4, and thirteen hearing second language (L2) signers of ASL participated in the experiment. The data from four deaf non-native signers were excluded from the analysis because they performed at chance on the discrimination task described below. Since participants
Identification task
The first analysis addresses the nature of the categories that emerged through the identification task. For every handshape and POA contrast and for all groups, participants consistently switched labels at a similar point along the stimulus continuum. The crossover from one category to another was well defined, with one or at most two stimuli that were not assigned to a single category 75% of the time or more. The location of the crossover was not at the same point in the stimulus continuum for
Discussion
Sign language experience clearly affects the perception of handshape in ASL, but not as we had predicted at the onset of this study. All of the participants, regardless of language background, revealed discontinuities in their ability to discriminate between phonetic variants of handshape primes. Discrimination of handshape stimuli was poorest in regions close to a category prototype, and better for more peripheral phonetic variants as well as for actual phoneme contrasts. The degree to which
Acknowledgements
We thank the participants of our research, as well as Sarah Hafer for help in collecting data, and Caroline Smith and several anonymous reviewers for very helpful comments on previous versions of the manuscript. Portions of this study were presented at the 2005 Meeting of the Linguistic Society of America in Oakland, CA. This research was supported by NIH Grant R03 DC03865 to Jill P. Morford, and by the National Science Foundation under Grant No. SBE-0541953 awarded to Thomas Allen to establish
References (53)
- et al.
Effects of phonological and phonetic factors on cross-language perception of approximants
Journal of Phonetics
(1992) - et al.
A perceptual interference account of acquisition difficulties for non-native phonemes
Cognition
(2003) - et al.
Preliminaries to a distinctive feature analysis of American Sign Language
Cognitive Psychology
(1976) - et al.
The long-lasting advantage of learning sign language in childhood: Another look at the critical period for language acquisition
Journal of Memory and Language
(1991) - et al.
Infant sensitivity to distributional information can affect phonetic discrimination
Cognition
(2002) - et al.
Infants are sensitive to within-category variation in speech perception
Cognition
(2005) Maturational constraints on language learning
Cognitive Science
(1990)- et al.
When learners surpass their models: The acquisition of American Sign Language from inconsistent input
Cognitive Psychology
(2004) - et al.
Cross-language speech perception: Evidence for perceptual reorganization during the first year of life
Infant Behavior and Development
(1984) - et al.
Contextual influences on the internal structure of phonetic categories: A distinction between lexical status and speaking rate
Perception & Psychophysics
(2001)
New insights into old puzzles from infants’ categorical discrimination of soundless phonetic units
Language Learning and Development
The perception of handshapes in American Sign Language
Memory & Cognition
Perception of VOT and first formant onset by Spanish and English speakers
Infant perception of non-native consonant contrasts that adults assimilate in different ways
Language and Speech
The transition from nonreferential to referential language in children acquiring American Sign Language
Developmental Psychology
Grammatical processing in American Sign Language: Age of first language acquisition effects in relation to syntactic structure
Language and Cognitive Processes
A prosodic model of sign language phonology
The acquisition of a new phonological contrast: The case of stop consonants in French-English bilinguals
Journal of the Acoustical Society of America
PowerLaboratory for Macintosh
Linguistic experience and phonemic perception in infancy: A crosslinguistic study
Child Development
Speech perception in infants
Science
Effects of age of acquisition on grammatical sensitivity: Evidence from on-line and off-line tasks
Applied Psycholinguistics
Lexical recognition in sign language: Effects of phonetic structure and morphology
Perceptual and Motor Skills
Categorical perception in American Sign Language
Language & Cognitive Processes
Cited by (44)
The role of the superior parietal lobule in lexical processing of sign language: Insights from fMRI and TMS
2021, CortexCitation Excerpt :Similarly in a previous study of Mayberry et al. (2011) on early and late deaf signers, a positive relationship between the age of onset of sign language acquisition and the level of activation in the occipital cortex was found. With the support of previous behavioral data (e.g., Mayberry & Fischer, 1989; Morford et al., 2008), both the Twomey et al. (2020) and the Mayberry et al. (2011) studies suggest shallower language processing and the hypersensitivity to the perceptual properties of signs in late learners. Our results suggest that these greater demands occur not only in the occipital cortex, but also extend to left SPL.
Age of acquisition effects differ across linguistic domains in sign language: EEG evidence
2020, Brain and LanguageCitation Excerpt :These findings are parallel to the existing literature on the interaction between the age of sign language acquisition and processing at the interface of phonology and lexicon. For instance, Deaf late L1 learners have been shown to be more sensitive to the visual properties of signs, as compared with native Deaf signers and hearing L2 signers (Best, Mathur, Miranda, & Lillo-Martin, 2010; see also Morford, Grieve-Smith, MacFarlane, Staley, & Waters, 2008 for similar results). Phonological processing appears to be less automatized in late learners, such that they focus on fine-grained phonetic properties of signs, which are ignored by persons acquiring an L1 in infancy.
Language modality shapes the dynamics of word and sign recognition
2019, CognitionCitation Excerpt :In that case, we expect fewer and/or later fixations to one or both competitors for L2 learners of LSE, reflecting greater processing costs, compared to native signers. Since L2 signers struggle with handshape (Morford, Grieve-Smith, MacFarlane, Staley, & Waters, 2008; Ortega & Morgan, 2015), it is reasonable to expect that processing of this parameter is especially affected. This would be supported by significant group differences on the intercept (fewer fixations) and/or linear term (later fixations) for either competitor, but especially handshape.
The impact of input quality on early sign development in native and non-native language learners
2016, Journal of Child Language
- 1
NSF Science of Learning Center on Visual Language and Visual Learning (VL2).