Tuesday, May 26, 2009

Evidence for separate semantic and syntactic processing in deaf native signers.

Adding to evidence from written and spoken language suggesting that nonidentical brain networks support semantic and syntactic processing, Capek et al now find a similar distinction in native deaf signers:
Studies of written and spoken language suggest that nonidentical brain networks support semantic and syntactic processing. Event-related brain potential (ERP) studies of spoken and written languages show that semantic anomalies elicit a posterior bilateral N400, whereas syntactic anomalies elicit a left anterior negativity, followed by a broadly distributed late positivity. The present study assessed whether these ERP indicators index the activity of language systems specific for the processing of aural-oral language or if they index neural systems underlying any natural language, including sign language. The syntax of a signed language is mediated through space. Thus the question arises of whether the comprehension of a signed language requires neural systems specific for this kind of code. Deaf native users of American Sign Language (ASL) were presented signed sentences that were either correct or that contained either a semantic or a syntactic error (1 of 2 types of verb agreement errors). ASL sentences were presented at the natural rate of signing, while the electroencephalogram was recorded. As predicted on the basis of earlier studies, an N400 was elicited by semantic violations. In addition, signed syntactic violations elicited an early frontal negativity and a later posterior positivity. Crucially, the distribution of the anterior negativity varied as a function of the type of syntactic violation, suggesting a unique involvement of spatial processing in signed syntax. Together, these findings suggest that biological constraints and experience shape the development of neural systems important for language.

No comments:

Post a Comment