© 2024
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Dr. Evguenia Malaia, University of Texas at Arlington – Sign Language and the Brain

In today’s Academic Minute, Dr. Evie Malaia of the University of Texas at Arlington reveals what features of American Sign Language have to say about how the brain processes language.

Evie Malaia is an assistant professor in the Center for Mind, Brain, and Education at the University of Texas at Arlington. Her work utilizes EEG and fMRI techniques to investigate the neural basis for language processing and the effect of linguistic experience on visual processing, memory, and executive control. She holds a Ph.D. from Purdue University.

About Dr. Malaia

Dr. Evguenia Malaian – Sign Language and the Brain

Languages are complex systems with multiple inter-related levels, and we still do not quite understand how humans developed them. Research on how brain processes Sign Languages – visual languages -  has shown that there is a relationship between  everyday visual processing of moving objects, and linguistic structures in visual languages, suggesting a path from mental structuring of visual scenes to language development.
    

In most human languages, two categories of verbs are present: those describing a change (buy, arrive), and those describing a homogenous activity (run, read). In ASL, these two kinds of verbs can be distinguished not only on the basis of their meaning, but also based on their form. When a signer uses a verb describing change, the motion of the hand rapidly decelerates at the end of the sign. Signs describing continuous actions do not have such drastic speed changes in hand motion. From research on human perception we know that all humans  use speed and deceleration of objects to understand what is happening in their surroundings. The use of similar visual features in everyday perception, and in sign language comprehension shows that there are general perceptual abilities that languages build upon.

Colleagues and I conducted a neuroimaging study to understand what information signers extract from the increased hand speed at the end of the sign. The verb-signs with high deceleration at the end – those describing changes - activated precuneus in Deaf signers. This region known for processing abstract scripts of events. This meant that the Deaf participants were using the increased speed of hand motion in signs describing change to trigger access to long-term memory.
    
The finding that human perceptual ability contributes to creation of linguistic categories is exciting, because it applies to all languags, both spoken and signed, since spoken languages conceptually distinguish between the same two verb types. Our neuroimaging study uncovered the neural connection between motion recognition, language, and memory access in the human brain.  
    
Production support for the Academic Minute comes from Newman’s Own, giving all profits to charity and pursuing the common good for over 30 years, and from Mount Holyoke College.

Related Content