Media Contact:
Karla SchusterUniversity Relations
Hofstra Hall 202
Phone: 516-463-6493/516-633-2088
Fax: 516-463-5146
Send an E-mail
Date: Nov 06, 2009
Study Co-Authored By Hofstra Professor Finds Words, Gestures Are Decoded by Same Brain Areas
Research May Further Understanding of How Language Evolved
Hofstra University, Hempstead, NY – Your ability to understand Groucho’s punch lines and Harpo’s pantomimes happens in the same parts of your brain, according to new research co-authored by a Hofstra professor and funded by one of the National Institutes of Health that marks a major step forward in the study of how language evolved.In a study published in this week’s Early Edition of Proceedings of the National Academy of Sciences (PNAS), researchers have shown that the regions of the brain responsible for decoding written and spoken words also are involved with interpreting silent gestures. The evidence that both forms of communication are handled by a single, overlapping network in the brain lends credence to the idea that the areas that process language may be a vestige of regions that have existed for millions of years and adapted over time from communicating gestures to understanding words as well.
“Our study illuminates the functions of yet another part of the still mysterious brain that have eluded us from time immemorial,” said Dr. Patrick Gannon, a physical anthropologist, evolutionary neuro-biologist and chairman of Science Education at Hofstra University School of Medicine in partnership with North Shore-LIJ Health System, which opens in Fall 2011.
“It shows us that it doesn’t matter to the brain what channels our person to person communications are going out, or coming in from, be it the hands and eyes, or voice box and ears,” Gannon said. “The brain still processes the information in its language areas so we can readily ‘get the picture’ so to speak. This also might explain why we can all so easily understand the meaning of everyday gestures like ‘check please’ used to communicate to the waiter across a crowded restaurant, or ‘shush’ and ‘calm down’ used by teachers in a noisy classroom.”
The study was funded by the National Institute on Deafness and Other Communication Disorders (NIDCD), and conducted in partnership with its scientists, and a researcher from San Diego State University.
“In babies, the ability to communicate through gestures precedes spoken language, and you can predict a child’s language skills based on the repertoire of his or her gestures during those early months,” said Dr. James F. Battey, M.D., Ph.D., director of the NIDCD. “These findings not only provide compelling evidence regarding where language may have come from, they help explain the interplay that exists between language and gesture as children develop their language skills.”
Besides offering new clues about the evolution of language, researchers say the findings may help in the treatment of aphasia, a disorder that hinders a person’s ability to produce or understand language.
The study aimed to find out if non-language-related gestures – the hand and body movements we use that convey meaning on their own without needing to be translated into words – are processed in the same parts of the brain as language. Two types of gestures were considered: pantomimes, such as juggling balls, and emblems, which signify abstract concepts, such as holding up one finger to convey “Hold on just a minute.”
While inside a functional MRI machine, 20 healthy, English-speaking volunteers – nine men and 11 women – watched video clips of a person either acting out one of the two types of gestures or speaking the phrases that the gestures represent. As controls, volunteers also watched clips of the person using meaningless gestures or speaking a jumble of pseudowords that would not be interpreted as language. A mirror attached to the head enabled the volunteer to watch the video projected on the scanner room wall. Scientists measured brain activity for each of the stimuli, and looked for similarities and differences as well as any communication between individual parts of the brain.
The result was that the brain was highly activated in the inferior frontal and posterior temporal areas – the long-recognized language areas of the brain – by gesture and spoken language stimuli.
Current thinking in the study of language is that, like a smart search engine that pops up the most suitable Web site at the top of its results, the posterior temporal region of the brain serves as a storehouse of words from which the inferior frontal gyrus selects the most appropriate match. The researchers suggest that, rather than being limited to deciphering words alones, these regions may be able to apply meaning to any incoming symbols, be they words, gestures, images, sounds or objects.
“Our findings also lend support to the notion that these brain regions support a mode of communication that may have been in place at the roots of humankind millions of years ago, and as such, were likely involved with the evolutionary origins of language as we now know it,” Gannon said. “The concept of a gestural origin of spoken language is not new, but it still remains a tantalizing account of why gestures were, and still are, a very important part of our everyday life.”
The National Institutes of Health – the nation’s medical research agency – includes 27 institutes and centers and is a component of the U.S. Department of Health and Human Services. It is the primary federal agency for conducting and supporting basic, clinical and translational medical research, and it investigates causes, treatments and cures for both common and rare diseases.
Hofstra University is a dynamic private institution where students can choose from more than 140 undergraduate and 155 graduate programs in liberal arts and sciences, business, communication, education and allied health services and honor studies, as well as a School of Law.
###



YouTube FourSquare Flickr RSS