By Kevin Chan
As we are about to transition from mainly studying the visual sense in our perception class to studying the auditory sense, I though of writing a topic that involved audio and vision. Because of that I chose the study that in a way is a transition as well from the visual to the auditory. In a nutshell, the study tested whether adding visual information about the articulatory gestures (such as lip movements) could enhance the perceptual.
To start off, the brain integrates two sources of information for speech comprehension: information in vision (lip movements) and audition (linguistic sounds). Furthermore, can this audiovisual integration of speech facilitate the perception of perceiving a second language?
The methodology was simple. There was a audio
only trial, a video only trial and a audiovisual trial. All participants had been exposed to either Spanish or Catalan as their second language of simple Spanish- Catalan phonemes. Each trial consisted of a presentation of one disyllabic stimulus for a duration of 800 ms. the task was for the participants to press as fast (and accurately of course) as possible the correct syllable of the stimulus.
The results of the study indicates that the addition of the visual information (the pictures of the lips moving) about the speakers' gestures enhanced the ability to discriminate sounds in a second language. It actually is in constras with previous studies statement an improvement in overall comprehension based on audiovisual inputs. Therefore, integration of visual gestures to auditory information can produce a specific improvement in phonological processing.
A sound suggestion would be to test this study cross-culturally. In the said study the language used was Spanish- Catalan. How applicable would this be to other forms of language?
For example, lets look at Chinese, a language very close to my heart. In Chinese, there is such a thing as intonation, the pitch and speed can affect the meaning of the word. Two different words can be "spelled out" (although spelling in Chinese is different" completely the same way but because of its intonation can mean different things. Mai can mean both buy and sell depending on the intonation. Mai (with a stress) means sell while when you say like as if you are ask
ing it means buy. The question now is would visuals be able to enhance this if CHinese is strongly an auditory language. If you see a chinese person say "mai" his lip movement would probably be very very very much alike. How then is this study applicable to that?
Funny because after taking a course in psycholinguistics (psychology 145) I did not know this. I did not think that visuals such as this had a profound effect on comprehension. A whole different chapter of this can be included in the textbook that we used for the course.
I think a great application of this study is for those who are hearing impaired. Since now we know that visual speech information such as gestures can greatly enhance the comprehensi
on of spoken messages(with is the motor theory of speech perception), we can somehow device a system that focuses on a person's mouth or something (I'm just thinking out loud). For example, in the news, there can be a window in the lower part of the screen that is zooms in the mouth of the reporter. By doing so, people who are hearing impaired can look at the mouth which will enhance their comprehension.
Also, this will serve people who do not have a hearing disability as well. Companies that make instructional material to learn language (such as the Rosetta Stone) could employ the implications of the said study. They should stop production of materials that are purely audio (such as learning CDs) and focus on materials that are audiovisual in nature. Also, they can similarly in
clude a small window that is zoomed in the mouths of the main communicator in their instructional audiovisual materials.
Furthermore, for those who are trying to learn a new language, it maybe a good idea to look at the mouths of people who speak that particular language.This study is actually very
much perfect for me for I am currently taking Spanish 10 this semester. That means for me, I should look at the mouth of my Spanish professor while she talks for it actually might make me speak better Spanish! Voy a intentar que! (I will try that!)
SOURCE:
Navarra, J. & Soto-Faraco, S. (2007). Hearing lips in a second language: visual articulatory information enables the perception of second language sounds. Psychological Research 71: 4-12
No comments:
Post a Comment