fbpx
Wikipedia

McGurk effect

The McGurk effect is a perceptual phenomenon that demonstrates an interaction between hearing and vision in speech perception. The illusion occurs when the auditory component of one sound is paired with the visual component of another sound, leading to the perception of a third sound.[1] The visual information a person gets from seeing a person speak changes the way they hear the sound.[2][3] If a person is getting poor-quality auditory information but good-quality visual information, they may be more likely to experience the McGurk effect.[4] Integration abilities for audio and visual information may also influence whether a person will experience the effect. People who are better at sensory integration have been shown to be more susceptible to the effect.[2] Many people are affected differently by the McGurk effect based on many factors, including brain damage and other disorders.

Background

It was first described in 1976 in a paper by Harry McGurk and John MacDonald, titled "Hearing Lips and Seeing Voices" in Nature (23 December 1976).[5] This effect was discovered by accident when McGurk and his research assistant, MacDonald, asked a technician to dub a video with a different phoneme from the one spoken while conducting a study on how infants perceive language at different developmental stages. When the video was played back, both researchers heard a third phoneme rather than the one spoken or mouthed in the video.[6]

This effect may be experienced when a video of one phoneme's production is dubbed with a sound-recording of a different phoneme being spoken. Often, the perceived phoneme is a third, intermediate phoneme. As an example, the syllables /ba-ba/ are spoken over the lip movements of /ga-ga/, and the perception is of /da-da/. McGurk and MacDonald originally believed that this resulted from the common phonetic and visual properties of /b/ and /g/.[7] Two types of illusion in response to incongruent audiovisual stimuli have been observed: fusions ('ba' auditory and 'ga' visual produce 'da') and combinations ('ga' auditory and 'ba' visual produce 'bga').[8] This is the brain's effort to provide the consciousness with its best guess about the incoming information.[9] The information coming from the eyes and ears is contradictory, and in this instance, the eyes (visual information) have had a greater effect on the brain, and thus the fusion and combination responses have been created.[9]

Vision is the primary sense for humans,[2] but speech perception is multimodal, which means that it involves information from more than one sensory modality, in particular, audition and vision. The McGurk effect arises during phonetic processing because the integration of audio and visual information happens early in speech perception.[7] The McGurk effect is very robust; that is, knowledge about it seems to have little effect on one's perception of it. This is different from certain optical illusions, which break down once one "sees through" them. Some people, including those that have been researching the phenomenon for more than twenty years, experience the effect even when they are aware that it is taking place.[8][10] With the exception of people who can identify most of what is being said from lip reading alone, most people are quite limited in their ability to identify speech from visual-only signals.[2] A more extensive phenomenon is the ability of visual speech to increase the intelligibility of heard speech in a noisy environment.[2] Visible speech can also alter the perception of perfectly audible speech sounds when the visual speech stimuli are mismatched with the auditory speech.[2] Normally, speech perception is thought to be an auditory process;[2] however, our use of information is immediate, automatic, and, to a large degree, unconscious[10] and therefore, despite what is widely accepted as true, speech is not only something we hear.[10] Speech is perceived by all of the senses working together (seeing, touching, and listening to a face move).[10] The brain is often unaware of the separate sensory contributions of what it perceives.[10] Therefore, when it comes to recognizing speech the brain cannot differentiate whether it is seeing or hearing the incoming information.[10]

It has also been examined in relation to witness testimony. Wareham and Wright's 2005 study showed that inconsistent visual information can change the perception of spoken utterances, suggesting that the McGurk effect may have many influences in everyday perception. Not limited to syllables, the effect can occur in whole words[7][11] and have an effect on daily interactions that people are unaware of. Research into this area can provide information on not only theoretical questions, but also it can provide therapeutic and diagnostic relevance for those with disorders relating to audio and visual integration of speech cues.[12]

Factors

Internal

Brain damage

Both hemispheres of the brain make a contribution to the McGurk effect.[13] They work together to integrate speech information that is received through the auditory and visual senses. A McGurk response is more likely to occur in right-handed individuals for whom the face has privileged access to the right hemisphere and words to the left hemisphere.[13] In people that have had callosotomies done, the McGurk effect is still present but significantly slower.[13] In people with lesions to the left hemisphere of the brain, visual features often play a critical role in speech and language therapy.[12] People with lesions in the left hemisphere of the brain show a greater McGurk effect than normal controls.[12] Visual information strongly influences speech perception in these people.[12] There is a lack of susceptibility to the McGurk illusion if left hemisphere damage resulted in a deficit to visual segmental speech perception.[14] In people with right hemisphere damage, impairment on both visual-only and audio-visual integration tasks is exhibited, although they are still able to integrate the information to produce a McGurk effect.[14] Integration only appears if visual stimuli is used to improve performance when the auditory signal is impoverished but audible.[14] Therefore, there is a McGurk effect exhibited in people with damage to the right hemisphere of the brain but the effect is not as strong as a normal group.

Dyslexia

Dyslexic individuals exhibit a smaller McGurk effect than normal readers of the same chronological age, but they showed the same effect as reading-level age-matched readers.[15] Dyslexics particularly differed for combination responses, not fusion responses.[15] The smaller McGurk effect may be due to the difficulties dyslexics have in perceiving and producing consonant clusters.[15]

Specific language impairment

Children with specific language impairment show a significantly lower McGurk effect than the average child.[16] They use less visual information in speech perception, or have a reduced attention to articulatory gestures, but have no trouble perceiving auditory-only cues.[16]

Autism spectrum disorders

Children with autism spectrum disorders (ASD) showed a significantly reduced McGurk effect than children without.[17] However, if the stimulus was nonhuman (for example bouncing a tennis ball to the sound of a bouncing beach ball) then they scored similarly to children without ASD.[17] Younger children with ASD show a very reduced McGurk effect; however, this diminishes with age. As the individuals grow up, the effect they show becomes closer to those who did not have ASD.[18] It has been suggested that the weakened McGurk effect seen in people with ASD is due to deficits in identifying both the auditory and visual components of speech rather than in the integration of said components (although distinguishing speech components as speech components may be isomorphic to integrating them).[19]

Language-learning disabilities

Adults with language-learning disabilities exhibit a much smaller McGurk effect than other adults.[20] These people are not as influenced by visual input as most people.[20] Therefore, people with poor language skills will produce a smaller McGurk effect. A reason for the smaller effect in this population is that there may be uncoupled activity between anterior and posterior regions of the brain, or left and right hemispheres.[20] Cerebellar or basal ganglia etiology is also possible.

Alzheimer’s disease

In patients with Alzheimer's disease (AD), there is a smaller McGurk effect exhibited than in those without.[21] Often a reduced size of the corpus callosum produces a hemisphere disconnection process.[21] Less influence on visual stimulus is seen in patients with AD, which is a reason for the lowered McGurk effect.[21]

Schizophrenia

The McGurk effect is not as pronounced in schizophrenic individuals as in non-schizophrenic individuals. However, it is not significantly different in adults.[22] Schizophrenia slows down the development of audiovisual integration and does not allow it to reach its developmental peak. However, no degradation is observed.[22] Schizophrenics are more likely to rely on auditory cues than visual cues in speech perception.[22]

Aphasia

People with aphasia show impaired perception of speech in all conditions (visual-only, auditory-only, and audio-visual), and therefore exhibited a small McGurk effect.[23] The greatest difficulty for aphasics is in the visual-only condition showing that they use more auditory stimuli in speech perception.[23]

Bipolar disorder

Limited evidence available to this day shows no apparent difference between individuals with bipolar disorder and those without with respect to the McGurk effect. However, preliminary data suggests that people with bipolar disorder are much poorer at lipreading than healthy individuals. This may suggest that the neural pathways formed and activated in the integration of auditory and visual speech information in individuals with bipolar disorder are different compared to those in people without any mental disorder.[24]

External

Cross-dubbing

Discrepancy in vowel category significantly reduced the magnitude of the McGurk effect for fusion responses.[25] Auditory /a/ tokens dubbed onto visual /i/ articulations were more compatible than the reverse.[25] This could be because /a/ has a wide range of articulatory configurations whereas /i/ is more limited,[25] which makes it much easier for subjects to detect discrepancies in the stimuli.[25] /i/ vowel contexts produce the strongest effect, while /a/ produces a moderate effect, and /u/ has almost no effect.[26]

Mouth visibility

The McGurk effect is stronger when the right side of the speaker's mouth (on the viewer's left) is visible.[27] People tend to get more visual information from the right side of a speaker's mouth than the left or even the whole mouth.[27] This relates to the hemispheric attention factors discussed in the brain hemispheres section above.

Visual distractors

The McGurk effect is weaker when there is a visual distractor present that the listener is attending to.[28] Visual attention modulates audiovisual speech perception.[28] Another form of distraction is movement of the speaker. A stronger McGurk effect is elicited if the speaker's face/head is motionless, rather than moving.[29]

Syllable structure

A strong McGurk effect can be seen for click-vowel syllables compared to weak effects for isolated clicks.[30] This shows that the McGurk effect can happen in a non-speech environment.[30] Phonological significance is not a necessary condition for a McGurk effect to occur; however, it does increase the strength of the effect.[30]

Gender

Females show a stronger McGurk effect than males. Women show significantly greater visual influence on auditory speech than men did for brief visual stimuli, but no difference is apparent for full stimuli.[29] Another aspect regarding gender is the issue of male faces and voices as stimuli in comparison to female faces and voices as stimuli. Although, there is no difference in the strength of the McGurk effect for either situation.[31] If a male face is dubbed with a female voice, or vice versa, there is still no difference in strength of the McGurk effect.[31] Knowing that the voice you hear is different from the face you see – even if different genders – doesn't eliminate the McGurk effect.[10]

Familiarity

Subjects who are familiar with the faces of the speakers are less susceptible to the McGurk effect than those who are unfamiliar with the faces of the speakers.[2][26] On the other hand, there was no difference regarding voice familiarity.[26]

Expectation

Semantic congruency had a significant impact on the McGurk illusion.[32] The effect is experienced more often and rated as clearer in the semantically congruent condition relative to the incongruent condition.[32] When a person was expecting a certain visual or auditory appearance based on the semantic information leading up to it, the McGurk effect was greatly increased.[32]

Self-influence

The McGurk effect can be observed when the listener is also the speaker or articulator.[33] While looking at oneself in the mirror and articulating visual stimuli while listening to another auditory stimulus, a strong McGurk effect can be observed.[33] In the other condition, where the listener speaks auditory stimuli softly while watching another person articulate the conflicting visual gestures, a McGurk effect can still be seen, although it is weaker.[33]

Temporal synchrony

Temporal synchrony is not necessary for the McGurk effect to be present.[34] Subjects are still strongly influenced by auditory stimuli even when it lagged the visual stimuli by 180 milliseconds (point at which McGurk effect begins to weaken).[34] There was less tolerance for the lack of synchrony if the auditory stimuli preceded the visual stimuli.[34] In order to produce a significant weakening of the McGurk effect, the auditory stimuli had to precede the visual stimuli by 60 milliseconds, or lag by 240 milliseconds.[2]

Physical task diversion

The McGurk effect was greatly reduced when attention was diverted to a tactile task (touching something).[35] Touch is a sensory perception like vision and audition, therefore increasing attention to touch, decreases the attention to auditory and visual senses.

Gaze

The eyes do not need to fixate in order to integrate audio and visual information in speech perception.[36] There was no difference in the McGurk effect when the listener was focusing anywhere on the speaker's face.[36] The effect does not appear if the listener focuses beyond the speaker's face.[2] In order for the McGurk effect to become insignificant, the listener's gaze must deviate from the speaker's mouth by at least 60 degrees.[36]

Other languages

Whatever the language, all listeners rely on visual information to a degree in speech perception. But the McGurk effect's intensity differs across languages. Dutch,[37] English, Spanish, German, Italian and Turkish [38] language listeners experience a robust McGurk effect; Japanese and Chinese listeners, weaker.[39] Most research on the McGurk effect between languages has been between English and Japanese. A smaller McGurk effect occurs in Japanese listeners than English listeners.[37][40][41][42][43][44] The cultural practice of face avoidance in Japanese people may diminish the McGurk effect, as well as tone and syllabic structures of the language.[37] This could also be why Chinese listeners are less susceptible to visual cues, and similar to Japanese, produce a smaller effect than English listeners.[37] Studies also show that Japanese listeners do not show a developmental increase in visual influence after six, as English children do.[40][41] Japanese listeners identify incompatibility between visual and auditory stimuli better than English listeners.[37][41] This greater ability could relate to Japanese's lacking consonant clusters.[37][42] Regardless, listeners of all languages resort to visual stimuli when speech is unintelligible; the McGurk effect then applies to them equally.[37][42] The McGurk effect works with listeners of every tested language.[10]

Hearing impairment

Experiments have been conducted involving hard-of-hearing individuals and individuals who have had cochlear implants. These individuals tend to weigh visual information from speech more heavily than auditory information.[45] In comparison to normal-hearing individuals, this is not different unless there is more than one syllable, such as a word.[45] Regarding the McGurk experiment, responses from cochlear-implanted users produced the same responses as normal-hearing individuals when an auditory bilabial stimulus is dubbed onto a visual velar stimulus.[45] However, when an auditory dental stimulus is dubbed onto a visual bilabial stimulus, the responses are quite different. The McGurk effect is still present in individuals with impaired hearing or using cochlear implants, although it is quite different in some aspects.

Infants

By measuring an infant's attention to certain audiovisual stimuli, a response that is consistent with the McGurk effect can be recorded.[2][10][46][47][48] From just minutes to a couple of days old, infants can imitate adult facial movements, and within weeks of birth, infants can recognize lip movements and speech sounds.[49] At this point, the integration of audio and visual information can happen, but not at a proficient level.[49] The first evidence of the McGurk effect can be seen at four months of age;[46][47] however, more evidence is found for 5-month-olds.[2][10][48][50] Through the process of habituating an infant to a certain stimulus and then changing the stimulus (or part of it, such as ba-voiced/va-visual to da-voiced/va-visual), a response that simulates the McGurk effect becomes apparent.[10][48] The strength of the McGurk effect displays a developmental pattern that increases throughout childhood and extends into adulthood.[47][48]

See also

References

  1. ^ Nath, A. R.; Beauchamp, M. S. (Jan 2012). "A neural basis for interindividual differences in the McGurk effect, a multisensory speech illusion". NeuroImage. 59 (1): 781–787. doi:10.1016/j.neuroimage.2011.07.024. PMC 3196040. PMID 21787869.
  2. ^ a b c d e f g h i j k l Calvert, Gemma; Spence, Charles; Stein, Barry E. (2004). The handbook of multisensory processes. Cambridge, Mass.: MIT Press. ISBN 978-0-262-03321-3. OCLC 54677752.
  3. ^ Boersma, Paul (2011). "A constraint-based explanation of the McGurk effect" (PDF). Retrieved 19 October 2013.
  4. ^ Massaro, D. W.; Cohen, M. M. (Aug 2000). "Tests of auditory-visual integration efficiency within the framework of the fuzzy logical model of perception". Journal of the Acoustical Society of America. 108 (2): 784–789. Bibcode:2000ASAJ..108..784M. doi:10.1121/1.429611. PMID 10955645.
  5. ^ McGurk H., MacDonald J. (1976). "Hearing lips and seeing voices". Nature. 264 (5588): 746–748. Bibcode:1976Natur.264..746M. doi:10.1038/264746a0. PMID 1012311. S2CID 4171157.
  6. ^ "The McGurk Effect: Hearing lips and seeing voices". Retrieved 2 October 2011.
  7. ^ a b c Barutchu, Ayla; Crewther; Kiely; Murphy (2008). "When /b/ill with /g/ill becomes /d/ill: Evidence for a lexical effect in audiovisual speech perception". European Journal of Cognitive Psychology. 20 (1): 1–11. doi:10.1080/09541440601125623. S2CID 144852134.
  8. ^ a b Colin, C.; Radeau, M.; Deltenre, P. (2011). "Top-down and bottom-up modulation of audiovisual integration in speech" (PDF). European Journal of Cognitive Psychology. 17 (4): 541–560. doi:10.1080/09541440440000168. S2CID 38546088.
  9. ^ a b O’Shea, M. (2005). The Brain: A Very Short Introduction. Oxford University Press.
  10. ^ a b c d e f g h i j k Rosenblum, L. D. (2010). See what I'm saying: The extraordinary powers of our five senses. New York, NY: W. W. Norton & Company Inc.
  11. ^ Gentilucci, M.; Cattaneo, L. (2005). "Automatic audiovisual integration in speech perception". Experimental Brain Research. 167 (1): 66–75. doi:10.1007/s00221-005-0008-z. PMID 16034571. S2CID 20166301.
  12. ^ a b c d Schmid, G.; Thielmann, A.; Ziegler, W. (2009). "The influence of visual and auditory information on the perception of speech and non-speech oral movements in patients with left hemisphere lesions". Clinical Linguistics and Phonetics. 23 (3): 208–221. doi:10.1080/02699200802399913. PMID 19283578. S2CID 40852629.
  13. ^ a b c Baynes, K.; Fummell, M.; Fowler, C. (1994). "Hemispheric contributions to the integration of visual and auditory information in speech perception". Perception and Psychophysics. 55 (6): 633–641. doi:10.3758/bf03211678. PMID 8058451.
  14. ^ a b c Nicholson, K.; Baum, S.; Cuddy, L.; Munhall, K. (2002). "A case of impaired auditory and visual speech prosody perception after right hemisphere damage". Neurocase. 8 (4): 314–322. doi:10.1093/neucas/8.4.314. PMID 12221144.
  15. ^ a b c Bastien-Toniazzo, M.; Stroumza, A.; Cavé, C. (2009). "Audio-visual perception and integration in developmental dyslexia: An exploratory study using the McGurk effect". Current Psychology Letters. 25 (3): 2–14.
  16. ^ a b Norrix, L.; Plante, E.; Vance, R.; Boliek, C. (2007). "Auditory-visual integration for speech by children with and without specific language impairment". Journal of Speech, Language, and Hearing Research. 50 (6): 1639–1651. doi:10.1044/1092-4388(2007/111). PMID 18055778.
  17. ^ a b Mongillo, E.; Irwin, J.; Whalen, D.; Klaiman, C. (2008). "Audiovisual processing in children with and without autism spectrum disorders". Journal of Autism and Developmental Disorders. 38 (7): 1349–1358. doi:10.1007/s10803-007-0521-y. PMID 18307027. S2CID 8265591.
  18. ^ Taylor, N.; Isaac, C.; Milne, E. (2010). "A comparison of the development of audiovisual integration in children with autism spectrum disorders and typically developing children" (PDF). Journal of Autism and Developmental Disorders. 40 (11): 1403–1411. doi:10.1007/s10803-010-1000-4. PMID 20354776. S2CID 38370295.
  19. ^ Williams, J. H. G.; Massaro, D. W.; Peel, N. J.; Bosseler, A.; Suddendorf, T. (2004). "Visual–auditory integration during speech imitation in autism". Research in Developmental Disabilities. 25 (6): 559–575. doi:10.1016/j.ridd.2004.01.008. PMID 15541632.
  20. ^ a b c Norrix, L.; Plante, E.; Vance, R. (2006). "Auditory-visual speech integration by adults with and without language-learning disabilities". Journal of Communication Disorders. 39 (1): 22–36. doi:10.1016/j.jcomdis.2005.05.003. PMID 15950983.
  21. ^ a b c Delbeuck, X.; Collette, F.; Van der Linden, M. (2007). "Is Alzheimer's disease a disconnection syndrome? Evidence from a crossmodal audio-visual illusory experiment". Neuropsychologia. 45 (14): 3315–3323. doi:10.1016/j.neuropsychologia.2007.05.001. hdl:2268/4759. PMID 17765932. S2CID 8675718.
  22. ^ a b c Pearl, D.; Yodashkin-Porat, D.; Nachum, K.; Valevski, A.; Aizenberg, D.; Sigler, M.; Weizman, A.; Kikinzon, L. (2009). "Differences in audiovisual integration, as measured by McGurk phenomenon, among adult and adolescent patients with schizophrenia and age-matched healthy control groups". Comprehensive Psychiatry. 50 (2): 186–192. doi:10.1016/j.comppsych.2008.06.004. PMID 19216897.
  23. ^ a b Youse, K.; Cienkowski, K.; Coelho, C. (2004). "Auditory-visual speech perception in an adult with aphasia". Brain Injury. 18 (8): 825–834. doi:10.1080/02699000410001671784. PMID 15204322. S2CID 8579511.
  24. ^ Yordamlı, A.; Erdener, D. (2018). "Auditory–visual speech integration in bipolar disorder: a preliminary study". Languages. 3 (4): 38. doi:10.3390/languages3040038.
  25. ^ a b c d Green, K. P.; Gerdeman, A. (1995). "Cross-modal discrepancies in coarticulation and the integration of speech information: The McGurk effect with mismatched vowels". Journal of Experimental Psychology: Human Perception and Performance. 21 (6): 1409–1426. doi:10.1037/0096-1523.21.6.1409. PMID 7490588.
  26. ^ a b c Walker, S.; Bruce, V.; O'Malley, C. (1995). "Facial identity and facial speech processing: Familiar faces and voices in the McGurk effect". Perception & Psychophysics. 57 (8): 1124–1133. doi:10.3758/bf03208369. PMID 8539088.
  27. ^ a b Nicholls, M.; Searle, D.; Bradshaw, J. (2004). "Read my lips: Asymmetries in the visual expression and perception of speech revealed through the McGurk effect". Psychological Science. 15 (2): 138–141. doi:10.1111/j.0963-7214.2004.01502011.x. PMID 14738522. S2CID 9849205.
  28. ^ a b Tiippana, K.; Andersen, T. S.; Sams, M. (2004). "Visual attention modulates audiovisual speech perception". European Journal of Cognitive Psychology. 16 (3): 457–472. doi:10.1080/09541440340000268. S2CID 8825535.
  29. ^ a b Irwin, J. R.; Whalen, D. H.; Fowler, C. A. (2006). "A sex difference in visual influence on heard speech". Perception and Psychophysics. 68 (4): 582–592. doi:10.3758/bf03208760. PMID 16933423.
  30. ^ a b c Brancazio, L.; Best, C. T.; Fowler, C. A. (2006). "Visual influences on perception of speech and nonspeech vocal-tract events". Language and Speech. 49 (1): 21–53. doi:10.1177/00238309060490010301. PMC 2773261. PMID 16922061.
  31. ^ a b Green, K.; Kuhl, P.; Meltzoff, A.; Stevens, E. (1991). "Integrating speech information across talkers, gender, and sensory modality: Female faces and male voices in the McGurk effect". Perception and Psychophysics. 50 (6): 524–536. doi:10.3758/bf03207536. PMID 1780200.
  32. ^ a b c Windmann, S (2004). "Effects of sentence context and expectation on the McGurk illusion". Journal of Memory and Language. 50 (1): 212–230. doi:10.1016/j.jml.2003.10.001.
  33. ^ a b c Sams, M.; Mottonen, R.; Sihvonen, T. (2005). "Seeing and hearing others and oneself talk". Cognitive Brain Research. 23 (1): 429–435. doi:10.1016/j.cogbrainres.2004.11.006. PMID 15820649.
  34. ^ a b c Munhall, K.; Gribble, P.; Sacco, L.; Ward, M. (1996). "Temporal constraints on the McGurk effect". Perception and Psychophysics. 58 (3): 351–362. doi:10.3758/bf03206811. PMID 8935896.
  35. ^ Alsius, A.; Navarra, J.; Soto-Faraco, S. (2007). "Attention to touch weakens audiovisual speech integration". Experimental Brain Research. 183 (1): 399–404. doi:10.1007/s00221-007-1110-1. PMID 17899043. S2CID 8430729.
  36. ^ a b c Paré, M.; Richler, C.; Hove, M.; Munhall, K. (2003). "Gaze behavior in audiovisual speech perception: The influence on ocular fixations on the McGurk effect". Perception and Psychophysics. 65 (4): 533–567. doi:10.3758/bf03194582. PMID 12812278.
  37. ^ a b c d e f g Sekiyama, K (1997). "Cultural and linguistic factors in audiovisual speech processing: The McGurk effect in Chinese subjects". Perception and Psychophysics. 59 (1): 73–80. doi:10.3758/bf03206849. PMID 9038409.
  38. ^ Erdener, D. (December 2015). "The McGurk illusion in Turkish" (PDF). Turkish Journal of Psychology. 30 (76): 19–31.
  39. ^ Bavo, R.; Ciorba, A.; Prosser, S.; Martini, A. (2009). "The McGurk phenomenon in Italian listeners". Acta Otorhinolaryngologica Italica. 29 (4): 203–208. PMC 2816368. PMID 20161878.
  40. ^ a b Hisanaga, S., Sekiyama, K., Igasaki, T., Murayama, N. (2009). Audiovisual speech perception in Japanese and English: Inter-language differences examined by event-related potentials. Retrieved from http://www.isca-speech.org/archive_open/avsp09/papers/av09_038.pdf
  41. ^ a b c Sekiyama, K.; Burnham, D. (2008). "Impact of language on development of auditory-visual speech perception". Developmental Science. 11 (2): 306–320. doi:10.1111/j.1467-7687.2008.00677.x. PMID 18333984.
  42. ^ a b c Sekiyama, K.; Tohkura, Y. (1991). "McGurk effect in non-English listeners: Few visual effects for Japanese subjects hearing Japanese syllables of high auditory intelligibility". Journal of the Acoustical Society of America. 90 (4, Pt 1): 1797–1805. Bibcode:1991ASAJ...90.1797S. doi:10.1121/1.401660. PMID 1960275.
  43. ^ Wu, J. (2009). Speech perception and the McGurk effect: A cross cultural study using event-related potentials (Dissertation). University of Louisville.
  44. ^ Gelder, B.; Bertelson, P.; Vroomen, J.; Chin Chen, H. (1995). "Inter-language differences in the McGurk effect for Dutch and Cantonese listeners".
  45. ^ a b c Rouger, J.; Fraysse, B.; Deguine, O.; Barone, P. (2008). "McGurk effects in cochlear-implanted deaf subjects". Brain Research. 1188: 87–99. doi:10.1016/j.brainres.2007.10.049. PMID 18062941. S2CID 8858346.
  46. ^ a b Bristow, D.; Dehaene-Lambertz, G.; Mattout, J.; Soares, C.; Gliga, T.; Baillet, S.; Mangin, J. F. (2009). "Hearing faces: How the infant brain matches the face it sees with the speech it hears". Journal of Cognitive Neuroscience. 21 (5): 905–921. CiteSeerX 10.1.1.147.8793. doi:10.1162/jocn.2009.21076. PMID 18702595. S2CID 11759568.
  47. ^ a b c Burnham, D.; Dodd, B. (2004). "Auditory-Visual Speech Integration by Prelinguistic Infants: Perception of an Emergent Consonant in the McGurk Effect". Developmental Psychobiology. 45 (4): 204–220. doi:10.1002/dev.20032. PMID 15549685.
  48. ^ a b c d Rosenblum, L. D.; Schmuckler, M. A.; Johnson, J. A. (1997). "The McGurk effect in infants". Perception & Psychophysics. 59 (3): 347–357. doi:10.3758/bf03211902. PMID 9136265.
  49. ^ a b Woodhouse, L.; Hickson, L.; Dodd, B. (2009). "Review of visual speech perception by hearing and hearing-impaired people: Clinical implications". International Journal of Language & Communication Disorders. 44 (3): 253–270. doi:10.1080/13682820802090281. PMID 18821117.
  50. ^ Kushnerenko, E.; Teinonen, T.; Volein, A.; Csibra, G. (2008). "Electrophysiological evidence of illusory audiovisual speech percept in human infants". Proceedings of the National Academy of Sciences of the United States of America. 105 (32): 11442–11445. Bibcode:2008PNAS..10511442K. doi:10.1073/pnas.0804275105. PMC 2516214. PMID 18682564.

Bibliography

  • McGurk, H.; MacDonald, J. (1976). "Hearing lips and seeing voices". Nature. 264 (5588): 746–748. Bibcode:1976Natur.264..746M. doi:10.1038/264746a0. PMID 1012311. S2CID 4171157.
  • Wright, Daniel; Wareham, Gary (2005). "Mixing sound and vision: The interaction of auditory and visual information for earwitnesses of a crime scene". Legal and Criminological Psychology. 10 (1): 103–108. doi:10.1348/135532504x15240. S2CID 18781846.

External links

  • A constraint-based explanation of the McGurk effect a write up of the McGurk effect by Paul Boersma of University of Amsterdam. PDF available from academic webpage of author.
  • Try The McGurk Effect! – Horizon: Is Seeing Believing? – BBC Two
  • McGurk Effect (with explanation)

mcgurk, effect, perceptual, phenomenon, that, demonstrates, interaction, between, hearing, vision, speech, perception, illusion, occurs, when, auditory, component, sound, paired, with, visual, component, another, sound, leading, perception, third, sound, visua. The McGurk effect is a perceptual phenomenon that demonstrates an interaction between hearing and vision in speech perception The illusion occurs when the auditory component of one sound is paired with the visual component of another sound leading to the perception of a third sound 1 The visual information a person gets from seeing a person speak changes the way they hear the sound 2 3 If a person is getting poor quality auditory information but good quality visual information they may be more likely to experience the McGurk effect 4 Integration abilities for audio and visual information may also influence whether a person will experience the effect People who are better at sensory integration have been shown to be more susceptible to the effect 2 Many people are affected differently by the McGurk effect based on many factors including brain damage and other disorders Contents 1 Background 2 Factors 2 1 Internal 2 1 1 Brain damage 2 1 2 Dyslexia 2 1 3 Specific language impairment 2 1 4 Autism spectrum disorders 2 1 5 Language learning disabilities 2 1 6 Alzheimer s disease 2 1 7 Schizophrenia 2 1 8 Aphasia 2 1 9 Bipolar disorder 2 2 External 2 2 1 Cross dubbing 2 2 2 Mouth visibility 2 2 3 Visual distractors 2 2 4 Syllable structure 2 2 5 Gender 2 2 6 Familiarity 2 2 7 Expectation 2 2 8 Self influence 2 2 9 Temporal synchrony 2 2 10 Physical task diversion 2 2 11 Gaze 3 Other languages 4 Hearing impairment 5 Infants 6 See also 7 References 8 Bibliography 9 External linksBackground EditIt was first described in 1976 in a paper by Harry McGurk and John MacDonald titled Hearing Lips and Seeing Voices in Nature 23 December 1976 5 This effect was discovered by accident when McGurk and his research assistant MacDonald asked a technician to dub a video with a different phoneme from the one spoken while conducting a study on how infants perceive language at different developmental stages When the video was played back both researchers heard a third phoneme rather than the one spoken or mouthed in the video 6 This effect may be experienced when a video of one phoneme s production is dubbed with a sound recording of a different phoneme being spoken Often the perceived phoneme is a third intermediate phoneme As an example the syllables ba ba are spoken over the lip movements of ga ga and the perception is of da da McGurk and MacDonald originally believed that this resulted from the common phonetic and visual properties of b and g 7 Two types of illusion in response to incongruent audiovisual stimuli have been observed fusions ba auditory and ga visual produce da and combinations ga auditory and ba visual produce bga 8 This is the brain s effort to provide the consciousness with its best guess about the incoming information 9 The information coming from the eyes and ears is contradictory and in this instance the eyes visual information have had a greater effect on the brain and thus the fusion and combination responses have been created 9 Vision is the primary sense for humans 2 but speech perception is multimodal which means that it involves information from more than one sensory modality in particular audition and vision The McGurk effect arises during phonetic processing because the integration of audio and visual information happens early in speech perception 7 The McGurk effect is very robust that is knowledge about it seems to have little effect on one s perception of it This is different from certain optical illusions which break down once one sees through them Some people including those that have been researching the phenomenon for more than twenty years experience the effect even when they are aware that it is taking place 8 10 With the exception of people who can identify most of what is being said from lip reading alone most people are quite limited in their ability to identify speech from visual only signals 2 A more extensive phenomenon is the ability of visual speech to increase the intelligibility of heard speech in a noisy environment 2 Visible speech can also alter the perception of perfectly audible speech sounds when the visual speech stimuli are mismatched with the auditory speech 2 Normally speech perception is thought to be an auditory process 2 however our use of information is immediate automatic and to a large degree unconscious 10 and therefore despite what is widely accepted as true speech is not only something we hear 10 Speech is perceived by all of the senses working together seeing touching and listening to a face move 10 The brain is often unaware of the separate sensory contributions of what it perceives 10 Therefore when it comes to recognizing speech the brain cannot differentiate whether it is seeing or hearing the incoming information 10 It has also been examined in relation to witness testimony Wareham and Wright s 2005 study showed that inconsistent visual information can change the perception of spoken utterances suggesting that the McGurk effect may have many influences in everyday perception Not limited to syllables the effect can occur in whole words 7 11 and have an effect on daily interactions that people are unaware of Research into this area can provide information on not only theoretical questions but also it can provide therapeutic and diagnostic relevance for those with disorders relating to audio and visual integration of speech cues 12 Factors EditInternal Edit Brain damage Edit Both hemispheres of the brain make a contribution to the McGurk effect 13 They work together to integrate speech information that is received through the auditory and visual senses A McGurk response is more likely to occur in right handed individuals for whom the face has privileged access to the right hemisphere and words to the left hemisphere 13 In people that have had callosotomies done the McGurk effect is still present but significantly slower 13 In people with lesions to the left hemisphere of the brain visual features often play a critical role in speech and language therapy 12 People with lesions in the left hemisphere of the brain show a greater McGurk effect than normal controls 12 Visual information strongly influences speech perception in these people 12 There is a lack of susceptibility to the McGurk illusion if left hemisphere damage resulted in a deficit to visual segmental speech perception 14 In people with right hemisphere damage impairment on both visual only and audio visual integration tasks is exhibited although they are still able to integrate the information to produce a McGurk effect 14 Integration only appears if visual stimuli is used to improve performance when the auditory signal is impoverished but audible 14 Therefore there is a McGurk effect exhibited in people with damage to the right hemisphere of the brain but the effect is not as strong as a normal group Dyslexia Edit Dyslexic individuals exhibit a smaller McGurk effect than normal readers of the same chronological age but they showed the same effect as reading level age matched readers 15 Dyslexics particularly differed for combination responses not fusion responses 15 The smaller McGurk effect may be due to the difficulties dyslexics have in perceiving and producing consonant clusters 15 Specific language impairment Edit Children with specific language impairment show a significantly lower McGurk effect than the average child 16 They use less visual information in speech perception or have a reduced attention to articulatory gestures but have no trouble perceiving auditory only cues 16 Autism spectrum disorders Edit Children with autism spectrum disorders ASD showed a significantly reduced McGurk effect than children without 17 However if the stimulus was nonhuman for example bouncing a tennis ball to the sound of a bouncing beach ball then they scored similarly to children without ASD 17 Younger children with ASD show a very reduced McGurk effect however this diminishes with age As the individuals grow up the effect they show becomes closer to those who did not have ASD 18 It has been suggested that the weakened McGurk effect seen in people with ASD is due to deficits in identifying both the auditory and visual components of speech rather than in the integration of said components although distinguishing speech components as speech components may be isomorphic to integrating them 19 Language learning disabilities Edit Adults with language learning disabilities exhibit a much smaller McGurk effect than other adults 20 These people are not as influenced by visual input as most people 20 Therefore people with poor language skills will produce a smaller McGurk effect A reason for the smaller effect in this population is that there may be uncoupled activity between anterior and posterior regions of the brain or left and right hemispheres 20 Cerebellar or basal ganglia etiology is also possible Alzheimer s disease Edit In patients with Alzheimer s disease AD there is a smaller McGurk effect exhibited than in those without 21 Often a reduced size of the corpus callosum produces a hemisphere disconnection process 21 Less influence on visual stimulus is seen in patients with AD which is a reason for the lowered McGurk effect 21 Schizophrenia Edit The McGurk effect is not as pronounced in schizophrenic individuals as in non schizophrenic individuals However it is not significantly different in adults 22 Schizophrenia slows down the development of audiovisual integration and does not allow it to reach its developmental peak However no degradation is observed 22 Schizophrenics are more likely to rely on auditory cues than visual cues in speech perception 22 Aphasia Edit People with aphasia show impaired perception of speech in all conditions visual only auditory only and audio visual and therefore exhibited a small McGurk effect 23 The greatest difficulty for aphasics is in the visual only condition showing that they use more auditory stimuli in speech perception 23 Bipolar disorder Edit Limited evidence available to this day shows no apparent difference between individuals with bipolar disorder and those without with respect to the McGurk effect However preliminary data suggests that people with bipolar disorder are much poorer at lipreading than healthy individuals This may suggest that the neural pathways formed and activated in the integration of auditory and visual speech information in individuals with bipolar disorder are different compared to those in people without any mental disorder 24 External Edit Cross dubbing Edit Discrepancy in vowel category significantly reduced the magnitude of the McGurk effect for fusion responses 25 Auditory a tokens dubbed onto visual i articulations were more compatible than the reverse 25 This could be because a has a wide range of articulatory configurations whereas i is more limited 25 which makes it much easier for subjects to detect discrepancies in the stimuli 25 i vowel contexts produce the strongest effect while a produces a moderate effect and u has almost no effect 26 Mouth visibility Edit The McGurk effect is stronger when the right side of the speaker s mouth on the viewer s left is visible 27 People tend to get more visual information from the right side of a speaker s mouth than the left or even the whole mouth 27 This relates to the hemispheric attention factors discussed in the brain hemispheres section above Visual distractors Edit The McGurk effect is weaker when there is a visual distractor present that the listener is attending to 28 Visual attention modulates audiovisual speech perception 28 Another form of distraction is movement of the speaker A stronger McGurk effect is elicited if the speaker s face head is motionless rather than moving 29 Syllable structure Edit A strong McGurk effect can be seen for click vowel syllables compared to weak effects for isolated clicks 30 This shows that the McGurk effect can happen in a non speech environment 30 Phonological significance is not a necessary condition for a McGurk effect to occur however it does increase the strength of the effect 30 Gender Edit Females show a stronger McGurk effect than males Women show significantly greater visual influence on auditory speech than men did for brief visual stimuli but no difference is apparent for full stimuli 29 Another aspect regarding gender is the issue of male faces and voices as stimuli in comparison to female faces and voices as stimuli Although there is no difference in the strength of the McGurk effect for either situation 31 If a male face is dubbed with a female voice or vice versa there is still no difference in strength of the McGurk effect 31 Knowing that the voice you hear is different from the face you see even if different genders doesn t eliminate the McGurk effect 10 Familiarity Edit Subjects who are familiar with the faces of the speakers are less susceptible to the McGurk effect than those who are unfamiliar with the faces of the speakers 2 26 On the other hand there was no difference regarding voice familiarity 26 Expectation Edit Semantic congruency had a significant impact on the McGurk illusion 32 The effect is experienced more often and rated as clearer in the semantically congruent condition relative to the incongruent condition 32 When a person was expecting a certain visual or auditory appearance based on the semantic information leading up to it the McGurk effect was greatly increased 32 Self influence Edit The McGurk effect can be observed when the listener is also the speaker or articulator 33 While looking at oneself in the mirror and articulating visual stimuli while listening to another auditory stimulus a strong McGurk effect can be observed 33 In the other condition where the listener speaks auditory stimuli softly while watching another person articulate the conflicting visual gestures a McGurk effect can still be seen although it is weaker 33 Temporal synchrony Edit Temporal synchrony is not necessary for the McGurk effect to be present 34 Subjects are still strongly influenced by auditory stimuli even when it lagged the visual stimuli by 180 milliseconds point at which McGurk effect begins to weaken 34 There was less tolerance for the lack of synchrony if the auditory stimuli preceded the visual stimuli 34 In order to produce a significant weakening of the McGurk effect the auditory stimuli had to precede the visual stimuli by 60 milliseconds or lag by 240 milliseconds 2 Physical task diversion Edit The McGurk effect was greatly reduced when attention was diverted to a tactile task touching something 35 Touch is a sensory perception like vision and audition therefore increasing attention to touch decreases the attention to auditory and visual senses Gaze Edit The eyes do not need to fixate in order to integrate audio and visual information in speech perception 36 There was no difference in the McGurk effect when the listener was focusing anywhere on the speaker s face 36 The effect does not appear if the listener focuses beyond the speaker s face 2 In order for the McGurk effect to become insignificant the listener s gaze must deviate from the speaker s mouth by at least 60 degrees 36 Other languages EditWhatever the language all listeners rely on visual information to a degree in speech perception But the McGurk effect s intensity differs across languages Dutch 37 English Spanish German Italian and Turkish 38 language listeners experience a robust McGurk effect Japanese and Chinese listeners weaker 39 Most research on the McGurk effect between languages has been between English and Japanese A smaller McGurk effect occurs in Japanese listeners than English listeners 37 40 41 42 43 44 The cultural practice of face avoidance in Japanese people may diminish the McGurk effect as well as tone and syllabic structures of the language 37 This could also be why Chinese listeners are less susceptible to visual cues and similar to Japanese produce a smaller effect than English listeners 37 Studies also show that Japanese listeners do not show a developmental increase in visual influence after six as English children do 40 41 Japanese listeners identify incompatibility between visual and auditory stimuli better than English listeners 37 41 This greater ability could relate to Japanese s lacking consonant clusters 37 42 Regardless listeners of all languages resort to visual stimuli when speech is unintelligible the McGurk effect then applies to them equally 37 42 The McGurk effect works with listeners of every tested language 10 Hearing impairment EditExperiments have been conducted involving hard of hearing individuals and individuals who have had cochlear implants These individuals tend to weigh visual information from speech more heavily than auditory information 45 In comparison to normal hearing individuals this is not different unless there is more than one syllable such as a word 45 Regarding the McGurk experiment responses from cochlear implanted users produced the same responses as normal hearing individuals when an auditory bilabial stimulus is dubbed onto a visual velar stimulus 45 However when an auditory dental stimulus is dubbed onto a visual bilabial stimulus the responses are quite different The McGurk effect is still present in individuals with impaired hearing or using cochlear implants although it is quite different in some aspects Infants EditBy measuring an infant s attention to certain audiovisual stimuli a response that is consistent with the McGurk effect can be recorded 2 10 46 47 48 From just minutes to a couple of days old infants can imitate adult facial movements and within weeks of birth infants can recognize lip movements and speech sounds 49 At this point the integration of audio and visual information can happen but not at a proficient level 49 The first evidence of the McGurk effect can be seen at four months of age 46 47 however more evidence is found for 5 month olds 2 10 48 50 Through the process of habituating an infant to a certain stimulus and then changing the stimulus or part of it such as ba voiced va visual to da voiced va visual a response that simulates the McGurk effect becomes apparent 10 48 The strength of the McGurk effect displays a developmental pattern that increases throughout childhood and extends into adulthood 47 48 See also EditDuplex perception Ideasthesia Lip reading Motor theory of speech perception Multisensory integration Speech perception Viseme Yanny or LaurelReferences Edit Nath A R Beauchamp M S Jan 2012 A neural basis for interindividual differences in the McGurk effect a multisensory speech illusion NeuroImage 59 1 781 787 doi 10 1016 j neuroimage 2011 07 024 PMC 3196040 PMID 21787869 a b c d e f g h i j k l Calvert Gemma Spence Charles Stein Barry E 2004 The handbook of multisensory processes Cambridge Mass MIT Press ISBN 978 0 262 03321 3 OCLC 54677752 Boersma Paul 2011 A constraint based explanation of the McGurk effect PDF Retrieved 19 October 2013 Massaro D W Cohen M M Aug 2000 Tests of auditory visual integration efficiency within the framework of the fuzzy logical model of perception Journal of the Acoustical Society of America 108 2 784 789 Bibcode 2000ASAJ 108 784M doi 10 1121 1 429611 PMID 10955645 McGurk H MacDonald J 1976 Hearing lips and seeing voices Nature 264 5588 746 748 Bibcode 1976Natur 264 746M doi 10 1038 264746a0 PMID 1012311 S2CID 4171157 The McGurk Effect Hearing lips and seeing voices Retrieved 2 October 2011 a b c Barutchu Ayla Crewther Kiely Murphy 2008 When b ill with g ill becomes d ill Evidence for a lexical effect in audiovisual speech perception European Journal of Cognitive Psychology 20 1 1 11 doi 10 1080 09541440601125623 S2CID 144852134 a b Colin C Radeau M Deltenre P 2011 Top down and bottom up modulation of audiovisual integration in speech PDF European Journal of Cognitive Psychology 17 4 541 560 doi 10 1080 09541440440000168 S2CID 38546088 a b O Shea M 2005 The Brain A Very Short Introduction Oxford University Press a b c d e f g h i j k Rosenblum L D 2010 See what I m saying The extraordinary powers of our five senses New York NY W W Norton amp Company Inc Gentilucci M Cattaneo L 2005 Automatic audiovisual integration in speech perception Experimental Brain Research 167 1 66 75 doi 10 1007 s00221 005 0008 z PMID 16034571 S2CID 20166301 a b c d Schmid G Thielmann A Ziegler W 2009 The influence of visual and auditory information on the perception of speech and non speech oral movements in patients with left hemisphere lesions Clinical Linguistics and Phonetics 23 3 208 221 doi 10 1080 02699200802399913 PMID 19283578 S2CID 40852629 a b c Baynes K Fummell M Fowler C 1994 Hemispheric contributions to the integration of visual and auditory information in speech perception Perception and Psychophysics 55 6 633 641 doi 10 3758 bf03211678 PMID 8058451 a b c Nicholson K Baum S Cuddy L Munhall K 2002 A case of impaired auditory and visual speech prosody perception after right hemisphere damage Neurocase 8 4 314 322 doi 10 1093 neucas 8 4 314 PMID 12221144 a b c Bastien Toniazzo M Stroumza A Cave C 2009 Audio visual perception and integration in developmental dyslexia An exploratory study using the McGurk effect Current Psychology Letters 25 3 2 14 a b Norrix L Plante E Vance R Boliek C 2007 Auditory visual integration for speech by children with and without specific language impairment Journal of Speech Language and Hearing Research 50 6 1639 1651 doi 10 1044 1092 4388 2007 111 PMID 18055778 a b Mongillo E Irwin J Whalen D Klaiman C 2008 Audiovisual processing in children with and without autism spectrum disorders Journal of Autism and Developmental Disorders 38 7 1349 1358 doi 10 1007 s10803 007 0521 y PMID 18307027 S2CID 8265591 Taylor N Isaac C Milne E 2010 A comparison of the development of audiovisual integration in children with autism spectrum disorders and typically developing children PDF Journal of Autism and Developmental Disorders 40 11 1403 1411 doi 10 1007 s10803 010 1000 4 PMID 20354776 S2CID 38370295 Williams J H G Massaro D W Peel N J Bosseler A Suddendorf T 2004 Visual auditory integration during speech imitation in autism Research in Developmental Disabilities 25 6 559 575 doi 10 1016 j ridd 2004 01 008 PMID 15541632 a b c Norrix L Plante E Vance R 2006 Auditory visual speech integration by adults with and without language learning disabilities Journal of Communication Disorders 39 1 22 36 doi 10 1016 j jcomdis 2005 05 003 PMID 15950983 a b c Delbeuck X Collette F Van der Linden M 2007 Is Alzheimer s disease a disconnection syndrome Evidence from a crossmodal audio visual illusory experiment Neuropsychologia 45 14 3315 3323 doi 10 1016 j neuropsychologia 2007 05 001 hdl 2268 4759 PMID 17765932 S2CID 8675718 a b c Pearl D Yodashkin Porat D Nachum K Valevski A Aizenberg D Sigler M Weizman A Kikinzon L 2009 Differences in audiovisual integration as measured by McGurk phenomenon among adult and adolescent patients with schizophrenia and age matched healthy control groups Comprehensive Psychiatry 50 2 186 192 doi 10 1016 j comppsych 2008 06 004 PMID 19216897 a b Youse K Cienkowski K Coelho C 2004 Auditory visual speech perception in an adult with aphasia Brain Injury 18 8 825 834 doi 10 1080 02699000410001671784 PMID 15204322 S2CID 8579511 Yordamli A Erdener D 2018 Auditory visual speech integration in bipolar disorder a preliminary study Languages 3 4 38 doi 10 3390 languages3040038 a b c d Green K P Gerdeman A 1995 Cross modal discrepancies in coarticulation and the integration of speech information The McGurk effect with mismatched vowels Journal of Experimental Psychology Human Perception and Performance 21 6 1409 1426 doi 10 1037 0096 1523 21 6 1409 PMID 7490588 a b c Walker S Bruce V O Malley C 1995 Facial identity and facial speech processing Familiar faces and voices in the McGurk effect Perception amp Psychophysics 57 8 1124 1133 doi 10 3758 bf03208369 PMID 8539088 a b Nicholls M Searle D Bradshaw J 2004 Read my lips Asymmetries in the visual expression and perception of speech revealed through the McGurk effect Psychological Science 15 2 138 141 doi 10 1111 j 0963 7214 2004 01502011 x PMID 14738522 S2CID 9849205 a b Tiippana K Andersen T S Sams M 2004 Visual attention modulates audiovisual speech perception European Journal of Cognitive Psychology 16 3 457 472 doi 10 1080 09541440340000268 S2CID 8825535 a b Irwin J R Whalen D H Fowler C A 2006 A sex difference in visual influence on heard speech Perception and Psychophysics 68 4 582 592 doi 10 3758 bf03208760 PMID 16933423 a b c Brancazio L Best C T Fowler C A 2006 Visual influences on perception of speech and nonspeech vocal tract events Language and Speech 49 1 21 53 doi 10 1177 00238309060490010301 PMC 2773261 PMID 16922061 a b Green K Kuhl P Meltzoff A Stevens E 1991 Integrating speech information across talkers gender and sensory modality Female faces and male voices in the McGurk effect Perception and Psychophysics 50 6 524 536 doi 10 3758 bf03207536 PMID 1780200 a b c Windmann S 2004 Effects of sentence context and expectation on the McGurk illusion Journal of Memory and Language 50 1 212 230 doi 10 1016 j jml 2003 10 001 a b c Sams M Mottonen R Sihvonen T 2005 Seeing and hearing others and oneself talk Cognitive Brain Research 23 1 429 435 doi 10 1016 j cogbrainres 2004 11 006 PMID 15820649 a b c Munhall K Gribble P Sacco L Ward M 1996 Temporal constraints on the McGurk effect Perception and Psychophysics 58 3 351 362 doi 10 3758 bf03206811 PMID 8935896 Alsius A Navarra J Soto Faraco S 2007 Attention to touch weakens audiovisual speech integration Experimental Brain Research 183 1 399 404 doi 10 1007 s00221 007 1110 1 PMID 17899043 S2CID 8430729 a b c Pare M Richler C Hove M Munhall K 2003 Gaze behavior in audiovisual speech perception The influence on ocular fixations on the McGurk effect Perception and Psychophysics 65 4 533 567 doi 10 3758 bf03194582 PMID 12812278 a b c d e f g Sekiyama K 1997 Cultural and linguistic factors in audiovisual speech processing The McGurk effect in Chinese subjects Perception and Psychophysics 59 1 73 80 doi 10 3758 bf03206849 PMID 9038409 Erdener D December 2015 The McGurk illusion in Turkish PDF Turkish Journal of Psychology 30 76 19 31 Bavo R Ciorba A Prosser S Martini A 2009 The McGurk phenomenon in Italian listeners Acta Otorhinolaryngologica Italica 29 4 203 208 PMC 2816368 PMID 20161878 a b Hisanaga S Sekiyama K Igasaki T Murayama N 2009 Audiovisual speech perception in Japanese and English Inter language differences examined by event related potentials Retrieved from http www isca speech org archive open avsp09 papers av09 038 pdf a b c Sekiyama K Burnham D 2008 Impact of language on development of auditory visual speech perception Developmental Science 11 2 306 320 doi 10 1111 j 1467 7687 2008 00677 x PMID 18333984 a b c Sekiyama K Tohkura Y 1991 McGurk effect in non English listeners Few visual effects for Japanese subjects hearing Japanese syllables of high auditory intelligibility Journal of the Acoustical Society of America 90 4 Pt 1 1797 1805 Bibcode 1991ASAJ 90 1797S doi 10 1121 1 401660 PMID 1960275 Wu J 2009 Speech perception and the McGurk effect A cross cultural study using event related potentials Dissertation University of Louisville Gelder B Bertelson P Vroomen J Chin Chen H 1995 Inter language differences in the McGurk effect for Dutch and Cantonese listeners a b c Rouger J Fraysse B Deguine O Barone P 2008 McGurk effects in cochlear implanted deaf subjects Brain Research 1188 87 99 doi 10 1016 j brainres 2007 10 049 PMID 18062941 S2CID 8858346 a b Bristow D Dehaene Lambertz G Mattout J Soares C Gliga T Baillet S Mangin J F 2009 Hearing faces How the infant brain matches the face it sees with the speech it hears Journal of Cognitive Neuroscience 21 5 905 921 CiteSeerX 10 1 1 147 8793 doi 10 1162 jocn 2009 21076 PMID 18702595 S2CID 11759568 a b c Burnham D Dodd B 2004 Auditory Visual Speech Integration by Prelinguistic Infants Perception of an Emergent Consonant in the McGurk Effect Developmental Psychobiology 45 4 204 220 doi 10 1002 dev 20032 PMID 15549685 a b c d Rosenblum L D Schmuckler M A Johnson J A 1997 The McGurk effect in infants Perception amp Psychophysics 59 3 347 357 doi 10 3758 bf03211902 PMID 9136265 a b Woodhouse L Hickson L Dodd B 2009 Review of visual speech perception by hearing and hearing impaired people Clinical implications International Journal of Language amp Communication Disorders 44 3 253 270 doi 10 1080 13682820802090281 PMID 18821117 Kushnerenko E Teinonen T Volein A Csibra G 2008 Electrophysiological evidence of illusory audiovisual speech percept in human infants Proceedings of the National Academy of Sciences of the United States of America 105 32 11442 11445 Bibcode 2008PNAS 10511442K doi 10 1073 pnas 0804275105 PMC 2516214 PMID 18682564 Bibliography EditMcGurk H MacDonald J 1976 Hearing lips and seeing voices Nature 264 5588 746 748 Bibcode 1976Natur 264 746M doi 10 1038 264746a0 PMID 1012311 S2CID 4171157 Wright Daniel Wareham Gary 2005 Mixing sound and vision The interaction of auditory and visual information for earwitnesses of a crime scene Legal and Criminological Psychology 10 1 103 108 doi 10 1348 135532504x15240 S2CID 18781846 External links EditA constraint based explanation of the McGurk effect a write up of the McGurk effect by Paul Boersma of University of Amsterdam PDF available from academic webpage of author Try The McGurk Effect Horizon Is Seeing Believing BBC Two McGurk Effect with explanation Retrieved from https en wikipedia org w index php title McGurk effect amp oldid 1134984960, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.