fbpx
Wikipedia

Neuroscience of music

The neuroscience of music is the scientific study of brain-based mechanisms involved in the cognitive processes underlying music. These behaviours include music listening, performing, composing, reading, writing, and ancillary activities. It also is increasingly concerned with the brain basis for musical aesthetics and musical emotion. Scientists working in this field may have training in cognitive neuroscience, neurology, neuroanatomy, psychology, music theory, computer science, and other relevant fields.

The cognitive neuroscience of music represents a significant branch of music psychology, and is distinguished from related fields such as cognitive musicology in its reliance on direct observations of the brain and use of brain imaging techniques like functional magnetic resonance imaging (fMRI) and positron emission tomography (PET).

Elements of music edit

Pitch edit

Sounds consist of waves of air molecules that vibrate at different frequencies. These waves travel to the basilar membrane in the cochlea of the inner ear. Different frequencies of sound will cause vibrations in different locations of the basilar membrane. We are able to hear different pitches because each sound wave with a unique frequency is correlated to a different location along the basilar membrane. This spatial arrangement of sounds and their respective frequencies being processed in the basilar membrane is known as tonotopy. When the hair cells on the basilar membrane move back and forth due to the vibrating sound waves, they release neurotransmitters and cause action potentials to occur down the auditory nerve. The auditory nerve then leads to several layers of synapses at numerous clusters of neurons, or nuclei, in the auditory brainstem. These nuclei are also tonotopically organized, and the process of achieving this tonotopy after the cochlea is not well understood.[1] This tonotopy is in general maintained up to primary auditory cortex in mammals.[2]

A widely postulated mechanism for pitch processing in the early central auditory system is the phase-locking and mode-locking of action potentials to frequencies in a stimulus. Phase-locking to stimulus frequencies has been shown in the auditory nerve,[3][4] the cochlear nucleus,[3][5] the inferior colliculus,[6] and the auditory thalamus.[7] By phase- and mode-locking in this way, the auditory brainstem is known to preserve a good deal of the temporal and low-passed frequency information from the original sound; this is evident by measuring the auditory brainstem response using EEG.[8] This temporal preservation is one way to argue directly for the temporal theory of pitch perception, and to argue indirectly against the place theory of pitch perception.

 
The primary auditory cortex is one of the main areas associated with superior pitch resolution.

The right secondary auditory cortex has finer pitch resolution than the left. Hyde, Peretz and Zatorre (2008) used functional magnetic resonance imaging (fMRI) in their study to test the involvement of right and left auditory cortical regions in the frequency processing of melodic sequences.[9] As well as finding superior pitch resolution in the right secondary auditory cortex, specific areas found to be involved were the planum temporale (PT) in the secondary auditory cortex, and the primary auditory cortex in the medial section of Heschl's gyrus (HG).

Many neuroimaging studies have found evidence of the importance of right secondary auditory regions in aspects of musical pitch processing, such as melody.[10] Many of these studies such as one by Patterson, Uppenkamp, Johnsrude and Griffiths (2002) also find evidence of a hierarchy of pitch processing. Patterson et al. (2002) used spectrally matched sounds which produced: no pitch, fixed pitch or melody in an fMRI study and found that all conditions activated HG and PT. Sounds with pitch activated more of these regions than sounds without. When a melody was produced activation spread to the superior temporal gyrus (STG) and planum polare (PP). These results support the existence of a pitch processing hierarchy.

Absolute pitch edit

 
Musicians possessing perfect pitch can identify the pitch of musical tones without external reference.

Absolute pitch (AP) is defined as the ability to identify the pitch of a musical tone or to produce a musical tone at a given pitch without the use of an external reference pitch.[11][12] Neuroscientific research has not discovered a distinct activation pattern common for possessors of AP. Zatorre, Perry, Beckett, Westbury and Evans (1998) examined the neural foundations of AP using functional and structural brain imaging techniques.[13] Positron emission tomography (PET) was utilized to measure cerebral blood flow (CBF) in musicians possessing AP and musicians lacking AP. When presented with musical tones, similar patterns of increased CBF in auditory cortical areas emerged in both groups. AP possessors and non-AP subjects demonstrated similar patterns of left dorsolateral frontal activity when they performed relative pitch judgments. However, in non-AP subjects activation in the right inferior frontal cortex was present whereas AP possessors showed no such activity. This finding suggests that musicians with AP do not need access to working memory devices for such tasks. These findings imply that there is no specific regional activation pattern unique to AP. Rather, the availability of specific processing mechanisms and task demands determine the recruited neural areas.

Melody edit

Studies suggest that individuals are capable of automatically detecting a difference or anomaly in a melody such as an out of tune pitch which does not fit with their previous music experience. This automatic processing occurs in the secondary auditory cortex. Brattico, Tervaniemi, Naatanen, and Peretz (2006) performed one such study to determine if the detection of tones that do not fit an individual's expectations can occur automatically.[14] They recorded event-related potentials (ERPs) in nonmusicians as they were presented unfamiliar melodies with either an out of tune pitch or an out of key pitch while participants were either distracted from the sounds or attending to the melody. Both conditions revealed an early frontal error-related negativity independent of where attention was directed. This negativity originated in the auditory cortex, more precisely in the supratemporal lobe (which corresponds with the secondary auditory cortex) with greater activity from the right hemisphere. The negativity response was larger for pitch that was out of tune than that which was out of key. Ratings of musical incongruity were higher for out of tune pitch melodies than for out of key pitch. In the focused attention condition, out of key and out of tune pitches produced late parietal positivity. The findings of Brattico et al. (2006) suggest that there is automatic and rapid processing of melodic properties in the secondary auditory cortex.[14] The findings that pitch incongruities were detected automatically, even in processing unfamiliar melodies, suggests that there is an automatic comparison of incoming information with long term knowledge of musical scale properties, such as culturally influenced rules of musical properties (common chord progressions, scale patterns, etc.) and individual expectations of how the melody should proceed.

Rhythm edit

The belt and parabelt areas of the right hemisphere are involved in processing rhythm.[15] Rhythm is a strong repeated pattern of movement or sound. When individuals are preparing to tap out a rhythm of regular intervals (1:2 or 1:3) the left frontal cortex, left parietal cortex, and right cerebellum are all activated. With more difficult rhythms such as a 1:2.5, more areas in the cerebral cortex and cerebellum are involved.[16] EEG recordings have also shown a relationship between brain electrical activity and rhythm perception. Snyder and Large (2005)[17] performed a study examining rhythm perception in human subjects, finding that activity in the gamma band (20 – 60 Hz) corresponds to the beats in a simple rhythm. Two types of gamma activity were found by Snyder & Large: induced gamma activity, and evoked gamma activity. Evoked gamma activity was found after the onset of each tone in the rhythm; this activity was found to be phase-locked (peaks and troughs were directly related to the exact onset of the tone) and did not appear when a gap (missed beat) was present in the rhythm. Induced gamma activity, which was not found to be phase-locked, was also found to correspond with each beat. However, induced gamma activity did not subside when a gap was present in the rhythm, indicating that induced gamma activity may possibly serve as a sort of internal metronome independent of auditory input.

Tonality edit

Tonality describes the relationships between the elements of melody and harmony – tones, intervals, chords, and scales. These relationships are often characterized as hierarchical, such that one of the elements dominates or attracts another. They occur both within and between every type of element, creating a rich and time-varying perception between tones and their melodic, harmonic, and chromatic contexts. In one conventional sense, tonality refers to just the major and minor scale types – examples of scales whose elements are capable of maintaining a consistent set of functional relationships. The most important functional relationship is that of the tonic note (the first note in a scale) and the tonic chord (the first note in the scale with the third and fifth note) with the rest of the scale. The tonic is the element which tends to assert its dominance and attraction over all others, and it functions as the ultimate point of attraction, rest and resolution for the scale.[18]

The right auditory cortex is primarily involved in perceiving pitch, and parts of harmony, melody and rhythm.[16] One study by Petr Janata found that there are tonality-sensitive areas in the medial prefrontal cortex, the cerebellum, the superior temporal sulci of both hemispheres and the superior temporal gyri (which has a skew towards the right hemisphere).[19] Hemispheric asymmetries in the processing of dissonant/consonant sounds have been demonstrated. ERP studies have shown larger evoked responses over the left temporal area in response to dissonant chords, and over the right one, in response to consonant chords.[20]

Music production and performance edit

Motor control functions edit

Musical performance usually involves at least three elementary motor control functions: timing, sequencing, and spatial organization of motor movements. Accuracy in timing of movements is related to musical rhythm. Rhythm, the pattern of temporal intervals within a musical measure or phrase, in turn creates the perception of stronger and weaker beats.[21] Sequencing and spatial organization relate to the expression of individual notes on a musical instrument.

These functions and their neural mechanisms have been investigated separately in many studies, but little is known about their combined interaction in producing a complex musical performance.[21] The study of music requires examining them together.

Timing edit

Although neural mechanisms involved in timing movement have been studied rigorously over the past 20 years, much remains controversial. The ability to phrase movements in precise time has been accredited to a neural metronome or clock mechanism where time is represented through oscillations or pulses.[22][23][24][25] An opposing view to this metronome mechanism has also been hypothesized stating that it is an emergent property of the kinematics of movement itself.[24][25][26] Kinematics is defined as parameters of movement through space without reference to forces (for example, direction, velocity and acceleration).[21]

Functional neuroimaging studies, as well as studies of brain-damaged patients, have linked movement timing to several cortical and sub-cortical regions, including the cerebellum, basal ganglia and supplementary motor area (SMA).[21] Specifically the basal ganglia and possibly the SMA have been implicated in interval timing at longer timescales (1 second and above), while the cerebellum may be more important for controlling motor timing at shorter timescales (milliseconds).[22][27] Furthermore, these results indicate that motor timing is not controlled by a single brain region, but by a network of regions that control specific parameters of movement and that depend on the relevant timescale of the rhythmic sequence.[21]

Sequencing edit

Motor sequencing has been explored in terms of either the ordering of individual movements, such as finger sequences for key presses, or the coordination of subcomponents of complex multi-joint movements.[21] Implicated in this process are various cortical and sub-cortical regions, including the basal ganglia, the SMA and the pre-SMA, the cerebellum, and the premotor and prefrontal cortices, all involved in the production and learning of motor sequences but without explicit evidence of their specific contributions or interactions amongst one another.[21] In animals, neurophysiological studies have demonstrated an interaction between the frontal cortex and the basal ganglia during the learning of movement sequences.[28] Human neuroimaging studies have also emphasized the contribution of the basal ganglia for well-learned sequences.[29]

The cerebellum is arguably important for sequence learning and for the integration of individual movements into unified sequences,[29][30][31][32][33] while the pre-SMA and SMA have been shown to be involved in organizing or chunking of more complex movement sequences.[34][35] Chunking, defined as the re-organization or re-grouping of movement sequences into smaller sub-sequences during performance, is thought to facilitate the smooth performance of complex movements and to improve motor memory.[21] Lastly, the premotor cortex has been shown to be involved in tasks that require the production of relatively complex sequences, and it may contribute to motor prediction.[36][37]

Spatial organization edit

Few studies of complex motor control have distinguished between sequential and spatial organization, yet expert musical performances demand not only precise sequencing but also spatial organization of movements. Studies in animals and humans have established the involvement of parietal, sensory–motor and premotor cortices in the control of movements, when the integration of spatial, sensory and motor information is required.[38][39] Few studies so far have explicitly examined the role of spatial processing in the context of musical tasks.

Auditory-motor interactions edit

Feedforward and feedback interactions edit

An auditory–motor interaction may be loosely defined as any engagement of or communication between the two systems. Two classes of auditory-motor interaction are "feedforward" and "feedback".[21] In feedforward interactions, it is the auditory system that predominately influences the motor output, often in a predictive way.[40] An example is the phenomenon of tapping to the beat, where the listener anticipates the rhythmic accents in a piece of music. Another example is the effect of music on movement disorders: rhythmic auditory stimuli have been shown to improve walking ability in Parkinson's disease and stroke patients.[41][42]

Feedback interactions are particularly relevant in playing an instrument such as a violin, or in singing, where pitch is variable and must be continuously controlled. If auditory feedback is blocked, musicians can still execute well-rehearsed pieces, but expressive aspects of performance are affected.[43] When auditory feedback is experimentally manipulated by delays or distortions,[44] motor performance is significantly altered: asynchronous feedback disrupts the timing of events, whereas alteration of pitch information disrupts the selection of appropriate actions, but not their timing. This suggests that disruptions occur because both actions and percepts depend on a single underlying mental representation.[21]

Models of auditory–motor interactions edit

Several models of auditory–motor interactions have been advanced. The model of Hickok and Poeppel,[45] which is specific for speech processing, proposes that a ventral auditory stream maps sounds onto meaning, whereas a dorsal stream maps sounds onto articulatory representations. They and others[46] suggest that posterior auditory regions at the parieto-temporal boundary are crucial parts of the auditory–motor interface, mapping auditory representations onto motor representations of speech, and onto melodies.[47]

Mirror/echo neurons and auditory–motor interactions edit

The mirror neuron system has an important role in neural models of sensory–motor integration. There is considerable evidence that neurons respond to both actions and the accumulated observation of actions. A system proposed to explain this understanding of actions is that visual representations of actions are mapped onto our own motor system.[48]

Some mirror neurons are activated both by the observation of goal-directed actions, and by the associated sounds produced during the action. This suggests that the auditory modality can access the motor system.[49][50] While these auditory–motor interactions have mainly been studied for speech processes, and have focused on Broca's area and the vPMC, as of 2011, experiments have begun to shed light on how these interactions are needed for musical performance. Results point to a broader involvement of the dPMC and other motor areas.[21] The literature has shown a highly specialized cortical network in the skilled musician's brain that codes the relationship between musical gestures and their corresponding sounds. The data hint at the existence of an audiomotor mirror network involving the right superior temporal gyrus, the premotor cortex, the inferior frontal and inferior parietal areas, among other areas.[51]  

Music and language edit

Certain aspects of language and melody have been shown to be processed in near identical functional brain areas. Brown, Martinez and Parsons (2006) examined the neurological structural similarities between music and language.[52] Utilizing positron emission tomography (PET), the findings showed that both linguistic and melodic phrases produced activation in almost identical functional brain areas. These areas included the primary motor cortex, supplementary motor area, Broca's area, anterior insula, primary and secondary auditory cortices, temporal pole, basal ganglia, ventral thalamus and posterior cerebellum. Differences were found in lateralization tendencies as language tasks favoured the left hemisphere, but the majority of activations were bilateral which produced significant overlap across modalities.[52]

Syntactical information mechanisms in both music and language have been shown to be processed similarly in the brain. Jentschke, Koelsch, Sallat and Friederici (2008) conducted a study investigating the processing of music in children with specific language impairments (SLI).[53] Children with typical language development (TLD) showed ERP patterns different from those of children with SLI, which reflected their challenges in processing music-syntactic regularities. Strong correlations between the ERAN (Early Right Anterior Negativity—a specific ERP measure) amplitude and linguistic and musical abilities provide additional evidence for the relationship of syntactical processing in music and language.[53]

However, production of melody and production of speech may be subserved by different neural networks. Stewart, Walsh, Frith and Rothwell (2001) studied the differences between speech production and song production using transcranial magnetic stimulation (TMS).[54] Stewart et al. found that TMS applied to the left frontal lobe disturbs speech but not melody supporting the idea that they are subserved by different areas of the brain. The authors suggest that a reason for the difference is that speech generation can be localized well but the underlying mechanisms of melodic production cannot. Alternatively, it was also suggested that speech production may be less robust than melodic production and thus more susceptible to interference.[54]

Language processing is a function more of the left side of the brain than the right side, particularly Broca's area and Wernicke's area, though the roles played by the two sides of the brain in processing different aspects of language are still unclear. Music is also processed by both the left and the right sides of the brain.[52][55] Recent evidence further suggest shared processing between language and music at the conceptual level.[56] It has also been found that, among music conservatory students, the prevalence of absolute pitch is much higher for speakers of tone language, even controlling for ethnic background, showing that language influences how musical tones are perceived.[57][58]

Musician vs. non-musician processing edit

 
Professional pianists show less cortical activation for complex finger movement tasks due to structural differences in the brain.

Differences edit

Brain structure within musicians and non-musicians is distinctly different. Gaser and Schlaug (2003) compared brain structures of professional musicians with non-musicians and discovered gray matter volume differences in motor, auditory and visual-spatial brain regions.[59] Specifically, positive correlations were discovered between musician status (professional, amateur and non-musician) and gray matter volume in the primary motor and somatosensory areas, premotor areas, anterior superior parietal areas and in the inferior temporal gyrus bilaterally. This strong association between musician status and gray matter differences supports the notion that musicians' brains show use-dependent structural changes.[60] Due to the distinct differences in several brain regions, it is unlikely that these differences are innate but rather due to the long-term acquisition and repetitive rehearsal of musical skills.

Brains of musicians also show functional differences from those of non-musicians. Krings, Topper, Foltys, Erberich, Sparing, Willmes and Thron (2000) utilized fMRI to study brain area involvement of professional pianists and a control group while performing complex finger movements.[61] Krings et al. found that the professional piano players showed lower levels of cortical activation in motor areas of the brain. It was concluded that a lesser amount of neurons needed to be activated for the piano players due to long-term motor practice which results in the different cortical activation patterns. Koeneke, Lutz, Wustenberg and Jancke (2004) reported similar findings in keyboard players.[62] Skilled keyboard players and a control group performed complex tasks involving unimanual and bimanual finger movements. During task conditions, strong hemodynamic responses in the cerebellum were shown by both non-musicians and keyboard players, but non-musicians showed the stronger response. This finding indicates that different cortical activation patterns emerge from long-term motor practice. This evidence supports previous data showing that musicians require fewer neurons to perform the same movements.

Musicians have been shown to have significantly more developed left planum temporales, and have also shown to have a greater word memory.[63] Chan's study controlled for age, grade point average and years of education and found that when given a 16 word memory test, the musicians averaged one to two more words above their non musical counterparts.

Similarities edit

Studies have shown that the human brain has an implicit musical ability.[64][65] Koelsch, Gunter, Friederici and Schoger (2000) investigated the influence of preceding musical context, task relevance of unexpected chords and the degree of probability of violation on music processing in both musicians and non-musicians.[64] Findings showed that the human brain unintentionally extrapolates expectations about impending auditory input. Even in non-musicians, the extrapolated expectations are consistent with music theory. The ability to process information musically supports the idea of an implicit musical ability in the human brain. In a follow-up study, Koelsch, Schroger, and Gunter (2002) investigated whether ERAN and N5 could be evoked preattentively in non-musicians.[65] Findings showed that both ERAN and N5 can be elicited even in a situation where the musical stimulus is ignored by the listener indicating that there is a highly differentiated preattentive musicality in the human brain.

Gender differences edit

Minor neurological differences regarding hemispheric processing exist between brains of males and females. Koelsch, Maess, Grossmann and Friederici (2003) investigated music processing through EEG and ERPs and discovered gender differences.[66] Findings showed that females process music information bilaterally and males process music with a right-hemispheric predominance. However, the early negativity of males was also present over the left hemisphere. This indicates that males do not exclusively utilize the right hemisphere for musical information processing. In a follow-up study, Koelsch, Grossman, Gunter, Hahne, Schroger and Friederici (2003) found that boys show lateralization of the early anterior negativity in the left hemisphere but found a bilateral effect in girls.[67] This indicates a developmental effect as early negativity is lateralized in the right hemisphere in men and in the left hemisphere in boys.

Handedness differences edit

It has been found that subjects who are lefthanded, particularly those who are also ambidextrous, perform better than righthanders on short term memory for the pitch.[68][69] It was hypothesized that this handedness advantage is due to the fact that lefthanders have more duplication of storage in the two hemispheres than do righthanders. Other work has shown that there are pronounced differences between righthanders and lefthanders (on a statistical basis) in how musical patterns are perceived, when sounds come from different regions of space. This has been found, for example, in the Octave illusion[70][71] and the Scale illusion.[72][73]

Musical imagery edit

Musical imagery refers to the experience of replaying music by imagining it inside the head.[74] Musicians show a superior ability for musical imagery due to intense musical training.[75] Herholz, Lappe, Knief and Pantev (2008) investigated the differences in neural processing of a musical imagery task in musicians and non-musicians. Utilizing magnetoencephalography (MEG), Herholz et al. examined differences in the processing of a musical imagery task with familiar melodies in musicians and non-musicians. Specifically, the study examined whether the mismatch negativity (MMN) can be based solely on imagery of sounds. The task involved participants listening to the beginning of a melody, continuation of the melody in his/her head and finally hearing a correct/incorrect tone as further continuation of the melody. The imagery of these melodies was strong enough to obtain an early preattentive brain response to unanticipated violations of the imagined melodies in the musicians. These results indicate similar neural correlates are relied upon for trained musicians imagery and perception. Additionally, the findings suggest that modification of the imagery mismatch negativity (iMMN) through intense musical training results in achievement of a superior ability for imagery and preattentive processing of music.

Perceptual musical processes and musical imagery may share a neural substrate in the brain. A PET study conducted by Zatorre, Halpern, Perry, Meyer and Evans (1996) investigated cerebral blood flow (CBF) changes related to auditory imagery and perceptual tasks.[76] These tasks examined the involvement of particular anatomical regions as well as functional commonalities between perceptual processes and imagery. Similar patterns of CBF changes provided evidence supporting the notion that imagery processes share a substantial neural substrate with related perceptual processes. Bilateral neural activity in the secondary auditory cortex was associated with both perceiving and imagining songs. This implies that within the secondary auditory cortex, processes underlie the phenomenological impression of imagined sounds. The supplementary motor area (SMA) was active in both imagery and perceptual tasks suggesting covert vocalization as an element of musical imagery. CBF increases in the inferior frontal polar cortex and right thalamus suggest that these regions may be related to retrieval and/or generation of auditory information from memory.

Emotion edit

Music is able to create an intensely pleasurable experience that can be described as "chills".[77] Blood and Zatorre (2001) used PET to measure changes in cerebral blood flow while participants listened to music that they knew to give them the "chills" or any sort of intensely pleasant emotional response. They found that as these chills increase, many changes in cerebral blood flow are seen in brain regions such as the amygdala, orbitofrontal cortex, ventral striatum, midbrain, and the ventral medial prefrontal cortex. Many of these areas appear to be linked to reward, motivation, emotion, and arousal, and are also activated in other pleasurable situations.[77] The resulting pleasure responses enable the release dopamine, serotonin, and oxytocin. Nucleus accumbens (a part of striatum) is involved in both music related emotions, as well as rhythmic timing.

[78] According to the National Institute of Health, children and adults who are suffering from emotional trauma have been able to benefit from the use of music in a variety of ways. The use of music has been essential in helping children who struggle with focus, anxiety, and cognitive function by using music in therapeutic way. Music therapy has also helped children cope with autism, pediatric cancer, and pain from treatments.

Emotions induced by music activate similar frontal brain regions compared to emotions elicited by other stimuli.[60] Schmidt and Trainor (2001) discovered that valence (i.e. positive vs. negative) of musical segments was distinguished by patterns of frontal EEG activity.[79] Joyful and happy musical segments were associated with increases in left frontal EEG activity whereas fearful and sad musical segments were associated with increases in right frontal EEG activity. Additionally, the intensity of emotions was differentiated by the pattern of overall frontal EEG activity. Overall frontal region activity increased as affective musical stimuli became more intense.[79]

When unpleasant melodies are played, the posterior cingulate cortex activates, which indicates a sense of conflict or emotional pain.[16] The right hemisphere has also been found to be correlated with emotion, which can also activate areas in the cingulate in times of emotional pain, specifically social rejection (Eisenberger). This evidence, along with observations, has led many musical theorists, philosophers and neuroscientists to link emotion with tonality. This seems almost obvious because the tones in music seem like a characterization of the tones in human speech, which indicate emotional content. The vowels in the phonemes of a song are elongated for a dramatic effect, and it seems as though musical tones are simply exaggerations of the normal verbal tonality.

Memory edit

Neuropsychology of musical memory edit

Musical memory involves both explicit and implicit memory systems.[80] Explicit musical memory is further differentiated between episodic (where, when and what of the musical experience) and semantic (memory for music knowledge including facts and emotional concepts). Implicit memory centers on the 'how' of music and involves automatic processes such as procedural memory and motor skill learning – in other words skills critical for playing an instrument. Samson and Baird (2009) found that the ability of musicians with Alzheimer's Disease to play an instrument (implicit procedural memory) may be preserved.

Neural correlates of musical memory edit

A PET study looking into the neural correlates of musical semantic and episodic memory found distinct activation patterns.[81] Semantic musical memory involves the sense of familiarity of songs. The semantic memory for music condition resulted in bilateral activation in the medial and orbital frontal cortex, as well as activation in the left angular gyrus and the left anterior region of the middle temporal gyri. These patterns support the functional asymmetry favouring the left hemisphere for semantic memory. Left anterior temporal and inferior frontal regions that were activated in the musical semantic memory task produced activation peaks specifically during the presentation of musical material, suggestion that these regions are somewhat functionally specialized for musical semantic representations.

Episodic memory of musical information involves the ability to recall the former context associated with a musical excerpt.[81] In the condition invoking episodic memory for music, activations were found bilaterally in the middle and superior frontal gyri and precuneus, with activation predominant in the right hemisphere. Other studies have found the precuneus to become activated in successful episodic recall.[82] As it was activated in the familiar memory condition of episodic memory, this activation may be explained by the successful recall of the melody.

When it comes to memory for pitch, there appears to be a dynamic and distributed brain network subserves pitch memory processes. Gaab, Gaser, Zaehle, Jancke and Schlaug (2003) examined the functional anatomy of pitch memory using functional magnetic resonance imaging (fMRI).[83] An analysis of performance scores in a pitch memory task resulted in a significant correlation between good task performance and the supramarginal gyrus (SMG) as well as the dorsolateral cerebellum. Findings indicate that the dorsolateral cerebellum may act as a pitch discrimination processor and the SMG may act as a short-term pitch information storage site. The left hemisphere was found to be more prominent in the pitch memory task than the right hemispheric regions.

Therapeutic effects of music on memory edit

Musical training has been shown to aid memory. Altenmuller et al. studied the difference between active and passive musical instruction and found both that over a longer (but not short) period of time, the actively taught students retained much more information than the passively taught students. The actively taught students were also found to have greater cerebral cortex activation. The passively taught students weren't wasting their time; they, along with the active group, displayed greater left hemisphere activity, which is typical in trained musicians.[84]

Research suggests we listen to the same songs repeatedly because of musical nostalgia. One major study, published in the journal Memory & Cognition, found that music enables the mind to evoke memories of the past, known as music-evoked autobiographical memories.[85]

Attention edit

Treder et al.[86] identified neural correlates of attention when listening to simplified polyphonic music patterns. In a musical oddball experiment, they had participants shift selective attention to one out of three different instruments in music audio clips, with each instrument occasionally playing one or several notes deviating from an otherwise repetitive pattern. Contrasting attended versus unattended instruments, ERP analysis shows subject- and instrument-specific responses including P300 and early auditory components. The attended instrument could be classified offline with high accuracy. This indicates that attention paid to a particular instrument in polyphonic music can be inferred from ongoing EEG, a finding that is potentially relevant for building more ergonomic music-listing based brain-computer interfaces.[86]

Development edit

Musical four-year-olds have been found to have one greater left hemisphere intrahemispheric coherence.[84] Musicians have been found to have more developed anterior portions of the corpus callosum in a study by Cowell et al. in 1992. This was confirmed by a study by Schlaug et al. in 1995 that found that classical musicians between the ages of 21 and 36 have significantly greater anterior corpora callosa than the non-musical control. Schlaug also found that there was a strong correlation of musical exposure before the age of seven, and a great increase in the size of the corpus callosum.[84] These fibers join together the left and right hemispheres and indicate an increased relaying between both sides of the brain. This suggests the merging between the spatial- emotiono-tonal processing of the right brain and the linguistical processing of the left brain. This large relaying across many different areas of the brain might contribute to music's ability to aid in memory function.

Impairment edit

Focal hand dystonia edit

Focal hand dystonia is a task-related movement disorder associated with occupational activities that require repetitive hand movements.[87] Focal hand dystonia is associated with abnormal processing in the premotor and primary sensorimotor cortices. An fMRI study examined five guitarists with focal hand dystonia.[88] The study reproduced task-specific hand dystonia by having guitarists use a real guitar neck inside the scanner as well as performing a guitar exercise to trigger abnormal hand movement. The dystonic guitarists showed significantly more activation of the contralateral primary sensorimotor cortex as well as a bilateral underactivation of premotor areas. This activation pattern represents abnormal recruitment of the cortical areas involved in motor control. Even in professional musicians, widespread bilateral cortical region involvement is necessary to produce complex hand movements such as scales and arpeggios. The abnormal shift from premotor to primary sensorimotor activation directly correlates with guitar-induced hand dystonia.

Music agnosia edit

Music agnosia, an auditory agnosia, is a syndrome of selective impairment in music recognition.[89] Three cases of music agnosia are examined by Dalla Bella and Peretz (1999); C.N., G.L., and I.R.. All three of these patients suffered bilateral damage to the auditory cortex which resulted in musical difficulties while speech understanding remained intact. Their impairment is specific to the recognition of once familiar melodies. They are spared in recognizing environmental sounds and in recognizing lyrics. Peretz (1996) has studied C.N.'s music agnosia further and reports an initial impairment of pitch processing and spared temporal processing.[90] C.N. later recovered in pitch processing abilities but remained impaired in tune recognition and familiarity judgments.

Musical agnosias may be categorized based on the process which is impaired in the individual.[91] Apperceptive music agnosia involves an impairment at the level of perceptual analysis involving an inability to encode musical information correctly. Associative music agnosia reflects an impaired representational system which disrupts music recognition. Many of the cases of music agnosia have resulted from surgery involving the middle cerebral artery. Patient studies have surmounted a large amount of evidence demonstrating that the left side of the brain is more suitable for holding long-term memory representations of music and that the right side is important for controlling access to these representations. Associative music agnosias tend to be produced by damage to the left hemisphere, while apperceptive music agnosia reflects damage to the right hemisphere.

Congenital amusia edit

Congenital amusia, otherwise known as tone deafness, is a term for lifelong musical problems which are not attributable to mental retardation, lack of exposure to music or deafness, or brain damage after birth.[92] Amusic brains have been found in fMRI studies to have less white matter and thicker cortex than controls in the right inferior frontal cortex. These differences suggest abnormal neuronal development in the auditory cortex and inferior frontal gyrus, two areas which are important in musical-pitch processing.

Studies on those with amusia suggest different processes are involved in speech tonality and musical tonality. Congenital amusics lack the ability to distinguish between pitches and so are for example unmoved by dissonance and playing the wrong key on a piano. They also cannot be taught to remember a melody or to recite a song; however, they are still capable of hearing the intonation of speech, for example, distinguishing between "You speak French" and "You speak French?" when spoken.

Amygdala damage edit

 
Damage to the amygdala may impair recognition of scary music.

Damage to the amygdala has selective emotional impairments on musical recognition. Gosselin, Peretz, Johnsen and Adolphs (2007) studied S.M., a patient with bilateral damage of the amygdala with the rest of the temporal lobe undamaged and found that S.M. was impaired in recognition of scary and sad music.[93] S.M.'s perception of happy music was normal, as was her ability to use cues such as tempo to distinguish between happy and sad music. It appears that damage specific to the amygdala can selectively impair recognition of scary music.

Selective deficit in music reading edit

Specific musical impairments may result from brain damage leaving other musical abilities intact. Cappelletti, Waley-Cohen, Butterworth and Kopelman (2000) studied a single case study of patient P.K.C., a professional musician who sustained damage to the left posterior temporal lobe as well as a small right occipitotemporal lesion.[94] After sustaining damage to these regions, P.K.C. was selectively impaired in the areas of reading, writing and understanding musical notation but maintained other musical skills. The ability to read aloud letters, words, numbers and symbols (including musical ones) was retained. However, P.K.C. was unable to read aloud musical notes on the staff regardless of whether the task involved naming with the conventional letter or by singing or playing. Yet despite this specific deficit, P.K.C. retained the ability to remember and play familiar and new melodies.

Auditory arrhythmia edit

Arrhythmia in the auditory modality is defined as a disturbance of rhythmic sense; and includes deficits such as the inability to rhythmically perform music, the inability to keep time to music and the inability to discriminate between or reproduce rhythmic patterns.[95] A study investigating the elements of rhythmic function examined Patient H.J., who acquired arrhythmia after sustaining a right temporoparietal infarct.[95] Damage to this region impaired H.J.'s central timing system which is essentially the basis of his global rhythmic impairment. H.J. was unable to generate steady pulses in a tapping task. These findings suggest that keeping a musical beat relies on functioning in the right temporal auditory cortex.

References edit

  1. ^ Kandler, Karl; Clause, Amanda; Noh, Jihyun (2009). "Tonotopic reorganization of developing auditory brainstem circuits". Nature Neuroscience. 12 (6): 711–7. doi:10.1038/nn.2332. PMC 2780022. PMID 19471270.
  2. ^ Arlinger, S; Elberling, C; Bak, C; Kofoed, B; Lebech, J; Saermark, K (1982). "Cortical magnetic fields evoked by frequency glides of a continuous tone". Electroencephalography and Clinical Neurophysiology. 54 (6): 642–53. doi:10.1016/0013-4694(82)90118-3. PMID 6183097.
  3. ^ a b Köppl, Christine (1997). "Phase Locking to High Frequencies in the Auditory Nerve and Cochlear Nucleus Magnocellularis of the Barn Owl, Tyto alba". Journal of Neuroscience. 17 (9): 3312–21. doi:10.1523/JNEUROSCI.17-09-03312.1997. PMC 6573645. PMID 9096164.
  4. ^ Dreyer, A.; Delgutte, B. (2006). "Phase Locking of Auditory-Nerve Fibers to the Envelopes of High-Frequency Sounds: Implications for Sound Localization". Journal of Neurophysiology. 96 (5): 2327–41. doi:10.1152/jn.00326.2006. PMC 2013745. PMID 16807349.
  5. ^ Laudanski, J.; Coombes, S.; Palmer, A. R.; Sumner, C. J. (2009). "Mode-Locked Spike Trains in Responses of Ventral Cochlear Nucleus Chopper and Onset Neurons to Periodic Stimuli". Journal of Neurophysiology. 103 (3): 1226–37. doi:10.1152/jn.00070.2009. PMC 2887620. PMID 20042702.
  6. ^ Liu, L.-F.; Palmer, AR; Wallace, MN (2006). "Phase-Locked Responses to Pure Tones in the Inferior Colliculus". Journal of Neurophysiology. 95 (3): 1926–35. doi:10.1152/jn.00497.2005. PMID 16339005.
  7. ^ Wallace, M. N.; Anderson, L. A.; Palmer, A. R. (2007). "Phase-Locked Responses to Pure Tones in the Auditory Thalamus". Journal of Neurophysiology. 98 (4): 1941–52. doi:10.1152/jn.00697.2007. PMID 17699690. S2CID 10052217.
  8. ^ Skoe, Erika; Kraus, Nina (2010). "Auditory Brain Stem Response to Complex Sounds: A Tutorial". Ear and Hearing. 31 (3): 302–24. doi:10.1097/AUD.0b013e3181cdb272. PMC 2868335. PMID 20084007.
  9. ^ Hyde, Krista L.; Peretz, Isabelle; Zatorre, Robert J. (2008). "Evidence for the role of the right auditory cortex in fine pitch resolution". Neuropsychologia. 46 (2): 632–9. doi:10.1016/j.neuropsychologia.2007.09.004. PMID 17959204. S2CID 12414672.
  10. ^ Patterson, Roy D; Uppenkamp, Stefan; Johnsrude, Ingrid S; Griffiths, Timothy D (2002). "The Processing of Temporal Pitch and Melody Information in Auditory Cortex". Neuron. 36 (4): 767–76. doi:10.1016/S0896-6273(02)01060-7. PMID 12441063. S2CID 2429799.
  11. ^ Deutsch, D. (2013). "Absolute pitch In D. Deutsch (Ed.)". The Psychology of Music, 3rd Edition: 141–182. doi:10.1016/B978-0-12-381460-9.00005-5. ISBN 9780123814609. PDF Document
  12. ^ Takeuchi, Annie H.; Hulse, Stewart H. (1993). "Absolute pitch". Psychological Bulletin. 113 (2): 345–61. doi:10.1037/0033-2909.113.2.345. PMID 8451339.
  13. ^ Zatorre, Robert J.; Perry, David W.; Beckett, Christine A.; Westbury, Christopher F.; Evans, Alan C. (1998). "Functional anatomy of musical processing in listeners with absolute pitch and relative pitch". Proceedings of the National Academy of Sciences of the United States of America. 95 (6): 3172–7. Bibcode:1998PNAS...95.3172Z. doi:10.1073/pnas.95.6.3172. PMC 19714. PMID 9501235.
  14. ^ a b Brattico, Elvira; Tervaniemi, Mari; Näätänen, Risto; Peretz, Isabelle (2006). "Musical scale properties are automatically processed in the human auditory cortex". Brain Research. 1117 (1): 162–74. doi:10.1016/j.brainres.2006.08.023. PMID 16963000. S2CID 8401429.
  15. ^ . Oxford Dictionaries | English. Archived from the original on September 27, 2016. Retrieved 2019-05-31.
  16. ^ a b c Tramo, M. J. (2001). "BIOLOGY AND MUSIC: Enhanced: Music of the Hemispheres". Science. 291 (5501): 54–6. doi:10.1126/science.10.1126/science.1056899. PMID 11192009. S2CID 132754452.
  17. ^ Snyder, Joel S.; Large, Edward W. (2005). "Gamma-band activity reflects the metric structure of rhythmic tone sequences". Cognitive Brain Research. 24 (1): 117–26. doi:10.1016/j.cogbrainres.2004.12.014. PMID 15922164.
  18. ^ Krumhansl, Carol (1990). Cognitive Foundations of Musical Pitch. New York: Oxford University Press. ISBN 978-0-19-514836-7.[page needed]
  19. ^ Janata, P.; Birk, JL; Van Horn, JD; Leman, M; Tillmann, B; Bharucha, JJ (2002). "The Cortical Topography of Tonal Structures Underlying Western Music". Science. 298 (5601): 2167–70. Bibcode:2002Sci...298.2167J. doi:10.1126/science.1076262. PMID 12481131. S2CID 3031759.
  20. ^ Proverbio, Alice Mado; Orlandi, Andrea; Pisanu, Francesca (2016). "Brain processing of consonance/dissonance in musicians and controls: a hemispheric asymmetry revisited". European Journal of Neuroscience. 44 (6): 2340–2356. doi:10.1111/ejn.13330. ISSN 1460-9568. PMID 27421883. S2CID 3899594.
  21. ^ a b c d e f g h i j k Zatorre, R. J.; Halpern, A. R. (2005). "Mental concerts: musical imagery and auditory cortex". Neuron. 47 (1): 9–12. doi:10.1016/j.neuron.2005.06.013. PMID 15996544. S2CID 1613599.
  22. ^ a b Buhusi, C. V.; Meck, W. H. (2005). "What makes us tick? Functional and neural mechanisms of interval timing". Nature Reviews Neuroscience. 6 (10): 755–765. doi:10.1038/nrn1764. PMID 16163383. S2CID 29616055.
  23. ^ Ivry, R. B.; Spencer, R. M. (2004). "The neural representation of time". Curr. Opin. Neurobiol. 14 (2): 225–232. doi:10.1016/j.conb.2004.03.013. PMID 15082329. S2CID 10629859.
  24. ^ a b Spencer, R. M.; Zelaznik, H. N.; Diedrichson, J.; Ivry, R. B. (2003). "Disrupted timing of discontinuous but not continuous movements by cerebellar lesions". Science. 300 (5624): 1437–1439. Bibcode:2003Sci...300.1437S. doi:10.1126/science.1083661. PMID 12775842. S2CID 16390014.
  25. ^ a b Wing, A. M. (2002). "Voluntary timing and brain function: an information processing approach". Brain Cogn. 48 (1): 7–30. doi:10.1006/brcg.2001.1301. PMID 11812030. S2CID 5596590.
  26. ^ Mauk, M. D.; Buonomano, D. V. (2004). "The neural basis of temporal processing". Annu. Rev. Neurosci. 27: 307–340. doi:10.1146/annurev.neuro.27.070203.144247. PMID 15217335.
  27. ^ Lewis, P. A.; Miall, R. C. (2003). "Distinct systems for automatic and cognitively controlled time measurement: evidence from neuroimaging". Curr. Opin. Neurobiol. 13 (2): 250–255. doi:10.1016/s0959-4388(03)00036-9. PMID 12744981. S2CID 328258.
  28. ^ Graybiel, A. M. (2005). "The basal ganglia: learning new tricks and loving it". Curr. Opin. Neurobiol. 15 (6): 638–644. doi:10.1016/j.conb.2005.10.006. PMID 16271465. S2CID 12490490.
  29. ^ a b Doyon, J.; Penhune, V. B.; Ungerleider, L. G. (2003). "Distinct contribution of the cortico-striatal and corticocerebellar systems to motor skill learning". Neuropsychologia. 41 (3): 252–262. doi:10.1016/s0028-3932(02)00158-6. PMID 12457751. S2CID 1855933.
  30. ^ Penhune, V. B.; Doyon, J. (2005). "Cerebellum and M1 interaction during early learning of timed motor sequences". NeuroImage. 26 (3): 801–812. doi:10.1016/j.neuroimage.2005.02.041. PMID 15955490. S2CID 14531779.
  31. ^ Hikosaka, O.; Nakamura, H.; Sakai, K.; Nakahara, H. (2002). "Central mechanisms of motor skill learning". Curr. Opin. Neurobiol. 12 (2): 217–222. doi:10.1016/s0959-4388(02)00307-0. PMID 12015240. S2CID 12354147.
  32. ^ Thach, W. T. (1998). "A role for the cerebellum in learning movement coordination". Neurobiol. Learn. Mem. 70 (1–2): 177–188. doi:10.1006/nlme.1998.3846. PMID 9753595. S2CID 29972449.
  33. ^ Garraux, G.; et al. (2005). "Shared brain areas but not functional connections in controlling movement timing and order". J. Neurosci. 25 (22): 5290–5297. doi:10.1523/jneurosci.0340-05.2005. PMC 6724991. PMID 15930376.
  34. ^ Sakai, K.; Hikosaka, O.; Nakamura, H. (2004). "Emergence of rhythm during motor learning". Trends Cogn. Sci. 8 (12): 547–553. doi:10.1016/j.tics.2004.10.005. PMID 15556024. S2CID 18845950.
  35. ^ Kennerley, S. W.; Sakai, K.; Rushworth, M. F. (2004). "Organization of action sequences and the role of the pre-SMA". J. Neurophysiol. 91 (2): 978–993. doi:10.1152/jn.00651.2003. PMID 14573560. S2CID 7763911.
  36. ^ Janata, P.; Grafton, S. T. (2003). "Swinging in the brain: shared neural substrates for behaviors related to sequencing and music". Nature Neuroscience. 6 (7): 682–687. doi:10.1038/nn1081. PMID 12830159. S2CID 7605155.
  37. ^ Schubotz, R. I.; von Cramon, D. Y. (2003). "Functional-anatomical concepts of human premotor cortex: evidence from fMRI and PET studies". NeuroImage. 20 (Suppl. 1): S120–S131. doi:10.1016/j.neuroimage.2003.09.014. PMID 14597305. S2CID 10198110.
  38. ^ Johnson, P. B.; Ferraina, S.; Bianchi, L.; Caminiti, R. (1996). "Cortical networks for visual reaching: physiological and anatomical organization of frontal and parietal lobe arm regions. Cereb". Cortex. 6 (2): 102–119. doi:10.1093/cercor/6.2.102. PMID 8670643.
  39. ^ Rizzolatti, G.; Luppino, G.; Matelli, M. (1998). "The organization of the cortical motor system: new concepts". Electroencephalogr. Clin. Neurophysiol. 106 (4): 283–296. doi:10.1016/s0013-4694(98)00022-4. PMID 9741757.
  40. ^ Large, E. W.; Palmer, C. (2002). "Perceiving temporal regularity in music". Cogn Sci. 26: 1–37. doi:10.1207/s15516709cog2601_1.
  41. ^ Thaut, M. H.; McIntosh, G. C.; Rice, R. R. (1997). "Rhythmic facilitation of gait training in hemiparetic stroke rehabilitation". J. Neurol. Sci. 151 (2): 207–212. doi:10.1016/s0022-510x(97)00146-9. PMID 9349677. S2CID 2515325.
  42. ^ McIntosh, G. C.; Brown, S. H.; Rice, R. R.; Thaut, M. H. (1997). "Rhythmic auditory-motor facilitation of gait patterns in patients with Parkinson's disease". J. Neurol. Neurosurg. Psychiatry. 62 (1): 22–26. doi:10.1136/jnnp.62.1.22. PMC 486690. PMID 9010395.
  43. ^ Repp, B. H. (1999). "Effects of auditory feedback deprivation on expressive piano performance". Music Perception. 16 (4): 409–438. doi:10.2307/40285802. JSTOR 40285802.
  44. ^ Pfordresher, P. Q.; Palmer, C. (2006). "Effects of hearing the past, present, or future during music performance". Percept. Psychophys. 68 (3): 362–376. doi:10.3758/bf03193683. PMID 16900830.
  45. ^ Hickok, G.; Poeppel, D. (2004). "Dorsal and ventral streams: a framework for understanding aspects of the functional anatomy of language". Cognition. 92 (1–2): 67–99. doi:10.1016/j.cognition.2003.10.011. PMID 15037127. S2CID 635860.
  46. ^ Scott, S. K. & Johnsrude, I. S. "The neuroanatomical and functional organization of speech perception. Trends Neurosci. 26, 100–107 (2003)
  47. ^ Hickok, G.; Buchsbaum, B.; Humphries, C.; Muftuler, T. (2003). "Auditory–motor interaction revealed by fMRI: speech, music, and working memory in area SPT". J. Cogn. Neurosci. 15 (5): 673–682. doi:10.1162/089892903322307393. PMID 12965041.
  48. ^ Rizzolatti, G.; Fogassi, L.; Gallese, V. (2001). "Neurophysiological mechanisms underlying the understanding and imitation of action". Nature Reviews Neuroscience. 2 (9): 661–670. doi:10.1038/35090060. PMID 11533734. S2CID 6792943.
  49. ^ Kohler, E.; et al. (2002). "Hearing sounds, understanding actions: action representation in mirror neurons". Science. 297 (5582): 846–848. Bibcode:2002Sci...297..846K. doi:10.1126/science.1070311. PMID 12161656. S2CID 16923101.
  50. ^ Keysers, C.; et al. (2003). "Audiovisual mirror neurons and action recognition. Exp". Brain Res. 153 (4): 628–636. doi:10.1007/s00221-003-1603-5. PMID 12937876. S2CID 7704309.
  51. ^ Mado Proverbio, Alice; Calbi, Marta; Manfredi, Mirella; Zani, Alberto (2014-07-29). "Audio-visuomotor processing in the Musician's brain: an ERP study on professional violinists and clarinetists". Scientific Reports. 4: 5866. Bibcode:2014NatSR...4E5866M. doi:10.1038/srep05866. ISSN 2045-2322. PMC 5376193. PMID 25070060.
  52. ^ a b c Brown, Steven; Martinez, Michael J.; Parsons, Lawrence M. (2006). "Music and language side by side in the brain: A PET study of the generation of melodies and sentences". European Journal of Neuroscience. 23 (10): 2791–803. doi:10.1111/j.1460-9568.2006.04785.x. PMID 16817882. S2CID 15189129.
  53. ^ a b Jentschke, Sebastian; Koelsch, Stefan; Sallat, Stephan; Friederici, Angela D. (2008). "Children with Specific Language Impairment Also Show Impairment of Music-syntactic Processing". Journal of Cognitive Neuroscience. 20 (11): 1940–51. doi:10.1162/jocn.2008.20135. PMID 18416683. S2CID 6678801.
  54. ^ a b Stewart, Lauren; Walsh, Vincent; Frith, UTA; Rothwell, John (2006). "Transcranial Magnetic Stimulation Produces Speech Arrest but Not Song Arrest" (PDF). Annals of the New York Academy of Sciences. 930 (1): 433–5. Bibcode:2001NYASA.930..433S. doi:10.1111/j.1749-6632.2001.tb05762.x. PMID 11458860. S2CID 31971115.
  55. ^ Koelsch, Stefan; Gunter, Thomas C.; v Cramon, D.Yves; Zysset, Stefan; Lohmann, Gabriele; Friederici, Angela D. (2002). "Bach Speaks: A Cortical "Language-Network" Serves the Processing of Music". NeuroImage. 17 (2): 956–66. doi:10.1006/nimg.2002.1154. hdl:11858/00-001M-0000-0010-9FF9-6. PMID 12377169.
  56. ^ Daltrozzo, Jérôme; Schön, Daniele (2009). "Conceptual Processing in Music as Revealed by N400 Effects on Words and Musical Targets". Journal of Cognitive Neuroscience. 21 (10): 1882–92. doi:10.1162/jocn.2009.21113. PMID 18823240. S2CID 10848425.
  57. ^ Deutsch, Diana; Henthorn, Trevor; Marvin, Elizabeth; Xu, Hongshuai (2006). "Absolute pitch among American and Chinese conservatory students: Prevalence differences, and evidence for a speech-related critical period". The Journal of the Acoustical Society of America. 119 (2): 719–22. Bibcode:2006ASAJ..119..719D. doi:10.1121/1.2151799. PMID 16521731.
  58. ^ Deutsch, Diana; Dooley, Kevin; Henthorn, Trevor; Head, Brian (2009). "Absolute pitch among students in an American music conservatory: Association with tone language fluency". The Journal of the Acoustical Society of America. 125 (4): 2398–403. Bibcode:2009ASAJ..125.2398D. doi:10.1121/1.3081389. PMID 19354413.
  59. ^ Gaser, C; Schlaug, G (2003). "Brain structures differ between musicians and non-musicians". The Journal of Neuroscience. 23 (27): 9240–5. doi:10.1523/JNEUROSCI.23-27-09240.2003. PMC 6740845. PMID 14534258.
  60. ^ a b Croom, Adam M. (2012). "Music, Neuroscience, and the Psychology of Well-Being: A Précis". Frontiers in Psychology. 2: 393. doi:10.3389/fpsyg.2011.00393. PMC 3249389. PMID 22232614.
  61. ^ Krings, Timo; Töpper, Rudolf; Foltys, Henrik; Erberich, Stephan; Sparing, Roland; Willmes, Klaus; Thron, Armin (2000). "Cortical activation patterns during complex motor tasks in piano players and control subjects. A functional magnetic resonance imaging study". Neuroscience Letters. 278 (3): 189–93. doi:10.1016/S0304-3940(99)00930-1. PMID 10653025. S2CID 6564482.
  62. ^ Koeneke, Susan; Lutz, Kai; Wüstenberg, Torsten; Jäncke, Lutz (2004). "Long-term training affects cerebellar processing in skilled keyboard players". NeuroReport. 15 (8): 1279–82. doi:10.1097/01.wnr.0000127463.10147.e7. PMID 15167549. S2CID 14517466.
  63. ^ Chan, Agnes S.; Ho, Yim-Chi; Cheung, Mei-Chun (1998). "Music training improves verbal memory". Nature. 396 (6707): 128. Bibcode:1998Natur.396..128C. doi:10.1038/24075. ISSN 1476-4687. PMID 9823892. S2CID 4425221.
  64. ^ a b Koelsch, Stefan; Gunter, Tomas; Friederici, Angela D.; Schröger, Erich (2000). "Brain Indices of Music Processing: "Nonmusicians" are Musical". Journal of Cognitive Neuroscience. 12 (3): 520–41. doi:10.1162/089892900562183. PMID 10931776. S2CID 6205775.
  65. ^ a b Koelsch, Stefan; Schroger, Erich; Gunter, Thomas C. (2002). "Music matters: Preattentive musicality of the human brain". Psychophysiology. 39 (1): 38–48. doi:10.1111/1469-8986.3910038. hdl:11858/00-001M-0000-0010-C96E-D. PMID 12206294.
  66. ^ Koelsch, Stefan; Maess, Burkhard; Grossmann, Tobias; Friederici, Angela D. (2003). "Electric brain responses reveal gender differences in music processing". NeuroReport. 14 (5): 709–13. doi:10.1097/00001756-200304150-00010. hdl:11858/00-001M-0000-0010-B017-B. PMID 12692468.
  67. ^ Koelsch, Stefan; Grossmann, Tobias; Gunter, Thomas C.; Hahne, Anja; Schröger, Erich; Friederici, Angela D. (2003). "Children Processing Music: Electric Brain Responses Reveal Musical Competence and Gender Differences". Journal of Cognitive Neuroscience. 15 (5): 683–93. doi:10.1162/jocn.2003.15.5.683. hdl:11858/00-001M-0000-0010-A3D0-1. PMID 12965042. S2CID 10553168.
  68. ^ Deutsch, D (February 1978). "Pitch memory: An advantage for the left-handed". Science. 199 (4328): 559–560. Bibcode:1978Sci...199..559D. doi:10.1126/science.622558. PMID 622558.
  69. ^ Deutsch, Diana (1980). "Handedness and Memory for Tonal Pitch" (PDF). In Herron, Jeannine (ed.). Neuropsychology of Lefthandedness. pp. 263–71.
  70. ^ Deutsch, Diana (1974). "An auditory illusion". Nature. 251 (5473): 307–9. Bibcode:1974Natur.251..307D. doi:10.1038/251307a0. PMID 4427654. S2CID 4273134.
  71. ^ Deutsch, Diana (1983). "The octave illusion in relation to handedness and familial handedness background". Neuropsychologia. 21 (3): 289–93. doi:10.1016/0028-3932(83)90047-7. PMID 6877583. S2CID 3063526.
  72. ^ Deutsch, Diana (1975). "Two-channel listening to musical scales". The Journal of the Acoustical Society of America. 57 (5): 1156–60. Bibcode:1975ASAJ...57.1156D. doi:10.1121/1.380573. PMID 1127169.
  73. ^ Deutsch, D. (1999). "Grouping mechanisms in music" (PDF). In Deutsch, D. (ed.). The psychology of music (2nd ed.). pp. 299–348.
  74. ^ Halpern, Andrea R. (2006). "Cerebral Substrates of Musical Imagery". Annals of the New York Academy of Sciences. 930 (1): 179–92. Bibcode:2001NYASA.930..179H. doi:10.1111/j.1749-6632.2001.tb05733.x. PMID 11458829. S2CID 31277594.
  75. ^ Herholz, Sibylle C.; Lappe, Claudia; Knief, Arne; Pantev, Christo (2008). "Neural basis of music imagery and the effect of musical expertise". European Journal of Neuroscience. 28 (11): 2352–60. doi:10.1111/j.1460-9568.2008.06515.x. PMID 19046375. S2CID 205513912.
  76. ^ Zatorre, Robert J.; Halpern, Andrea R.; Perry, David W.; Meyer, Ernst; Evans, Alan C. (1996). "Hearing in the Mind's Ear: A PET Investigation of Musical Imagery and Perception". Journal of Cognitive Neuroscience. 8 (1): 29–46. doi:10.1162/jocn.1996.8.1.29. PMID 23972234. S2CID 11312311.
  77. ^ a b Blood, A. J.; Zatorre, R. J. (2001). "Intensely pleasurable responses to music correlate with activity in brain regions implicated in reward and emotion". Proceedings of the National Academy of Sciences. 98 (20): 11818–11823. Bibcode:2001PNAS...9811818B. doi:10.1073/pnas.191355898. PMC 58814. PMID 11573015.
  78. ^ Collins, Francis S.; Fleming, Renée; Rutter, Deborah; Iyengar, Sunil; Tottenham, Nim; Patel, Aniruddh D.; Limb, Charles; Johnson, Julene K.; Holochwost, Steven J. (2018-03-21). "NIH/Kennedy Center Workshop on Music and the Brain: Finding Harmony". Neuron. 97 (6): 1214–1218. doi:10.1016/j.neuron.2018.02.004. ISSN 0896-6273. PMC 6688399. PMID 29566791.
  79. ^ a b Schmidt, Louis A.; Trainor, Laurel J. (2001). "Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions". Cognition & Emotion. 15 (4): 487–500. doi:10.1080/02699930126048. S2CID 5557258.
  80. ^ Baird, Amee; Samson, Séverine (2009). "Memory for Music in Alzheimer's Disease: Unforgettable?". Neuropsychology Review. 19 (1): 85–101. doi:10.1007/s11065-009-9085-2. PMID 19214750. S2CID 14341862.
  81. ^ a b Platel, Hervé; Baron, Jean-Claude; Desgranges, Béatrice; Bernard, Frédéric; Eustache, Francis (2003). "Semantic and episodic memory of music are subserved by distinct neural networks". NeuroImage. 20 (1): 244–56. doi:10.1016/S1053-8119(03)00287-8. PMID 14527585. S2CID 17195548.
  82. ^ Kapur, Shitij; Craik, Fergus I. M.; Jones, Corey; Brown, Gregory M.; Houle, Sylvain; Tulving, Endel (1995). "Functional role of the prefrontal cortex in retrieval of memories: A PET study". NeuroReport. 6 (14): 1880–4. doi:10.1097/00001756-199510020-00014. PMID 8547589. S2CID 21792266.
  83. ^ Gaab, Nadine; Gaser, Christian; Zaehle, Tino; Jancke, Lutz; Schlaug, Gottfried (2003). "Functional anatomy of pitch memory—an fMRI study with sparse temporal sampling". NeuroImage. 19 (4): 1417–26. doi:10.1016/S1053-8119(03)00224-6. PMID 12948699. S2CID 1878442.
  84. ^ a b c Burriss, Kathleen Glascott; Strickland, Susan J. (2001). "Review of Research: Music and the Brain in Childhood Development". Childhood Education. 78 (2): 100–103. doi:10.1080/00094056.2002.10522714. S2CID 219597861.
  85. ^ Schulkind, Matthew D.; Hennis, Laura Kate; Rubin, David C. (1999-11-01). "Music, emotion, and autobiographical memory: They're playing your song". Memory & Cognition. 27 (6): 948–955. doi:10.3758/BF03201225. hdl:10161/10143. ISSN 1532-5946. PMID 10586571. S2CID 34931829.
  86. ^ a b Treder, Matthias; Purwins, Hendrik; Miklody, Daniel; Sturm, Irene; Blankertz, Benjamin (2014). "Decoding auditory attention to instruments in polyphonic music using single-trial EEG classification" (PDF). Journal of Neural Engineering. 11 (2): 026009. Bibcode:2014JNEng..11b6009T. doi:10.1088/1741-2560/11/2/026009. PMID 24608228. S2CID 35135614.
  87. ^ Chen, R; Hallett, M (1998). "Focal dystonia and repetitive motion disorders". Clinical Orthopaedics and Related Research (351): 102–6. PMID 9646753.
  88. ^ Pujol, J.; Roset-Llobet, J.; Rosinés-Cubells, D.; Deus, J.; Narberhaus, B.; Valls-Solé, J.; Capdevila, A.; Pascual-Leone, A. (2000). "Brain Cortical Activation during Guitar-Induced Hand Dystonia Studied by Functional MRI". NeuroImage. 12 (3): 257–67. doi:10.1006/nimg.2000.0615. PMID 10944408. S2CID 24205160.
  89. ^ Dalla Bella, Simone; Peretz, Isabelle (1999). "Music Agnosias: Selective Impairments of Music Recognition After Brain Damage". Journal of New Music Research. 28 (3): 209–216. doi:10.1076/jnmr.28.3.209.3108.
  90. ^ Peretz, Isabelle (1996). "Can We Lose Memory for Music? A Case of Music Agnosia in a Nonmusician". Journal of Cognitive Neuroscience. 8 (6): 481–96. doi:10.1162/jocn.1996.8.6.481. PMID 23961980. S2CID 25846736.
  91. ^ Ayotte, J. (2000). "Patterns of music agnosia associated with middle cerebral artery infarcts". Brain. 123 (9): 1926–38. doi:10.1093/brain/123.9.1926. PMID 10960056.
  92. ^ Peretz, Isabelle (2008). "Musical Disorders: From Behavior to Genes". Current Directions in Psychological Science. 17 (5): 329–333. doi:10.1111/j.1467-8721.2008.00600.x. S2CID 15242461.
  93. ^ Gosselin, Nathalie; Peretz, Isabelle; Johnsen, Erica; Adolphs, Ralph (2007). "Amygdala damage impairs emotion recognition from music". Neuropsychologia. 45 (2): 236–44. doi:10.1016/j.neuropsychologia.2006.07.012. PMID 16970965. S2CID 14537793.
  94. ^ Cappelletti, M.; Waley-Cohen, H.; Butterworth, B.; Kopelman, M. (2000). "A selective loss of the ability to read and to write music". Neurocase. 6 (4): 321–332. doi:10.1080/13554790008402780. S2CID 144572937.
  95. ^ a b Wilson, Sarah J; Pressing, Jeffrey L; Wales, Roger J (2002). "Modelling rhythmic function in a musician post-stroke". Neuropsychologia. 40 (8): 1494–505. doi:10.1016/S0028-3932(01)00198-1. PMID 11931954. S2CID 16730354.

External links edit

  • MusicCognition.info - A Resource and Information Center

neuroscience, music, neuroscience, music, scientific, study, brain, based, mechanisms, involved, cognitive, processes, underlying, music, these, behaviours, include, music, listening, performing, composing, reading, writing, ancillary, activities, also, increa. The neuroscience of music is the scientific study of brain based mechanisms involved in the cognitive processes underlying music These behaviours include music listening performing composing reading writing and ancillary activities It also is increasingly concerned with the brain basis for musical aesthetics and musical emotion Scientists working in this field may have training in cognitive neuroscience neurology neuroanatomy psychology music theory computer science and other relevant fields The cognitive neuroscience of music represents a significant branch of music psychology and is distinguished from related fields such as cognitive musicology in its reliance on direct observations of the brain and use of brain imaging techniques like functional magnetic resonance imaging fMRI and positron emission tomography PET Contents 1 Elements of music 1 1 Pitch 1 1 1 Absolute pitch 1 2 Melody 1 3 Rhythm 1 4 Tonality 2 Music production and performance 2 1 Motor control functions 2 1 1 Timing 2 1 2 Sequencing 2 1 3 Spatial organization 2 2 Auditory motor interactions 2 2 1 Feedforward and feedback interactions 2 2 2 Models of auditory motor interactions 2 2 3 Mirror echo neurons and auditory motor interactions 3 Music and language 4 Musician vs non musician processing 4 1 Differences 4 2 Similarities 5 Gender differences 6 Handedness differences 7 Musical imagery 8 Emotion 9 Memory 9 1 Neuropsychology of musical memory 9 2 Neural correlates of musical memory 9 3 Therapeutic effects of music on memory 10 Attention 11 Development 12 Impairment 12 1 Focal hand dystonia 12 2 Music agnosia 12 3 Congenital amusia 12 4 Amygdala damage 12 5 Selective deficit in music reading 12 6 Auditory arrhythmia 13 References 14 External linksElements of music editPitch edit Sounds consist of waves of air molecules that vibrate at different frequencies These waves travel to the basilar membrane in the cochlea of the inner ear Different frequencies of sound will cause vibrations in different locations of the basilar membrane We are able to hear different pitches because each sound wave with a unique frequency is correlated to a different location along the basilar membrane This spatial arrangement of sounds and their respective frequencies being processed in the basilar membrane is known as tonotopy When the hair cells on the basilar membrane move back and forth due to the vibrating sound waves they release neurotransmitters and cause action potentials to occur down the auditory nerve The auditory nerve then leads to several layers of synapses at numerous clusters of neurons or nuclei in the auditory brainstem These nuclei are also tonotopically organized and the process of achieving this tonotopy after the cochlea is not well understood 1 This tonotopy is in general maintained up to primary auditory cortex in mammals 2 A widely postulated mechanism for pitch processing in the early central auditory system is the phase locking and mode locking of action potentials to frequencies in a stimulus Phase locking to stimulus frequencies has been shown in the auditory nerve 3 4 the cochlear nucleus 3 5 the inferior colliculus 6 and the auditory thalamus 7 By phase and mode locking in this way the auditory brainstem is known to preserve a good deal of the temporal and low passed frequency information from the original sound this is evident by measuring the auditory brainstem response using EEG 8 This temporal preservation is one way to argue directly for the temporal theory of pitch perception and to argue indirectly against the place theory of pitch perception nbsp The primary auditory cortex is one of the main areas associated with superior pitch resolution The right secondary auditory cortex has finer pitch resolution than the left Hyde Peretz and Zatorre 2008 used functional magnetic resonance imaging fMRI in their study to test the involvement of right and left auditory cortical regions in the frequency processing of melodic sequences 9 As well as finding superior pitch resolution in the right secondary auditory cortex specific areas found to be involved were the planum temporale PT in the secondary auditory cortex and the primary auditory cortex in the medial section of Heschl s gyrus HG Many neuroimaging studies have found evidence of the importance of right secondary auditory regions in aspects of musical pitch processing such as melody 10 Many of these studies such as one by Patterson Uppenkamp Johnsrude and Griffiths 2002 also find evidence of a hierarchy of pitch processing Patterson et al 2002 used spectrally matched sounds which produced no pitch fixed pitch or melody in an fMRI study and found that all conditions activated HG and PT Sounds with pitch activated more of these regions than sounds without When a melody was produced activation spread to the superior temporal gyrus STG and planum polare PP These results support the existence of a pitch processing hierarchy Absolute pitch edit nbsp Musicians possessing perfect pitch can identify the pitch of musical tones without external reference Main article Absolute pitch Absolute pitch AP is defined as the ability to identify the pitch of a musical tone or to produce a musical tone at a given pitch without the use of an external reference pitch 11 12 Neuroscientific research has not discovered a distinct activation pattern common for possessors of AP Zatorre Perry Beckett Westbury and Evans 1998 examined the neural foundations of AP using functional and structural brain imaging techniques 13 Positron emission tomography PET was utilized to measure cerebral blood flow CBF in musicians possessing AP and musicians lacking AP When presented with musical tones similar patterns of increased CBF in auditory cortical areas emerged in both groups AP possessors and non AP subjects demonstrated similar patterns of left dorsolateral frontal activity when they performed relative pitch judgments However in non AP subjects activation in the right inferior frontal cortex was present whereas AP possessors showed no such activity This finding suggests that musicians with AP do not need access to working memory devices for such tasks These findings imply that there is no specific regional activation pattern unique to AP Rather the availability of specific processing mechanisms and task demands determine the recruited neural areas Melody edit Studies suggest that individuals are capable of automatically detecting a difference or anomaly in a melody such as an out of tune pitch which does not fit with their previous music experience This automatic processing occurs in the secondary auditory cortex Brattico Tervaniemi Naatanen and Peretz 2006 performed one such study to determine if the detection of tones that do not fit an individual s expectations can occur automatically 14 They recorded event related potentials ERPs in nonmusicians as they were presented unfamiliar melodies with either an out of tune pitch or an out of key pitch while participants were either distracted from the sounds or attending to the melody Both conditions revealed an early frontal error related negativity independent of where attention was directed This negativity originated in the auditory cortex more precisely in the supratemporal lobe which corresponds with the secondary auditory cortex with greater activity from the right hemisphere The negativity response was larger for pitch that was out of tune than that which was out of key Ratings of musical incongruity were higher for out of tune pitch melodies than for out of key pitch In the focused attention condition out of key and out of tune pitches produced late parietal positivity The findings of Brattico et al 2006 suggest that there is automatic and rapid processing of melodic properties in the secondary auditory cortex 14 The findings that pitch incongruities were detected automatically even in processing unfamiliar melodies suggests that there is an automatic comparison of incoming information with long term knowledge of musical scale properties such as culturally influenced rules of musical properties common chord progressions scale patterns etc and individual expectations of how the melody should proceed Rhythm edit The belt and parabelt areas of the right hemisphere are involved in processing rhythm 15 Rhythm is a strong repeated pattern of movement or sound When individuals are preparing to tap out a rhythm of regular intervals 1 2 or 1 3 the left frontal cortex left parietal cortex and right cerebellum are all activated With more difficult rhythms such as a 1 2 5 more areas in the cerebral cortex and cerebellum are involved 16 EEG recordings have also shown a relationship between brain electrical activity and rhythm perception Snyder and Large 2005 17 performed a study examining rhythm perception in human subjects finding that activity in the gamma band 20 60 Hz corresponds to the beats in a simple rhythm Two types of gamma activity were found by Snyder amp Large induced gamma activity and evoked gamma activity Evoked gamma activity was found after the onset of each tone in the rhythm this activity was found to be phase locked peaks and troughs were directly related to the exact onset of the tone and did not appear when a gap missed beat was present in the rhythm Induced gamma activity which was not found to be phase locked was also found to correspond with each beat However induced gamma activity did not subside when a gap was present in the rhythm indicating that induced gamma activity may possibly serve as a sort of internal metronome independent of auditory input Tonality edit Tonality describes the relationships between the elements of melody and harmony tones intervals chords and scales These relationships are often characterized as hierarchical such that one of the elements dominates or attracts another They occur both within and between every type of element creating a rich and time varying perception between tones and their melodic harmonic and chromatic contexts In one conventional sense tonality refers to just the major and minor scale types examples of scales whose elements are capable of maintaining a consistent set of functional relationships The most important functional relationship is that of the tonic note the first note in a scale and the tonic chord the first note in the scale with the third and fifth note with the rest of the scale The tonic is the element which tends to assert its dominance and attraction over all others and it functions as the ultimate point of attraction rest and resolution for the scale 18 The right auditory cortex is primarily involved in perceiving pitch and parts of harmony melody and rhythm 16 One study by Petr Janata found that there are tonality sensitive areas in the medial prefrontal cortex the cerebellum the superior temporal sulci of both hemispheres and the superior temporal gyri which has a skew towards the right hemisphere 19 Hemispheric asymmetries in the processing of dissonant consonant sounds have been demonstrated ERP studies have shown larger evoked responses over the left temporal area in response to dissonant chords and over the right one in response to consonant chords 20 Music production and performance editMotor control functions edit Musical performance usually involves at least three elementary motor control functions timing sequencing and spatial organization of motor movements Accuracy in timing of movements is related to musical rhythm Rhythm the pattern of temporal intervals within a musical measure or phrase in turn creates the perception of stronger and weaker beats 21 Sequencing and spatial organization relate to the expression of individual notes on a musical instrument These functions and their neural mechanisms have been investigated separately in many studies but little is known about their combined interaction in producing a complex musical performance 21 The study of music requires examining them together Timing edit Although neural mechanisms involved in timing movement have been studied rigorously over the past 20 years much remains controversial The ability to phrase movements in precise time has been accredited to a neural metronome or clock mechanism where time is represented through oscillations or pulses 22 23 24 25 An opposing view to this metronome mechanism has also been hypothesized stating that it is an emergent property of the kinematics of movement itself 24 25 26 Kinematics is defined as parameters of movement through space without reference to forces for example direction velocity and acceleration 21 Functional neuroimaging studies as well as studies of brain damaged patients have linked movement timing to several cortical and sub cortical regions including the cerebellum basal ganglia and supplementary motor area SMA 21 Specifically the basal ganglia and possibly the SMA have been implicated in interval timing at longer timescales 1 second and above while the cerebellum may be more important for controlling motor timing at shorter timescales milliseconds 22 27 Furthermore these results indicate that motor timing is not controlled by a single brain region but by a network of regions that control specific parameters of movement and that depend on the relevant timescale of the rhythmic sequence 21 Sequencing edit Motor sequencing has been explored in terms of either the ordering of individual movements such as finger sequences for key presses or the coordination of subcomponents of complex multi joint movements 21 Implicated in this process are various cortical and sub cortical regions including the basal ganglia the SMA and the pre SMA the cerebellum and the premotor and prefrontal cortices all involved in the production and learning of motor sequences but without explicit evidence of their specific contributions or interactions amongst one another 21 In animals neurophysiological studies have demonstrated an interaction between the frontal cortex and the basal ganglia during the learning of movement sequences 28 Human neuroimaging studies have also emphasized the contribution of the basal ganglia for well learned sequences 29 The cerebellum is arguably important for sequence learning and for the integration of individual movements into unified sequences 29 30 31 32 33 while the pre SMA and SMA have been shown to be involved in organizing or chunking of more complex movement sequences 34 35 Chunking defined as the re organization or re grouping of movement sequences into smaller sub sequences during performance is thought to facilitate the smooth performance of complex movements and to improve motor memory 21 Lastly the premotor cortex has been shown to be involved in tasks that require the production of relatively complex sequences and it may contribute to motor prediction 36 37 Spatial organization edit Few studies of complex motor control have distinguished between sequential and spatial organization yet expert musical performances demand not only precise sequencing but also spatial organization of movements Studies in animals and humans have established the involvement of parietal sensory motor and premotor cortices in the control of movements when the integration of spatial sensory and motor information is required 38 39 Few studies so far have explicitly examined the role of spatial processing in the context of musical tasks Auditory motor interactions edit Feedforward and feedback interactions edit An auditory motor interaction may be loosely defined as any engagement of or communication between the two systems Two classes of auditory motor interaction are feedforward and feedback 21 In feedforward interactions it is the auditory system that predominately influences the motor output often in a predictive way 40 An example is the phenomenon of tapping to the beat where the listener anticipates the rhythmic accents in a piece of music Another example is the effect of music on movement disorders rhythmic auditory stimuli have been shown to improve walking ability in Parkinson s disease and stroke patients 41 42 Feedback interactions are particularly relevant in playing an instrument such as a violin or in singing where pitch is variable and must be continuously controlled If auditory feedback is blocked musicians can still execute well rehearsed pieces but expressive aspects of performance are affected 43 When auditory feedback is experimentally manipulated by delays or distortions 44 motor performance is significantly altered asynchronous feedback disrupts the timing of events whereas alteration of pitch information disrupts the selection of appropriate actions but not their timing This suggests that disruptions occur because both actions and percepts depend on a single underlying mental representation 21 Models of auditory motor interactions edit Several models of auditory motor interactions have been advanced The model of Hickok and Poeppel 45 which is specific for speech processing proposes that a ventral auditory stream maps sounds onto meaning whereas a dorsal stream maps sounds onto articulatory representations They and others 46 suggest that posterior auditory regions at the parieto temporal boundary are crucial parts of the auditory motor interface mapping auditory representations onto motor representations of speech and onto melodies 47 Mirror echo neurons and auditory motor interactions edit The mirror neuron system has an important role in neural models of sensory motor integration There is considerable evidence that neurons respond to both actions and the accumulated observation of actions A system proposed to explain this understanding of actions is that visual representations of actions are mapped onto our own motor system 48 Some mirror neurons are activated both by the observation of goal directed actions and by the associated sounds produced during the action This suggests that the auditory modality can access the motor system 49 50 While these auditory motor interactions have mainly been studied for speech processes and have focused on Broca s area and the vPMC as of 2011 experiments have begun to shed light on how these interactions are needed for musical performance Results point to a broader involvement of the dPMC and other motor areas 21 The literature has shown a highly specialized cortical network in the skilled musician s brain that codes the relationship between musical gestures and their corresponding sounds The data hint at the existence of an audiomotor mirror network involving the right superior temporal gyrus the premotor cortex the inferior frontal and inferior parietal areas among other areas 51 Music and language editSee also Musical semantics and Musical syntax Certain aspects of language and melody have been shown to be processed in near identical functional brain areas Brown Martinez and Parsons 2006 examined the neurological structural similarities between music and language 52 Utilizing positron emission tomography PET the findings showed that both linguistic and melodic phrases produced activation in almost identical functional brain areas These areas included the primary motor cortex supplementary motor area Broca s area anterior insula primary and secondary auditory cortices temporal pole basal ganglia ventral thalamus and posterior cerebellum Differences were found in lateralization tendencies as language tasks favoured the left hemisphere but the majority of activations were bilateral which produced significant overlap across modalities 52 Syntactical information mechanisms in both music and language have been shown to be processed similarly in the brain Jentschke Koelsch Sallat and Friederici 2008 conducted a study investigating the processing of music in children with specific language impairments SLI 53 Children with typical language development TLD showed ERP patterns different from those of children with SLI which reflected their challenges in processing music syntactic regularities Strong correlations between the ERAN Early Right Anterior Negativity a specific ERP measure amplitude and linguistic and musical abilities provide additional evidence for the relationship of syntactical processing in music and language 53 However production of melody and production of speech may be subserved by different neural networks Stewart Walsh Frith and Rothwell 2001 studied the differences between speech production and song production using transcranial magnetic stimulation TMS 54 Stewart et al found that TMS applied to the left frontal lobe disturbs speech but not melody supporting the idea that they are subserved by different areas of the brain The authors suggest that a reason for the difference is that speech generation can be localized well but the underlying mechanisms of melodic production cannot Alternatively it was also suggested that speech production may be less robust than melodic production and thus more susceptible to interference 54 Language processing is a function more of the left side of the brain than the right side particularly Broca s area and Wernicke s area though the roles played by the two sides of the brain in processing different aspects of language are still unclear Music is also processed by both the left and the right sides of the brain 52 55 Recent evidence further suggest shared processing between language and music at the conceptual level 56 It has also been found that among music conservatory students the prevalence of absolute pitch is much higher for speakers of tone language even controlling for ethnic background showing that language influences how musical tones are perceived 57 58 Musician vs non musician processing edit nbsp Professional pianists show less cortical activation for complex finger movement tasks due to structural differences in the brain Differences edit Brain structure within musicians and non musicians is distinctly different Gaser and Schlaug 2003 compared brain structures of professional musicians with non musicians and discovered gray matter volume differences in motor auditory and visual spatial brain regions 59 Specifically positive correlations were discovered between musician status professional amateur and non musician and gray matter volume in the primary motor and somatosensory areas premotor areas anterior superior parietal areas and in the inferior temporal gyrus bilaterally This strong association between musician status and gray matter differences supports the notion that musicians brains show use dependent structural changes 60 Due to the distinct differences in several brain regions it is unlikely that these differences are innate but rather due to the long term acquisition and repetitive rehearsal of musical skills Brains of musicians also show functional differences from those of non musicians Krings Topper Foltys Erberich Sparing Willmes and Thron 2000 utilized fMRI to study brain area involvement of professional pianists and a control group while performing complex finger movements 61 Krings et al found that the professional piano players showed lower levels of cortical activation in motor areas of the brain It was concluded that a lesser amount of neurons needed to be activated for the piano players due to long term motor practice which results in the different cortical activation patterns Koeneke Lutz Wustenberg and Jancke 2004 reported similar findings in keyboard players 62 Skilled keyboard players and a control group performed complex tasks involving unimanual and bimanual finger movements During task conditions strong hemodynamic responses in the cerebellum were shown by both non musicians and keyboard players but non musicians showed the stronger response This finding indicates that different cortical activation patterns emerge from long term motor practice This evidence supports previous data showing that musicians require fewer neurons to perform the same movements Musicians have been shown to have significantly more developed left planum temporales and have also shown to have a greater word memory 63 Chan s study controlled for age grade point average and years of education and found that when given a 16 word memory test the musicians averaged one to two more words above their non musical counterparts Similarities edit Studies have shown that the human brain has an implicit musical ability 64 65 Koelsch Gunter Friederici and Schoger 2000 investigated the influence of preceding musical context task relevance of unexpected chords and the degree of probability of violation on music processing in both musicians and non musicians 64 Findings showed that the human brain unintentionally extrapolates expectations about impending auditory input Even in non musicians the extrapolated expectations are consistent with music theory The ability to process information musically supports the idea of an implicit musical ability in the human brain In a follow up study Koelsch Schroger and Gunter 2002 investigated whether ERAN and N5 could be evoked preattentively in non musicians 65 Findings showed that both ERAN and N5 can be elicited even in a situation where the musical stimulus is ignored by the listener indicating that there is a highly differentiated preattentive musicality in the human brain Gender differences editMinor neurological differences regarding hemispheric processing exist between brains of males and females Koelsch Maess Grossmann and Friederici 2003 investigated music processing through EEG and ERPs and discovered gender differences 66 Findings showed that females process music information bilaterally and males process music with a right hemispheric predominance However the early negativity of males was also present over the left hemisphere This indicates that males do not exclusively utilize the right hemisphere for musical information processing In a follow up study Koelsch Grossman Gunter Hahne Schroger and Friederici 2003 found that boys show lateralization of the early anterior negativity in the left hemisphere but found a bilateral effect in girls 67 This indicates a developmental effect as early negativity is lateralized in the right hemisphere in men and in the left hemisphere in boys Handedness differences editIt has been found that subjects who are lefthanded particularly those who are also ambidextrous perform better than righthanders on short term memory for the pitch 68 69 It was hypothesized that this handedness advantage is due to the fact that lefthanders have more duplication of storage in the two hemispheres than do righthanders Other work has shown that there are pronounced differences between righthanders and lefthanders on a statistical basis in how musical patterns are perceived when sounds come from different regions of space This has been found for example in the Octave illusion 70 71 and the Scale illusion 72 73 Musical imagery editSee also Audiation Musical imagery refers to the experience of replaying music by imagining it inside the head 74 Musicians show a superior ability for musical imagery due to intense musical training 75 Herholz Lappe Knief and Pantev 2008 investigated the differences in neural processing of a musical imagery task in musicians and non musicians Utilizing magnetoencephalography MEG Herholz et al examined differences in the processing of a musical imagery task with familiar melodies in musicians and non musicians Specifically the study examined whether the mismatch negativity MMN can be based solely on imagery of sounds The task involved participants listening to the beginning of a melody continuation of the melody in his her head and finally hearing a correct incorrect tone as further continuation of the melody The imagery of these melodies was strong enough to obtain an early preattentive brain response to unanticipated violations of the imagined melodies in the musicians These results indicate similar neural correlates are relied upon for trained musicians imagery and perception Additionally the findings suggest that modification of the imagery mismatch negativity iMMN through intense musical training results in achievement of a superior ability for imagery and preattentive processing of music Perceptual musical processes and musical imagery may share a neural substrate in the brain A PET study conducted by Zatorre Halpern Perry Meyer and Evans 1996 investigated cerebral blood flow CBF changes related to auditory imagery and perceptual tasks 76 These tasks examined the involvement of particular anatomical regions as well as functional commonalities between perceptual processes and imagery Similar patterns of CBF changes provided evidence supporting the notion that imagery processes share a substantial neural substrate with related perceptual processes Bilateral neural activity in the secondary auditory cortex was associated with both perceiving and imagining songs This implies that within the secondary auditory cortex processes underlie the phenomenological impression of imagined sounds The supplementary motor area SMA was active in both imagery and perceptual tasks suggesting covert vocalization as an element of musical imagery CBF increases in the inferior frontal polar cortex and right thalamus suggest that these regions may be related to retrieval and or generation of auditory information from memory Emotion editMain article Music and emotion Music is able to create an intensely pleasurable experience that can be described as chills 77 Blood and Zatorre 2001 used PET to measure changes in cerebral blood flow while participants listened to music that they knew to give them the chills or any sort of intensely pleasant emotional response They found that as these chills increase many changes in cerebral blood flow are seen in brain regions such as the amygdala orbitofrontal cortex ventral striatum midbrain and the ventral medial prefrontal cortex Many of these areas appear to be linked to reward motivation emotion and arousal and are also activated in other pleasurable situations 77 The resulting pleasure responses enable the release dopamine serotonin and oxytocin Nucleus accumbens a part of striatum is involved in both music related emotions as well as rhythmic timing 78 According to the National Institute of Health children and adults who are suffering from emotional trauma have been able to benefit from the use of music in a variety of ways The use of music has been essential in helping children who struggle with focus anxiety and cognitive function by using music in therapeutic way Music therapy has also helped children cope with autism pediatric cancer and pain from treatments Emotions induced by music activate similar frontal brain regions compared to emotions elicited by other stimuli 60 Schmidt and Trainor 2001 discovered that valence i e positive vs negative of musical segments was distinguished by patterns of frontal EEG activity 79 Joyful and happy musical segments were associated with increases in left frontal EEG activity whereas fearful and sad musical segments were associated with increases in right frontal EEG activity Additionally the intensity of emotions was differentiated by the pattern of overall frontal EEG activity Overall frontal region activity increased as affective musical stimuli became more intense 79 When unpleasant melodies are played the posterior cingulate cortex activates which indicates a sense of conflict or emotional pain 16 The right hemisphere has also been found to be correlated with emotion which can also activate areas in the cingulate in times of emotional pain specifically social rejection Eisenberger This evidence along with observations has led many musical theorists philosophers and neuroscientists to link emotion with tonality This seems almost obvious because the tones in music seem like a characterization of the tones in human speech which indicate emotional content The vowels in the phonemes of a song are elongated for a dramatic effect and it seems as though musical tones are simply exaggerations of the normal verbal tonality Memory editMain article Music related memory Neuropsychology of musical memory edit Musical memory involves both explicit and implicit memory systems 80 Explicit musical memory is further differentiated between episodic where when and what of the musical experience and semantic memory for music knowledge including facts and emotional concepts Implicit memory centers on the how of music and involves automatic processes such as procedural memory and motor skill learning in other words skills critical for playing an instrument Samson and Baird 2009 found that the ability of musicians with Alzheimer s Disease to play an instrument implicit procedural memory may be preserved Neural correlates of musical memory edit A PET study looking into the neural correlates of musical semantic and episodic memory found distinct activation patterns 81 Semantic musical memory involves the sense of familiarity of songs The semantic memory for music condition resulted in bilateral activation in the medial and orbital frontal cortex as well as activation in the left angular gyrus and the left anterior region of the middle temporal gyri These patterns support the functional asymmetry favouring the left hemisphere for semantic memory Left anterior temporal and inferior frontal regions that were activated in the musical semantic memory task produced activation peaks specifically during the presentation of musical material suggestion that these regions are somewhat functionally specialized for musical semantic representations Episodic memory of musical information involves the ability to recall the former context associated with a musical excerpt 81 In the condition invoking episodic memory for music activations were found bilaterally in the middle and superior frontal gyri and precuneus with activation predominant in the right hemisphere Other studies have found the precuneus to become activated in successful episodic recall 82 As it was activated in the familiar memory condition of episodic memory this activation may be explained by the successful recall of the melody When it comes to memory for pitch there appears to be a dynamic and distributed brain network subserves pitch memory processes Gaab Gaser Zaehle Jancke and Schlaug 2003 examined the functional anatomy of pitch memory using functional magnetic resonance imaging fMRI 83 An analysis of performance scores in a pitch memory task resulted in a significant correlation between good task performance and the supramarginal gyrus SMG as well as the dorsolateral cerebellum Findings indicate that the dorsolateral cerebellum may act as a pitch discrimination processor and the SMG may act as a short term pitch information storage site The left hemisphere was found to be more prominent in the pitch memory task than the right hemispheric regions Therapeutic effects of music on memory edit Musical training has been shown to aid memory Altenmuller et al studied the difference between active and passive musical instruction and found both that over a longer but not short period of time the actively taught students retained much more information than the passively taught students The actively taught students were also found to have greater cerebral cortex activation The passively taught students weren t wasting their time they along with the active group displayed greater left hemisphere activity which is typical in trained musicians 84 Research suggests we listen to the same songs repeatedly because of musical nostalgia One major study published in the journal Memory amp Cognition found that music enables the mind to evoke memories of the past known as music evoked autobiographical memories 85 Attention editTreder et al 86 identified neural correlates of attention when listening to simplified polyphonic music patterns In a musical oddball experiment they had participants shift selective attention to one out of three different instruments in music audio clips with each instrument occasionally playing one or several notes deviating from an otherwise repetitive pattern Contrasting attended versus unattended instruments ERP analysis shows subject and instrument specific responses including P300 and early auditory components The attended instrument could be classified offline with high accuracy This indicates that attention paid to a particular instrument in polyphonic music can be inferred from ongoing EEG a finding that is potentially relevant for building more ergonomic music listing based brain computer interfaces 86 Development editMusical four year olds have been found to have one greater left hemisphere intrahemispheric coherence 84 Musicians have been found to have more developed anterior portions of the corpus callosum in a study by Cowell et al in 1992 This was confirmed by a study by Schlaug et al in 1995 that found that classical musicians between the ages of 21 and 36 have significantly greater anterior corpora callosa than the non musical control Schlaug also found that there was a strong correlation of musical exposure before the age of seven and a great increase in the size of the corpus callosum 84 These fibers join together the left and right hemispheres and indicate an increased relaying between both sides of the brain This suggests the merging between the spatial emotiono tonal processing of the right brain and the linguistical processing of the left brain This large relaying across many different areas of the brain might contribute to music s ability to aid in memory function Impairment editMain article Music specific disorders Focal hand dystonia edit Focal hand dystonia is a task related movement disorder associated with occupational activities that require repetitive hand movements 87 Focal hand dystonia is associated with abnormal processing in the premotor and primary sensorimotor cortices An fMRI study examined five guitarists with focal hand dystonia 88 The study reproduced task specific hand dystonia by having guitarists use a real guitar neck inside the scanner as well as performing a guitar exercise to trigger abnormal hand movement The dystonic guitarists showed significantly more activation of the contralateral primary sensorimotor cortex as well as a bilateral underactivation of premotor areas This activation pattern represents abnormal recruitment of the cortical areas involved in motor control Even in professional musicians widespread bilateral cortical region involvement is necessary to produce complex hand movements such as scales and arpeggios The abnormal shift from premotor to primary sensorimotor activation directly correlates with guitar induced hand dystonia Music agnosia edit Music agnosia an auditory agnosia is a syndrome of selective impairment in music recognition 89 Three cases of music agnosia are examined by Dalla Bella and Peretz 1999 C N G L and I R All three of these patients suffered bilateral damage to the auditory cortex which resulted in musical difficulties while speech understanding remained intact Their impairment is specific to the recognition of once familiar melodies They are spared in recognizing environmental sounds and in recognizing lyrics Peretz 1996 has studied C N s music agnosia further and reports an initial impairment of pitch processing and spared temporal processing 90 C N later recovered in pitch processing abilities but remained impaired in tune recognition and familiarity judgments Musical agnosias may be categorized based on the process which is impaired in the individual 91 Apperceptive music agnosia involves an impairment at the level of perceptual analysis involving an inability to encode musical information correctly Associative music agnosia reflects an impaired representational system which disrupts music recognition Many of the cases of music agnosia have resulted from surgery involving the middle cerebral artery Patient studies have surmounted a large amount of evidence demonstrating that the left side of the brain is more suitable for holding long term memory representations of music and that the right side is important for controlling access to these representations Associative music agnosias tend to be produced by damage to the left hemisphere while apperceptive music agnosia reflects damage to the right hemisphere Congenital amusia edit Congenital amusia otherwise known as tone deafness is a term for lifelong musical problems which are not attributable to mental retardation lack of exposure to music or deafness or brain damage after birth 92 Amusic brains have been found in fMRI studies to have less white matter and thicker cortex than controls in the right inferior frontal cortex These differences suggest abnormal neuronal development in the auditory cortex and inferior frontal gyrus two areas which are important in musical pitch processing Studies on those with amusia suggest different processes are involved in speech tonality and musical tonality Congenital amusics lack the ability to distinguish between pitches and so are for example unmoved by dissonance and playing the wrong key on a piano They also cannot be taught to remember a melody or to recite a song however they are still capable of hearing the intonation of speech for example distinguishing between You speak French and You speak French when spoken Amygdala damage edit nbsp Damage to the amygdala may impair recognition of scary music Damage to the amygdala has selective emotional impairments on musical recognition Gosselin Peretz Johnsen and Adolphs 2007 studied S M a patient with bilateral damage of the amygdala with the rest of the temporal lobe undamaged and found that S M was impaired in recognition of scary and sad music 93 S M s perception of happy music was normal as was her ability to use cues such as tempo to distinguish between happy and sad music It appears that damage specific to the amygdala can selectively impair recognition of scary music Selective deficit in music reading edit Specific musical impairments may result from brain damage leaving other musical abilities intact Cappelletti Waley Cohen Butterworth and Kopelman 2000 studied a single case study of patient P K C a professional musician who sustained damage to the left posterior temporal lobe as well as a small right occipitotemporal lesion 94 After sustaining damage to these regions P K C was selectively impaired in the areas of reading writing and understanding musical notation but maintained other musical skills The ability to read aloud letters words numbers and symbols including musical ones was retained However P K C was unable to read aloud musical notes on the staff regardless of whether the task involved naming with the conventional letter or by singing or playing Yet despite this specific deficit P K C retained the ability to remember and play familiar and new melodies Auditory arrhythmia edit Arrhythmia in the auditory modality is defined as a disturbance of rhythmic sense and includes deficits such as the inability to rhythmically perform music the inability to keep time to music and the inability to discriminate between or reproduce rhythmic patterns 95 A study investigating the elements of rhythmic function examined Patient H J who acquired arrhythmia after sustaining a right temporoparietal infarct 95 Damage to this region impaired H J s central timing system which is essentially the basis of his global rhythmic impairment H J was unable to generate steady pulses in a tapping task These findings suggest that keeping a musical beat relies on functioning in the right temporal auditory cortex References edit Kandler Karl Clause Amanda Noh Jihyun 2009 Tonotopic reorganization of developing auditory brainstem circuits Nature Neuroscience 12 6 711 7 doi 10 1038 nn 2332 PMC 2780022 PMID 19471270 Arlinger S Elberling C Bak C Kofoed B Lebech J Saermark K 1982 Cortical magnetic fields evoked by frequency glides of a continuous tone Electroencephalography and Clinical Neurophysiology 54 6 642 53 doi 10 1016 0013 4694 82 90118 3 PMID 6183097 a b Koppl Christine 1997 Phase Locking to High Frequencies in the Auditory Nerve and Cochlear Nucleus Magnocellularis of the Barn Owl Tyto alba Journal of Neuroscience 17 9 3312 21 doi 10 1523 JNEUROSCI 17 09 03312 1997 PMC 6573645 PMID 9096164 Dreyer A Delgutte B 2006 Phase Locking of Auditory Nerve Fibers to the Envelopes of High Frequency Sounds Implications for Sound Localization Journal of Neurophysiology 96 5 2327 41 doi 10 1152 jn 00326 2006 PMC 2013745 PMID 16807349 Laudanski J Coombes S Palmer A R Sumner C J 2009 Mode Locked Spike Trains in Responses of Ventral Cochlear Nucleus Chopper and Onset Neurons to Periodic Stimuli Journal of Neurophysiology 103 3 1226 37 doi 10 1152 jn 00070 2009 PMC 2887620 PMID 20042702 Liu L F Palmer AR Wallace MN 2006 Phase Locked Responses to Pure Tones in the Inferior Colliculus Journal of Neurophysiology 95 3 1926 35 doi 10 1152 jn 00497 2005 PMID 16339005 Wallace M N Anderson L A Palmer A R 2007 Phase Locked Responses to Pure Tones in the Auditory Thalamus Journal of Neurophysiology 98 4 1941 52 doi 10 1152 jn 00697 2007 PMID 17699690 S2CID 10052217 Skoe Erika Kraus Nina 2010 Auditory Brain Stem Response to Complex Sounds A Tutorial Ear and Hearing 31 3 302 24 doi 10 1097 AUD 0b013e3181cdb272 PMC 2868335 PMID 20084007 Hyde Krista L Peretz Isabelle Zatorre Robert J 2008 Evidence for the role of the right auditory cortex in fine pitch resolution Neuropsychologia 46 2 632 9 doi 10 1016 j neuropsychologia 2007 09 004 PMID 17959204 S2CID 12414672 Patterson Roy D Uppenkamp Stefan Johnsrude Ingrid S Griffiths Timothy D 2002 The Processing of Temporal Pitch and Melody Information in Auditory Cortex Neuron 36 4 767 76 doi 10 1016 S0896 6273 02 01060 7 PMID 12441063 S2CID 2429799 Deutsch D 2013 Absolute pitch In D Deutsch Ed The Psychology of Music 3rd Edition 141 182 doi 10 1016 B978 0 12 381460 9 00005 5 ISBN 9780123814609 PDF Document Takeuchi Annie H Hulse Stewart H 1993 Absolute pitch Psychological Bulletin 113 2 345 61 doi 10 1037 0033 2909 113 2 345 PMID 8451339 Zatorre Robert J Perry David W Beckett Christine A Westbury Christopher F Evans Alan C 1998 Functional anatomy of musical processing in listeners with absolute pitch and relative pitch Proceedings of the National Academy of Sciences of the United States of America 95 6 3172 7 Bibcode 1998PNAS 95 3172Z doi 10 1073 pnas 95 6 3172 PMC 19714 PMID 9501235 a b Brattico Elvira Tervaniemi Mari Naatanen Risto Peretz Isabelle 2006 Musical scale properties are automatically processed in the human auditory cortex Brain Research 1117 1 162 74 doi 10 1016 j brainres 2006 08 023 PMID 16963000 S2CID 8401429 rhythm Definition of rhythm in English by Oxford Dictionaries Oxford Dictionaries English Archived from the original on September 27 2016 Retrieved 2019 05 31 a b c Tramo M J 2001 BIOLOGY AND MUSIC Enhanced Music of the Hemispheres Science 291 5501 54 6 doi 10 1126 science 10 1126 science 1056899 PMID 11192009 S2CID 132754452 Snyder Joel S Large Edward W 2005 Gamma band activity reflects the metric structure of rhythmic tone sequences Cognitive Brain Research 24 1 117 26 doi 10 1016 j cogbrainres 2004 12 014 PMID 15922164 Krumhansl Carol 1990 Cognitive Foundations of Musical Pitch New York Oxford University Press ISBN 978 0 19 514836 7 page needed Janata P Birk JL Van Horn JD Leman M Tillmann B Bharucha JJ 2002 The Cortical Topography of Tonal Structures Underlying Western Music Science 298 5601 2167 70 Bibcode 2002Sci 298 2167J doi 10 1126 science 1076262 PMID 12481131 S2CID 3031759 Proverbio Alice Mado Orlandi Andrea Pisanu Francesca 2016 Brain processing of consonance dissonance in musicians and controls a hemispheric asymmetry revisited European Journal of Neuroscience 44 6 2340 2356 doi 10 1111 ejn 13330 ISSN 1460 9568 PMID 27421883 S2CID 3899594 a b c d e f g h i j k Zatorre R J Halpern A R 2005 Mental concerts musical imagery and auditory cortex Neuron 47 1 9 12 doi 10 1016 j neuron 2005 06 013 PMID 15996544 S2CID 1613599 a b Buhusi C V Meck W H 2005 What makes us tick Functional and neural mechanisms of interval timing Nature Reviews Neuroscience 6 10 755 765 doi 10 1038 nrn1764 PMID 16163383 S2CID 29616055 Ivry R B Spencer R M 2004 The neural representation of time Curr Opin Neurobiol 14 2 225 232 doi 10 1016 j conb 2004 03 013 PMID 15082329 S2CID 10629859 a b Spencer R M Zelaznik H N Diedrichson J Ivry R B 2003 Disrupted timing of discontinuous but not continuous movements by cerebellar lesions Science 300 5624 1437 1439 Bibcode 2003Sci 300 1437S doi 10 1126 science 1083661 PMID 12775842 S2CID 16390014 a b Wing A M 2002 Voluntary timing and brain function an information processing approach Brain Cogn 48 1 7 30 doi 10 1006 brcg 2001 1301 PMID 11812030 S2CID 5596590 Mauk M D Buonomano D V 2004 The neural basis of temporal processing Annu Rev Neurosci 27 307 340 doi 10 1146 annurev neuro 27 070203 144247 PMID 15217335 Lewis P A Miall R C 2003 Distinct systems for automatic and cognitively controlled time measurement evidence from neuroimaging Curr Opin Neurobiol 13 2 250 255 doi 10 1016 s0959 4388 03 00036 9 PMID 12744981 S2CID 328258 Graybiel A M 2005 The basal ganglia learning new tricks and loving it Curr Opin Neurobiol 15 6 638 644 doi 10 1016 j conb 2005 10 006 PMID 16271465 S2CID 12490490 a b Doyon J Penhune V B Ungerleider L G 2003 Distinct contribution of the cortico striatal and corticocerebellar systems to motor skill learning Neuropsychologia 41 3 252 262 doi 10 1016 s0028 3932 02 00158 6 PMID 12457751 S2CID 1855933 Penhune V B Doyon J 2005 Cerebellum and M1 interaction during early learning of timed motor sequences NeuroImage 26 3 801 812 doi 10 1016 j neuroimage 2005 02 041 PMID 15955490 S2CID 14531779 Hikosaka O Nakamura H Sakai K Nakahara H 2002 Central mechanisms of motor skill learning Curr Opin Neurobiol 12 2 217 222 doi 10 1016 s0959 4388 02 00307 0 PMID 12015240 S2CID 12354147 Thach W T 1998 A role for the cerebellum in learning movement coordination Neurobiol Learn Mem 70 1 2 177 188 doi 10 1006 nlme 1998 3846 PMID 9753595 S2CID 29972449 Garraux G et al 2005 Shared brain areas but not functional connections in controlling movement timing and order J Neurosci 25 22 5290 5297 doi 10 1523 jneurosci 0340 05 2005 PMC 6724991 PMID 15930376 Sakai K Hikosaka O Nakamura H 2004 Emergence of rhythm during motor learning Trends Cogn Sci 8 12 547 553 doi 10 1016 j tics 2004 10 005 PMID 15556024 S2CID 18845950 Kennerley S W Sakai K Rushworth M F 2004 Organization of action sequences and the role of the pre SMA J Neurophysiol 91 2 978 993 doi 10 1152 jn 00651 2003 PMID 14573560 S2CID 7763911 Janata P Grafton S T 2003 Swinging in the brain shared neural substrates for behaviors related to sequencing and music Nature Neuroscience 6 7 682 687 doi 10 1038 nn1081 PMID 12830159 S2CID 7605155 Schubotz R I von Cramon D Y 2003 Functional anatomical concepts of human premotor cortex evidence from fMRI and PET studies NeuroImage 20 Suppl 1 S120 S131 doi 10 1016 j neuroimage 2003 09 014 PMID 14597305 S2CID 10198110 Johnson P B Ferraina S Bianchi L Caminiti R 1996 Cortical networks for visual reaching physiological and anatomical organization of frontal and parietal lobe arm regions Cereb Cortex 6 2 102 119 doi 10 1093 cercor 6 2 102 PMID 8670643 Rizzolatti G Luppino G Matelli M 1998 The organization of the cortical motor system new concepts Electroencephalogr Clin Neurophysiol 106 4 283 296 doi 10 1016 s0013 4694 98 00022 4 PMID 9741757 Large E W Palmer C 2002 Perceiving temporal regularity in music Cogn Sci 26 1 37 doi 10 1207 s15516709cog2601 1 Thaut M H McIntosh G C Rice R R 1997 Rhythmic facilitation of gait training in hemiparetic stroke rehabilitation J Neurol Sci 151 2 207 212 doi 10 1016 s0022 510x 97 00146 9 PMID 9349677 S2CID 2515325 McIntosh G C Brown S H Rice R R Thaut M H 1997 Rhythmic auditory motor facilitation of gait patterns in patients with Parkinson s disease J Neurol Neurosurg Psychiatry 62 1 22 26 doi 10 1136 jnnp 62 1 22 PMC 486690 PMID 9010395 Repp B H 1999 Effects of auditory feedback deprivation on expressive piano performance Music Perception 16 4 409 438 doi 10 2307 40285802 JSTOR 40285802 Pfordresher P Q Palmer C 2006 Effects of hearing the past present or future during music performance Percept Psychophys 68 3 362 376 doi 10 3758 bf03193683 PMID 16900830 Hickok G Poeppel D 2004 Dorsal and ventral streams a framework for understanding aspects of the functional anatomy of language Cognition 92 1 2 67 99 doi 10 1016 j cognition 2003 10 011 PMID 15037127 S2CID 635860 Scott S K amp Johnsrude I S The neuroanatomical and functional organization of speech perception Trends Neurosci 26 100 107 2003 Hickok G Buchsbaum B Humphries C Muftuler T 2003 Auditory motor interaction revealed by fMRI speech music and working memory in area SPT J Cogn Neurosci 15 5 673 682 doi 10 1162 089892903322307393 PMID 12965041 Rizzolatti G Fogassi L Gallese V 2001 Neurophysiological mechanisms underlying the understanding and imitation of action Nature Reviews Neuroscience 2 9 661 670 doi 10 1038 35090060 PMID 11533734 S2CID 6792943 Kohler E et al 2002 Hearing sounds understanding actions action representation in mirror neurons Science 297 5582 846 848 Bibcode 2002Sci 297 846K doi 10 1126 science 1070311 PMID 12161656 S2CID 16923101 Keysers C et al 2003 Audiovisual mirror neurons and action recognition Exp Brain Res 153 4 628 636 doi 10 1007 s00221 003 1603 5 PMID 12937876 S2CID 7704309 Mado Proverbio Alice Calbi Marta Manfredi Mirella Zani Alberto 2014 07 29 Audio visuomotor processing in the Musician s brain an ERP study on professional violinists and clarinetists Scientific Reports 4 5866 Bibcode 2014NatSR 4E5866M doi 10 1038 srep05866 ISSN 2045 2322 PMC 5376193 PMID 25070060 a b c Brown Steven Martinez Michael J Parsons Lawrence M 2006 Music and language side by side in the brain A PET study of the generation of melodies and sentences European Journal of Neuroscience 23 10 2791 803 doi 10 1111 j 1460 9568 2006 04785 x PMID 16817882 S2CID 15189129 a b Jentschke Sebastian Koelsch Stefan Sallat Stephan Friederici Angela D 2008 Children with Specific Language Impairment Also Show Impairment of Music syntactic Processing Journal of Cognitive Neuroscience 20 11 1940 51 doi 10 1162 jocn 2008 20135 PMID 18416683 S2CID 6678801 a b Stewart Lauren Walsh Vincent Frith UTA Rothwell John 2006 Transcranial Magnetic Stimulation Produces Speech Arrest but Not Song Arrest PDF Annals of the New York Academy of Sciences 930 1 433 5 Bibcode 2001NYASA 930 433S doi 10 1111 j 1749 6632 2001 tb05762 x PMID 11458860 S2CID 31971115 Koelsch Stefan Gunter Thomas C v Cramon D Yves Zysset Stefan Lohmann Gabriele Friederici Angela D 2002 Bach Speaks A Cortical Language Network Serves the Processing of Music NeuroImage 17 2 956 66 doi 10 1006 nimg 2002 1154 hdl 11858 00 001M 0000 0010 9FF9 6 PMID 12377169 Daltrozzo Jerome Schon Daniele 2009 Conceptual Processing in Music as Revealed by N400 Effects on Words and Musical Targets Journal of Cognitive Neuroscience 21 10 1882 92 doi 10 1162 jocn 2009 21113 PMID 18823240 S2CID 10848425 Deutsch Diana Henthorn Trevor Marvin Elizabeth Xu Hongshuai 2006 Absolute pitch among American and Chinese conservatory students Prevalence differences and evidence for a speech related critical period The Journal of the Acoustical Society of America 119 2 719 22 Bibcode 2006ASAJ 119 719D doi 10 1121 1 2151799 PMID 16521731 Deutsch Diana Dooley Kevin Henthorn Trevor Head Brian 2009 Absolute pitch among students in an American music conservatory Association with tone language fluency The Journal of the Acoustical Society of America 125 4 2398 403 Bibcode 2009ASAJ 125 2398D doi 10 1121 1 3081389 PMID 19354413 Gaser C Schlaug G 2003 Brain structures differ between musicians and non musicians The Journal of Neuroscience 23 27 9240 5 doi 10 1523 JNEUROSCI 23 27 09240 2003 PMC 6740845 PMID 14534258 a b Croom Adam M 2012 Music Neuroscience and the Psychology of Well Being A Precis Frontiers in Psychology 2 393 doi 10 3389 fpsyg 2011 00393 PMC 3249389 PMID 22232614 Krings Timo Topper Rudolf Foltys Henrik Erberich Stephan Sparing Roland Willmes Klaus Thron Armin 2000 Cortical activation patterns during complex motor tasks in piano players and control subjects A functional magnetic resonance imaging study Neuroscience Letters 278 3 189 93 doi 10 1016 S0304 3940 99 00930 1 PMID 10653025 S2CID 6564482 Koeneke Susan Lutz Kai Wustenberg Torsten Jancke Lutz 2004 Long term training affects cerebellar processing in skilled keyboard players NeuroReport 15 8 1279 82 doi 10 1097 01 wnr 0000127463 10147 e7 PMID 15167549 S2CID 14517466 Chan Agnes S Ho Yim Chi Cheung Mei Chun 1998 Music training improves verbal memory Nature 396 6707 128 Bibcode 1998Natur 396 128C doi 10 1038 24075 ISSN 1476 4687 PMID 9823892 S2CID 4425221 a b Koelsch Stefan Gunter Tomas Friederici Angela D Schroger Erich 2000 Brain Indices of Music Processing Nonmusicians are Musical Journal of Cognitive Neuroscience 12 3 520 41 doi 10 1162 089892900562183 PMID 10931776 S2CID 6205775 a b Koelsch Stefan Schroger Erich Gunter Thomas C 2002 Music matters Preattentive musicality of the human brain Psychophysiology 39 1 38 48 doi 10 1111 1469 8986 3910038 hdl 11858 00 001M 0000 0010 C96E D PMID 12206294 Koelsch Stefan Maess Burkhard Grossmann Tobias Friederici Angela D 2003 Electric brain responses reveal gender differences in music processing NeuroReport 14 5 709 13 doi 10 1097 00001756 200304150 00010 hdl 11858 00 001M 0000 0010 B017 B PMID 12692468 Koelsch Stefan Grossmann Tobias Gunter Thomas C Hahne Anja Schroger Erich Friederici Angela D 2003 Children Processing Music Electric Brain Responses Reveal Musical Competence and Gender Differences Journal of Cognitive Neuroscience 15 5 683 93 doi 10 1162 jocn 2003 15 5 683 hdl 11858 00 001M 0000 0010 A3D0 1 PMID 12965042 S2CID 10553168 Deutsch D February 1978 Pitch memory An advantage for the left handed Science 199 4328 559 560 Bibcode 1978Sci 199 559D doi 10 1126 science 622558 PMID 622558 Deutsch Diana 1980 Handedness and Memory for Tonal Pitch PDF In Herron Jeannine ed Neuropsychology of Lefthandedness pp 263 71 Deutsch Diana 1974 An auditory illusion Nature 251 5473 307 9 Bibcode 1974Natur 251 307D doi 10 1038 251307a0 PMID 4427654 S2CID 4273134 Deutsch Diana 1983 The octave illusion in relation to handedness and familial handedness background Neuropsychologia 21 3 289 93 doi 10 1016 0028 3932 83 90047 7 PMID 6877583 S2CID 3063526 Deutsch Diana 1975 Two channel listening to musical scales The Journal of the Acoustical Society of America 57 5 1156 60 Bibcode 1975ASAJ 57 1156D doi 10 1121 1 380573 PMID 1127169 Deutsch D 1999 Grouping mechanisms in music PDF In Deutsch D ed The psychology of music 2nd ed pp 299 348 Halpern Andrea R 2006 Cerebral Substrates of Musical Imagery Annals of the New York Academy of Sciences 930 1 179 92 Bibcode 2001NYASA 930 179H doi 10 1111 j 1749 6632 2001 tb05733 x PMID 11458829 S2CID 31277594 Herholz Sibylle C Lappe Claudia Knief Arne Pantev Christo 2008 Neural basis of music imagery and the effect of musical expertise European Journal of Neuroscience 28 11 2352 60 doi 10 1111 j 1460 9568 2008 06515 x PMID 19046375 S2CID 205513912 Zatorre Robert J Halpern Andrea R Perry David W Meyer Ernst Evans Alan C 1996 Hearing in the Mind s Ear A PET Investigation of Musical Imagery and Perception Journal of Cognitive Neuroscience 8 1 29 46 doi 10 1162 jocn 1996 8 1 29 PMID 23972234 S2CID 11312311 a b Blood A J Zatorre R J 2001 Intensely pleasurable responses to music correlate with activity in brain regions implicated in reward and emotion Proceedings of the National Academy of Sciences 98 20 11818 11823 Bibcode 2001PNAS 9811818B doi 10 1073 pnas 191355898 PMC 58814 PMID 11573015 Collins Francis S Fleming Renee Rutter Deborah Iyengar Sunil Tottenham Nim Patel Aniruddh D Limb Charles Johnson Julene K Holochwost Steven J 2018 03 21 NIH Kennedy Center Workshop on Music and the Brain Finding Harmony Neuron 97 6 1214 1218 doi 10 1016 j neuron 2018 02 004 ISSN 0896 6273 PMC 6688399 PMID 29566791 a b Schmidt Louis A Trainor Laurel J 2001 Frontal brain electrical activity EEG distinguishes valence and intensity of musical emotions Cognition amp Emotion 15 4 487 500 doi 10 1080 02699930126048 S2CID 5557258 Baird Amee Samson Severine 2009 Memory for Music in Alzheimer s Disease Unforgettable Neuropsychology Review 19 1 85 101 doi 10 1007 s11065 009 9085 2 PMID 19214750 S2CID 14341862 a b Platel Herve Baron Jean Claude Desgranges Beatrice Bernard Frederic Eustache Francis 2003 Semantic and episodic memory of music are subserved by distinct neural networks NeuroImage 20 1 244 56 doi 10 1016 S1053 8119 03 00287 8 PMID 14527585 S2CID 17195548 Kapur Shitij Craik Fergus I M Jones Corey Brown Gregory M Houle Sylvain Tulving Endel 1995 Functional role of the prefrontal cortex in retrieval of memories A PET study NeuroReport 6 14 1880 4 doi 10 1097 00001756 199510020 00014 PMID 8547589 S2CID 21792266 Gaab Nadine Gaser Christian Zaehle Tino Jancke Lutz Schlaug Gottfried 2003 Functional anatomy of pitch memory an fMRI study with sparse temporal sampling NeuroImage 19 4 1417 26 doi 10 1016 S1053 8119 03 00224 6 PMID 12948699 S2CID 1878442 a b c Burriss Kathleen Glascott Strickland Susan J 2001 Review of Research Music and the Brain in Childhood Development Childhood Education 78 2 100 103 doi 10 1080 00094056 2002 10522714 S2CID 219597861 Schulkind Matthew D Hennis Laura Kate Rubin David C 1999 11 01 Music emotion and autobiographical memory They re playing your song Memory amp Cognition 27 6 948 955 doi 10 3758 BF03201225 hdl 10161 10143 ISSN 1532 5946 PMID 10586571 S2CID 34931829 a b Treder Matthias Purwins Hendrik Miklody Daniel Sturm Irene Blankertz Benjamin 2014 Decoding auditory attention to instruments in polyphonic music using single trial EEG classification PDF Journal of Neural Engineering 11 2 026009 Bibcode 2014JNEng 11b6009T doi 10 1088 1741 2560 11 2 026009 PMID 24608228 S2CID 35135614 Chen R Hallett M 1998 Focal dystonia and repetitive motion disorders Clinical Orthopaedics and Related Research 351 102 6 PMID 9646753 Pujol J Roset Llobet J Rosines Cubells D Deus J Narberhaus B Valls Sole J Capdevila A Pascual Leone A 2000 Brain Cortical Activation during Guitar Induced Hand Dystonia Studied by Functional MRI NeuroImage 12 3 257 67 doi 10 1006 nimg 2000 0615 PMID 10944408 S2CID 24205160 Dalla Bella Simone Peretz Isabelle 1999 Music Agnosias Selective Impairments of Music Recognition After Brain Damage Journal of New Music Research 28 3 209 216 doi 10 1076 jnmr 28 3 209 3108 Peretz Isabelle 1996 Can We Lose Memory for Music A Case of Music Agnosia in a Nonmusician Journal of Cognitive Neuroscience 8 6 481 96 doi 10 1162 jocn 1996 8 6 481 PMID 23961980 S2CID 25846736 Ayotte J 2000 Patterns of music agnosia associated with middle cerebral artery infarcts Brain 123 9 1926 38 doi 10 1093 brain 123 9 1926 PMID 10960056 Peretz Isabelle 2008 Musical Disorders From Behavior to Genes Current Directions in Psychological Science 17 5 329 333 doi 10 1111 j 1467 8721 2008 00600 x S2CID 15242461 Gosselin Nathalie Peretz Isabelle Johnsen Erica Adolphs Ralph 2007 Amygdala damage impairs emotion recognition from music Neuropsychologia 45 2 236 44 doi 10 1016 j neuropsychologia 2006 07 012 PMID 16970965 S2CID 14537793 Cappelletti M Waley Cohen H Butterworth B Kopelman M 2000 A selective loss of the ability to read and to write music Neurocase 6 4 321 332 doi 10 1080 13554790008402780 S2CID 144572937 a b Wilson Sarah J Pressing Jeffrey L Wales Roger J 2002 Modelling rhythmic function in a musician post stroke Neuropsychologia 40 8 1494 505 doi 10 1016 S0028 3932 01 00198 1 PMID 11931954 S2CID 16730354 External links editMusicCognition info A Resource and Information Center Retrieved from https en wikipedia org w index php title Neuroscience of music amp oldid 1193750483, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.