Tuesday, 12 February 2008

music and language dissociation between



Music and Language: dissociation between rule-crunching and memory-retrieval

systems

I have previously written about how concepts are stored in the brain:

they involve rule-based systems (A is bachelor if A is Single AND A is

male) and memory based systems (prototypes and exemplars). I have also

looked at how language involves both rules (the syntax of the

language) as well as memory (semantics or word meanings) systems and

our normal language comprehension as well as productions engages both

types of systems.

It is a popular paradigm in cognitive linguistic research to present

unexpected words in sentences (such as, "I'll have my coffee with milk

and concrete"), while monitoring brain activity using ERP, and find

that the presentation of an unexpected word leads to a N400 peak in

the temporal lobe areas. This violation of semantics is differentiated

from when the syntax of the sentence is wrong, in which case we get

changed activity in frontal lobes.

"Up until now, researchers had found that the processing of rules

relies on an overlapping set of frontal lobe structures in music

and language. However, in addition to rules, both language and

music crucially require the memorization of arbitrary information

such as words and melodies," says the study's principal

investigator, Michael Ulmann, Ph.D., professor of neuroscience,

psychology, neurology and linguistics.

For the first time , similar results have been obtained for music. If

one assumes that changing an in-key note in a familiar melody is akin

to an unexpected word in a sentence, then the same N400 peak is

observed. Also , if a violation of harmonical rules , like an off-key

note in an unfamiliar harmony, is akin to violations of linguistic

syntax, then here too similar changes in frontal lobe activity were

observed.

The subjects listened to 180 snippets of melodies. Half of the

melodies were segments from tunes that most participants would

know, such as "Three Blind Mice" and "Twinkle, Twinkle Little

Star." The other half included novel tunes composed by Miranda.

Three versions of each well-known and novel melody were created:

melodies containing an in-key deviant note (which could only be

detected if the melody was familiar, and therefore memorized);

melodies that contained an out-of-key deviant note (which violated

rules of harmony); and the original (control) melodies.

For listeners familiar with a melody, an in-key deviant note

violated the listener's memory of the melody - the song sounded

musically "correct" and didn't violate any rules of music, but it

was different than what the listener had previously memorized. In

contrast, in-key "deviant" notes in novel melodies did not violate

memory (or rules) because the listeners did not know the tune.

Out-of-key deviant notes constituted violations of musical rules in

both well-known and novel melodies. Additionally, out-of-key

deviant notes violated memory in well-known melodies.

Miranda and Ullman examined the brain waves of the participants who

listened to melodies in the different conditions, and found that

violations of rules andmemory in music corresponded to the two

patterns of brain waves seen in previous studies of rule and memory

violations in language. That is, in-key violations of familiar (but

not novel) melodies led to a brain-wave pattern similar to one

called an "N400" that has previously been found with violations of

words (such as, "I'll have my coffee with milk and concrete").

Out-of-key violations of both familiar and novel melodies led to a

brain-wave pattern over frontal lobe electrodes similar to patterns

previously found for violations of rules in both language and

music. Finally, out-of-key violations of familiar melodies also led

to an N400-like pattern of brain activity, as expected because

these are violations of memory as well as rules.

"This tells us that these two aspects of music, that is rules and

memorized melodies, depend on two different brain systems - brain

systems that also underlie rules and memorized information in

language," Ullman says. "The findings open up exciting new ways of

thinking about and investigating the relationship between language

and music, two fundamental human capacities."

To me this seems exciting. My thesis has been that Men are better at

rule-based things (syntax and harmony); while women are better at

memory-based things (semantics and melody), so I'll like to know

whether the authors observed any gender effects. If so, this would be

further proof for abstract vs concrete gender difference theory.

Labels: categorization, language, music


No comments: