"When two jazz musicians seem lost in thought while 'trading fours,' they aren't simply waiting for their turn to play," says Charles Limb. "Instead, they are using the syntactic areas of their brain to process what they are hearing so they can respond by playing a new series of notes that hasn't previously been composed or practiced." (Credit: Ramon/Flickr)

Brain scans show jazz musicians ‘speak’ music

When jazz musicians “trade fours” they use parts of the brain linked to the structure of spoken language, but not those tied to spoken meaning.

“Trading fours” is a jazz pattern in which musicians play brief alternating solos, usually four bars in length, introducing new melodies in response to each other’s ideas, and then elaborating on and modifying them over the course of a performance.

Based on fMRI brain scans, the results suggest the brain regions that process syntax aren’t limited to analyzing spoken language. Rather, the brain uses those syntactic areas to process communication in general, including communication through music.

“When two jazz musicians seem lost in thought while ‘trading fours,’ they aren’t simply waiting for their turn to play,” says Charles Limb, an associate professor of otolaryngology-head and neck surgery at Johns Hopkins University who is also on the faculty of the Peabody Conservatory.

“Instead, they are using the syntactic areas of their brain to process what they are hearing so they can respond by playing a new series of notes that hasn’t previously been composed or practiced.”

Limb, a surgeon, brain researcher, and jazz saxophonist, says the work sheds new light on the complex relationship between music and language.

Syntax vs. semantics

“Until now, studies of how the brain processes auditory communication between two individuals have been done only in the context of spoken language. But looking at jazz lets us investigate the neurological basis of interactive, musical communication as it occurs outside of spoken language,” he says.

“We’ve shown in this study that there is a fundamental difference between how meaning is processed by the brain for music and language. Specifically, it’s syntactic and not semantic processing that is key to this type of musical communication. Meanwhile, conventional notions of semantics may not apply to musical processing by the brain.”

For the study, published in PLOS ONE, researchers recruited 11 men highly proficient in jazz piano performance. During each 10-minute session of trading fours, one musician lay on his back inside an MRI machine with a plastic piano keyboard resting on his lap while his legs were elevated with a cushion.

A pair of mirrors was placed so the musician could look directly up while in the MRI machine and see the placement of his fingers on the keyboard. The keyboard had no metal parts that would be attracted to the large magnet in the MRI.

Improvisational exchanges between the musicians activated areas of the brain linked to syntactic processing for language, called the inferior frontal gyrus and posterior superior temporal gyrus. In contrast, the musical exchange deactivated brain structures involved in semantic processing, called the angular gyrus and supramarginal gyrus.

The Dana Foundation and the Brain Science Institute of the Johns Hopkins University School of Medicine funded the research.

Source: Johns Hopkins University

chat1 Comment

You are free to share this article under the Creative Commons Attribution-NoDerivs 3.0 Unported license.

  1. Patty Carlson

    As a film score composer for over 12 years, my responsibility was to compose music which increased and enhanced the emotional response of the audience to the visual content of the film. I defined the language of music as a mathematical sequence of structural form and motion. Altering how the subject of music is understood is critically important to further studies of communication with higher brain function. I respectfully submit my work to this community at http://www.pianologic.com.

We respect your privacy.