Decoding music: interpreting the language of pitch
Anna Mack
Imagine processing music as a language: each note is translated into a letter, a phrase into a word, and a piece into a story. To most, music is a language because the brain can perceive it as such. As the brain receives sound waves, it organizes the signals tonotopically by frequency (McDermott & Oxenham, 2008), and interprets the complexity of music, the brain recognizes the movement of the melody as if it were a voice conveying words.
Using relative pitch, a mechanism recognizing a pitch’s relation to successive pitches, the brain is able to perceive intonation changes, often important for pitch patterns in spoken language, and discern shifts in musical intervals, which is important for the tonality and texture of music (McDermott & Oxenham, 2008). The story of music is perceived from changes between letters. However, trained musicians may recognize pitch patterns differently. Instead of identifying the relationship between different tones, these people process music as if it was an alphabet, designating labels to tones. People with perfect pitch utilize a mechanism coined absolute pitch: the memorization of pitches and pitch labels (Leipold et al., 2019). Instead of relying on comparisons between two successive tones, people with perfect pitch can recall the previously perceived pitch and a label associated with such pitch (Leipold et al., 2019). In theory, this means the labeling of pitches spells the words that ultimately create the musical story.
With the exception of individuals with amusia, also known as tone-deafness, all individuals innately have relative pitch perception (McDermott & Oxenham, 2008). Due to the simplicity of memory retrieval, absolute pitch perception has a possibility of increased neural-efficiency in comparison to relative pitch when recalling pitches (Leipold et al., 2019). The brain’s ability to translate the musical language via multiple, coinciding neural mechanisms — relative pitch and absolute pitch — displays the complexity of music processing, especially with musical training.
Contrary to other important dimensions of music, such as timbre and rhythm, pitch relies heavily on perception (McDermott & Oxenham, 2008). Pitch is defined as the “perceptual correlate of periodicity in sounds,” (McDermott & Oxenham, 2008). In other words, pitch is the brain’s interpretation of the repeated wavelengths within sound, which often have “harmonic spectra” frequencies. These harmonic spectra are how strings can create octave, fifth, third, seventh harmonics, since the wavelengths of successive harmonics are all multiples of the common fundamental frequency (F0). The cochlear organs transduce sound into neural signals, filtering and interpreting varying frequencies within one sound, creating the periodicity of sound (McDermott & Oxenham, 2008), such is referred to as a tonotopic organization. Although tonotopic organization is an important feature of relaying auditory information, recalling pitch via the distinct frequency almost always requires musical training (Leipold et al., 2019). Contrarily, behavioral studies of melody transposition with young infants implies an innate feature of the auditory system: relative pitch (McDermott & Oxenham, 2008). While absolute pitch detection is learned through musical training, relative pitch is inherent to human auditory processing.
The exact location and process of relative pitch is unknown. When observed under functional imaging, pitch shifts cause temporal regions activation, most often in the right hemisphere (McDermott & Oxenham, 2008). On top of this, damage to areas of the brain can impair music perception, suggesting a specified area for relative pitch in the brain (McDermott & Oxenham, 2008). “Frequency-shift detectors” are theoretical pitch-selective neurons that can distinguish pitch shifts, explaining how listeners can discern pitch shifts within pure tone chords when individual pitches cannot be detected. Interestingly, non-human animals need much training and adaptation of their brains to recognize transposed melodies; in such, animal’s sound perception may innately rely on absolute pitch, rather than relative pitch (McDermott & Oxenham, 2008).
Perfect pitch is characterized by two processes: memorizing pitches as labels of some sort, eg: pitch letters in Western music, and labeling new transducted sound with such labels, through the association with the memorized pitches (Leipold et al., 2019). In these highly trained musicians, pitch processing may be defined as effortless, as they simply grab the labels associated with pitch. In comparison, identifying pitch in musicians with relative pitch showed usage of the presupplementary motor area (preSMA), indicating imagery of pitches. This imagery could reflect the recalling of previously heard pitches, in comparison to musicians with absolute pitch ability to recall and label easily (Leipold et al., 2019). In functional imaging of individuals with perfect pitch and relative pitch, absolute pitch seems to use similar areas but less oxygen usage, meaning potentially more neural efficiency, to label pitches more accurately than relative pitch (Leipold et al., 2019).
In studying music like an alphabet, the labels associated with pitches, the brain creates a processing system, perfect pitch, that aids our innate, preexisting system of relative pitch. Musical training leads to neural adaptment, beyond the possibility of acquiring a different pitch processing system. Structural and functional brain differences in musicians support musical training effect on perceptual differences unspecific to relative pitch, such as basic frequency discrimination (McDermott & Oxenham, 2008). The importance of encoding auditory information developed a system focused on relative pitch – a system unlike other animal’s. To comprehend music even further, some musician’s auditory systems developed alternative systems to memorize and identify pitches.
Various studies have also been conducted with mice models to determine the effects that different tones and pitches of music could have on neural development. What was found was that different tones of music were able to regulate neural development, specifically through the expression of the BNDF gene and its downstream pathways (Wang et al., 2023). Additionally, they found that various other genes in the hippocampus and in the prefrontal cortex including PI3K, AKT, ERK and MAPK, were also up-regulated (Wang et al., 2023). These genes also play a major role and are heavily affected in cancer cells. These studies demonstrated that D-tone music is the best tone for promoting the development of neurons in the brain at the early stages of life (Wang et al., 2023).
About the Author Anna Mack ('29) is a freshman at Harvard College.
References
Leipold, S., Brauchli, C., Greber, M., & Jäncke, L. (2019). Absolute and relative pitch processing in the human brain: Neural and behavioral evidence. Brain Structure and Function, 224(5), 1723–1738. https://doi.org/10.1007/s00429-019-01872-2
McDermott, J. H., & Oxenham, A. J. (2008). Music perception, pitch, and the auditory system. Current Opinion in Neurobiology, 18(4), 452–463. https://doi.org/10.1016/j.conb.2008.09.005
Wang, J., Wang, J., Wang, Y., Chai, Y., Li, H., Miao, D., Liu, H., Li, J., & Bao, J. (2023).Music with Different Tones Affects the Development of Brain Nerves in Mice inEarly Life through BDNF and Its Downstream Pathways. International journal of molecular sciences, 24(9), 8119. https://doi.org/10.3390/ijms24098119