Music can intensify moments of elation and moments of despair. It can connect people and it can divide them. The prospect of psychologists turning their lens on music might give a person the heebie-jeebies, however, conjuring up an image of humorless people in white lab coats manipulating sequences of beeps and boops to make grand pronouncements about human musicality.
In truth, that’s historically rather how music psychology used to work. In the early 1900s, elaborate mechanical apparatus were devised that purported to determine a child’s musical potential by measuring their perceptual acuity on tasks such as loudness discrimination. Measuring musical aptitude with performance on an acoustic judgment task misses out on what might be even more essential factors—say, the degree to which music captivates a child’s attention or inspires them to move.
But contemporary work on music perception embraces a variety of disciplines and methodologies, from anthropology to musicology to neuroscience, to try to understand the relationship between music and the human mind. Researchers use motion capture systems to record people’s movements as they dance, analyzing the gestures’ relationship to the accompanying sound. They use eye tracking to measure changes in infants’ attentiveness as musical features or contexts vary. They place electrodes on the scalp to measure changes in electrical activity, or use neuroimaging to make inferences about the neural processes that underlie diverse types of musical experiences, from jazz improvisation to trance-like states to simply feeling a beat.
One particularly interesting recent example from Nori Jacoby and Josh McDermott takes its methodological inspiration from the game of telephone. In this familiar diversion, one person whispers a sentence to another, who in turn whispers it to someone else. The whispered exchanges continue until the message winds its way back to the original speaker, who often finds the content radically transformed. In Jacoby and McDermott’s task, participants hear a random rhythm and are asked to reproduce it as closely as possible by tapping along with it. The pattern of taps they produce is then played back to them as the next rhythmic sequence they’re asked to match, and so on and so forth in a kind of game of tapping telephone. Across the course of this iterative process, tapping patterns settle into rhythmic structures common in the participants’ musical environment. By examining how these structures change from place to place, this research can illuminate the interrelationship between culture and a person’s sense of time.
This design relies on a relatively widespread musical behavior—rhythmic tapping—rather than a potentially unfamiliar and awkward task like providing overt ratings of acoustic features. By randomly generating the initial seed rhythms, this task also avoids the kinds of inadvertent cultural biases then can creep into the selection of materials for experiments in music perception. These strengths enable the research design to be used successfully in places around the world where musical styles and practices differ from one another, permitting insight into the interrelationships between perceptual tendencies and the characteristics of the music to which people listen most frequently.
Through close partnership with people who make music as well as people who study it from the perspective of the humanities, scientific approaches to music are increasingly focused on aspects of musical experience that extend beyond beeps and boops, like exploring the cognitive processes that underlie the experience of musical groove, or trying to understand the role singing plays in interactions between infants and their caregivers.
At several sites around the world, including the McMaster Institute for Music and the Mind in Hamilton, Ontario, and the Max Planck Institute for Empirical Aesthetics in Frankfurt, Germany, concert halls have been specially constructed to allow both for musical performances and for scientific inquiry. They feature seats that have been outfitted with physiological measurement systems, stages that incorporate motion capture systems, and other tools that make it possible to study experiences sustained at public performances and in participatory music making sessions. This work has even fed back into new kinds of composition and performance, in which features of the sound change depending on the heart rate of the audience members as the concert progresses.
Some lines of research explore potential clinical applications of music, ranging from benefits for gait and coordination in movement disorders, to memory effects in dementia, to speech facilitation in aphasia. Other lines of research identify new ways to make music, ranging from Rebecca Fiebrink’s systems for creating music out of intuitive hand gestures to Gil Weinberg’s robots that can improvise along with human partners. Still other studies try to peel apart the influence of exposure and context to understand the way music does and does not communicate across social boundaries.
With its long, sometimes checkered, but uncommonly close relationship between people in the arts, humanities, and sciences, research in music cognition has the potential to serve as a model for integrating multiple intellectual frameworks in order to address humanity’s big questions.
Featured image credit: “Music Melody” by MIH83. CC0 via Pixabay.
good
[…] More here. […]