Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

A neuromechanistic model for rhythmic beat generation

View through CrossRef
Abstract When listening to music, humans can easily identify and move to the beat. Numerous experimental studies have identified brain regions that may be involved with beat perception and representation. Several theoretical and algorithmic approaches have been proposed to account for this ability. Related to, but different from the issue of how we perceive a beat, is the question of how we learn to generate and hold a beat. In this paper, we introduce a neuronal framework for a beat generator that is capable of learning isochronous rhythms over a range of frequencies that are relevant to music and speech. Our approach combines ideas from error-correction and entrainment models to investigate the dynamics of how a biophysically-based neuronal network model synchronizes its period and phase to match that of an external stimulus. The model makes novel use of on-going faster gamma rhythms to form a set of discrete clocks that provide estimates, but not exact information, of how well the beat generator spike times match those of a stimulus sequence. The beat generator is endowed with plasticity allowing it to quickly learn and thereby adjust its spike times to achieve synchronization. Our model makes generalizable predictions about the existence of asymmetries in the synchronization process, as well as specific predictions about resynchronization times after changes in stimulus tempo or phase. Analysis of the model demonstrates that accurate rhythmic time keeping can be achieved over a range of frequencies relevant to music, in a manner that is robust to changes in parameters and to the presence of noise. Author summary Music is integral to human experience and is appreciated across a wide range of cultures. Although many features distinguish different musical traditions, rhythm is central to nearly all. Most humans can detect and move along to the beat through finger or foot tapping, hand clapping or other bodily movements. But many people have a hard time “keeping a beat”, or say they have “no sense of rhythm”. There appears to be a disconnect between our ability to perceive a beat versus our ability to produce a beat, as a drummer would do as part of a musical group. Producing a beat requires beat generation, the process by which we learn how to keep track of the specific time intervals between beats, as well as executing the motor movement needed to produce the sound associated with a beat. In this paper, we begin to explore neural mechanisms that may be responsible for our ability to generate and keep a beat. We develop a computational model that includes different neurons and shows how they cooperate to learn a beat and keep it, even after the stimulus is removed, across a range of frequencies relevant to music. Our dynamical systems model leads to predictions for how the brain may react when learning a beat. Our findings and techniques should be widely applicable to those interested in understanding how the brain processes time, particularly in the context of music.
Title: A neuromechanistic model for rhythmic beat generation
Description:
Abstract When listening to music, humans can easily identify and move to the beat.
Numerous experimental studies have identified brain regions that may be involved with beat perception and representation.
Several theoretical and algorithmic approaches have been proposed to account for this ability.
Related to, but different from the issue of how we perceive a beat, is the question of how we learn to generate and hold a beat.
In this paper, we introduce a neuronal framework for a beat generator that is capable of learning isochronous rhythms over a range of frequencies that are relevant to music and speech.
Our approach combines ideas from error-correction and entrainment models to investigate the dynamics of how a biophysically-based neuronal network model synchronizes its period and phase to match that of an external stimulus.
The model makes novel use of on-going faster gamma rhythms to form a set of discrete clocks that provide estimates, but not exact information, of how well the beat generator spike times match those of a stimulus sequence.
The beat generator is endowed with plasticity allowing it to quickly learn and thereby adjust its spike times to achieve synchronization.
Our model makes generalizable predictions about the existence of asymmetries in the synchronization process, as well as specific predictions about resynchronization times after changes in stimulus tempo or phase.
Analysis of the model demonstrates that accurate rhythmic time keeping can be achieved over a range of frequencies relevant to music, in a manner that is robust to changes in parameters and to the presence of noise.
Author summary Music is integral to human experience and is appreciated across a wide range of cultures.
Although many features distinguish different musical traditions, rhythm is central to nearly all.
Most humans can detect and move along to the beat through finger or foot tapping, hand clapping or other bodily movements.
But many people have a hard time “keeping a beat”, or say they have “no sense of rhythm”.
There appears to be a disconnect between our ability to perceive a beat versus our ability to produce a beat, as a drummer would do as part of a musical group.
Producing a beat requires beat generation, the process by which we learn how to keep track of the specific time intervals between beats, as well as executing the motor movement needed to produce the sound associated with a beat.
In this paper, we begin to explore neural mechanisms that may be responsible for our ability to generate and keep a beat.
We develop a computational model that includes different neurons and shows how they cooperate to learn a beat and keep it, even after the stimulus is removed, across a range of frequencies relevant to music.
Our dynamical systems model leads to predictions for how the brain may react when learning a beat.
Our findings and techniques should be widely applicable to those interested in understanding how the brain processes time, particularly in the context of music.

Related Results

The Beats
The Beats
This volume is the first-ever collection devoted to teaching Beat literature in high school to graduate-level classes. Essays address teaching topics such as the history of the cen...
Testing beat perception without sensory cues to the beat: the Beat-Drop Alignment Test (BDAT)
Testing beat perception without sensory cues to the beat: the Beat-Drop Alignment Test (BDAT)
Beat perception can serve as a window into internal time-keeping mechanisms, auditory-motor interactions, and aspects of cognition. A popular test of beat perception, the Beat Alig...
Interactions between Rhythmic and Discrete Components in a Bimanual Task
Interactions between Rhythmic and Discrete Components in a Bimanual Task
An asymmetric bimanual task was investigated in which participants performed a rhythmic movement with their dominant arm and initiated a second movement with their nondominant arm ...
Testing beat perception without sensory cues to the beat: the Beat-Drop Alignment Test (BDAT)
Testing beat perception without sensory cues to the beat: the Beat-Drop Alignment Test (BDAT)
AbstractBeat perception can serve as a window into internal time-keeping mechanisms, auditory–motor interactions, and aspects of cognition. One aspect of beat perception is the cov...
Modern Inspiration from the Beat Generation
Modern Inspiration from the Beat Generation
The segment of my work containing song lyrics was inspired by a question: Was the Beat Movement an isolated period of thought, tied to one time and location that generated the Beat...
The Rhytmic Pattern of Tifa in Cakalele Dance
The Rhytmic Pattern of Tifa in Cakalele Dance
ABSTRACT Jeremy Giovan. 2020. The Rhythmic Pattern of Tifa in Cakalele Dance. Research. Department of Music Education, Faculty of Language and Art, Jakarta State University.     ...
Buddhism and the Beats
Buddhism and the Beats
“For the beat generation,” Stephen Prothero wrote in 1991, “dissertation time is here. Magazine and newspaper critics have gotten in their jabs. Now scholars are starting to analyz...
Musical Time: Rhythm, Meter, and Tempo
Musical Time: Rhythm, Meter, and Tempo
Abstract “It’s got a good beat—you can dance to it!” This was for a long time seemingly the first, foremost, and perhaps only consideration useful in judging the val...

Back to Top