Humans have a remarkable capacity for coordination. Our ability to interact and act jointly in groups is crucial to our success as a species. Joint Action (JA) research has often concerned itself with simplistic behaviors in highly constrained laboratory tasks. But there has been a growing interest in understanding complex coordination in more open-ended contexts. In this regard, collective music improvisation has emerged as a fascinating model domain for studying basic JA mechanisms in an unconstrained and highly sophisticated setting. A number of empirical studies have begun to elucidate coordination mechanisms underlying joint musical improvisation, but these findings have yet to be cached out in a working computational model. The present work fills this gap by presenting Tonal Emergence, an idealized agent-based model of improvised musical coordination. Tonal Emergence models the coordination of notes played by improvisers to generate harmony (i.e., tonality), by simulating agents that stochastically generate notes biased towards maximizing harmonic consonance given their partner’s previous notes. The model replicates an interesting empirical result from a previous study of professional jazz pianists: feedback loops of mutual adaptation between interacting agents support the production of consonant harmony. The model is further explored to show how complex tonal dynamics, such as the production and dissolution of stable tonal centers, are supported by agents that are characterized by (i) a tendency to strive toward consonance, (ii) stochasticity, and (iii) a limited memory for previously played notes. Tonal Emergence thus provides a grounded computational model to simulate and probe the coordination mechanisms underpinning one of the more remarkable feats of human cognition: collective music improvisation.
Setzler, M., & Goldstone, R. L. (2020). Coordination and consonance between interacting, improvising musicians. Open Mind: Discoveries in Cognitive Science, 4, 88—101. https://doi.org/10.1162/opmi_a_00036.
Joint action (JA) is ubiquitous in our cognitive lives. From basketball teams to teams of surgeons, humans often coordinate with one another to achieve some common goal. Idealized laboratory studies of group behavior have begun to elucidate basic JA mechanisms, but little is understood about how these mechanisms scale up in more sophisticated and open-ended JA that occurs in the wild. We address this gap by examining coordination in a paragon domain for creative joint expression: improvising jazz musicians. Coordination in jazz music subserves an aesthetic goal: the generation of a collective musical expression comprising coherent, highly nuanced musical structure (e.g. rhythm, harmony). In our study, dyads of professional jazz pianists improvised in a “coupled”, mutually adaptive condition, and an “overdubbed” condition which precluded mutual adaptation, as occurs in common studio recording practices. Using a model of musical tonality, we quantify the flow of rhythmic and harmonic information between musicians as a function of interaction condition. Our analyses show that mutually adapting dyads achieve greater temporal alignment and produce more consonant harmonies. These musical signatures of coordination were preferred by independent improvisers and naive listeners, who gave higher quality ratings to coupled interactions despite being blind to condition. We present these results and discuss their implications for music technology and JA research more generally.
Setzler, M., & Goldstone, R. L. (2020). Quantifying Emergent, Dynamic Tonal Coordination in Collaborative Musical Improvisation. Proceedings of the 42nd Annual Conference of the Cognitive Science Society. (pp. 461-466). Toronto, CA. Cognitive Science Society.
Groups of interacting individuals often coordinate in service of abstract goals, such as the alignment of mental representations in conversation, or the generation of new ideas in group brainstorming sessions. What are the mechanisms and dynamics of abstract coordination? This study examines coordination in a sophisticated paragon domain: collaboratively improvising jazz musicians. Remarkably, freely improvising jazz ensembles collectively produce coherent tonal structure (i.e. melody and harmony) in real time performance without previously established harmonic forms. We investigate how tonal structure emerges out of interacting musicians, and how this structure is constrained by underlying patterns of coordination. Dyads of professional jazz pianists were recorded improvising in two conditions of interaction: a ‘coupled’ condition in which they could mutually adapt to one another, and an ‘overdubbed’ condition which precluded mutual adaptation. Using a computational model of musical tonality, we show that this manipulation effected the directed flow of tonal information amongst pianists, who could mutually adapt to one another’s notes in coupled trials, but not in overdubbed trials. Consequently, musicians were better able to harmonize with one another in coupled trials, and this ability increased throughout the course of improvised performance. We present these results and discuss their implications for music technology and joint action research more generally.
Setzler, M., & Goldstone, R. L. (2019). Patterns of coordination in simultaneously and sequentially improvising jazz musicians. Proceedings of the 41st Annual Conference of the Cognitive Science Society. (pp. 1035-1040). Montreal, Canada: Cognitive Science Society.
In Joint Action (JA) tasks, individuals must coordinate their actions so as to achieve some desirable outcome at the grouplevel. Group function is an emergent outcome of ongoing, mutually constraining interactions between agents. Here we investigate JA in dyads of improvising jazz pianists. Participants’ musical output is recorded in one of two conditions: a real condition, in which two pianists improvise together as they typically would, and a virtual condition, in which a single pianist improvises along with a “ghost partner” – a recording of another pianist taken from a previous real trial. The conditions are identical except for that in real trials subjects are mutually coupled to one another, whereas there is only unidirectional influence in virtual trials (i.e. recording to musician). We quantify ways in which the rhythmic structures spontaneously produced in these improvisations is shaped by mutual coupling of co-performers. Musical signatures of underlying coordination patterns are also shown to parallel the subjective experience of improvisers, who preferred playing in trials with bidirectional influence despite not explicitly knowing which condition they had played in. These results illuminate how mutual coupling shapes emergent, group-level structure in the creative, open-ended and fundamentally collaborative domain of expert musical improvisation.
When a musical tone is sounded, most listeners are unable to identify its pitch by name. Those listeners who can identify pitches are said to have absolute pitch perception (AP). A limited subset of musicians possesses AP, and it has been debated whether musicians’ AP interferes with their ability to perceive tonal relationships between pitches, or relative pitch (RP). The present study tested musicians’ discrimination of relative pitch categories, or intervals, by placing absolute pitch values in conflict with relative pitch categories. AP listeners perceived intervals categorically, and their judgments were not affected by absolute pitch values. These results indicate that AP listeners do not infer interval identities from the absolute values between tones, and that RP categories are salient musical concepts in both RP and AP musicianship.