On Junitaki Falls (2017) Trio for solo instrument and two AI performers. Dur: 35'
- Submitting institution
-
De Montfort University
- Unit of assessment
- 33 - Music, Drama, Dance, Performing Arts, Film and Screen Studies
- Output identifier
- 33099
- Type
- J - Composition
- Month
- -
- Year
- 2017
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
-
- Research group(s)
-
-
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- Yes
- Additional information
- Music-AI has a tradition of focusing on mind-based models, and recently machine learning, to create real-time music-making (e.g. Pachet’s Flow Machines, Google’s Magenta Project). Whilst ‘mind’-based research serves a valuable purpose of building a virtual performer/composer that ‘thinks’ like a musician, it fails to capture one of the most important aspects of musicianship: embodiment. On Junitaki Falls was a practice-based research project that aimed to develop knowledge of the processes and complexities of embodiment when human and behavioural AI work together in creating live music performance. The research imperative synthesised the pivotal theories of Musicking (Small 1989) and behavioural AI (Brooks 1987), with creativity philosophy from ‘The Muse in the Machine’ (Gelernter 1994).
A composition was created for a solo musician (Christopher Redgate). It used a central director to control a dynamic visual score shown on a laptop screen, and coordinate the computer performers. It stored sonic “memories” of Redgate’s performances in a database which became the source material for future iterations. Crucially, the embodied AI behaviour needed to feel intuitive inside the live musicking (flow), be meaningful to Redgate (familiar/ inspiring “thought- trains”), and do something creative in this realm (embodied creativity). The result was a technical solution that emphasized the symbiosis in the behaviours of the code and the recorded media in realtime musicking, over symbolic traits of reasoning and logic. After 18 months of development, through a process of iterative design and deployment, increasingly complex beta versions were developed with Redgate to a point where he felt the embodied AI was ‘being there with him’.
Redgate premiered the piece at the 60th anniversary festival of music at Tempo Reale, Italy. It has been released on CD, with an open- source release of the code; analysed in an academic book chapter; and featured in an artist profile film.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -