Cosmologies for piano and three-dimensional electronics (Electroacoustic composition supplied on USB)
- Submitting institution
-
City, University of London
- Unit of assessment
- 33 - Music, Drama, Dance, Performing Arts, Film and Screen Studies
- Output identifier
- 1265
- Type
- J - Composition
- Month
- March
- Year
- 2020
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
0
- Research group(s)
-
-
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- Cosmologies, for piano and three-dimensional (3D) electronics, explodes the space inside the piano out to the space of the concert hall, situating the listener inside the virtual instrument to experience its secret inner life. While applications of Artificial Intelligence (AI) research to audio are ubiquitous today, this is among the first projects to apply AI to the embodied spatial presence of live instruments and performers. It introduces a novel approach to interactive electronics connecting research in music information retrieval (MIR) with spatialization using higher order ambisonics (HOA) and machine learning (ML) to enable the computer to “learn” from measured radiation patterns of acoustic instruments and apply them to 3D audio in real and deferred time. MIR techniques are used to transcribe piano samples and field recordings into a detailed instrumental score (attached) interpreted live alongside the 3D interactive electronics. The premiere by pianist Alvise Sinivia at Paris’s Centre Georges Pompidou was the first public performance worldwide using the EM32 Eigenmike 32-channel microphone array for live amplification and processing, resulting in a live 3D projection of the piano interior diffused over a 27.2-channel loudspeaker dome surrounding the audience. The work was produced through a STARTS (Science + Technology + ARTS) Residency at IRCAM where I collaborated with computer music researchers Jean Bresson, Diemo Schwarz, and Thibaut Carpentier. The project was simultaneously the first fully realized project involving new software environments Cat-Spat developed by myself and OM# developed by Bresson, and their connections to Spat5 and MuBu libraries developed by Carpentier and Schwarz. The performance is documented in a video and binaural recording rendering the immersive concert experience over headphones (offered for assement). The technological contributions of this research were reported in a conference presentation at the 2020 Ateliers du Forum and published in STARTS Residency Report (offered as contextual information)
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -