IDIOSYNCRASIES for contrabass clarinet and live electronics
- Submitting institution
-
University of Keele
- Unit of assessment
- 33 - Music, Drama, Dance, Performing Arts, Film and Screen Studies
- Output identifier
- 780
- Type
- J - Composition
- Month
- -
- Year
- 2018
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
-
- Research group(s)
-
-
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- "Idiosyncrasies for contrabass clarinet and live electronics was premiered at the 3rd International Conference and Festival for Sensor Augmented Bassclarinet Research SABRE, (Zurich University of the Arts) – Konzertsaal, Zürick, (2019) and received subsequent performances at NoiseFloor Festval (Stoke-on-Trent 2019), Festival IKLECTIK, London (2019); Rarescale concert (Keele, 2019) and was programmed in the Midlands New Music Symposium & Nottingham New Music Weekend (2020, postponed due to Covid-19).
This piece seeks to expand the repertoire and technical performance praxis of the generally underused contrabass clarinet. The research explored the application of air pressure to control real-time sound processing in live performance using the Sensor Assisted Bass clarinet Research (SABRe) device. The work was written to derive new mapping possibilities of incoming air pressure data (captured by the device as the performer plays).
Mapping trials established new ways of using this sensor technology. Two key examples include:
• Sharp air pressure outbursts were mapped to randomly dispersed frequency clouds
• Continuous high air pressure spatialised the sound in circular motion across eight loudspeakers around the audience.
The device and mapping strategies explored have developed an original method for music expression that complements the written score by using information obtained from musician’s natural performance gestures. This process resolves the need for obtrusive/visible technology that encumbers the musician and negates additional, unnatural exaggerated movements from the performer to trigger additional audio or sound processing. The method and device application have overcome previous limitations of apparatus, which inhibited the ‘free execution of movements’ as well as limits of methods which rely on spatially fixed sensors, such as those associated with video capture (Goebl, Dixon, DePoli et al. 2008) This work demonstrates technology-enhanced expressiveness, creating music in-the-moment by using software, in response to the performer’s unwitting gestures and air pressure.
"
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -