Hollow Vertices: A performance environment for live coding and interactive media
- Submitting institution
-
Falmouth University
- Unit of assessment
- 32 - Art and Design: History, Practice and Theory
- Output identifier
- 182
- Type
- I - Performance
- Venue(s)
- Brisbane, Australia
- Open access status
- Out of scope for open access requirements
- Month of first performance
- July
- Year of first performance
- 2016
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
2
- Research group(s)
-
D - Digital Creativity
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- Hollow Vertices is an improvisatory audio-visual performance environment. The sonic components are co-created through live coding in the real-time audio synthesis language, Supercollider. This results in the creation of dense percussive and ambient textures. The two sound sources are linked through a custom-built network upon which each performer has control over the others’ code. This produces developmental elements in the composition. This is combined with an amplified clarinetist using a custom programmed pedal board in Max/MSP to drive live audio effects.
A projected image is displaying the sound source’s live-code, framed by another projection of video content manipulated in real-time through an internal video feedback process programmed in CoGe VJ. The video feedback is processed by custom-built effects that transform the content into new video feedback abstractions. Effects are programmed so unpredictable visual outcomes or glitches appear. The visual glitches aid in the transitional process between visual aesthetics during the performance. Some visual effects are programmed to interact rhythmically with the composition, and some are manually controlled as the piece is improvised.
The composition converges disciplines and mediums by augmenting sensorial modalities through human-computer interaction. This is realised through employing different programming languages, combinations of instruments, and reacting to the collective output while maintaining awareness of individual contributions to the composition. Live coding in SuperCollider provides an interactive musical experience which responds to the visual and clarinet output in real time.
Research Output: streaming videos of performances at VIVO Media Arts Centre in Vancouver, Canada in 2015, and NIME, Brisbane, Australia 2016.
Contextual evidence: Programme notes, software code, two journal articles
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -