Skip to main content
SearchLoginLogin or Signup

Solaris: Jazz-AI Quartet Performance

Published onAug 29, 2024
Solaris: Jazz-AI Quartet Performance
·

Solaris: Jazz-AI Quartet Performance

https://aimc2024.pubpub.org/pub/7jjgyvwj/draft?access=dt260wqm

Project Description

Artistic

This project asks a simple question: what happens when you give a creative AI George Russell's Lydian Chromatic Concept (LCC) (Russell 1959), and ask it to play and co-create with human improvisers? But the question goes much deeper than that ... it's asking the creative AI to co-create WITH human improvisers. This presumes there is awareness on both sides of the human-AI collaboration; that stuff is being communicated, and sensed, and there is a shared journey through the flow of musicking (Small 1998).

Through our experiments with Solaris we have been surprised by its behaviours, interactions and apparent co-creativity, “not because an AI can play jazz with humans, but because it produces musical phrases that surprise, challenging the limits of the improvisation grammar typically found in jazz, while maintaining a recognisable structure.” (Poltronieri & Vear 2024) It is these traits that we argue offer insights into the nature of AI awareness (Poltronieri & Vear 2024), as the constant tension between what the AI is doing, and the human’s natural ability to infer meaning with it pushes human creative boundaries and challenges traditional models of musical understanding.

Technological concepts

Simply put, this digital score is a communications interface of musical ideas between musicians utilizing the creative potential of digital technology. The communications interface is an AI agent playing with human musicians and providing 2 screens of information. The musical idea embedded into this system are deconstructed jazz and popular songs, that are restructured using a specific theory of jazz improvisation. The creative potential of digital technology is the creative AI and its agency, behaviour and stimulation inside musicking, and its presence as a behavioural agent (Bown 2009) that produces two screens of information: an abstract representation of its "state of being" (see fig 1.) and the mechanics of the score through structure, form, harmony and tonal gravity.

The digital score operates as a realtime, closed-loop interactive system. It takes input percepts from the musicking world through audio input (of the live music. NB it is also listening to itself as a form of self-awareness) and uses these to activate an AI-Factory containing 4 deep learning models, decides to make a gesture based on weighting and rules, calculates at harmonic response using Russell's LCC  (Russell 1959) and an awareness of its own tonal gravity, outputs a sound that becomes part of the musicking world, which is heard by the human musicians who use it (maybe) as stimulus for their next moment of playing.

The AI is built using an approach called Embodied Intelligence (Vear 2022) and exemplified the application of the Human-AI Musicking Framework (Vear et al 2023). This uses Creative AI datasets and algorithms to make music from inside the flow. Embodied Intelligence differs from other Music-AI approaches insofar that it is not using AI to construct the physical phenomena of music (sound wave), nor the meta-workings of music composition (sequencing of notes). Instead, the AI is powering the realtime generation of core impetus from within the dynamic flow of live music-making.

Ultimately, this digital score is designed to co-create with human musicians and stimulate the generation of complex music based on unusual time-signatures in post-modern deconstructions of popular and jazz music. It is designed to be a hybrid between John Zorn, Steve Coleman, and mathcore. It aims to highlight the creativity of AI within given frameworks and matrices, and equally to highlight the creativity of humans collaborating with AI.

Type of submission

  • Performance 2, at an Oxford University performance space, will feature a flexible stage and electronics/projected visuals set-up suitable for music with more complex technical requirements. No live performers are provided, but musicians are welcome to perform their own works, or provide their own performers (this should be indicated in the submission)

Technical/Stage Requirements

Equipment

Really big projector and screen (positioned up stage (at the back behind the main stage area). Spanning full stage width? Or whatever you have.

Full range stereo speakers.

Onstage power (we will bring UK-EU adaptors)

Lines to front of house (or sub mix on stage whichever is easily supported):

·       6 channels from AI

·       Stereo from hybrid drum kit

Hopeful to receive technical assistance for …

A front of house engineer

Some help ensuring that the projectors work.

Program Notes

  • Solaris is a digital score for AI and a human musician that questions the nature of awareness. It uses realtime AI and deep learning prediction to create a free, jazz, AI, math music performance. The digital score is both an agent expressing itself by joining in with the "musicking" (Small 1998) through symbolic rules, and a visual notation tool. This notation operates on two levels a) it presents a poetic image of its state-of-being, b) it is a graphic music score for the humans to use and interpret.  This project asks a simple question: what happens when you give a creative AI George Russell's Lydian Chromatic Concept (LCC) (Russell 1959), and ask it to play and co-create with human improvisers? This seems simple enough: program a symbolic based system that makes a jazz improvisation using Russell's LCC as the guiding rules, give it an over-arching form such as a jazz tune. But the question goes much deeper than that ... it's asking the creative AI to co-create WITH human improvisers. This presumes there is awareness on both sides of the human-AI collaboration; that stuff is being communicated, and sensed, and there is a shared journey through the flow of musicking.

Media

  • Figure 1 : Screenshots from Solaris, during performance of Giant Steps, highlighting the visual representation of the AI’s “state of being”.

  • LIVE DEMO: https://solarisjazz.bandcamp.com/album/tree-mountains-live-demo

Acknowledgements

  • The Digital Score project (DigiScore) is funded by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Grant agreement No. ERC-2020-COG – 101002086).

Comments
0
comment
No comments here
Why not start the discussion?