What’s that sound? A duet of humans and AI in concert

Artificial Intelligence is becoming part of everyday life, but will it have a role in creating music? Dr Charles Martin of the School of Computing has been working with his students and Dr Alexander Hunger of the School of Music to design “intelligent instruments” and play then on a live stage.

Research Videos

ANU Computing student Sandy Ma performing
ANU Computing student Sandy Ma performing

Music is a uniquely human endeavour, the auditory window into the soul, discovered through improvisation, honed through collaboration.

In recent years, researchers at The Australian National University (ANU) have introduced artificial intelligence (AI) technology to help diagnose diseases, co-pilot rescue helicopters, and theorise about the origin of the Universe. To date, however, there are no mainstream musical instruments that apply generative AI. That may be about to change.

“We’re creating new kinds of musical instruments,” said Dr Charles Martin of the ANU College of Engineering, Computing & Cybernetics (CECC). “A really good way to explore how they might behave is to try them out in a free improvisation performance.”

Dr Martin was recently joined on stage by two of his students and Dr Alexander Hunter of the School of Music for a concert demonstrating intelligent instruments laden with AI and augmented reality. With cinematic stage lighting and processed video projections, a quartet of musicians unveiled six new instruments, each embedded with AI and augmented reality components.

“The AI model is designed to stay quiet when we’re performing, then respond when we stop,” Dr Martin said. “We were inspired by free improvisation, looking at ways of exploring new sounds and understanding how these AI systems work.”

The concert’s most haunting moments came when undergraduate Computing and Music student Sandy Ma performed by slowly pressing her hands upon on quilt embedded with a Beagle Bone computing system. The human/computer duet unspooled a nebulous soundscape that felt and sounded like a scene from a movie, a guarded venture into a digital underworld where dissonant sounds — somewhere between whale song and Mechagodzilla — waited in ambush.

ANU Undergraduate Student Sandy Ma performing on a quilt embedded with a BeagleBone computing system
ANU Undergraduate Student Sandy Ma performing on a quilt embedded with a BeagleBone computing system.

The lightest touch of her hand would cause the roaring behemoth to run amok. Then, she’d touch the quilt in another place, and the beast would be corralled momentarily in a new envisaging of call and response, from human to machine and back again.

Dr Martin explained that the ethereal sound of the concert stemmed from the “classical experimental music aesthetic” of the human performers, not their robot partners. He has been performing with Dr Hunter since 2014 in a band called Andromeda is Coming. “We’re interested in the gestural features that are possible as an electronic musician,” Dr Martin said. “But we’re developing instruments that could be played by other musicians who might find more conventional ways of applying them.”

Creative computing as an artistic practice

Sandy said she came to ANU to study computing and the arts in a double degree. “My previous practice was largely involved in ceramics and sculpture, and I thought I needed to keep the creative and computing separate,” she said. But when she took Dr Martin’s Honours course, Sound and Music Computing, she realised that she could pursue her twin passions in concert: “creative computing as an artistic practice”, she said.

Her Honours research explores touch perception as a bridge between musical performers, both human and AI. She hopes to present it at the New Interfaces for Musical Expression conference in the Netherlands this September.

Yichen Wang also became interested human/computer interaction when she took Dr Martin’s Honours course. She is now conducting PhD research under his supervision.

During the concert, she debuted an AR instrument she has designed to display an array of cubes layered over her actual surroundings. Above her, a projector displayed what she saw, along with ghostly renditions of her arms and hands as she created music by touching the floating cubes.

“Because the AR systems overlay her instruments on the real world, it gives her a way of interacting very naturally with other musicians,” Dr Martin said. “She can see what they're doing through her headset, as well as seeing her own instrument.”

Although the imagery from her performance was undeniably stunning, she felt the music left something to be desired.

“I tried to make musical information play through the AR interface but it was hard,” Yichen said. “It played in a very quick, non-human way.”
PhD candidate Yichen Wang performs on an instrument she designed to employ augmented reality
PhD candidate Yichen Wang performs on an instrument she designed to employ augmented reality.

She is now taking some time to “cognitively process the information and think about new approaches for the improvisation setting”.

Leaping the real-time hurdle

Obstacles for human/computer interaction in the musical sphere have ranged from the technical, to the ethical, to the philosophical.

What are the ethical and intellectual property implications, for instance, when existing music is used to train large learning models (LLMs) on music composition?

Dr Charles Martin performing
Dr Martin and his students have cleared that hurdle by using improvisation to train and test their models. But this created a new challenge. “It’s not acceptable, as it might be in an image or text generation context, to wait several seconds for a generative AI response,” Dr Martin writes in his paper, Generative AI for Musicians.

He proposes a “small-data approach where artists generate, curate, and train generative AI models” rather than using previously generated music. This is what he and his students are exploring with their continued research and periodic performances.

“When you’re onstage making music, you want your performance to keep going, you don’t want it to break,” Dr Martin said. “So, there’s an element of risk, a feeling of pressure to get these computing systems to work reliably and in real time.”

Live performance, thus, replicates real-world requirements for computing solutions. “We want our computers to behave reliably, to be efficient, to be timely. Computer music gives students a really interesting angle on this across multiple computing disciplines.”

The concert featured duets, trios and quartets with the human performers navigating improvisations with intelligent instruments and the musicians who were operating them.

“In the realm of experimental improvised music, it’s hard to distinguish between a human musician and intelligent instrument responding,” Sandy said. “But honestly, I really don’t care. A collaborator is a collaborator, whether its digital or human. Both are bringing valid input to the performance experience.”

Dr Charles Martin of the School of Computing, Dr Alexander Hunter of the School of Music, and ANU Computing students Sandy Ma, and Yichen Wang.
Dr Charles Martin of the School of Computing, Dr Alexander Hunter of the School of Music, and ANU Computing students Sandy Ma, and Yichen Wang.

Explore the Honours course, Sound and Music Computing at the ANU School of Computing.

Learn more about the Laptop Ensemble at ANU and how you can be a part of it.

You are on Aboriginal land.

The Australian National University acknowledges, celebrates and pays our respects to the Ngunnawal and Ngambri people of the Canberra region and to all First Nations Australians on whose traditional lands we meet and work, and whose cultures are among the oldest continuing cultures in human history.

arrow-left bars search times