The idea of this project is to create an intelligent musical instrument that can predict human musical interactions to help them create music. This work involves encapsulating machine learning models of creative interaction into a playable instrument that runs on an augmented reality platform such as Meta Quest or Microsoft HoloLens.
The challenge here is not just to create a ML model, but to build it into an interactive system that might be useful for a musician or just a curious and creative human!
As part of this project, you will conceptualise, create, and evaluate a musical system. You’ll need to be comfortable learning new languages and should enjoy working with physical hardware. It would be advantageous to have taken Sound and Music Computing.
One focus of this project is to use augmented reality systems such as Hololens, Quest or others. You would use Unity or other 3D interactive software development tools to create your system.
For an Honours/master project we would expect you to create a working prototype that includes an ML model and enables interactive sound or music to be created. You would need to complete some type of formal evaluation. This project could also be the basis of a wider PhD project.
Please read information about joining the Sound, Music, and Creative Computing Lab before applying for this project.
How to Apply
To apply for this project, contact Charles Martin.
- your CV
- your unofficial transcript (if you are an ANU student)
Make sure you specify what skills and accomplishments you have that would help you to complete this project.
Useful Papers and Resources:
- Understanding Musical Predictions With an Embodied Interface for Musical Machine Learning
- Performing with a Generative Electronic Music Controller
- An Interactive Musical Prediction System with Mixture Density Recurrent Neural Networks
- A Physical Intelligent Instrument using Recurrent Neural Networks
- Vigliensoni et al. A small-data mindset for generative AI creative work