Polypilot product mascot

Introducing PolyPilot:

Our AI-Powered Mentorship Program

Learn More
Go to Polygence Scholars page
Pierre Pang's cover illustration
Polygence Scholar2022
Pierre Pang's profile

Pierre Pang

Ecole Moser GeneveClass of 2023Geneva, Geneva

Project Portfolio

To what extent can machine learning find a suitable musical accompaniment for a given melody?

Started Mar. 16, 2022

Portfolio item's cover image

Abstract or project description

Music generation using machine learning and AI has been a topic of interest over the past few years. Music proves to be a complex task for AI, as it is an art of time, is heavily influenced by human intuition, and is often composed polyphonically where all instruments are interdependent. However, audio data has several interesting features for a machine learning model. In fact, it follows a strict tempo, is pitch-related, and when defining a strict genre or style, some patterns can be found in the way music is created. Currently, most multi-track music generation models use CNN (Convolutional Neural Network), and music is often generated using only very general training data, not allowing a user to generate music according to a specific melody.

In this paper, we present a VAE-based (Variational Autoencoder) machine learning model that is able to generate a musical accompaniment to a user-given melody. The master melody is inputted by the user and a VAE works with a Convolutional Neural Network to find a musical accompaniment to the inputted melody. Unlike current existing models, which generally generate music from scratch after listening to many samples, this one allows any user to enrich his/her wanted melody. We trained our model on different genres to produce different styles of accompaniment to the melody. Not only does it perform better than state of the art CNNs, but it also gives the user more influence on the outputted music, allowing him to give the first main melodic idea.