EECS's Centre for Digital Music is starting four new collaborative projects to develop new ways to use AI in music.
The partnership projects include working with:
Backed by Innovate UK, these partnerships combine the disruptive ambition of startups with the world-leading expertise of Queen Mary’s Centre for Digital Music.
Simon Dixon, Director of the UKRI Centre for Doctoral Training in Artificial Intelligence and Music at Queen Mary University of London, comments:
“Our Centre for Digital Music has grown into a world-leading, multidisciplinary research group, responsible for numerous spinout companies and business partnerships with companies large and small.
Industry partnerships like these allow us to achieve more real-world impact from our research, and give companies access to researchers working at the very edge of what’s possible.”
Innovate UK Creative Catalyst: AI in the Music Industry
Music source separation aims to decompose music into its constitutive components, yielding separated stems for each musical instrument. Separating an arbitrary number of instruments from an audio mixture in a commercial quality remains a challenging problem.
This partnership project with AudioStrip will fill this gap by developing advanced machine-learning algorithms that can automatically detect musical instruments for high-quality audio source separation.
AudioStrip specialise in source separation technology via machine learning. They take a music file and separate it into the individual instrumentals and vocals.
Basil Woods, Co-Founder and CEO of AudioStrip, comments:
“This technology is sweeping the music industry. AudioStrip will offer more advanced tools for precise separation of individual elements in audio files.
By partnering with Queen Mary, we aim to elevate music source separation technology beyond industry benchmarks, making it an indispensable tool for DJs, independent artists, producers, and licensors.
Our goal is to automatically identify musical elements from any given song - including vocal, instrumental, drums, bass, piano, electric guitar, acoustic guitar, and synthesizer - and extract them into independent tracks without losing quality.”
The path from an unmixed track to a professionally curated masterpiece is a steep climb, especially for emerging creators. This partnership project with RoEx seeks to revolutionise music production and create new revenue streams and engagement activities in the music industry.
RoEx is a Queen Mary spinout democratising music production through AI-powered mixing. In essence, doing for music what Instagram achieved for photography.
Dave Ronan, Founder and CEO of RoEx, comments:
“We want to encapsulate the expertise of a professional mix engineer using AI. This innovative venture navigates the complexity of discerning musical instruments and audio effects to automate a significant and challenging aspect of music production.
Our approach is enriched by the world-class expertise from our partners at Queen Mary University of London, known for their innovative research in audio engineering, audio signal processing, and machine learning.
We envision a future where the fusion of AI with human expertise not only enhances the creative process but also cultivates a collaborative and ethical ecosystem in the ever-evolving digital soundscape of music production.”
Innovate UK: Collaborative AI Solutions to improve productivity in key sectors
Writer's block is a well-known affliction which many famous songwriters have spoken of. This partnership will develop generative-AI creator tools to help songwriters, musicians, and producers to overcome writer’s block and generate sample sounds.
The project will integrate cutting edge language modelling functionality, specifically trained on lyrics, literature, and poetry, to build an AI collaborator to help the artist write lyrics.
The aim is to create AI functionality that seamlessly integrates with the creative process - as if the AI were another musician in the studio with you.
The project is led by Session, a company founded by musicians, songwriters, and producers (including ABBA’s Björn Ulvaeus) as a space for artists to collaborate globally in real time. Queen Mary will be leading the research and creation of new generative-AI methods.
Doug Imrie, CEO of Session’s parent company Salt, comments:
"We at Salt are excited to be working with such an AI innovator as Queen Mary University of London on this important step in the evolution of next-generation tools for creators and the music industry.
These tools will bring powerful new functionality across Salt's entire product line, adding greater support for creators and their musical works from the moment of creation, through to distribution and remuneration.
We are also grateful to Innovate UK for their support of Salt's continued mission to ensure that creators' rights and works are protected across the music industry.”
For music creators to be credited and rewarded when a recording is streamed online, record labels need to collate and share credits data describing the artists, songwriters, and producers who contributed. This process today is manually intensive, with the potential for delays and gaps in data.
Stage designs, builds and operates cloud-native data platforms which ensure that music creators and other rightsholders receive credits and royalties quickly, accurately and transparently. Their Connex project will deliver a new service for record labels of any size to collate and share credits data detailing all contributors and their roles – with vastly improved speed, efficiency, quality and security.
This new partnership, which also includes Session, will also create a set of AI capabilities which add ground-breaking new features and even greater automation to these workflows. Queen Mary will be leading the research and creation of new generative-AI methods and Session will be testing with creators.
Paul Dickinson, CEO of Stage comments,
“We intend to streamline how music labels collate authoritative recording data on who did what, where and when.
We believe that this unique partnership of music industry knowledge, leading technology and pioneering research will develop and deploy Natural Language Processing and AI audio analysis techniques to add powerful new features to Connex, benefitting music creators everywhere.”
Learn more about partnering with Queen Mary at: Collaborate with us - Queen Mary University of London (qmul.ac.uk)