At a time when the music industry is anticipating the impact of generative AI, DAACI has become a key player in the fast-developing technology.
In the past week, the start-up has confirmed new product launches – Natural Edits is an editing tool for the global sync market – and an education partnership with BIMM University.
DAACI is an assistive composition platform that aims to empower music creators to unlock their creativity with artificial intelligence. The system is based on PhD research with the UKRI Centre for Doctoral Training in Artificial Intelligence and Queen Mary University of London.
The tech start-up supports the Human Artistry Campaign in the development of ethical AI that protects the rights of composers, producers, and artists across the world.
To find out more about DAACI’s role in the AI music landscape, Music Week spoke to CEO Rachel Lyske about the potential for the technology in music…
What is DAACI’s vision for music and AI - how are you helping creatives utilise the technology?
“Music is such an incredibly special and beautiful thing. It’s integral to the human experience. It helps us communicate with each other, express ourselves, and release emotions. The music that really moves us is important because it’s made by people who share those emotions and experiences. Musicians are also technologists – we look to tech to help express ourselves. We use microphones to be heard, DAWS to compose, records to listen and allow others to experience what we have created. AI is a powerful tool, and this is never more true than when we’re in control of it. Our own creativity leads us to unleash AI’s own potential as a creative tool. Will it inspire me? Will it teach me something? Will it increase my output, or offer me a different perspective or solution?
“My brother Joe [Dr Joe Lyske] had all of that in mind when he came up with the concept for what would become DAACI. As a composer, he wanted a tool that would allow him to pilot the process of writing music without having to have his hands on every one of the controls. He needed a system that would effectively understand what he was trying to achieve musically and actively help him to get that result. DAACI is all of the above and it’s human-led, letting creatives be creative. The decisions are down to them or there’s just no point. And we’re putting creatives in the driver's seat of this new technology. We’re building it because we as musicians want to use it.”
You recently made acquisitions of fellow start-ups MXX and WiSL – how will this empower artists and rights-holders as they navigate this new technology?
“We’ve acquired a serious suite of technology that draws together all aspects of the creative process, powered by DAACI’s core technology. From how we discover and edit music through to composing and production, we are making tools that will not just elevate the status quo but go far beyond with music that is both dynamic and responsive. It’s a whole new world of possibilities. We want to offer artists, creatives, and rights-holders choices, functionality and ease. What it means is we have the most powerful collection of tools all in one place that people can use however they want, and it's all connected. This isn't our power – it’s to empower others. How they use it is up to them.”
Does assistive AI ensure that the human element remains at the forefront of music creation? Given the concern of groups such as the Human Artistry Campaign, how is DAACI distinct from generative AI platforms?
“DAACI’s technologies are both assistive and generative, but what differentiates our generative AI is that it is the only one that categorically relies on human input. We start with our own unique method of encoding a composer’s musical ideas. That ensures that the human element remains at the forefront of all music creation. The AI is just an assistant - just a collaborator. The future of music and AI needs music makers and technology to work in harmony. That’s why DAACI stands with the Human Artistry Campaign in the development of ethical AI that protects the rights of composers, producers and artists across the world.”
The future of music and AI needs music makers and technology to work in harmony
Rachel Lyske
AI has had a lot of coverage with concern expressed about the impact on music and copyright. Do you think perceptions of the technology are too negative - will AI benefit the music industry overall?
“AI is not new and it’s not just one thing either, it covers a whole host of technologies and uses. It’s also here to stay so we have to be proactive in the way we approach it and shape it in a way that we like. Negative perceptions are valid if we aren’t in control and the technologies are designed to replace us, but the way we approach AI is all about starting with the human and expanding from there.
“I think it’s a really positive sign that so many people are passionate and vocal about getting this right. The music industry is a series of technological steps, this is the next one. AI offers new solutions and possibilities that didn’t exist before and these can be very beneficial if we guide it that way. For our part, DAACI has been working with groups across the music industry to share the benefits of AI. If we work together we can make sure everyone in the music-making process is supported and compensated while continuing to create beautiful high-quality music.”
How important was your inclusion in the Abbey Road Red incubator programme? And how is DAACI working with the music industry and music creators?
“The team at Abbey Road are incredibly supportive and sensitive to what we are striving to achieve with DAACI. We are building our system as a creative tool for the likes of the very people that walk their halls so advice, guidance and feedback from within the industry are crucial to our mission. We have an incredibly powerful system and together with the industry we can build something special that is value adding for artists, composers, producers and music makers of all kinds. We have been able to speak to incredible people within Abbey Road Studios and UMG, pitch our ideas and technology, get their feedback and steer on how we should develop it, whilst respecting artist rights and ethics and so much more.
“We have also had the recent privilege of completing the 12-week long ASCAP Labs Music and AI Challenge. It’s so important to take this kind of opportunity to interact with music makers at the top of their game to learn and listen to feedback. We’ve actively pursued opportunities with groups and industry leaders like ASCAP, Abbey Road Studios and The Ivors Academy because by working together they recognise DAACI as being at the forefront of creator-first AI. That’s an incredible privilege and their trust means a lot.”
How is the company expanding in terms of staffing and any global presence?
“I am so fortunate and proud to be part of such a unique team of talented and passionate musicians, researchers and developers. The team is growing fast, and so is our presence globally. Appearances at industry conferences and events, like our co-head of research Dr Nadine Kroher, who recently spoke on a AI related panel in Bergen, the Ivors Global Summit in London, and the ASCAP Experience in LA, where I joined the ASCAP board in conversation, all help raise the profile of DAACI’s work into music and AI.
“It’s also worth mentioning that we have many people reach out to us on socials and the website. We love the curiosity, openness and questions that they’re bringing and it’s a great feeling to share our passion and vision.”
What are the plans for DAACI? What developments and features are on the way?
“It’s an exciting time for DAACI! We are revealing elements of the DAACI system in various product teasers and releases over the next year. Natural Edits, which will debut in a B2B setting, is a simple yet powerful tool that allows users to discover and edit tracks for sync. We’re also very excited to release our first B2C teaser product for all music creators. This will be the first in a series of launches that will allow users to integrate the DAACI system into their workflow. We can’t wait to see what people do with our tools, and to hear the music they’ll make.”