The songs would destabilize and fall apart.
And it produced another pleasant surprise.
The music that the livestream is based on is a Vancouver-based technical death metal band called Archspire. They characterize their work with Dadabots as working towards “eliminating humans from black metal.”įor each project, they have Dadabots analyze "subsets of a single artist’s discography,” and work off of it to create its own work. They then curate the best-sounding tracks into an album. With Dadabots, besides the YouTube stream, they have released 10 different albums based on the music of metal and experimental groups like Aepoch, Battles and Meshuggah. While it doesn’t sound totally human-because the vocals in each track are distorted gibberish, notes are held without room for breaths, and some of the guitar riffs are at speeds most people couldn’t achieve-the general feel and instrumentals are convincing, especially to the untrained ear. “Solo vocalists become a lush choir of ghostly voices, rock bands become crunchy cubist-jazz, and cross-breeds of multiple recordings become a surrealist chimera of sound.” “While we set out to achieve a realistic recreation of the original data, we were delighted by the aesthetic merit of its imperfections,” they wrote. In the paper, Carr and Zukowski wrote that initially, they were surprised by the result. As this training goes on, the AI learns the identifying features and starts to produce more and more detailed samples, including riffs and sectional transitions. They start by feeding the AI model short segments of music, a few seconds at a time. They broke down their process in a 2017 paper posted to the arXiv preprint server.