AI Music Is Here. Can It Change the Future of Art?
As more artists — including musicians and actors — become compelled to protect their work and likenesses against AI replication using intellectual property law, the world has less access to their talent.
This month, American composer and musician Holly Herndon released a cover of Dolly Parton’s Jolene on her YouTube channel. The cover, however, was not performed by Herndon. The cover is credited to Holly+, an Artificial Intelligence (AI) program. Trained by Herndon in her own voice, Holly+ was created to reach all the musical possibilities that Herndon herself couldn’t reach, such as singing in multiple languages and adapting to complex vocal styles out of her physical range. With her release of Jolene sung by Holly+, Herndon signaled the beginning of a new phase in the use of AI in music.
Herndon’s vision exemplifies how artists can achieve new possibilities when they adapt AI for artistic pursuits. On the other hand, the backlash against AI — in part generated by non-consensual use of artists’ works — could strengthen legal protections on existing art, making access to art even more restrictive.
This brings us to a major ethical, and perhaps legal, question that has grappled artists about AI-generated art for quite some time: that of artists’ likenesses and skills used without their consent to create programs that surpass their own artistic capabilities. As The New York Times reported, “What makes the new breed of A.I. tools different, some critics believe, is not just that they’re capable of producing beautiful works of art with minimal effort. It’s how they work… by scraping millions of images from the open web, then teaching algorithms to recognize patterns and relationships in those images and generate new ones in the same style. That means that artists who upload their works to the internet may be unwittingly helping to train their algorithmic competitors.”
On one hand, it presents artists with the possibility to go beyond the restrictions arbitrarily imposed by their physical capabilities. On the other, however, it could also create a situation where intellectual property and copyright laws are invoked to protect publicly available art, restricting people’s access to music itself. We saw this earlier with open source code — and many note that the unethical use of publicly available data to train AI can bolster the push toward copyright protections.
The use of AI in art has also had a divided reception. In September in Colorado, for instance, many artists cried foul when an AI-generated art piece was judged the winner in an art competition. While a large portion of the general public has embraced easy to access AI art modules like Dall-E Mini and Midjourney, many artists and creators remain uncomfortable with the proliferation of AI in art. They are of the opinion that this would eventually lead to “real” artists losing their jobs and work to computer codes.
Related on The Swaddle:
An AI Rapper Perpetuated Racist Stereotypes, Showing How Tech Commodifies Culture
For Allen, who created the image, it was just a way of examining the limits that AI-generated art could cross. Herndon had a similar idea in mind while creating Holly+. In an interview with Wired, Herndon mentions, “There’s a narrative around a lot of this stuff that it’s scary dystopian,” adding, “I’m trying to present another side: This is an opportunity.” Herndon has made Holly+ available to use and collaborate with for everyone, excited at the prospect of her likeness and creation reaching artistic heights she physically couldn’t scale herself.
Herndon is also concerned about the fallouts of artists trying to aggressively protect their work from being used by computer programs. “We had a practice run in the last century [with creating legal protections for artists and their intellectual property] and we messed a lot of it up,” she tells Wired, adding, “I could see people signing away contracts right now that could have really detrimental impacts on their future ability to make work as themselves.” This adds to the larger concern with intellectual property itself as safeguards to art and other creative endeavours.
Often, intellectual property rights rest not with independent creators but the companies — labels, publishers, production companies — that they work for. This at times prevents artists from being able to even access their own work. At other times, despite the fact that all art is derivative, it restricts other artists from meaningfully engaging with pre-existing art. According to Nina Paley, a multimedia artist and filmmaker, and one of the strongest critics of copyright and intellectual property laws, “Ideas can flow in and they flow out, and they change little as they go along and that’s called innovation or progress. But thanks to copyright… I often hear people engaged in creator pursuits ask ‘am I allowed to use this? I don’t want to get in trouble.’ And it’s the threat of trouble that is dictating our choices about what we express.”
The push and pull when it comes to artists invoking intellectual property laws to protect their work or skills from AI, then, could restrict access to art in greater ways than the problem posed by AI. However, artists like Herndon and Allen believe that the issue is not about AI itself but about the companies that build these AIs. As Allen told The New York Times, “It shouldn’t be an indictment of the technology itself. The ethics isn’t in the technology. It’s in the people.” Herndon advocates a similar view. “It’s often pitted as if it’s us versus these evil companies,” she tells Wired, adding, “[but] they want this problem to be solved. They want to have consensual training sets.”
Amlan Sarkar is a staff writer at TheSwaddle. He writes about the intersection between pop culture and politics. You can reach him on Instagram @amlansarkr.