43 min

AI & The Art of Music Creative Next: AI Automation at Work

    • Technology

AI is being used by music groups, such as our guest this episode Claire Evans, a member of the band YACHT. Their latest album, Chain Tripping, leveraged machine learning solutions for the music, lyrics, and more.
Artists are making the most of machine learning, using the technology both in the creation of their art and as a cultural touchpoint for expression, exploration, and commentary. While the Internet and more modern emerging technologies have long had a negative impact on musicians and others who create using audio, Claire Evans and her band YACHT - Young Americans Challenging High Technology - are at the vanguard of discovering how these technologies will impact art and music in the future.
 
Memorable Quotes
“Since we only really learn by doing things and making things, we figured that the most efficient way for us to get a sort of bodily understanding of what the hell AI is and what it's doing and what it means for artists and for all of us was to try to make something with it.”
“I think when we first started this project, we naively thought we could just kind of hand our back catalog to some algorithm, and the algorithm would analyze that and spit out new songs that would be new YACHT songs. And the project, the art, would be about committing to that, whatever it was. As soon as we started working on this, we realized that we're not there yet, thank God. Algorithms can't just spit out pop songs. If they could, the airwaves would be full of them.”
“If you listen to the record it sounds like an interesting experimental rock or pop record. It doesn't sound like generative, you know, plausible nonsense. It sounds like songs, and that's because there was very much a human in the loop. We used the machine learning model to facilitate the process of generating source material, and then from that source material we built songs the way that we would always build songs as humans in a studio playing music.”
“I was projecting my own meaning onto words that I didn't write. And trying to sort of cobble together some kind of meaning to the songs that made it possible for me to sort of perform and convey them with my voice. And so, it's oddly democratizing, because now the fans, the listeners, and the band, are all trying to figure out what it all means at the same time. And we were going to have as many interpretations of what it means as there are people to listen to it.”
“It also has no consideration of the body, right. It doesn't ‘know’ what it feels like to play any of these melodies on the guitar or on the keyboard. If it's physically challenging to do. All it knows is the MIDI data that it's been fed in the training process. So, a lot of these melodies sounded odd, but simple enough to play. But then when we sat down to actually play them, we found that they were extremely challenging, because they forced us to acknowledge the embodied habits that we bring with us as players into the studio.”
“I like to think of some of these machine learning models being like a camera of their individual disciplines. I mean, a text-generating model that's able to make perfect texts. Maybe that just becomes the camera of writing. And we have to completely step outside of our comfort zone to reinvent what writing means in the 21st century. And what an exciting proposition that is for an artist.”
“There's also something really interesting about the reflective quality of AI as it works today. I mean, you build a machine learning model by feeding it lots of information, trading data. And in the context of music that information is historic. It's the history of music. It's a corpus of millions of notes, or a corpus of millions of words, of song lyrics from musicians and artists that we love. Or ourselves. So this idea that we could use an emerging technology not only learn to understand it, but also maybe learn something about ourselves in the process.”
“Maybe in ten years we won't even be making mu

AI is being used by music groups, such as our guest this episode Claire Evans, a member of the band YACHT. Their latest album, Chain Tripping, leveraged machine learning solutions for the music, lyrics, and more.
Artists are making the most of machine learning, using the technology both in the creation of their art and as a cultural touchpoint for expression, exploration, and commentary. While the Internet and more modern emerging technologies have long had a negative impact on musicians and others who create using audio, Claire Evans and her band YACHT - Young Americans Challenging High Technology - are at the vanguard of discovering how these technologies will impact art and music in the future.
 
Memorable Quotes
“Since we only really learn by doing things and making things, we figured that the most efficient way for us to get a sort of bodily understanding of what the hell AI is and what it's doing and what it means for artists and for all of us was to try to make something with it.”
“I think when we first started this project, we naively thought we could just kind of hand our back catalog to some algorithm, and the algorithm would analyze that and spit out new songs that would be new YACHT songs. And the project, the art, would be about committing to that, whatever it was. As soon as we started working on this, we realized that we're not there yet, thank God. Algorithms can't just spit out pop songs. If they could, the airwaves would be full of them.”
“If you listen to the record it sounds like an interesting experimental rock or pop record. It doesn't sound like generative, you know, plausible nonsense. It sounds like songs, and that's because there was very much a human in the loop. We used the machine learning model to facilitate the process of generating source material, and then from that source material we built songs the way that we would always build songs as humans in a studio playing music.”
“I was projecting my own meaning onto words that I didn't write. And trying to sort of cobble together some kind of meaning to the songs that made it possible for me to sort of perform and convey them with my voice. And so, it's oddly democratizing, because now the fans, the listeners, and the band, are all trying to figure out what it all means at the same time. And we were going to have as many interpretations of what it means as there are people to listen to it.”
“It also has no consideration of the body, right. It doesn't ‘know’ what it feels like to play any of these melodies on the guitar or on the keyboard. If it's physically challenging to do. All it knows is the MIDI data that it's been fed in the training process. So, a lot of these melodies sounded odd, but simple enough to play. But then when we sat down to actually play them, we found that they were extremely challenging, because they forced us to acknowledge the embodied habits that we bring with us as players into the studio.”
“I like to think of some of these machine learning models being like a camera of their individual disciplines. I mean, a text-generating model that's able to make perfect texts. Maybe that just becomes the camera of writing. And we have to completely step outside of our comfort zone to reinvent what writing means in the 21st century. And what an exciting proposition that is for an artist.”
“There's also something really interesting about the reflective quality of AI as it works today. I mean, you build a machine learning model by feeding it lots of information, trading data. And in the context of music that information is historic. It's the history of music. It's a corpus of millions of notes, or a corpus of millions of words, of song lyrics from musicians and artists that we love. Or ourselves. So this idea that we could use an emerging technology not only learn to understand it, but also maybe learn something about ourselves in the process.”
“Maybe in ten years we won't even be making mu

43 min

Top Podcasts In Technology

Lex Fridman Podcast
Lex Fridman
All-In with Chamath, Jason, Sacks & Friedberg
All-In Podcast, LLC
Acquired
Ben Gilbert and David Rosenthal
The Neuron: AI Explained
The Neuron
TED Radio Hour
NPR
Dwarkesh Podcast
Dwarkesh Patel