Despite having lived in Berlin for several years now, composer Holly Herndon still retains the slight twang of her native Tennessee. It’s one of the first things I notice about her upon meeting her at a hotel in east London where she’s staying during a visit. The accent, which is slow and soft, gives her an unexpectedly friendly sensibility, at odds with the image so often portrayed by her chosen industry. Holly’s doing the press rounds but, despite it being near the end of her day, welcomes me and what ensues is a fascinating, enthusiastic conversation about her practice, particularly the recent release of her album Proto, made in collaboration with an artificial intelligence.
Holly is an established name in the electronic music world. Her first real breakthrough came in 2012 with the release of Movement, which preceded Chorus (2014), Home (2014), and Platform (2015). But this most recent work has turned the heads of arts and culture fans across the board. The album, which as always has been made in collaboration with Holly’s partner Mat Dryhurst, features 13 tracks and contributions from artists like Martine Syms and Jlin, and sees Holly fronting and conducting a choir comprised of both human and AI voices. To produce the album, Holly and Mat created (or gave birth to) what they call an “AI baby”, affectionately named Spawn, who listened to group vocalisations and taught herself to manipulate sound.
With no learning parameters set for Spawn, she utilised openly available machine-learning programming to push standard code to actually manipulate sound. As a result, Spawn learned from Mat, Holly and collaborator and developer Jules LaPlace. Proto, in turn, represents a compelling step forward in terms of the creative applications of artificial intelligence.
A far cry from the problematic Alexas of the world, and not limited to rehashing or remixing existing content, but actually creating herself, Spawn is a key character in the conception of Proto. As the album goes on, we see her grow and learn, turning baby babbles into developed sounds, not only revealing the human that exists behind every AI or piece of tech, but also warning us it’s time to get smart about how we create and what we do with artificial intelligence. Ultimately, we are in control of it, and that means we have a responsibility.
INTWhen and why did you become interested in working with AI? And how did that lead to the creation of Spawn?
HHIt’s a continuation of a lot of the work I was doing as a computer musician. I’ve always been interested in algorithmic music, computer music, and AI felt the next frontier in that. And then my partner and I, Mat Dryhurst, were awarded a grant in Germany in honour of Beethoven (as one is). The idea became, “What could we spend this money doing, that normally we would be too scared to do?” Because we wouldn’t know what the outcome from that experiment would be. And so this seemed like the perfect opportunity. Luckily, it turned into something that we were able to use on our album, but when we started it, we didn’t know where it was going.
INTAnd since that first experimentation, how has Spawn developed? Has she become significantly more intelligent?
HHI don’t think she’s become any more intelligent. I think that we know how to work within her limitations better. And we’ve become more intelligent.
“I’ve always been interested in algorithmic music, computer music, and AI felt like the next frontier”Holly Herndon
INTWhy did you decide Spawn was a she?
HHThat’s a question that keeps coming up – it’s on everyone’s mind. Like, what’re the politics behind that? It’s actually just as simple as the first voice that she was trained on was mine. So the first things I heard out of her “mouth” were a reflection of myself, and so I felt like it was a daughter. It’s not political.
INTI don’t know if I even thought it was political, it’s more how you’ve given her a distinctive character that I find really interesting.
HHYeah, we decided on the child metaphor pretty early on, even though we think the anthropomorphising of AI and technology is somewhat problematic and not that interesting – we didn’t give her a visual avatar, purposefully, for that reason. But we wanted to use the metaphor of raising a child in order to give the process some gravity and some weight. We felt like that metaphor was like a nice way to do that while also drawing attention to how, whatever AI you’re dealing with, it’s not an embodied being, it doesn’t have a wider context, it only has access to the information that you and the community around you are giving it and putting in front of it, very much like raising a child. This is all too often made invisible with the AI that we encounter.
INTSo we’re usually meeting it at the point when all that “raising” has already happened?
HHExactly. Revealing that also means we’re seeing ourselves as being implicit in its development. This isn’t just some alien species that landed here, we can’t be like, “AI is gonna do what AI is gonna do”. No, we’re developing it. We can say what it will be.
1 of 5
Still from Godmother: Holly Herndon & Jlin (feat. Spawn)
“We wanted to see her as something that’s always evolving and always becoming, instead of a fixed entity”Holly Herndon
INTDo you think there’s an element of trying to make her something that people can connect with?
HHI think using the metaphor of the child, as well as giving her a name makes people connect with the concept. And that was purposeful, of course. Because otherwise we’re talking about neural network architecture and that’s very dry. Instead, it opens up the conversation to something more societal and philosophical.
INTHave you found yourself connecting with her over time?
HHI think maybe I’m more amenable, more drawn to Spawn. And I’ve learned what she likes to be fed, how to work with her.
INTThat’s such a strange feedback loop, because is that what she likes? Or, what you like? What about the differences between working with the human choir and with Spawn?
HHThey’re very different, obviously, but we tried to position Spawn as an ensemble member from the get-go, because a lot of AI that’s being developed now is for automated writing purposes, automated composing. We didn’t want to see Spawn in that kind of role. We wanted to see her in a more performative role. So seeing her as an ensemble member kind of helped clarify that a little bit. And while she interprets what we give her much as a human interprets it, it’s different. It’s much slower because there’s a rendering time.
INTWhat actually is that process of interpretation? How does that work?
HHWell, Spawn is not real-time. So we’re usually cutting and editing performed experiences, performed rehearsals or whatever it may be, together. We’ll record that and then edit that into a training set, and then feed that to Spawn. And then that will be rendered, and she’ll output an audio file. So it’s not so much this idea of her sitting there and listening, I think even the way that the press release is worded, it sounds like she’s cooking or something. Somebody actually asked me how Spawn cooks. I was like, “Oh, wow. Okay, no.”
INTIn terms of your creative agency within the project, how has working with an artificial intelligence affected that for you?
HHWell, it’s an inherently collaborative process. I started out, when I made Movement, working completely alone with my laptop – it was very isolated. And then with Platform, I started collaborating with other people, sometimes online, sometimes offline, and also with Mat in real-time. And then this album felt like opening the practice up to full collaboration in all aspects. When I was making my first record, I felt more protectionist, like I needed to prove myself and there wasn’t room to open it up to anyone else. For example, being a woman in electronic music, people would come up to me after my shows and ask who wrote my beats. But as I became more established and felt more comfortable in my own skin and as the political landscape of society changed, I’ve found myself fully embracing collaboration. And working with this AI, it’s almost exponentially collaborative.
INTYou’ve previously spoken about how people struggle to grasp who has made your music, because it’s so collaborative. How has the response been to Proto?
HHI think a lot of people confuse Spawn performing the work that myself and Mat feed her with her writing it. I just try to make it clear that she’s an ensemble member. But it’s also an age-old conversation in ensemble dynamics. Where does the sheet music end, where does the composer end, where does the conductor end, where does the interpreter end? These things overlap in a lot of ways. There are ways that ensemble members interpreted certain lines that were different to what I had in my mind, there is always a kind of a co-writing happening when you’re working with performers.
“We can’t be like, ‘AI is gonna do what AI is gonna do’. No, we’re developing it. We can say what it will be”Holly Herndon
INTHas that process been similar with Spawn? Has that happened?
HHIn a way. She’s way more interpretive, more surprising.
INTWhich again, is very reminiscent of a child. I’m fascinated by all of that, do you know Mediengruppe Bitnik?
HHYeah! They’re friends of mine.
INTThat element of unpredictability reminds me of their project Random Darknet Shopper.
HHYeah, the legal implications of their work are so interesting. I love them, because they set up these systems and bring these grey areas to the forefront where you have to kind of like scratch your head and be like, “Huh, legal precedent doesn’t really work for this.”
INTSpeaking holistically here, in your own practice, but also in the world of creativity, what benefits do you think artificial intelligence and machine learning can offer us?
HHI think in a wider, philosophical view, we can stop thinking about human rationale and intellect as the singular thing in the universe, and understand that there may be intelligence beyond us. And so how can we learn from other inhuman intelligence? For music, specifically, I think, in the same way that the digitally processed voice has helped transcend the physical limitations of my body, machine learning will further that ability. I think there might be a way it can help facilitate a group logic, a kind of group distributed intelligence. I also think it could just help us iterate ideas faster, so that we can focus on other musical parameters.
INTWhat about on the flip side of that?
HHI think we should avoid training AI on existing canon. So far, instead of making new training sets, and instead of trying to take AI to a new place, a lot of work has been focussed on “let’s train it on Bach and then have new pieces of Bach forever.” I feel like that can really get us into like an aesthetic and creative cul-de-sac of rehashing and recycling ourselves. Culturally, we struggle with that anyway, a kind of retromania and nostalgia. Of course, we’re always building on a shared language, we’re never entirely starting from scratch, but I think it’s important that it continues to build and that we don’t get stuck in a loop.
INTConsidering how we understand AI, there’s a collaboration that takes place akin to one with a creative partner. So do you see it as more than a tool?
HHYou could classify it as a tool, I suppose. I think it’s different to what’s come before though. Maybe it’s a new kind of tool. It’s not simply like algorithmic music, it’s an entirely different logic. One way that I like to explain it is, with algorithmic music, the composer sets up the parameters and there might be randomisation in there. So you might be surprised with some of the outputs or where it goes, but you set up the parameters, you set up the rules. With machine learning, you can actually provide a canon or training set that has the rules in its own logic, the way that it was formed, and the computer is deciphering the rules from that canon. So this is a different approach. You’re not setting up the rules yourself. You’re kind of extracting them from something that’s existing. That’s why there is a propensity to get stuck in whatever came before.
INTSo, away from the computer, how does your collaboration with Mat enhance the project? Particularly from a visual standpoint.
HHMat and I are creative partners, and we’re also partners outside of work. We’ve been together now for ten years. And I feel like we have this insane hive mind. But he, of course, has his own stuff going on – he lectures at NYU and teaches a class there called “Surviving the Future”. As a result, he thinks about the world from a more infrastructural kind of viewpoint. And so we complement each other in the way that we work, it’s really crucial. He brings so much to the project, particularly in terms of the visual aesthetic, using machine vision techniques that try to make the visual and the sound married.
INTYou mentioned earlier that you didn’t give Spawn any kind of visual identity. What was the decision behind that?
HHIt’s already problematic enough to anthropomorphise Spawn through the child metaphor. And we were afraid that if we gave her an avatar, we would further that, even if it was a non-humanoid kind of avatar. We wanted to see her as something that’s always evolving and always becoming, instead of a fixed entity. And of course, we could have created some sort of generative animation or something like that but we didn’t want there to be any fixed concept of what she would look like, we really wanted to focus on the sound and felt like giving her too much of a look would pacify that. Instead, we’ve chosen to represent her through the people that made her.
“I don't think she’s become any more intelligent. I think that we know how to work within her limitations better. And we’ve become more intelligent”Holly Herndon
INTWhat’s the significance of that?
HHOne of the huge political issues is that we erase the human labour that goes into making AI but also that rare earth minerals go into the machines that are processing and crunching the data. So we just wanted to try and make that a bit more visible.
INTMoving forward, how do you intend to work with Spawn in the future?
HHWe’re trying to figure out how to get Spawn to really perform, in real-time. Because right now, there’s this long rendering time and I don’t want to present any kind of bullshit, fake magic thing. I want to be really honest about it. That’s one reason why we left Godmother in its raw state – it has this roughness to it, the honest sound of what AI sounds like. So with the live performance, we’re trying to figure out real-time systems because it has so much potential. For example, when I think about Melodyne or Autotune, when that stuff was not in real-time, it tended to be used in more conservative ways, just for perfecting pitch. But as soon as they became real-time, artists started using it in the studio without a clean channel, straight through the plugin, they wouldn’t even record a clean channel because that became part of their voice in a Cyborg kind of way. And then that’s when their flows started mutating with the autotune software. And then you had this entirely new cadence developed through that.
That’s when I get really excited. First, you have the technology which tries to make the human more perfect and machine-readable. And then the human pushes it to the next level, and you really hear that human expression, but through this technology, and then there’s a whole new aesthetic that’s developed through that. Yeah, that’s when I get really excited, so I’m hoping that we can get to that point with Spawn. That’d be great.
About the Author
Ruby joined the It’s Nice That team as an editorial assistant in September 2017 after graduating from the Graphic Communication Design course at Central Saint Martins. In April 2018, she became a staff writer and in August 2019, she was made associate editor. Get in contact with Ruby about ideas you may have for long-form stories on the site.