In the aftermath of Grimes’ ode to Roko’s Basilisk, Holly Herndon’s new track “Godmother” feels like something of a necessary course correction.
Grimes’ joyous slice of propaganda was certainly entertaining but will our AI overlords really be all that receptive to nu-metal guitars and industrial cyber(pop)punk? Herndon — with her new track, made in collaboration with Jlin and an AI created by herself and Mat Dryhurst called Spawn — is here channeling a more appropriate AI response.
This new track feels very much like a continuation of a line of thought that Herndon has been evangelising for years, making us reconsider just how things can and should “sound” in our present moment.
I’m reminded of a part in her brilliant RBMA lecture with Emma Warren in which she discusses her work on the sounds of electric cars:
HOLLY HERNDON: Electric cars don’t have the same kind of natural engine sound that non-electric cars have. A lot of car companies have been putting recordings of actual, like physical mechanical sounds in their cars because you have to tell people… There’s been a huge problem with people who are visually impaired or older people not hearing cars.
EMMA WARREN: Or perhaps people who are just on their phones.
HH: Or people on their phones, which is me often. Yes, so they’re trying to figure out a way to let pedestrians know that cars are coming and so a lot of sound design companies are basically coming up with spaceship sounds, because I guess that’s what it’s like…
EW: So your car when you’re driving to the shop is supposed to sound like a spacecraft?
HH: That’s what the idea has been, that was the grand idea but I think that’s a really boring solution to what could be basically any kind of sound. So I was working with this company called Semcon, and we presented at the Frankfurt Motor Show last year, which was a really unusual venue for me to be showing stuff. [laughs] Basically, we came up with some different options for what an electric car could sound like and when you turn your wheel how could you play your car, and how your car could be an instrument in that way. One of the ideas that we came up with was to have a microphone system that would pull in the sound of the city wherever you were. Then it could process that, and then that could be a part of it, so it wouldn’t just be like a one-fit solution for every city. I think urban sound planning and things like that are really interesting.
EW: If you were in charge of the way the electric cars sound when they’re driving down the street, they would sound differently in the city than they would in the countryside?
HH: Yes. In short, yes.
With the increased marketing push towards cloud-based voice services like Alexa and Echo, Spawn seems to encapsulate this idea in a way that is, quite literally, parental rather than commercial. Spawn won’t order your shopping like a WiFi-enabled dumb waiter, but she’ll listen, learn and talk back — even, “sing” back — to you.
This is precisely how Herndon talks about Spawn in a statement released on Twitter alongside the single, giving an insight into Spawn’s gestation from embryonic code to sponge-brained AI child thirsty for sense-data.
She is being raised by listening to and learning from her parents, and those people close to us who come through our home or participate at our performances.
Spawn can already do quite a few wonderful things. ‘Godmother’ was generated from her listening to the artworks of her godmother Jlin, and attempting to reimagine them in her mother’s voice.
This piece of music was generated from silence with no samples, edits, or overdubs, and trained with the guidance of Spawn’s godfather Jules LaPlace.
This feels like a natural next step for Holly — a logical next step in a trilogy of records that shows the fascinating progression of not a single idea but a whole host of interconnected socially embodied implications concerning our relationships with technology.
Right now, it feels like things have come full circle. From the recursive body-mediating laptop-relations of 2012’s Movement through to the inhabited online worlds of Platform, “Godmother” sees Herndon’s deep gazes into the interiorities of our laptop lives turned around. Now, the code is looking back.
Around the time of Movement‘s release, I remember Herndon supported Cosey Fanni Tutti for a couple(?) of shows. So much was made of the novel ways she was treating her “laptop” at the time, her work still somewhat novel to its new audience. The music press, making sense of her performances, would nonetheless ground their analogies in the familiar, with her virtuosity being likened to that of a violin player for the way that she “embodied” her playing of the instrument. Watching a violinist play is certainly to see someone engaged in a full-body exercise. But so is playing the drums. Or brass. Or whatever…
Music-making is essentially an embodied activity. Even when made on a laptop. The combination of Herndon and Fanni Tutti was inspired, I thought, in this regard, because both artists have built careers on exacerbating the centrality of the body in their work. This embodied nature isn’t unusual in and of itself, but you, as an audience, noticing it — particularly in such overly mediated contexts, whether that is being a person (but especially a woman) in a band in the 1970s or online in the 2010s — very much is.
Then, on “Godmother”, the embodied body is left behind — now the laptop plays me — and the tables have been turned. What is exacerbated is less the presence of the body in the technosphere, now what is exacerbated is its absence. As Herndon later writes in her Twitter statement: “In nurturing collaboration with the enhanced capacities of Spawn, I am able to create music with my voice that far surpass the physical limitations of my body.” And not just herself, but her peers.
Going through this process has brought about interesting questions about the future of music. The advent of sampling raised many concerns about the ethical use of material created by others, but the era of machine legible culture accelerates and abstracts that conversation. Simply through witnessing music, Spawn is already pretty good at learning to recreate signature composition styles or vocal characters, and will only get better, sufficient that anyone collaborating with her might be able to mimic the work of, or communicate through the voice of, another.
Are we to recoil from these developments, and place limitations on the ability for non-human entities like Spawn to witness things that we want to protect? Is permission-less mimicry the logical end point of a data-driven new musical ecosystem surgically tailored to give people more of what they like, with less and less emphasis on the provenance, or identity, of an idea? Or is there a more beautiful, symbiotic, path of machine/human collaboration, owing to the legacies of pioneers like George Lewis, that view these developments as an opportunity to reconsider who we are, and dream up new ways of creating and organizing accordingly.
I find something hopeful about the roughness of this piece of music. Amidst a lot of misleading AI hype, it communicates something honest about the state of this technology; it is still a baby. It is important to be cautious that we are not raising a monster.
In my own experience, intent is irrelevant in trying to intuit an AI child with a ‘good’ conscience never mind a consciousness. Being fed on a diet of Jlin is certainly a good start but to what extent does this aesthetic valorisation allow it to remain true to itself…
We shall have to see what Spawn spawns next…
2 Comments