Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Financial Times
Financial Times
Business
Nic Fildes

When AI takes on Eurovision: can a computer write a hit song?

Imagine assembling a crack team of musicologists to compose the perfect Eurovision hit, only to end up with a song that crescendos as a robotic voice urges listeners to “kill the government, kill the system”.

That was the experience of a team of Dutch academics who, after an experiment in songwriting using artificial intelligence algorithms, inadvertently created a new musical genre: Eurovision Technofear.

The team – Can AI Kick It – used AI techniques to generate a hit predictor based on the melodies and rhythms of more than 200 classics from the Eurovision Song Contest, an annual celebration of pop music and kitsch. These included Abba’s “Waterloo” (Sweden’s 1974 winner) and Loreen’s “Euphoria” (2012, also Sweden).

But to generate the lyrics for the song “Abuss”, which they hoped to enter in the inaugural AI Song Contest this year, the team also used a separate AI system – one based on the social-media platform Reddit. It was this that resulted in a rallying cry for a revolution. 

Like the notorious Tay chatbot developed by Microsoft in 2016 that started spewing racist and sexist sentiments after being trained on Twitter, the fault lay with the human sources of data, not the algorithms.

“We do not condone these lyrics!” stresses Janne Spijkervet, a student who worked with Can AI Kick It and ran the lyric generator. She says the Dutch team nevertheless decided to keep the anarchist sentiment to show the perils of applying AI even to the relatively risk-free environment of europop. 

An Australian entry has the same sheen of a chart-topping dance hit but with a distorted AI-generated chorus of koalas, kookaburras and Tasmanian Devils

The use of AI in music composition is now on the cusp of the mainstream as more musicians and songwriters look for tools that inspire different types of music. The AI Song Contest, organised by Dutch broadcaster VPRO, is one of the first events to take the process of using algorithms to compose original music out of academia and avant-garde experimentation and into the commercial world.

The competition is inspired by Eurovision and has gained greater prominence since the cult event, which was due to take place in Rotterdam this month, was cancelled due to the coronavirus pandemic. The European Broadcasting Union, which organises the 64-year-old contest, has endorsed the computer-based version and will act as a vote supervisor.

The AI version is a much smaller affair, with only 13 entries in its first incarnation (compared with the 41 countries that were due to compete in Rotterdam). However the quirks of the original contest have come to the fore in the AI world as well. Alongside “Abuss”, which its creators describe as atonal and creepy, sits an Australian entry with the same sheen as a chart-topping dance hit but with a distorted subliminal AI-generated chorus of koalas, kookaburras and Tasmanian Devils.

Meanwhile, the song “I’ll Marry You, Punk Come”, composed by German team Dadabots x Portrait XO, used seven neural networks in its creation. The resulting piece of music blends lyrics from babble generated from 1950s acapella music with AI-generated death-metal vocal styles and a chromatic bass line spat out of a neural network trained on Bach’s canon.

The contest will be judged along the same lines as the established competition with a public vote tallied against the opinions of a panel of expert judges. Ed Newton-Rex, who founded the British AI compositional start-up Jukedeck, is one of them. He explains that the panel will be looking at the process of how machine learning was applied as well as creative uses of algorithms – such as the “koala synth” – and the quality of the song. The judges will also be asked to factor “Eurovisioness” into their thinking, although he admits, “I have no idea what that means.”

VPRO does not expect “billions of people” to tune into the event but says that many Eurovision fans will follow the livestreamed announcement of the winner on May 12. The hope is that the computer version could itself prove a hit and pave the way for AI to influence Eurovision proper through song composition or, over time, robotic performance. “That is my dream,” says Karen Van Dijk, the VPRO producer who came up with the concept.


A performance given by the Sex Pistols at Manchester’s Lesser Free Trade Hall in 1976 – where, legend has it, almost everyone in the tiny audience went on to form their own band – became known as the “gig that changed the world”, and was deemed a genesis point for a musical revolution. The equivalent for AI music took place in the winter of 2019 in Delft, the picturesque Dutch town known for its fine pottery and as the birthplace of the painter Johannes Vermeer. The city’s university was hosting the 20th conference of the International Society for Music Information Retrieval when a proposition was put to the academics in attendance.

The hope is that the computer version could pave the way for AI to influence the Eurovision proper through song composition or, over time, robotic performance

Van Dijk announced that she was organising the first “Eurovision for computers” and needed entries. When Holland’s Duncan Laurence won the Eurovision Song Contest in 2019, amid her euphoria Van Dijk pondered whether AI could be harnessed to lock in more hit songs for the country. “I was naive. I thought we could create the next Eurovision hit with the press of a button,” she says.

Van Dijk arrived in Delft bearing data gifts. An Israeli composer had created a spoof Eurovision song the year before, called “Blue Jeans and Bloody Tears”, using a cache of data extracted from the Eurovision catalogue. That data was bought by VPRO and provided to the entrants as a stimulant for their own experimentations. For some, it also allowed them to rekindle pop-star ambitions.

Tom Collins, a music lecturer at the University of York, and his wife Nancy Carlisle, an academic at Lehigh University in Pennsylvania, had a garage band called The Love Rats when they were doctoral students. When Collins heard about the AI Song Contest, he was inspired to “dust off his code” and get the band back together by using AI to write a song. He initially worked with Imogen Heap, the English singer-songwriter and audio engineer, but coronavirus-related travel restrictions halted those efforts. Instead, he and Nancy worked over a weekend on “Hope Rose High”, which he describes as an “eerie power ballad” inspired by the lockdown. 

The husband-and-wife team turned to an AI lyric engine called “theselyricsdonotexist” to generate robotic poetry with an optimistic feel. Carlisle says the AI’s suggested lyric “and then the mist will dance” seemed ridiculous until she listened again to some of her favourite songs and started hearing what sounded like nonsense. “Radiohead don’t make a lot of sense but I still love them,” she admits. Collins adds that the mist lyric also fits with the Eurovision theme: “You can imagine the massive smoke machines kicking in.”

While the duo did not enter the contest with the aim of winning, others saw an opportunity to test whether AI could be used not just to write a song but to pen a hit. Ashley Burgoyne, a lecturer in computational musicology at the University of Amsterdam and a member of the team behind “Abuss”, used the “Blue Jeans” dataset to create a “Eurovision hit predictor”.

That data suggested that melodies with hooks of three to seven notes and songs with simple rhythmic patterns scored the highest. It also showed that a certain level of atonality – where it is hard for the ear to identify the key – was crucial to Eurovision success. Yet Burgoyne believes that despite a handful of “stinkers” being included in the data, the results reflected a paucity of the negative information that is needed to successfully train the system – in this case, songs that didn’t reach the finals.

He compared the issue to Netflix recommendations that suggest “a load of crap” after you have watched a high-quality TV series. “If you believe quality exists, then AI isn’t good at finding it. How do you define [what is] a good song even in the world of Eurovision?” he says. 


The use of subliminal voices supposedly encouraging devil worship in heavy metal music was a cause célèbre in the 1980s. Few would have expected that subliminal Tasmanian Devil voices would be influencing europop 30 years later. 

Caroline Pegram, head of innovation at Uncanny Valley, the music technology company behind the Australian entry, wanted to pay homage to the wildlife that had been killed during the 2019-20 bushfires in Australia. A zookeeper friend gave her videos of Tasmanian devils “going absolutely wild” and blended the screeches with the sounds of koalas and laughing kookaburras to create an audio-generating neural network using technology developed by Google’s creative AI research project Magenta. They called it the “koala synth”.

It proved that AI can create unexpected results. “It was a happy accident. Everyone thought I was insane — literally insane — but the koalas have sent out a positive message and it is a strong and catchy sound,” says Pegram. 

The koala synth adds a new Antipodean angle to the Eurovision story — Australia has only been permitted to compete in the contest since 2015 when the European Broadcasting Union allowed its entry.

Justin Shave, who produced the song, explains that the DDSP — differential digital signal processing — technology it used has since been used to generate the sounds of violins, trumpets and even a choir of drunken men. “That one didn’t work so well,” he admits.

Unlike the more academic entrants, Uncanny Valley comes from a musical background, having produced songs for Aphex Twin and Sia. The group had already planned to enter an AI-composed song in the main song contest. 

They now hope that the AI Song Contest will help to dispel concerns in some parts of the traditional music community that the technology could lead to musicians losing their jobs if computers take over.

Geoff Taylor, chief executive of the BPI, the UK’s music trade body, and head of the Brit Awards, says the “new horizons” of AI are exciting but urges caution.

“We also need to guard against the risk that AI might in certain respects be deployed to supplant human creativity or undermine the cultural economy driven by artists. Such an outcome would leave our societies and our cultures worse off,” he says.

His fears have been stoked as some of the world’s largest technology companies, including Google and TikTok owner ByteDance, 

have moved into the compositional space. But Anna Huang, a resident at Google’s Magenta and a judge on the AI Song Contest, says Big Tech is attracted to AI musical composition by scientific curiosity, not a desire to take over the music world.

“Music is a very complex domain. In contrast to language, which is a single sequence, music comprises arrangement, timbre, multiple instruments, harmony and is perceptually driven. It is also very referential,” she says.

AI could also have a democratising impact on the creation of new music, says Huang. She cites her own experience at high school in Hong Kong, when some of her classmates were already composing for full orchestras. Huang was a musician too and believed that computer science could develop new methods of musical composition, something AI can potentially deliver.

That was demonstrated via an interactive Google Doodle launched in March last year that encouraged users to input a simple melody. The AI, developed by Magenta, then generated harmonies in the style of Bach. Within two days, the lighthearted doodle had created 55 million snippets of music.

Newton-Rex, who sold his company to China’s Bytedance last year, says musicians need to see AI as a tool to stimulate creativity – a spur that helps new ideas or disrupts habits – rather than a threat. “Every time I sit down at the piano, I play the same thing,” he says, adding that AI is already creeping into sophisticated drum machines, arpeggiators and mastering software, and that it will always need human curation. “What does AI music sound like? It sounds like nothing without a human element.”

Next week, the AI Song Contest may push the use of this technology on to the main stage and dispel some of the concerns about its growing influence. As Pegram says: “Some musicians fear we will end up building machines pumping out terrible music – but we need to rage with the machine, not against it.”

Nic Fildes is the FT’s telecoms correspondent

Follow @FTMag on Twitter to find out about our latest stories first. Listen to our podcast, Culture Call, where FT editors and special guests discuss life and art in the time of coronavirus. Subscribe on Apple, Spotify, or wherever you listen.

Copyright The Financial Times Limited 2020

2020 The Financial Times Ltd. All rights reserved. Please do not copy and paste FT articles and redistribute by email or post to the web.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.