Viral video of Joe Rogan interviewing Justin Trudeau was generated by AI and highlights danger of deep fakes

Viral video of Joe Rogan interviewing Justin Trudeau was generated by AI and highlights danger of deep fakes

Updated: 1 month, 8 days, 3 hours, 17 minutes, 33 seconds ago

A video that includes audio of what sounds like a conversation between podcast host Joe Rogan and Prime Minister Justin Trudeau is making the rounds online. But the bizarre interview posted on YouTube — with Rogan peppering Trudeau with topics such as the images of the Prime Minister in blackface, an unfounded Fidel Castro conspiracy and the so-called Freedom Convoy — never actually took place.

Although the video includes only a picture of Rogan and Trudeau side by side, the voices bear an uncanny resemblance to the host and the prime minister and raises concerns yet again about the pitfalls of artificial intelligence software and deep fakes.

Trudeau detractors have taken to the comment section and other social media channels to react to the video, with most appearing to be aware that the audio is AI-generated.

But awareness has done nothing to dampen online enthusiasm for the fake video.

The YouTube video, which tags #aivoice and #elevenlabs in the description, has already garnered more than 130,000 views.

AI technology used to mimic human voices

According to its website, ElevenLabs is a voice technology research company whose text-to-speech software develops extremely realistic human speech for publishers and creators.

In addition to the technology, which the company says can be used for voicing news, newsletters, books and videos, ElevenLabs offers tools for voice cloning and designing synthetic voices, its website says.

“We make every effort to implement appropriate safeguards which minimize the risk of harmful abuse,” it adds. “With this in mind, we’re fully committed both to respecting intellectual property rights and to actioning misuse.”

The risks associated with voice-mimicking AI

It’s not just ElevenLabs using technology like this. Recently, Microsoft revealed its new AI language model VALL-E, which it says can copy any voice using just three seconds of audio.

With such ease of use of this emerging technology comes greater risks, one expert previously told the Star.

“It’s going to create a real crisis in managing disinformation campaigns,” Brett Caraway, an associate professor of media economics at the University of Toronto, said. “It’s going to be harder to spot and it’s going to be overwhelming in terms of the volume of disinformation potentially.”

The use of this technology also opens the door to more bad-faith use, including spam and scam calls and fraudsters bypassing voice identification systems.

According to Abhishek Gupta, founder and principal researcher at the Montreal AI Ethics Institute, it could also pose challenges to artists who rely on their voice to make a living.

The risks of other AI technologies

Another form of AI technology that has taken the internet by storm is DreamBooth, which allows anyone to create digital replicas of real people — rocking outfits they’ve never worn and haircuts they’ve never gotten.

While users online have enjoyed toying with their own photos for fun, experts have also warned that the popular technology poses a more serious threat when it comes to “deep fakes” — AI-generated video that depicts highly-realistic moving images.

The impact is being seen overwhelmingly in doctored porn videos. According to a 2019 report by Sensity AI, 96 per cent of deepfakes were non-consensual sexual videos; of them, 99 per cent were made of women.

Last year, the use of the technology also already targeted Ukrainian President Volodymyr Zelenskyy in a fake video calling for Ukrainian soldiers to surrender .

With files from Kevin Jiang

JOIN THE CONVERSATION

Anyone can read Conversations, but to contribute, you should be a registered Torstar account holder. If you do not yet have a Torstar account, you can create one now (it is free)

Conversations are opinions of our readers and are subject to the Code of Conduct . The Star does not endorse these opinions.