NotJordanPeterson.com, a website that uses machine learning algorithms to produce deepfake audios that sound exactly like the author of the best-seller "12 Rules for Life", took down its AI voice generator shortly after the Canadian professor called for action against the disturbing tech on his blog.
"In light of Dr. Peterson's response to the technology demonstrated by this site, which you can read here, and out of respect for Dr. Peterson, the functionality of the site will be disabled for the time being", the site's message read.
However, another update suggests that the audio spoofing website may be operational again.
In a blog post, "I Didn't Say That", about the dangers of digital trickery, Peterson expressed his concerns about the technology that may become indistinguishable from the real thing in the near future and raised the issue of legality of the process.
"We need to seriously consider the idea that someone's voice is an integral part of their identity, of their reality, of their person - and that stealing that voice is genuinely criminal act, regardless (perhaps) of intent. [...] Are we entering a future where the only credible source of information will be direct personal contact? [...] I can't imagine what the world will be like when we will truly be unable to distinguish the real from the unreal, or exercise any control whatsoever on what videos reveal about behaviors we never engaged in, or audio avatars broadcasting any opinion at all about anything at all", Peterson penned.
He called on legislators to criminalise the unauthorised production of deepfakes, "at least in the case where the fake is being used to defame, damage, or deceive".
"Wake up. The sanctity of your voice, and your image, is at serious risk. It’s hard to imagine a more serious challenge to the sense of shared, reliable reality that keeps us linked together in relative peace. The Deep Fake artists need to be stopped, using whatever legal means are necessary, as soon as possible".
The website created by AI researcher Chris Vigorito allowed anyone to make the AI model of Peterson say anything one wants in 280 characters or fewer, converting the written message into the psychologist's voice.
His public speeches have repeatedly been used to train machine learning algorithms in order to make AI models say - or even rap - something he never said.
US lawmakers have already sounded the alarm over deepfakes potential impact on the 2020 presidential election, with several pieces of legislation already circulating through Congress.
For instance, Senator Ben Sasse (R-NE) has suggested rules that would make it illegal for people to "maliciously" produce and distribute deepfakes, while Rep. Yvette Clarke (D-NY) introduced a bill in June that would make the creators of AI-generated trickery disclose that they were fabricated by including some kind of a watermark.