Registration was successful!
Please follow the link from the email sent to

Artists Resurrect Richard Nixon with Deepfake Tech to Give Alternate Moon Landing Speech

CC0 / / Two members of the Apollo 11 lunar landing mission participate in a simulation of deploying and using lunar tools on the surface of the Moon during a training exercise on April 22, 1969
Two members of the Apollo 11 lunar landing mission participate in a simulation of deploying and using lunar tools on the surface of the Moon during a training exercise on April 22, 1969 - Sputnik International
As there was the draft speech in the event that the Moon Landing went horribly wrong, written by William Sapphire in a form of a solemn and poetic elegy, in a new immersive installation, a simulated Richard Nixon delivers Sapphire’s speech for an alternative Moon Landing history, using deepfake audio and video technologies.

The video was presented last weekend at International Film Festival Amsterdam and showed a deepfake Nixon speaking to installation viewers in a faithfully recreated 1960s era living room, complete with a vintage television set, wallpaper, furniture, and the decade’s TV ads. The whole installation is called “In Event of Moon Landing”.

“The moon landing is one of the most memorable historical events, at least within the last 50 to 100 years, so that what would be interesting is an alternative history of this seminal event,” one of the installation’s co-creators, Francesca Panetta—a journalist and fellow at the MIT Center for Advanced Virtuality —told Motherboard. “Just as people say, ‘Where were you on 9/11?’ or ‘Where were you when JFK was shot?’, people ask ‘Where were you on the day of the Moon Landing?’ What happens if we use deepfake technologies to provide this alternative history, but using a real documentary archive piece, which is this Bill Sapphire speech written for Richard Nixon if the astronauts had not been able to make it back to Earth.”

According to Panetta, deepfake technology has made history even more fragile than it was before, so rather than exploring deepfakes within the context of current news, the team was thinking about what it meant to retroactively rewrite a past event.

“It was a lot harder than the popular perception of deepfake creation is,” Co-creator Halsey Burgund, a sound artist and fellow at the MIT Open Documentary Lab, said. “This is a two-part deepfake creation. One part is the visuals of Nixon speaking, and then his synthetic voice.”

To synthesize Nixon’s voice, the team worked with a Ukrainian company called Respeecher which uses speech-to-speech synthetic voice production, a process in which they input into their AI model a speech by a voice actor, which then outputs the same speech with the same performative components—pacing, inflection—but with the target person’s voice. To get the visuals right, the team worked with Canny AI (the same team that worked on the deepfake Mark Zuckerberg video), who filmed an actor reading the speech, and selected target videos of Nixon they wanted to use.

 “Since the aim is creating a more discerning public around deepfakes because forensic technologies are always available that can automatically detect them for end users, there is a newspaper within the setting actually includes how deepfakes are made and what are the issues around them,” Harrell said. “And there is a lot of discussion around convolutional neural networks [like the one used by Respeecher] and algorithmic bias, and how these techniques can mislead. And I think that’s something important about this project.”

Panetta said that the team is currently making a digital version of In Event of Moon Landing, which they plan on releasing to the public in the spring of 2020.

To participate in the discussion
log in or register
Заголовок открываемого материала