At Popular Mechanics: This Nixon Deepfake Is an Alternate Reality Where Apollo 11 Fails. (YouTube).
The contingency speech – “In Event of Moon Disaster” – is real but the deepfake has been created by MIT Media Lab “to illustrate just how dangerous the AI-edited content can be if shared online.” So they created a doctored video on a subject beloved of conspiracy nutters … and shared it online.
To create the deepfake, the MIT team used deep learning, a type of artificial intelligence, to edit the video footage and employed a voice actor to build the voice of Nixon. Alongside Canny AI, an Israeli startup, the researchers studied video dialogue replacement strategies to replicate the movement of Nixon’s lips while speaking, helping to match up his mouth to the fake speech. The final product is a truly believable video of Nixon telling the U.S. public that the moon landing mission had failed.
I’m not convinced. By the moral panic, I mean. For one thing, the technology has no application to historical persons before the advent of television (and talkies). There’s no point deepfaking Dreyfus or lip-syncing Luther. Yes, the digitised ghost of Sir John Kerr could be made to thank the CIA for its advice in 1975 but games of that kind – as infantile and foolish as they may be – would only be believed by a few.
And what is novel exactly about gullible, stupid minorities believing preposterous things? Greens believe Scott Morrison can reduce the earth’s temperature but won’t do it out of meanness. Hollywood has been deepfaking history for decades. The only population that might be incited by a deepfake to do awful deeds is in the Middle East. But Palestinians were just as incited by a man in a mouse suit. Remember Farfour?