No, Bruce Willis did not sell his face to deepfake firm Deepcake
A spokesperson for Bruce Willis has come forward and told the BBC that the actor has not sold the rights to his face to a deepfake company called Deepcake.
Willis has "no partnership or agreement" with the firm, he said on Sunday.
In addition, a representative of Deepcake added that currently, only Willis had the rights to his face.
The news comes about after several media outlets reported that Willis had agreed to have his image used in deepfake videos. Both the Daily Mail and the Telegraph wrote about it.
Last year, Willis did work with Deepcake to create an advert for Megafon, a Russian telecoms company, which is where the rumors began. The advert saw the use of deepfake technology to recreate Willis’ appearance.
"What he definitely did is that he gave us his consent (and a lot of materials) to make his Digital Twin," Deepacke told the BBC.
Deepcake’s website even boasts a recommendation attributed to Willis: "I liked the precision of my character. It's a great opportunity for me to go back in time. The neural network was trained on content of Die Hard and Fifth Element, so my character is similar to the images of that time."
However, Willis's agent told the BBC that the actor is not at all involved with the company. "Please know that Bruce has no partnership or agreement with this Deepcake company,” he said.
The emergence and development of deepfakes
The controversy surrounding this event highlights some of the challenges associated with the recent emergence and development of deepfake technology. Work on deepfakes is actually not entirely new. While nowhere near the sophistication of current technology, the field of "computer vision" has been around since the 1990s.
A subfield of computer science, it combines AI and computer processing of digital images and videos to create new artificial media. One notable early academic project was called the Video Rewrite program that was published in 1997.
It was able to modify existing video footage of a person speaking with new doctored dialog. It used machine learning to completely automate facial reanimation.
More modern academic, and amateur work in this field has focussed on making the process simpler, faster and more accessible and has seen it creep up in a variety of applications.
After Darth Vader actor James Earl Jones retired from playing the famous character, AI firm Respeecher reportedly used archival materials and a proprietary algorithm to replicate Vader’s voice.
The technology was used in Disney's latest release of Star Wars spinoff Obi-Wan Kenobi. The show reproduced Vader's speech and even made him sound younger.
Although some people are worried about the progress of deepfakes, it seems that their use at least till now has been reserved for entertainment purposes. In addition, as the technology evolves so do techniques to identify it meaning dangerous deepfakes (such as those of political figures) won’t have a chance of being believed.