Microsoft Debuts Deepfake Detection Tool to 'Combat Disinformation'
Microsoft successfully developed a new tool to pinpoint deepfakes — a computer-aided image used to pass as a convincing facsimile of a real person's image, according to a blog post on Microsoft's website.
However, deepfakes are becoming more advanced by the day.
Microsoft debuted novel deep fake detection tool
Microsoft's software analyzes videos and images to generate a confidence score regarding the material's probability of artificial origin. The company said it wants the tech to help "combat disinformation," reports BBC.
However, an expert warned the tool might soon be outdated because of the rapid advancement of deepfake technology. To confront this issue, Microsoft also declared the existence of a separate system designed to help content producers insert hidden code in footage, so subsequent changes may be flagged.
Face-swapping deepfakes work with fewer images
Deepfakes entered the public space early in 2018 once a developer adapted cutting-edge artificial intelligence techniques to create new software that swaps one person's face for another.
This works via inputting a lot of still images of one person into computers, including video footage of another person. Using software, users could then generate new video — placing the former's face where the latter's was, and adapting facial expressions for lip-synch and other subtle human motions.
This process was simplified after the initial stage to widen availability to more users — and doesn't need as many images to function. Now several apps only require a single selfie to insert a users face in clips from a Hollywood film.
Deepfake detection software is a moving goalpost
However, concerns surrounding the capacity to abuse this process, and create clips that seem real but in reality are illusions. Imagine public figures made to act or say things completely divorced from the facts on the ground, for political, or other purposes.
As computer superpowers like Microsoft and Apple bring computing into the 21st century, the issue of using deepfakes to rig legacy systems like a social cheat code will probably become a moving goalpost — for every captured fake, more sophisticated deepfakes will come. For better or worse, it's a challenging time for ideas about reality and authenticity.
Principal director of Civil and Commercial Space Systems at Draper Pete Paceley told us that August is 'looking pretty good' for Artemis I mission.