- Microsoft on Tuesday announced the launch of Microsoft Video Authenticator, a tool designed to spot when videos have been manipulated using deepfake technology.
- Deepfakes are typically videos that have been altered using AI software, often to replace one person’s face with another, or to change the movement of a person’s mouth to make it look like they said something they didn’t.
- Microsoft said it’s inevitable deepfake technology will adapt to avoid detection, but that in the run-up to the election its tool can be useful.
- Visit Business Insider’s homepage for more stories.
Microsoft is trying to head off deepfake disinformation ahead of the 2020 election by launching new authenticator tech.
In a blog post on Tuesday, Microsoft announced the launch of a new tool called Microsoft Video Authenticator, which can analyze photos or videos to give “a percentage chance, or confidence score, that the media is artificially manipulated.” For videos,