You probably don’t need us to tell you how potentially damaging deepfakes can be. Not only do they bring uncertainty to video, they can be taken much further, actively damaging the person being faked and those who believe the fake is real.
But there’s apparently a simple way to see if a video is a deepfake or real.
Turning quickly sideways.
Trouble is, it’s the person in the video that has to turn sideways, not the viewer. This doesn't fix the problem of deepfakes. We're not helped too much unless the person in the video obliges us with a sideways view.
The findings came from Metaphysic.ai, a specialist in 3D animation, rendering and deepfake technology.
A report they wrote highlighted the fact that current deepfake technology cannot adjust quickly enough to a quick sideways view.
If the person in the video looks to the side fast enough, you can see whether they are real or deepfake.
This is apparently because the current level of technology behind deepfake cannot adapt quickly enough to the change in orientation. This results in flickering and distortion on the face or body being faked.
"Most 2D-based facial alignments algorithms assign only 50-60 percent of the number of landmarks from a front-on face view to a profile view," said Metaphysic.ai contributor Martin Anderson, who wrote the study's blog post.
While not particularly useful for some deepfakes, it could be a way around the volume of fraud that features deepfakes.
Particularly the fake job interview deepfakes the FBI warned us about in June. The problem is at such a level that they released a public service announcement alerting everyone to remote job applications.
Scammers were holding fake job interviews to harvest personal information to later use in fraud and identity theft.
Asking the interviewer to quickly move their head 90 degrees to one side may need a little explanation within an interview, but it’s a simple way to check they are real.
This finding can help live video deepfakes or where the presenter or subject is aware enough to provide that quick side view. For the majority of more harmful deepfakes, this offers little solace.
Hopefully, other solutions will arise as we delve deeper into the depths of deepfakes!