Fact from fiction

It's getting more and more difficult to separate fact from fiction. A few years ago, my mother-in-law would send me things that I knew without question were false. The date was wrong, the site or source was not legitimate, or the image was so badly altered that within seconds of looking at it, I would be able to tell her to stop worrying. I could assure her with confidence that no one was going to throw eggs on her windscreen from a bridge on the highway in order to abduct her dog for ransom. Especially as she doesn't have one.
Howard Feldman
Howard Feldman

This is now no longer that simple. I spend three hours a day on radio and write a number of columns a week. Over the last few months, I have found it increasingly difficult to verify information. The Tembisa 10, the story of the alleged decuplet birth is case in point. It was accepted as fact and reported around the world. But it was fiction and rightfully condemned by SANEF, the South African Editors Forum.

And that was a story. A tale that we believed. What about the deliberate attempt to mislead with the use of deepfakes. In order to understand this better, I asked the Intelligent Data team at Synthesis Software technologies to assist.

The subject of deep faking

First up, I checked in with Marais Neethling to get the basics. “Deepfaking is the process of creating realistic and believable forgery of a photograph or video, usually depicting people.” Archana Arakkal put it this way, “Deep fakes falls under the realm of deep learning that enables specific algorithms to create fake images and videos that human beings cannot distinguish them from authentic ones.”

In other words, if you see a video of Joe Biden standing on the White House Lawn speaking about a recent trip to Saturn, for example, you will only know its fake because we have not reached there yet. The likeness and the imagery will be so real that it will be virtually impossible to know that it’s fake.

But much like criminals leave DNA at a crime scene, so are clues left behind. “Just like AI is used to create the deep fakes, it can be used to detect the “fingerprint” of deep fake generator algorithms, left behind in the forged images.” Says Neethling.

AI, freedom of expression and the press
AI, freedom of expression and the press

  3 May 2021

The subject of deepfake is important not only to providers of news, but also for social media platforms. A recent Eye Witness News article reported on a Facebook announcement that their new software runs deepfakes through a network to search for imperfections left during the manufacturing process, which the scientists say alter an image's digital "fingerprint."

"In digital photography, fingerprints are used to identify the digital camera used to produce an image," the scientists said.

"Similar to device fingerprints, image fingerprints are unique patterns left on images... that can equally be used to identify the generative model that the image came from."

"Our research pushes the boundaries of understanding in deepfake detection," they said.

Identifying deep fakes

Microsoft late last year unveiled software that can help spot deepfake photos or videos, adding to an arsenal of programs designed to fight the hard-to-detect images ahead of the US presidential election.

The company's Video Authenticator software analyses an image or each frame of a video, looking for evidence of manipulation that could be invisible to the naked eye.

According to Arakkal, it is important to be able to identify deep fakes due to the rise around malicious use of these technologies. “The malicious use of deep fakes proves a general threat to security and privacy an example of such a case is generate fake satellite imagery to confuse military analysts.”

She explained further, that there are several studies running concurrently that aims at tackling the challenge of detecting these deep fakes utilising AI such an example is utilising specific features on a video or image that can only be prevalent in deep fakes i.e. eye blinking, head pose estimation.

In other words, it seems to take one to know one. And takes one to catch one.

We haven’t heard the last of deepfakes. They will be with us for a while and will undoubtedly need a universally recognised stamp to indicate that the message is one. If not, not only will the world become an unbearably confusing place, but in no time at all I will be receiving messages from my mother-in-law who will be thrilled that Cyril Ramaphosa took the time to send her video birthday wishes.

About Howard Feldman

Howard Feldman is one of SA's leading entrepreneurs and Head of Marketing & People at Synthesis. His experience is global and extensive, spanning more than 20 years of working as a businessman, philanthropist and social commentator. Feldman was the chairperson of the Board of the South African Jewish Report, the only weekly Jewish newspaper in Africa, and he is a global keynote speaker, business strategist, author and morning drive show host.
View my profile and articles...

 
For more, visit: https://www.bizcommunity.com