Deepfake Technology: Is it A Threat or Not?

Deepfake Technology: Is it A Threat or Not?
Published on

Deepfakes already possess a bad reputation as they are used for manipulation, but can we count on it for some benefits?

How would it feel if you could see your great grandmother blinking and smiling in one of her old photographs? It will creep you out, no doubt, but will it also make you nostalgic? Well according to MyHeritage, you might be nostalgic.

The genealogy startup MyHeritage has recently introduced a new feature called Deep Nostalgia that allows users to animate the faces in family photos. According to MyHeritage, over 1 million photos were animated in the first 48 hours alone. One of their blogs says, "Users have responded with wonder and emotion: some were awed to see ancestors they'd never met — some from over 100 years ago — move, blink, and smile, while others were moved to tears witnessing their lost loved ones in motion after so many years with only still photos to remember them by."

The website in its FAQs admitted that it is possible for people to find these videos creepy. Although, the feature has now become the new trend and many are using it to witness their long-lost loved ones moving.

According to MyHeritage, they licensed the technology from D-ID, an Israeli company specializing in video reenactment using deep learning.

The startup revealed that they did not include speech to prevent abuse of this technology to create deep fakes of living people. Although, they have already created a promotional video with speech and audio wherein they reanimated Abraham Lincoln. Is it not to be considered a deepfake? Deepfake technology has garnered so much negative attention and concern with regards to spreading fake news. Thus, their boundaries need to be set properly to decide whether it is a threat or not.

How do Deepfakes work?

Deepfake uses AI-based technology to manipulate images, audio, and video to make them seem authentic and real. This technology uses machine learning systems to synthesize videos and audio quickly at a minimal cost. Neural networks like Generative Adversarial Networks (GANs) are used to train data sets with real footage to make them understand a person's actual voice, behavior, and expressions. Two separate machine learning models are used, one to train on the provided datasets and fabricate images, the second one for monitoring these fabrications and grade the synthesis. These days deepfakes are created by both AI and non-AI-algorithms and do not involve GANs.

Deepfakes are a threat in many ways like the use of deepfake audios in money extortion, fraudsters targeting celebrities and politicians to spread fake news, creating non-consensual pornography, etc.

Back in 2019, a video of Nancy Pelosi, speaker at the United States House of Representatives, took rounds on social media, wherein she was speaking unusually slow and high pitched. This video was later identified as fake since it was altered from the original speed to make her speech seem slurred. The video was aimed at throwing negative light on her and this is not the first-ever incident. There have been many such incidents where manipulated videos went rounds.

Any Positives By Chance?

Deepfake technology uses AI to simulate human actions to create videos and they are infamous for spreading misinformation. However, some fields can actually benefit from deepfakes. The film industry can leverage deepfake technology to edit videos without reshooting them and also recreate actors who passed away on the screen. Training and educational videos can leverage deepfakes to enable virtual materials without human intervention. Deepfake technology has other benefits and thus can impact positively if used within ethical grounds.

For example, the new feature launched by MyHeritage can be appealing for many people but until it crosses the boundaries. If it creates misinformation in any way it might come under strict scrutiny since there are already regulations on deepfake technology and many legislations are planning to criminalize non-consensual deepfakes.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net