The Deep Fakes Challenge Overview

The Deep Fakes Challenge Overview
Published on

Deep fakes can make "cheap fakes" using less expensive software, such as photoshop version of a video

Deep fakes are videos that are digitally edited or manipulated to create human bodies and/or faces that appear and sound real, usually with the help of ML algorithms. They can be used for amusement, as seen in comedian Jordan Peele's deep-fake film in which he impersonated President Barack Obama. Other advantageous uses include portraying stars and actresses who have passed away in films as if they were still living. Deep fakes, however, raise concerns about a darker or more criminal usage in which these recordings can purposefully convince viewers to believe someone said or did something that is wholly made up. The results are disastrous. Just think about the repercussions of a sophisticated hoax that shows a world leader declaring war on a rival country. When one examines the use of deep fakes in combination with other assault types, such as physical or cyberattacks, and how quickly a deep fake can propagate on the internet and through social media platforms, the threat is exacerbated.

Deep fakes can also take the form of "cheap fakes," which refers to attempted changes made using less expensive software, such as a simple Photoshop version of a video or image. Anyone looking to experiment with video and picture alteration now has a variety of possibilities thanks to the growth of social media and more widely available tools and technologies. An alleged "cheap imitation" of Nancy Pelosi, the Speaker of the House, surfaced last year. In order to give the impression that Speaker Pelosi was intoxicated or slurring her words, the film was purposefully slowed down. Even though it was subsequently discovered that the video had been altered, it was still shared widely and viewed millions of times on social media. Last but not least, text-based deep-faking, in which a user may easily change a text transcript using pre-made software to add a new language or remove actual material, has started a cycle that will eventually make deep-faking a common household pastime. In the expanding disinformation arsenals of both nation-states and non-state actors, deep fakes have emerged as a go-to technique.

Managing the threat of deep fakes and related technology has major ramifications. First, the relevance of diplomacy is increased by the possible threat posed by cutting-edge technologies like deep fakes. One could readily envision a convincing false video showing Kim Jong Un or another prominent figure threatening to launch an assault. This just serves to emphasise how important it is for diplomats to be able to communicate with other foreign governments, particularly enemies, in order to confirm the veracity of or, more often than not, immediately dismiss deeply false movies meant to raise tensions. This means that phony photographs and films could have real-world repercussions, given the urgency of battle in the present day. Second, there is a risk that governments and people will stop paying close attention as a result of the constant barrage of distorted movies and images due to deep fakes being utilised so regularly. When one of these films does turn out to be real, and the authorities take a while to act, this might become an issue. Third, because it now takes so much more time and effort to even confirm whether something is real, leaving less bandwidth for real analysis, security officials and others whose job it is to evaluate data and spot trends will suffer. This is true even if new technologies have been created to help analysts distinguish "signals from noise."

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net