Is Biometrics the New AI Toolset to Execute Cybercrimes?

Is Biometrics the New AI Toolset to Execute Cybercrimes?

Cyberattacks have now extended to new avenues with the use of biometrics.

Two months back, a group of hackers hijacked the facial recognition system by the Chinese government to send fake tax invoices. According to the South China Morning Post report, "Prosecutors in Shanghai said a criminal group duped that platform's identity verification system by using manipulated personal information and high-definition photographs, which were bought from an online black market, so its registered shell company can issue fake tax invoices to clients." The wide availability of image manipulation apps and AI technology has made it possible to successfully exploit and manipulate biometrics to commit frauds.

Biometrics and Frauds

Biometrics is considered one of the best tools to ensure security and detect cybercrimes. The potential of biometrics in authenticating and reducing fraud is imperative and thus it is being widely used in the form of fingerprints, facial recognition, voice recognition, etc. However, the advancement in technology has also paved the way for more sophisticated crimes and identity thefts taking place in companies, especially retail. Criminals are now technically sophisticated and they are even designing their own AI systems, self-learning algorithms and other technologies to illegally access data and vulnerable systems. In the current scenario, cyber frauds have expanded to new avenues and can easily deceive us by manipulating biometric data.

Last year, biometrics from a woman belonging to Kolkata was used illegally to commit forgery and other crimes in Rajasthan. According to the TOI report, the victim gave her thumb impression to get a mobile sim card and the same biometric data was exploited to commit the frauds. Such cases of identity theft by using and manipulating biometrics are becoming widely popular in the retail and financial sector.

Another striking incident reported by The Wall Street Journal dates back to 2019 and states how criminals used artificial intelligence-based software to impersonate the voice of a chief executive of a company and demanded a transaction of US$243,000. This voice-spoofing attack is also driven by AI and has the potential to commit more serious crimes.

How Are They Committed?

According to a study by Accenture, there are two major types of biometric fraud. They are impersonation and obfuscation. Impersonation occurs when an impostor spoofs biometric information to pretend to be a specific individual and then commits crimes or illegal transactions. On the other hand, obfuscation is a method where the criminal manipulates the biometric data to avoid recognition. The report suggests that the commonly targeted biometric modalities are fingerprint, facial recognition, and voice recognition, but others including iris, veins, and even DNA-based data could also be exploited to commit frauds.

Another modern technology used for cyber fraud is deepfakes. By creating fake videos and voice messages, criminals can easily target their victims. This suspicious technology enables the users to swap faces in a video or image and make it seem real. The voice-spoofing frauds can also be considered a deepfake crime. Deepfakes are highly used in identity thefts and manipulating facial recognition systems. Ransomware attacks can essentially leverage these disruptive technologies and we have now reached an era where the technology is pervasively misused for advantage. Using biometrics to initiate cybercrimes in retail can cater to huge financial loss, drastically affecting financial institutions and businesses.

Curbing Biometric Fraud

To avoid these frauds that use and manipulate biometric data, experts suggest that multiple checks and authentication through multiple biometric methods can be beneficial. Companies should encourage better spoof-detection strategies and conduct detailed and thorough analyses of sensitive data and financial transactions. We should not forget that AI has the capability to identify threats and detect frauds. New developments in this area would also benefit in fighting cyber frauds.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net