How AI Voice Scammers are Luring Indians in?

How AI Voice Scammers are Luring Indians in?
Published on

Check out this article to find out how AI voice scammers are luring Indians

According to a recent McAfee survey, almost 47% of Indians have been the victim of an AI voice scam or know someone who has. Earlier this week, after quitting Google, artificial intelligence (AI) pioneer Geoffrey Hinton issued a warning about the rise in misinformation, warning that it is becoming increasingly difficult to distinguish between information generated by humans and that generated by AI.

The world has been enchanted by AI's rapid development of ever-increasing possibilities, but the dangers have been buried in the background.

McAfee, a cybersecurity company, recently conducted a survey of approximately 7,000 people from seven countries, including India. The results showed that more than half of Indians are unable to distinguish between genuine and fictitious voices and many AI voice scammers are luring Indians into money laundering scams, cyber scams, and many more. Swindlers can easily use it to send phony voice messages to friends and family pretending to need help and getting them to send money.

The discoveries distributed in the report, The Fake Shams, uncovered that about half (47%) of Indian grown-ups have encountered or know somebody who has been defrauded utilizing AI voice scammers, which is practically twofold the worldwide normal (25%). The survey found that 83% of Indian victims reported having lost money, with 48% reporting losing more than 50,000.

"Artificial intelligence opens up a world of possibilities, but as with any technology, it can always be misused maliciously by the wrong people. Steve Grobman, CTO of McAfee, stated in a statement, "This is what we're seeing today with the accessibility and ease of use of AI tools helping cybercriminals to scale their efforts in increasingly convincing ways."

One of the manners in which we recall individuals is through their voice; It's a one-of-a-kind feature, and being familiar with it helps you trust the other person. However, it is now also one of the data formats that receive the most attention. Over 80% of Indian adults send voice notes or share voice data online once per week. Even though privacy is guaranteed, cybercriminals can clone people's voices by exploiting security flaws, making shared voice data a risky tool.

Deepfakes, also known as manipulated videos and images, are becoming increasingly popular. Even though it's famously utilized for humor, it has likewise been utilized to spread falsehood. For instance, a video of the attack on migrant workers in Bihar in March was widely circulated, prompting strong social media responses. The video was at last observed to be a profound phony. The use of AI to manipulate technology can make it hard to tell the difference between real and fake, which can have serious consequences.

According to the McAfee survey, when fraudsters use AI technology to clone voices and send a phony voicemail or voice note, the majority of Indians lack confidence in their ability to recognize the cloned version. More than half of respondents indicated that they would respond to such a message, particularly if they believed the request for financial assistance came from a parent, partner, or child.

Messages that were probably going to get a reaction were those guaranteeing that the source had been ransacked (70%), was engaged with a vehicle occurrence (69%), lost their telephone or wallet (65%), or required help while voyaging abroad (62%).

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net