Technology is Spying on You: How Can Artificial Intelligence Prevent It?

Technology is Spying on You: How Can Artificial Intelligence Prevent It?
Published on

Human listeners perceive the audio camouflage as background noise and have no difficulty recognizing the spoken words

Companies utilize "bossware" to listen in on their staff while they are near their desktops. Phone calls can be recorded by a variety of "spyware" applications. Home gadgets like Amazon's Echo can also capture daily chats. A new technique known as Neural Voice Camouflage now provides protection. As you speak, it makes bespoke audio noises in the background, confounding the artificial intelligence (AI) that transduces our recorded speech.

The new technology employs what is known as an "adversarial attack." The method incorporates machine learning in which algorithms look for patterns in data to modify sounds in such a manner that an AI, but not humans, mistake them for something different. In essence, you utilize one AI to deceive another.

When you wish to hide in real-time, the machine-learning AI needs to organize the entire sound clip before understanding how to adjust it.

In the latest study, experts trained a neural network, an ML system modeled by the brain, to predict the future efficiently. They trained it on hours of recorded speech so that it can continually analyze 2-second audio snippets and conceal what's likely to be said next.

For example, if somebody has just said "enjoy the big feast," it is impossible to foresee what will be said next. However, by considering what has recently been spoken as well as the features of the speaker's voice, it generates noises that will interrupt a variety of alternative words that may follow. That covers what happened afterward, as stated by the same speaker, "that's being prepared." Human listeners perceive the audio camouflage as background noise and have no difficulty recognizing the spoken words. However, machines make mistakes.

Speech concealed by white noise and a competitive adversarial attack had mistake rates of just 12.8 percent and 20.5 percent, respectively.

Even when the ASR framework was conditioned to transcribe speech that had been disrupted by Neural Voice Camouflage, the error margin remained at 52.5 percent. Short words, like "the," were the most difficult to interrupt in general, but they are the least illuminating elements of dialogue.

The technology was also tested in the actual world, with the experts playing a speech recording paired with camouflage over a set of speakers in the same area as a microphone. It was still functional.

According to Mia Chiquier, a computer engineer at Columbia University who conducted the research, this is simply the first step toward protecting privacy in the face of AI.

Chiquier explains that the program's predictive component has a lot of potential for other uses that require real-time processing, such as driverless automobiles. Brains also function through anticipation; you feel surprised when your brain guesses something incorrectly. "We're replicating the way people do things," Chiquier explains in this regard.

"There's something pleasant about the form it integrates predicting the future, a classic ML problem, with another problem of confrontational ML," says Andrew Owens. Bo Li was astonished by the new approach's ability to defeat the fortified ASR system.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net