Latest News

Why is ‘Liquid’ Neural Network from MIT a Revolutionary Innovation?

Preetipadma

What shortcomings of a traditional neural network is addressed by MIT's Liquid Neural Network

The tech world is brimming with updates about the latest innovations on the artificial intelligence front. Applications like machine learning, computer vision, deep learning, natural language processing and neural network are being deployed and influencing the bottom line of several industry verticals. Out of these, neural network has gained status of great interest in the scientific community, for being inspired from biological nervous system process information. By emulating the human brain functions, it helps in developing computational models that are equipped for pattern recognition and at last, deliberation of new information.

The term neural network has been derived from the work of a neuroscientist, Warren S. McCulloch and a logician, Walter Pitts. They are accredited for developing the first conceptual model of an artificial neural network. In their work, they describe the concept of a neuron as a single cell living in a network of cells that receives inputs, processes those inputs, and generates an output.

Some of the general uses of a neural network are:

• Pattern recognition

• Image or object Detection

• Anomaly Detection

• Times series forecasting

• Signal processing

MIT's Breakthrough in Neural Network

However, since neural networks cannot be programmed directly for any application, they must be trained first, before learning from input data feed. Recently, this subset of machine learning was trained to continuously adapt and learn from data inputs by a team of researchers at MIT. Simply put, this new neural network model is capable of adapting its underlying behavior after the initial training phase.

Despite the numerous significant improvements in several iterations of neural network algorithms, the process of extracting meaningful information by adapting to real-time input remained a challenging task. To address this, MIT researchers arrived at this new solution to advance artificial intelligence – a neural network fluid enough to learn on the job. Dubbed as 'liquid network,' the team believes this innovation can enable decision-making based on data streams that change over time. This also includes applications like controlling robots in real-time, video-processing, medical condition diagnosis and autonomous driving.

According to TechCrunch, in the training phase, neural network algorithms are provided with a large volume of relevant target data to hone their inference capabilities and rewarded for correct responses to optimize performance. This implies that, post the training phase, the behavior of the neural network gets 'fixed.' Therefore, they are often unable to adjust to the changes and tweaks in the incoming data stream.

As per the statement by Ramin Hasani, a postdoc in the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL), the fluidity of this 'liquid' network makes it more resilient to unexpected or noisy data, like if heavy rain obscures the view of a camera on a self-driving car. Here, the liquid neural network can better adjust to the shift in circumstances and maintain a high level of performance. Further, this neural network is more interpretable which makes it more flexible.

"The liquid network skirts the inscrutability common to other neural networks. By changing the representation of a neuron, you can explore some degrees of complexity you couldn't explore otherwise," he adds. Ramin is also the lead author of the study. He drew inspiration for this model, from the microscopic nematode, C. elegans. Though, it has 302 neurons in its nervous system, it can still generate unexpectedly complex dynamics.

Ramin programmed his neural network with meticulous attention to how C. elegans neurons activate and communicate with each other via electrical impulses.

The research will be presented at February's AAAI Conference on Artificial Intelligence. Other co-authors from MIT include Daniela Rus, CSAIL director and the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science, and Ph.D. student Alexander Amini. Mathias Lechner of the Institute of Science and Technology Austria and Radu Grosu of the Vienna University of Technology have also co-authored this study.

The team focused on time-series adaptability. In other words, instead of building neural network on training data comprising of a number of snapshots, or static moments fixed in time, they used time-series data – or sequences of images rather than isolated slices. Ramin explains that time series data are both ubiquitous and vital to our understanding of the world. He elaborates that the real world is all about sequences. Even our perception—one does not perceive images, but perceive sequences of images. Hence, time-series data actually creates our reality. This also includes video processing, financial data, and medical diagnostic application which are primary situations that rely on time series data.

Moreover, the small number of highly expressive neurons, makes it easy to peer into the 'black box' of the network's decision-making and diagnose why the network made a certain characterization. This implies that the liquid network is more open to observation and study by researchers, unlike traditional neural networks. Also it is more economical to build and train.

When pitted against other state-of-the-art time series algorithms, the liquid neural network performed better by a few percentage points in accurate prediction of future values in datasets, ranging from atmospheric chemistry to traffic patterns. At present, Ramin plans to enhance the system and optimize it for practical industrial application.

Together, this research was funded, by Boeing, the National Science Foundation, the Austrian Science Fund, and Electronic Components and Systems for European Leadership.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

5 Top Performing Cryptos In December 2024 You’ll Regret Ignoring – Watch Before the Next Breakout

AI Cycle Returning? Keep an Eye on Near Protocol, IntelMarkets, and Bittensor to Rally Before 2025

Ethereum and Litecoin Rallies Spark Excitement, But Whales Are Targeting a New Altcoin for 20x Gains

Solana to Double its 2021 Rally Says Top Analyst, Shows Alternative that Will Mirrors its Gains in 3 Months

Here Are 4 Altcoins You’ll Regret Not Holding In This Crypto Bull Run