"How the magnets interact gives us all the information we need; the laws of physics themselves become the computer." Kilian Stenning quotes in his research paper "Reconfigurable training and reservoir computing in an artificial spin-vortex ice via spin-wave fingerprinting", published in Nature magazine. He made this statement in the paper in an effort to explain how magnet spins can be put to use to power AI systems. What is that motivated Killian and the team to apply magnetic energy to make AI predictions? Artificial intelligence, by its very versatility, has become an indispensable technology even if it is largely at the nascent stage. The interesting part is, that its acceptance is because of the predictions it can make, ignoring the carbon footprint it has. The fact is that even the simplest decision taken by AI or ML algorithm consumes a lot of energy. Higher the efficiency of the model higher the energy consumption. To put things in perspective, let us consider the example of Megatron ML, a language model similar to GPT-3, trained on 45 terabytes of data, NVIDIA had to run 512 V100 GPUs for nine days. Considering the fact that a V100 GPU consumes approximately 250 watts, the project must have required around 27,648 kWh for training, whereas an average household requires 10,649 kWh annually.
A team led by Imperial College of London researchers has found a new method of harnessing magnetic energy, to simulate the computations the conventional neural networks of an ML system undergo. The method they discovered depends on nanomagnet networks for AI to make time-series predictions. Initially, the mathematical computations required to design the neural networks were based on the principles of magnetic interactions, and now scientists have found a way to apply them directly.
A Nanomagnetic state can be understood as a spin value a particular magnet takes in an excited state. When a group of nanomagnets is placed in an energy field, the magnets pass through different states of spin, and there is a pattern they interact with one another – scaling up the spins into nanopatterned arrays. To make a prediction, the scientists said, they designed a technique to count the number of magnets in a particular state after the field has passed through, and obtain the answer. When compared to electrical computations that consume energy whenever electrons pass through the energy field, nanomagnets' stake in energy is negligible. Dr. Jack Gartside, the Co-first author of the study said, "We've been trying to crack the problem of how to input data, ask a question, and get an answer out of magnetic computing for a long time. Now we've proven it can be done, it paves the way for getting rid of the computer software that does the energy-intensive simulation."
Real-time processing leveraging the power of Edge AI is going to go mainstream very soon for automation and artificial intelligence to be of any value to humanity. Edge AI, while bringing in low latency, can significantly reduce network traffic, it is crippled by challenges so overwhelming that companies do not even think about it. Edge AI depends largely on hardware, which market doesn't have the capacity to produce the standardized units to support the energy-guzzling AI algorithms. In addition, it requires seamless hardware integration, a tedious task, that the current equipment can least handle. The nanomagnetic computation with magnets integrated into conventional computers can upscale the energy efficiency of the system significantly.
Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp
_____________
Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.