The new technology, devised by researchers from Imperial College London, has the potential to reduce the energy cost of artificial intelligence (AI), which is now doubling every 3.5 months globally.
The multinational team published the first evidence that networks of nanomagnets can be utilized to accomplish AI-like processing in the journal Nature Nanotechnology. Nanomagnets can be utilized for “time-series prediction” activities, such as forecasting and managing insulin levels in diabetic patients, according to the researchers.
Artificial intelligence that employs “neural networks” seeks to mimic the way neurons communicate with one another to process and store information in the brain. Physicists devised a lot of mathematics that powers neural networks to describe how magnets interact.
However, it was difficult to utilize magnets directly at that time since the researchers had no idea about putting data in and retrieving the information.
Consequently, conventional silicon-based computers were employed to model the magnet interactions, which are used to imitate the brain. The team is now able to analyze and store data directly on the magnets, eliminating the need for a software simulation and possibly saving a lot of energy.
Depending on their orientation, nanomagnets can exist in different “states”. When a magnetic field is applied to a network of nanomagnets, the state of the magnets varies depending on the parameters of the input field and the states of adjacent magnets.
The team, guided by Imperial Department of Physics academics, then devised a method for counting the number of magnets in each state after the field had gone through, providing the “answer.”
We’ve been trying to crack the problem of how to input data, ask a question, and get an answer out of magnetic computing for a long time. Now we’ve proven it can be done, it paves the way for getting rid of the computer software that does the energy-intensive simulation.
Dr. Jack Gartside, Study Co-First Author, Imperial College London
How the magnets interact gives us all the information we need; the laws of physics themselves become the computer.
Kilian Stenning, Study Co-First Author, Imperial College London
Team leader Dr. Will Branford stated, “It has been a long-term goal to realise computer hardware inspired by the software algorithms of Sherrington and Kirkpatrick. It was not possible using the spins on atoms in conventional magnets, but by scaling up the spins into nanopatterned arrays we have been able to achieve the necessary control and readout.”
Slashing Energy Cost
From speech recognition to self-driving automobiles, AI is currently being employed in a variety of applications. However, a lot of energy is required to program the AI to do seemingly basic tasks. To train AI to solve a Rubik’s cube, it needed the energy equivalent of two nuclear power plants running for an hour.
In silicon-chip computers, most of the energy required is squandered in inefficient electron transport during the processing and memory storage.
Nanomagnets, on the other hand, do not rely on the physical flow of particles like electrons to process and transmit information. Instead, they process and transfer information in the form of a “magnon” wave, in which each magnet influences the condition of its adjacent magnets.
This implies that significantly less energy is wasted, and information processing and storage can be done simultaneously rather than as separate operations like in traditional computers. Nanomagnetic computing might be 100,000 times more efficient than traditional computing because of this breakthrough.
AI at the Edge
After the success of the initial study, the next step for the scientists involves using real-world data, such as ECG signals, to instruct the system and eventually turn it into a working computer. Magnetic devices might eventually be incorporated into traditional computers to increase energy efficiency for high-processing workloads.
Due to their energy efficiency, they might be fueled by renewable energy and used to do “AI at the edge,” which involves processing data on the spot, such as at weather stations in Antarctica, rather than transmitting it back to massive data centers.
They might also be employed in wearable devices to analyze biometric data from the body, like predicting and regulating insulin levels in diabetic patients or detecting irregular heartbeats.
Gartside, Jack. C., et al., (2022) Reconfigurable training and reservoir computing in an artificial spin-vortex ice via spin-wave fingerprinting. Nat. Nanotechnol doi.org/10.1038/s41565-022-01091-7.