In conventional computer structures, the physical separation between processing and memory units causes considerable energy waste because of the repeated shuffling of data. To prevent this from happening, a type of computing known as reservoir computing draws inspiration from the brain to provide integrated memory and processing.
Reservoir Computing
Traditional computing consumes significantly large amounts of electricity. This is because the separate units used for data storage and processing require information to be constantly shuffled between the two, producing heat and wasting energy.
Such a tendency poses a challenge for machine learning as it requires vast datasets for processing. Training one large AI model generates hundreds of tonnes of carbon dioxide.
Reservoir computing is a form of brain-inspired computing that exploits the intrinsic physical properties of a material to reduce energy use. As one of the neuromorphic approaches, physical reservoir computing aims to remove the need for distinct memory and processing units and facilitate more efficient ways to process data.
Physical reservoir computing does not only serve as a more sustainable alternative to conventional computing but it can also be integrated into existing circuitry to provide enhanced capabilities that are energy efficient.
This approach is currently limited because of the lack of reconfigurability. It is because the physical properties of a material may allow it to excel at a particular subset of computing tasks but not others.
READ ALSO: Chemists Discover a New Process of Computing Through Light
Harnessing the Potential of Twisted Magnets
Scientists from University College London and Imperial College London are a step closer to creating brain-inspired computing by using chiral or twisted magnets as their computational medium. The research team used a vector network analyzer to determine chiral magnets' energy absorption at various magnetic field strengths and temperature ranges.
Their collaborators at UCL under Professor Hidekazu Kurebayashi identified a promising set of materials to power unconventional computing. These materials are considered special because they can support a vibrant and varied range of magnetic textures. Meanwhile, another group from Imperial College London designed a neuromorphic computing architecture to leverage the properties of complex materials and match the demands of a diverse set of challenging tasks.
The experts found that by applying an external magnetic field and changing temperature, the physical properties of these materials can adapt to suit various machine-learning tasks. It was also discovered that different magnetic phases of chiral magnets excelled at multiple computing tasks. This indicates that reconfiguring physical phases can directly tailor neuromorphic computing performance.
According to lead author Dr. Oscar Lee from the London Centre for Nanotechnology at UCL, their work brings the experts closer to realizing the full potential of physical reservoirs that create computers, which only require significantly less energy. This reservoir can also adapt its computational properties to work optimally across different tasks like human brains.
In the future, the research team plans to identify materials and device architectures that are commercially viable and scalable.
RELATED ARTICLE: Towards Quantum Computing: Physicists Surpass Current Supercomputers With New Programmable Simulator
Check out more news and information on Computing in Science Times.