Deep neural network - a neural network based on silicon memristive synapses
Recently, neural network models have become more accurate and sophisticated, which leads to increased energy consumption during their training and use on conventional computers. Developers from around the world are working on alternative, "brain-like" hardware to provide improved performance under high computational loads for artificial intelligence systems.
Researchers from the Technion - Israel Institute of Technology and the Peng Cheng Laboratory have recently created a new neuromorphic computing system that supports generative and graph-based deep learning models and the ability to work with deep belief neural networks (DBNs).
The scientists' work was presented in the journal Nature Electronics. The system is based on silicon memristors. These are energy-efficient devices for storing and processing information. Previously we have already mentioned the use of memristors in the field of artificial intelligence. The scientific community has been working on neuromorphic computing for quite some time, and the use of memristors seems very promising.
Memristors are electronic components that can switch or regulate the flow of electric current in a circuit and can also store the charge that passes through the circuit. They are well suited for running artificial intelligence models because their capabilities and structure are more like synapses in the human brain than conventional memory blocks and processors.
But, at the moment, memristors are still mainly used for analog computing, and to a much lesser extent in AI design. Since the cost of using memristors remains quite high, memristive technology has not yet become widespread in the neuromorphic field.
Professor Kvatinsky and his colleagues from the Technion and Peng Cheng Lab decided to circumvent this limitation. As mentioned above, memristors are not widely available, so instead of memristors, the researchers decided to use a commercially available flash technology developed by Tower Semiconductor. They designed its behavior to be similar to a memristor. They also specifically tested their system with the recently developed DBN, which is an old theoretical concept in machine learning. The reason for its use was the fact that the Deep neural network does not require data transformation, its input and output data are binary and inherently digital.
The idea of the scientists was to use binary (i.e., with a value of 0 or 1) neurons (input/output). This study investigated memristive synaptic devices with two floating-gate terminals made as part of the standard CMOS manufacturing process. As a result, silicon-based memristive synapses were created. These artificial synapses were called silicon synapses. The neural states were fully binarized, simplifying neural circuit design, where expensive analog-to-digital and digital-to-analog converters (ADCs and DACs) are no longer required.
Silicon synapses offer many advantages: analog conductivity, high wear resistance, long retention times, as well as predictable cyclic degradation and moderate device-to-device variation.
Kvatinsky and his colleagues created a Deep neural network. It consists of three 19x8 memristive restricted Boltzmann machines, for which two arrays of 12x8 memristors were used.
This system was tested with a modified MNIST dataset. The accuracy of network recognition using Y-Flash-based memristors reached 97.05%.
In the future, developers plan to scale up this architecture, apply more of them, and generally explore additional memristive technologies.
The architecture presented by the scientists offers a new viable solution for running restricted Boltzmann machines and other DBNs. In the future, it may become the basis for the development of similar neuromorphic systems, and further help to improve the energy efficiency of AI systems.
You can check out the MATLAB code for a deep learning memristive network based on a bipolar floating gate memristor (y-flash device) on github.