Running artificial intelligence computing will become faster, cheaper and with less energy

New hardware offers faster computation for artificial intelligence, with much less energy

Running artificial intelligence computing will become faster, cheaper and with less energy



The amount of time, energy, and money necessary to train more sophisticated neural network models is growing as scientists push the frontiers of machine learning. A new branch of artificial intelligence known as analogue deep learning promises quicker processing while using a fraction of the energy.


Programmable resistors are essential building pieces in analogue deep learning, much as transistors are in digital computers. Researchers may develop a network of analog artificial "neurons" and "synapses" that execute calculations much like a digital neural network by repeating arrays of programmable resistors in multiple layers. After that, the network may be trained to do complicated AI tasks such as image recognition and natural language processing.


A heterogeneous team of MIT researchers set out to test the speed limitations of a previously created form of the human-made analog synapse. In the fabrication process, they used a realistic inorganic material that allows their devices to run 1 million times faster than prior versions, which is also around 1 million times quicker than synapses in the human brain. Furthermore, the resistor's inorganic composition makes it incredibly energy-efficient. The new material, unlike the materials utilized in the previous generation of their gadget, is compatible with silicon production procedures. This advancement has enabled the fabrication of nanometer-scale devices, which may open the way for incorporation into commercial computer hardware for deep-learning applications.


"With that key insight, and the very powerful nanofabrication techniques we have at MIT.nano, we have been able to put these pieces together and demonstrate that these devices are intrinsically very fast and operate with reasonable voltages," says senior author Jesús A. del Alamo, the Donner Professor in MIT's Department of Electrical Engineering and Computer Science (EECS). "This work has really put these devices at a point where they now look really promising for future applications."


"The working mechanism of the device is the electrochemical insertion of the smallest ion, the proton, into an insulating oxide to modulate its electronic conductivity. Because we are working with very thin devices, we could accelerate the motion of this ion by using a strong electric field, and push these ionic devices to the nanosecond operation regime," explains senior author Bilge Yildiz, the Breene M. Kerr Professor in the departments of Nuclear Science and Engineering and Materials Science and Engineering.


"The action potential in biological cells rises and falls with a timescale of milliseconds since the voltage difference of about 0.1 volt is constrained by the stability of water," says senior author Ju Li, the Battelle Energy Alliance Professor of Nuclear Science and Engineering and professor of materials science and engineering, "Here we apply up to 10 volts across a special solid glass film of nanoscale thickness that conducts protons, without permanently damaging it. And the stronger the field, the faster the ionic devices."


"Once you have an analog processor, you will no longer be training networks everyone else is working on. You will be training networks with unprecedented complexities that no one else can afford to, and therefore vastly outperform them all. In other words, this is not a faster car, this is a spacecraft," adds lead author and MIT postdoc Murat Onen.


These programmable resistors substantially improve the speed with which a neural network is taught while drastically lowering the cost and energy required to do it. This might allow scientists to construct deep learning models faster, which could subsequently be used in applications such as self-driving cars, fraud detection, and medical picture analysis.


0 Comments