New hardware offers faster computation for artificial intelligence, with much less energy The amount of time, energy, and money necessary to train more sophisticated neural network models is growing as scientists push the frontiers of machine learning. A new branch of artificial intelligence known as analogue deep learning promises quicker processing while using a fraction of the energy. Programmable resistors are essential building pieces in analogue deep learning, much as transistors are in digital computers. Researchers may develop a network of analog artificial "neurons" and "synapses" that execute calculations much like a digital neural network by repeating arrays of programmable resistors in multiple layers. After that, the network may be trained to do complicated AI tasks such as image recognition and natural language processing. A heterogeneous team of MIT researchers set out to test the speed limitations of a previously created form of the human-made analog...
Best Website for Latest News which is relevant to current AI Generation