Researchers at the University of Michigan say they have invented a memristor-assembled reservoir computing system that reduces training time and enhances the capacity of similar neural networks. They note the use of memristors lowers space requirements and enables easier integration within existing silicon-based electronics.
The team employed a special memristor that retains events only in the near history, enabling them to bypass an expensive training process while still providing the network with recall capability. When the reservoir is fed a dataset, it identifies important time-related features of the data, and passes it to a second network in a simple format, so the second network only needs training like simpler neural networks, changing weights of the features and outputs that the first network handed it until it reaches an acceptable level of error.
"The beauty of reservoir computing is that while we design it, we don't have to train it," says Michigan professor Wei Lu.
From The Michigan Engineer News Center
View Full Article
Abstracts Copyright © 2017 Information Inc., Bethesda, Maryland, USA
No entries found