Logo
Nazad
D. Joksas, Erwei Wang, Nikolaos Barmpatsalos, W. H. Ng, A. Kenyon, G. Constantinides, A. Mehonic
22 13. 12. 2021.

Nonideality‐Aware Training for Accurate and Robust Low‐Power Memristive Neural Networks

Recent years have seen a rapid rise of artificial neural networks being employed in a number of cognitive tasks. The ever‐increasing computing requirements of these structures have contributed to a desire for novel technologies and paradigms, including memristor‐based hardware accelerators. Solutions based on memristive crossbars and analog data processing promise to improve the overall energy efficiency. However, memristor nonidealities can lead to the degradation of neural network accuracy, while the attempts to mitigate these negative effects often introduce design trade‐offs, such as those between power and reliability. In this work, authors design nonideality‐aware training of memristor‐based neural networks capable of dealing with the most common device nonidealities. The feasibility of using high‐resistance devices that exhibit high I‐V nonlinearity is demonstrated—by analyzing experimental data and employing nonideality‐aware training, it is estimated that the energy efficiency of memristive vector‐matrix multipliers is improved by almost three orders of magnitude (0.715 TOPs−1W−1 to 381 TOPs−1W−1) while maintaining similar accuracy. It is shown that associating the parameters of neural networks with individual memristors allows to bias these devices toward less conductive states through regularization of the corresponding optimization problem, while modifying the validation procedure leads to more reliable estimates of performance. The authors demonstrate the universality and robustness of this approach when dealing with a wide range of nonidealities.


Pretplatite se na novosti o BH Akademskom Imeniku

Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo

Saznaj više