papers AI Learner
The Github is limit! Click to go to the new site.

Use of Magnetoresistive Random-Access Memory as Approximate Memory for Training Neural Networks

2018-10-25
Nicolas Locatelli, Adrien F. Vincent, Damien Querlioz

Abstract

Hardware neural networks that implement synaptic weights with embedded non-volatile memory, such as spin torque memory (ST-MRAM), are a major lead for low energy artificial intelligence. In this work, we propose an approximate storage approach for their memory. We show that this strategy grants effective control of the bit error rate by modulating the programming pulse amplitude or duration. Accounting for the devices variability issue, we evaluate energy savings, and show how they translate when training a hardware neural network. On an image recognition example, 74% of programming energy can be saved by losing only 1% on the recognition performance.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1810.10836

PDF

https://arxiv.org/pdf/1810.10836


Similar Posts

Comments