Abstract
A proof-of-concept hardware neural network for the purpose of analog built-in self-test is presented. The network is reconfigurable into any one-hidden-layer topology within the constraints of the number of inputs and neurons. Analog operation domain of synapses and neurons in conjunction with the digital weight storage allow fast computational time, low power consumption and fast training cycle. The network is trained in the chip-in-the-loop fashion with the simulated annealing-based parallel weight perturbation training algorithm. Its effectiveness in learning how to separate nominal from faulty circuits is investigated on two case studies: a Butterworth filter and an operational amplifier. The results are compared to those of the software neural networks of equivalent topologies and limitations concerning the practical applicability are discussed.