SABiNN: Shift-Accumulator Based Binarized Neural Network
A power-efficient hardware model called SABiNN: Shift Accumulate Based Binarized Neural Network has been developed that converts all the hyper-parameters of the network, such as weights and their biases, into binary values ( +1 and -1/0). The SABiNN model is inspired by Matthieu Courbariaux, Yoshua Bengio, and his team’s developed Binarized Neural Network (BiNN) model. SABiNN significantly reduces the power consumption rate of the model and decreases the percentage of resource utilization when embedded onto re-programmable hardware. On-chip memories, such as SRAMs and DRAMs, were not used during inference as each weights and biases of the network are fixed binary value.
More information about SABiNN can be found in: github
DeepSAC: Shift Accumulator Based Deep Neural Network
In the DeepSAC method, first, a targeted NN model is trained, and all its weights are extracted. Then, compression techniques such as n-bit quantization and pruning are implemented to reduce the sparsity of the selected NN model. After compression, the next task is converting the hyperparameters into multiples of 2s to replace multipliers with shifters. After a successful conversion, the weights are plugged into the original NN model, and a forward pass is executed to check if the accuracy is maintained.
More information about DeepSAC can be found in: github