Understanding Neural Network Binarization with Forward and Backward Proximal Quantizers
NeurIPS 2023(2024)
摘要
In neural network binarization, BinaryConnect (BC) and its variants are
considered the standard. These methods apply the sign function in their forward
pass and their respective gradients are backpropagated to update the weights.
However, the derivative of the sign function is zero whenever defined, which
consequently freezes training. Therefore, implementations of BC (e.g., BNN)
usually replace the derivative of sign in the backward computation with
identity or other approximate gradient alternatives. Although such practice
works well empirically, it is largely a heuristic or ”training trick.” We aim
at shedding some light on these training tricks from the optimization
perspective. Building from existing theory on ProxConnect (PC, a generalization
of BC), we (1) equip PC with different forward-backward quantizers and obtain
ProxConnect++ (PC++) that includes existing binarization techniques as special
cases; (2) derive a principled way to synthesize forward-backward quantizers
with automatic theoretical guarantees; (3) illustrate our theory by proposing
an enhanced binarization algorithm BNN++; (4) conduct image classification
experiments on CNNs and vision transformers, and empirically verify that BNN++
generally achieves competitive results on binarizing these models.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要