DE^3-BERT: Distance-Enhanced Early Exiting for BERT based on Prototypical Networks
CoRR(2024)
摘要
Early exiting has demonstrated its effectiveness in accelerating the
inference of pre-trained language models like BERT by dynamically adjusting the
number of layers executed. However, most existing early exiting methods only
consider local information from an individual test sample to determine their
exiting indicators, failing to leverage the global information offered by
sample population. This leads to suboptimal estimation of prediction
correctness, resulting in erroneous exiting decisions. To bridge the gap, we
explore the necessity of effectively combining both local and global
information to ensure reliable early exiting during inference. Purposefully, we
leverage prototypical networks to learn class prototypes and devise a distance
metric between samples and class prototypes. This enables us to utilize global
information for estimating the correctness of early predictions. On this basis,
we propose a novel Distance-Enhanced Early Exiting framework for BERT
(DE^3-BERT). DE^3-BERT implements a hybrid exiting strategy that
supplements classic entropy-based local information with distance-based global
information to enhance the estimation of prediction correctness for more
reliable early exiting decisions. Extensive experiments on the GLUE benchmark
demonstrate that DE^3-BERT consistently outperforms state-of-the-art models
under different speed-up ratios with minimal storage or computational overhead,
yielding a better trade-off between model performance and inference efficiency.
Additionally, an in-depth analysis further validates the generality and
interpretability of our method.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要