Self Supervised Bert for Legal Text Classification

ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)(2023)

引用 1|浏览2
暂无评分
摘要
Critical BERT-based text classification tasks, such as legal text classification, require huge amounts of accurately labeled data. Legal text classification faces two trivial problems: labeling legal data is a sensitive process and can only be carried out by skilled professionals, and legal text is prone to privacy issues hence not all the data can be made available in the public domain. This means that we have limited diversity in the textual data, and to account for this data paucity, we propose a self-supervision approach to train Legal-BERT classifiers. We use the BERT text classifier’s knowledge of the class boundaries and perform gradient ascent w.r.t. class logits. Synthetic latent texts are generated through activation maximization. The main advantages over existing SOTAs are that our model: is easy to train, does not require much data but instead uses the synthesized data as fake samples; has less variance that helps to generate texts with good sample quality and diversity. We show the efficacy of the proposed method on the ECHR Violation (Multi-Label) Dataset and the Over-ruling Task Dataset.
更多
查看译文
关键词
Text classification,Self-supervision,BERT,Legal Text
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要