Focal Training and Tagger Decouple for Grammatical Error Correction

conf_acl(2023)

Cited 0|Views15
No score
Abstract
In this paper, we investigate how to improve tagging-based Grammatical Error Correction models.We address two issues of current tagging-based approaches, label imbalance issue, and tagging entanglement issue.Then we propose to down-weight the loss of well-classified labels using Focal Loss and decouple the error detection layer from the label tagging layer through an extra self-attention-based matching module.Experiments over three latest Chinese Grammatical Error Correction datasets show that our proposed methods are effective.We further analyze choices of hyper-parameters for Focal Loss and inference tweaking.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined