NeighBERT: Medical Entity Linking Using Relation-Induced Dense Retrieval

Journal of Healthcare Informatics Research(2024)

引用 0|浏览0
暂无评分
摘要
One of the common tasks in clinical natural language processing is medical entity linking (MEL) which involves mention detection followed by linking the mention to an entity in a knowledge base. One reason that MEL has not been solved is due to a problem that occurs in language where ambiguous texts can be resolved to several named entities. This problem is exacerbated when processing the text found in electronic health records. Recent work has shown that deep learning models based on transformers outperform previous methods on linking at higher rates of performance. We introduce NeighBERT, a custom pre-training technique which extends BERT (Devlin et al [ 1 ]) by encoding how entities are related within a knowledge graph. This technique adds relational context that has been traditionally missing in original BERT, helping resolve the ambiguity found in clinical text. In our experiments, NeighBERT improves the precision, recall, and F1-score of the state of the art by 1–3 points for named entity recognition and 10–15 points for MEL on two widely known clinical datasets.
更多
查看译文
关键词
Natural language processing,Knowledge graph,Information search and retrieval,Deep learning,Medical entity linking,Biomedical
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要