Bert model fine-tuning for text classification in knee OA radiology reports

OSTEOARTHRITIS AND CARTILAGE(2020)

引用 4|浏览7
暂无评分
摘要
Purpose: Traditional Natural Language Processing (NLP) techniques do a good job of understanding relationships between adjacent or nearby words. However, clinical text data such as patient notes or radiology reports require capturing interactions between distant words. BERT (Bidirectional Encoder Representations from Transformers), a new form of deep neural network recently introduced by Google, overcomes this challenge given its ability to comprehend long range word interactions. While BERT has dramatically improved outcomes in NLP tasks in the general domain such as optimizing search results, its performance in domain specific tasks such as analyzing biomedical data is still being explored.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要