Machine learning to increase the efficiency of a literature surveillance system: a performance evaluation

crossref(2023)

引用 0|浏览2
暂无评分
摘要
Background Given suboptimal performance of Boolean searching to identify methodologically sound and clinically relevant studies in large bibliographic databases such as MEDLINE, exploring the performance of machine learning (ML) tools is warranted. Objective Using a large internationally recognized dataset of articles tagged for methodological rigor, we trained and tested binary classification models to predict the probability of clinical research articles being of high methodologic quality to support a literature surveillance program. Materials and Methods Using an automated machine learning approach, over 12,000 models were trained on a dataset of 97,805 articles indexed in PubMed from 2012-2018 which were manually appraised for rigor by highly trained research associates with expertise in research methods and critical appraisal. As the dataset is unbalanced, with more articles that do not meet criteria for rigor, we used the unbalanced dataset and over- and under-sampled datasets. Models that maintained sensitivity for high rigor at 99% and maximized specificity were selected and tested in a retrospective set of 30,424 articles from 2020 and validated prospectively in a blinded study of 5253 articles. Results The final selected algorithm, combining a model trained in each dataset, maintained high sensitivity and achieved 57% specificity in the retrospective validation test and 53% in the prospective study. The number of articles needed to read to find one that met appraisal criteria was 3.68 (95% CI 3.52 to 3.85) in the prospective study, compared with 4.63 (95% CI 4.50 to 4.77) when relying only on Boolean searching. Conclusions ML models improved by approximately 25% the efficiency of detecting high quality clinical research publications for literature surveillance and subsequent dissemination to clinicians and other evidence users. ### Competing Interest Statement McMaster University, a not-for-profit institution, has contracts, managed by the Health Information Research Unit, supervised by AI, RBH, and LL, with several professional and commercial publishers, to supply newly published studies and systematic reviews that are critically appraised for research methods and assessed for clinical relevance through the McMaster Premium Literature Service (McMaster PLUS). TN, RP, CC, and CL are partly paid through these contracts and RBH receives remuneration for supervisory time and royalties. WA, EA, GF, LC, and MA are not affiliated with McMaster PLUS. ### Funding Statement EB and WA were supported by Mitacs through the Mitacs Accelerate program. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. ### Author Declarations I confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained. Not Applicable The details of the IRB/oversight body that provided approval or exemption for the research described are given below: Ethics approval is not required for this study as no patients are involved. I confirm that all necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived, and that any patient/participant/sample identifiers included were not known to anyone (e.g., hospital staff, patients or participants themselves) outside the research group so cannot be used to identify individuals. Not Applicable I understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance). Not Applicable I have followed all appropriate research reporting guidelines, such as any relevant EQUATOR Network research reporting checklist(s) and other pertinent material, if applicable. Not Applicable Data sharing is possible via collaboration agreements between the authors and those requesting access. The HEDGES article database is proprietary, owned by Dr Haynes via McMaster University, and can be made available with academic collaboration agreements or commercial contracts.
更多
查看译文
关键词
literature surveillance system,performance evaluation,system performance
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要