PD24 Robust Real-World Evidence Generation In Comparative Effects Studies – NICE’s Methods Guidance

Stephen Duffield,Seamus Kent, Manuj Sharma, Lynne Kincaid, Vandana Ayyar-Gupta, Shaun Rowark,Pall Jonsson

International Journal of Technology Assessment in Health Care(2022)

引用 0|浏览4
暂无评分
摘要
IntroductionRecent reviews have shown that many real-world evidence (RWE) studies suffer from avoidable methodological flaws. Meanwhile, the National Institute for Health and Care Excellence (NICE) is seeing an increase in RWE submissions in Health Technology Appraisals and is keen to support the use of this evidence. However, limited guidance exists for the development and assessment of RWE, risking both missed opportunities for unbiased evidence generation and inconsistent decision making based on that evidence. As part of its RWE framework, NICE has developed methods guidance to provide clear expectations for the conduct and reporting of non-randomized comparative effects studies using real world data.MethodsA conceptual model and draft framework were developed based on established international best practices in RWE and observational research. This was refined with focused literature searches, for example, on the use of external control arm studies. We then engaged with external stakeholders to incorporate their feedback and develop case studies. A reporting template was developed and tested on multiple use cases.Results & ConclusionsThe guidance stresses the central importance of a target trial approach to study design, e.g., adopting an active comparator, new user design, where possible. Target trial emulation is a useful tool to improve the quality and transparency of RWE studies, helping to overcome selection and confounding biases. Various other study design and analytical approaches are outlined for addressing confounding bias and biases due to missing data, measurement error, or misclassification, which are common challenges in RWE. Alongside traditional approaches to sensitivity analysis, the framework promotes quantitative bias analyses which includes a range of methods to assess and communicate the potential impact of remaining bias to study findings by quantifying the direction, magnitude, and uncertainty of bias. A reporting template, based on common methodological pitfalls, is provided to help evidence developers consider key areas of bias in their work and to inform reviewers of any approaches used to investigate or resolve these.
更多
查看译文
关键词
comparative effects studies,evidence,methods guidance,real-world
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要