基本信息
浏览量:84
职业迁徙
个人简介
He is interested in how statistical tools can be used to better understand psychological and educational outcomes—e.g., what is this child’s reading ability?—that are challenging to measure and yet ubiquitous in education as well as the social and biomedical sciences more generally. The following are examples of the kinds of questions that he gets excited about studying (with the essential help of many collaborators):
--As response time is collected in a broader range of psychometric studies, how should it be used? Response time is a potentially valuable piece of information about the response process but it is not always clear how it should be interpreted. One issue that we have focused on is whether the speed-accuracy tradeoff—the observation that people make more error-prone responses when hurried—arises absent priming in observational data (“Speed-accuracy tradeoff? Not so fast: Marginal changes in speed have inconsistent relationships with accuracy in real-world settings”). A related question is whether it can be potentially used to study group differences on low-stakes examinations (“Differences in time usage as a competing hypothesis for observed group differences in accuracy with an application to observed gender differences in PISA data.”).
--How can we better index prediction quality from models of binary outcomes? Our work leverages the fact that prediction quality can be translated to statements about weighted coins and introduces an index that has an interpretation which is consistently interpretable across a range of outcomes. Alongside our development of this idea in a generic setting (“InterModel Vigorish (IMV): A novel approach for quantifying predictive accuracy when outcomes are binary”), we also have done work to show how it can be used in both item response theory (“The InterModel Vigorish as a lens for understanding (and quantifying) the value of item response models for dichotomously coded items”) and structural equation models (“The InterModel Vigorish for Model Comparison in Confirmatory Factor Analysis with Binary Outcomes”).
--Can we use psychometric approaches to further our understanding of what we are learning from large experiments in education? Conventional approaches to understanding treatment effects frequently focus on outcomes that are composites of individual items. We ask whether we can identify and observe item-level variation in treatment sensitivity (“Heterogeneity of item-treatment interactions masks complexity and generalizability in randomized controlled trials”). Such variation may offer useful information about the nature of the intervention and the skills it is affecting.
--As response time is collected in a broader range of psychometric studies, how should it be used? Response time is a potentially valuable piece of information about the response process but it is not always clear how it should be interpreted. One issue that we have focused on is whether the speed-accuracy tradeoff—the observation that people make more error-prone responses when hurried—arises absent priming in observational data (“Speed-accuracy tradeoff? Not so fast: Marginal changes in speed have inconsistent relationships with accuracy in real-world settings”). A related question is whether it can be potentially used to study group differences on low-stakes examinations (“Differences in time usage as a competing hypothesis for observed group differences in accuracy with an application to observed gender differences in PISA data.”).
--How can we better index prediction quality from models of binary outcomes? Our work leverages the fact that prediction quality can be translated to statements about weighted coins and introduces an index that has an interpretation which is consistently interpretable across a range of outcomes. Alongside our development of this idea in a generic setting (“InterModel Vigorish (IMV): A novel approach for quantifying predictive accuracy when outcomes are binary”), we also have done work to show how it can be used in both item response theory (“The InterModel Vigorish as a lens for understanding (and quantifying) the value of item response models for dichotomously coded items”) and structural equation models (“The InterModel Vigorish for Model Comparison in Confirmatory Factor Analysis with Binary Outcomes”).
--Can we use psychometric approaches to further our understanding of what we are learning from large experiments in education? Conventional approaches to understanding treatment effects frequently focus on outcomes that are composites of individual items. We ask whether we can identify and observe item-level variation in treatment sensitivity (“Heterogeneity of item-treatment interactions masks complexity and generalizability in randomized controlled trials”). Such variation may offer useful information about the nature of the intervention and the skills it is affecting.
研究兴趣
论文共 162 篇作者统计合作学者相似作者
按年份排序按引用量排序主题筛选期刊级别筛选合作者筛选合作机构筛选
时间
引用量
主题
期刊级别
合作者
合作机构
Julian M Siebert, Phaedra Bell,Nuria Gutierrez, Mónica Zegers, Eric Falke,Benjamin Domingue,Yaacov Petscher,Hugh William Catts, Lucy Yan,Lillian Durán, Gorno Tempini Maria Luisa
crossref(2024)
Journal of Educational and Behavioral Statistics (2024)
Jennifer Guevara, Carlos Sánchez, Jessica Organista-Montaño,Benjamin W. Domingue,Nan Guo,Pervez Sultan
BJA Open (2024): 100269-100269
Journal of Educational and Behavioral Statistics (2023)
引用0浏览0引用
0
0
EDUCATIONAL MEASUREMENT-ISSUES AND PRACTICEno. 3 (2023): 50-64
Ellen Y Wang,Kristin M Kennedy, Lijin Zhang,Daniel Qian, Ty Forbes,Michelle Zuniga-Hernandez, Brian S-K Li,Benjamin Domingue,Thomas J Caruso
JAMIA openno. 3 (2023): ooad076-ooad076
Social Science Research Network (2023)
引用0浏览0引用
0
0
The journals of gerontology. Series B, Psychological sciences and social sciencesno. 9 (2023): 1466-1473
JOURNAL OF EDUCATIONAL AND BEHAVIORAL STATISTICS (2023)
引用0浏览0引用
0
0
加载更多
作者统计
合作学者
合作机构
D-Core
- 合作者
- 学生
- 导师
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn