To content
Department of Statistics

The role of measurement in the replicability of empirical findings

Funded by the German Research Foundation (DFG)

The first part of the project found reporting on item-based measures are often untransparent and incomplete. It also investigated how different modifications of measures, such as dropping items, impact replicability and the heterogeneity of effect sizes. 
The current second part of the project has two objectives: 1) increasing transparency and improving reporting practices on measurement in published research, and 2) providing hands-on advice to replicators and meta-scientists who have to deal with heterogeneity in measurement across studies.

To achieve the first objective, we will develop and evaluate the Measures Checklist, a checklist that requires authors to report which measure they used, whether they modified it, and if so, how.  To aid the completion of the checklist, we will develop the Measures Shiny app which will fill out the checklist automatically using the manuscript’s text. Authors will then check and revise the responses and submit them together with their manuscript.

The second objective will be achieved by A) developing a taxonomy of modifications and their differing impact on replication success and the heterogeneity of effect sizes and developing the Modifications Shiny App, which allows researchers to evaluate the potential impact of a certain modification more specifically. B), we will investigate how much measurement invariance (MI) due to differences in other study characteristics either between the original study and a replication study or between different replication studies can be violated without affecting replicability. We will address this using three complementary approaches: an analytical investigation that quantifies biases of effects when violations of MI occur, a simulation study that extends this to factor score estimation, and an empirical analysis of existing data sets. This research will inform the development of a tool that allows researchers to judge the impact of violations of MI on replication success.

PIs: Susanne Frick and Eunike Wetzel (RPTU University Kaiserslautern-Landau)

Logo of DFG © DFG