Influence of reported study design characteristics on intervention effect estimates from randomized, controlled trials

Jelena Savović, Hayley E Jones, Douglas G Altman, Ross J Harris, Peter Jüni, Julie Pildal, Bodil Als-Nielsen, Ethan M Balk, Christian Gluud, Lise Lotte Gluud, John P A Ioannidis, Kenneth F Schulz, Rebecca Beynon, Nicky J Welton, Lesley Wood, David Moher, Jonathan J Deeks, Jonathan A C Sterne

    Research output: Contribution to journalArticlepeer-review


    Published evidence suggests that aspects of trial design lead to biased intervention effect estimates, but findings from different studies are inconsistent. This study combined data from 7 meta-epidemiologic studies and removed overlaps to derive a final data set of 234 unique meta-analyses containing 1973 trials. Outcome measures were classified as "mortality," "other objective," "or subjective," and Bayesian hierarchical models were used to estimate associations of trial characteristics with average bias and between-trial heterogeneity. Intervention effect estimates seemed to be exaggerated in trials with inadequate or unclear (vs. adequate) random-sequence generation (ratio of odds ratios, 0.89 [95% credible interval {CrI}, 0.82 to 0.96]) and with inadequate or unclear (vs. adequate) allocation concealment (ratio of odds ratios, 0.93 [CrI, 0.87 to 0.99]). Lack of or unclear double-blinding (vs. double-blinding) was associated with an average of 13% exaggeration of intervention effects (ratio of odds ratios, 0.87 [CrI, 0.79 to 0.96]), and between-trial heterogeneity was increased for such studies (SD increase in heterogeneity, 0.14 [CrI, 0.02 to 0.30]). For each characteristic, average bias and increases in between-trial heterogeneity were driven primarily by trials with subjective outcomes, with little evidence of bias in trials with objective and mortality outcomes. This study is limited by incomplete trial reporting, and findings may be confounded by other study design characteristics. Bias associated with study design characteristics may lead to exaggeration of intervention effect estimates and increases in between-trial heterogeneity in trials reporting subjectively assessed outcomes.
    Original languageEnglish
    Pages (from-to)429-38
    Number of pages10
    JournalAnnals of internal medicine
    Issue number6
    Publication statusPublished - 18 Sept 2012


    • Randomized Controlled Trials as Topic
    • Odds Ratio
    • Double-Blind Method
    • Humans
    • Bayes Theorem
    • Bias (Epidemiology)
    • Research Design
    • Meta-Analysis as Topic


    Dive into the research topics of 'Influence of reported study design characteristics on intervention effect estimates from randomized, controlled trials'. Together they form a unique fingerprint.

    Cite this