Influence of reported study design characteristics on intervention effect estimates from randomized, controlled trials

Savović, Jelena; Jones, Hayley E; Altman, Douglas G; Harris, Ross J; Jüni, Peter; Pildal, Julie; Als-Nielsen, Bodil; Balk, Ethan M; Gluud, Christian; Gluud, Lise Lotte; Ioannidis, John P A; Schulz, Kenneth F; Beynon, Rebecca; Welton, Nicky J; Wood, Lesley; Moher, David; Deeks, Jonathan J; Sterne, Jonathan A C (2012). Influence of reported study design characteristics on intervention effect estimates from randomized, controlled trials. Annals of internal medicine, 157(6), pp. 429-38. Philadelphia, Pa.: American College of Physicians

[img] Text
Savovic AnnInternMed 2012.pdf - Published Version
Restricted to registered users only
Available under License Publisher holds Copyright.

Download (231kB) | Request a copy

Published evidence suggests that aspects of trial design lead to biased intervention effect estimates, but findings from different studies are inconsistent. This study combined data from 7 meta-epidemiologic studies and removed overlaps to derive a final data set of 234 unique meta-analyses containing 1973 trials. Outcome measures were classified as "mortality," "other objective," "or subjective," and Bayesian hierarchical models were used to estimate associations of trial characteristics with average bias and between-trial heterogeneity. Intervention effect estimates seemed to be exaggerated in trials with inadequate or unclear (vs. adequate) random-sequence generation (ratio of odds ratios, 0.89 [95% credible interval {CrI}, 0.82 to 0.96]) and with inadequate or unclear (vs. adequate) allocation concealment (ratio of odds ratios, 0.93 [CrI, 0.87 to 0.99]). Lack of or unclear double-blinding (vs. double-blinding) was associated with an average of 13% exaggeration of intervention effects (ratio of odds ratios, 0.87 [CrI, 0.79 to 0.96]), and between-trial heterogeneity was increased for such studies (SD increase in heterogeneity, 0.14 [CrI, 0.02 to 0.30]). For each characteristic, average bias and increases in between-trial heterogeneity were driven primarily by trials with subjective outcomes, with little evidence of bias in trials with objective and mortality outcomes. This study is limited by incomplete trial reporting, and findings may be confounded by other study design characteristics. Bias associated with study design characteristics may lead to exaggeration of intervention effect estimates and increases in between-trial heterogeneity in trials reporting subjectively assessed outcomes.

Item Type:

Journal Article (Original Article)


04 Faculty of Medicine > Pre-clinic Human Medicine > Institute of Social and Preventive Medicine

UniBE Contributor:

Jüni, Peter


600 Technology > 610 Medicine & health




American College of Physicians




Factscience Import

Date Deposited:

04 Oct 2013 14:35

Last Modified:

09 Sep 2017 10:35

PubMed ID:


Web of Science ID:




URI: (FactScience: 220701)

Actions (login required)

Edit item Edit item
Provide Feedback