Discussion paper

Scand J Work Environ Health 2012;38(3):282-290    pdf

https://doi.org/10.5271/sjweh.3201 | Published online: 21 Oct 2011, Issue date: May 2012

Synthesizing study results in a systematic review

by Verbeek J, Ruotsalainen J, Hoving JL

A single study rarely suffices to underpin treatment or policy decisions. This creates a strong imperative for systematic reviews. Authors of reviews need a method to synthesize the results of several studies, regardless of whether or which statistical method is used. In this article, we provide arguments for combining studies in a review. To combine studies authors should judge the similarity of studies. This judgement should be based on the working mechanism of the intervention or exposure. It should also be assessed if this mechanism is similar for various populations and follow-up times. The same judgement applies to the control interventions. Similar studies can be combined in either a meta-analysis or narrative synthesis. Other methods such as vote counting, levels of evidence synthesis, or best evidence synthesis are better avoided because they may produce biased results. We support our arguments by re-analysing a systematic review. In its original form, the review showed strong evidence of no effect, but our re-analysis concluded there is evidence of an effect. We provide a flow-chart to guide authors through the synthesis and assessment process.

The following articles refer to this text: 2013;39(6):633-634; 2014;40(2):133-145; 2014;40(3):215-229; 2018;44(2):134-146