Article Text

Download PDFPDF
Inter-rater agreement in the assessment of exposure to carcinogens in the offshore petroleum industry
  1. Kjersti Steinsvåg1,
  2. Magne Bråtveit1,
  3. Bente E Moen1,
  4. Hans Kromhout2
  1. 1University of Bergen, Department of Public Health and Primary Health Care, Section for Occupational Medicine, Bergen, Norway
  2. 2Utrecht University, Institute for Risk Assessment Sciences, Utrecht, the Netherlands
  1. Correspondence to:
 Kjersti Steinsvåg
 University of Bergen, Department of Public Health and Primary Health Care, Section for Occupational Medicine, Kalfarveien 31, N-5018 Bergen, Norway; kjersti.steinsvag{at}isf.uib.no

Abstract

Objectives: To evaluate the reliability of an expert team assessing exposure to carcinogens in the offshore petroleum industry and to study how the information provided influenced the agreement among raters.

Methods: Eight experts individually assessed the likelihood of exposure for combinations of 17 carcinogens, 27 job categories and four time periods (1970–1979, 1980–1989, 1990–1999 and 2000–2005). Each rater assessed 1836 combinations based on summary documents on carcinogenic agents, which included descriptions of sources of exposure and products, descriptions of work processes carried out within the different job categories, and monitoring data. Inter-rater agreement was calculated using Cohen’s kappa index and single and average score intraclass correlation coefficients (ICC) (ICC(2,1) and ICC(2,8), respectively). Differences in inter-rater agreement for time periods, raters, International Agency for Research on Cancer groups and the amount of information provided were consequently studied.

Results: Overall, 18% of the combinations were denoted as possible exposure, and 14% scored probable exposure. Stratified by the 17 carcinogenic agents, the probable exposure prevalence ranged from 3.8% for refractory ceramic fibres to 30% for crude oil. Overall mean kappa was 0.42 (ICC(2,1) = 0.62 and ICC(2,8) = 0.93). Providing limited quantitative measurement data was associated with less agreement than for equally well described carcinogens without sampling data.

Conclusion: The overall κ and single-score ICC indicate that the raters agree on exposure estimates well above the chance level. The levels of inter-rater agreement were higher than in other comparable studies. The average score ICC indicates reliable mean estimates and implies that sufficient raters were involved. The raters seemed to have enough documentation on which to base their estimates, but provision of limited monitoring data leads to more incongruence among raters. Having real exposure data, with the inherent variability of such data, apparently makes estimating exposure in a rigid semiquantitative manner more difficult.

  • IARC, International Agency for Research on Cancer
  • ICC, intraclass coefficients
  • reliability
  • Cohen’s kappa index
  • intraclass correlation coefficients

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Footnotes

  • Published Online First 16 January 2007

  • The Norwegian Oil Industry Association funded this study.

  • Competing interests: none declared