Article Text


Oral Session 12 – Exposure assessment

Statistics from


D. C. Glass1, C. N. Gray2, D. J. Jolley3, C. Gibbons3, M. R. Sim1, L. Fritschi4.1Monash University, Melbourne, Australia; 2Deakin University, Geelong Australia; 3Deakin University, Burwood, Australia; 4University of Western Australia, Perth, Australia

A nested case–control study investigated whether an excess of lymphohaematopoietic cancers identified among the male members of Health Watch was associated with benzene exposure. Cases identified between 1981 and 1999 (n = 79) were age matched to five controls and their benzene exposure retrospectively estimated using occupational histories and local site specific information. The study used the arithmetic mean of measured base estimates (BEs) data supplied by Australian petroleum companies to calculate BE levels for various jobs. These were validated by comparison to literature values. These were then systematically allocated, in a database, to periods of time in the subject’s occupational history. Matched analyses showed leukaemia risk increased at cumulative exposures above 2 ppm years and exposure intensity of highest exposed job over 0.8 ppm. For the 13 case sets with >8 ppm years cumulative exposure, the odds ratio was 11.3 (95% confidence interval 2.85 to 45.1). Risk of non-Hodgkin’s lymphoma and multiple myeloma was not associated with exposure to benzene. The cumulative exposures in this study were much lower than the 50–200 ppm years that is commonly quoted as being associated with risk of leukaemia. Subjects in the study had an average exposure period of 20 years (range 4 to 42). More highly exposed tasks (BE >1 ppm), such as handling benzene or BTX (benzene, toluene, xylene mixture), drum filling, and ship or tanker loading are usually performed for part of the week only. As a result, cumulative exposures were low for the study subjects. Approximately one third of the cumulative estimates were <1 ppm year and nearly 85% were ⩽10 ppm years. The addition of occasional high exposures eg as a result of spillages increased exposure for 25% of subjects but for most of these subjects the increase was modest, <5% of total exposure.


M. Friesen1, H. Davies1, K. Teschke1,2, S. Marion2, P. Demers1,2.1School of Occupational & Environmental Hygiene, University of British Columbia, Vancouver, Canada; 2Department of Health Care & Epidemiology, University of British Columbia, Vancouver, Canada

Introduction: Exposure misclassification may occur when non-specific exposure measurement methods are used, resulting in attenuation of the dose–response relationship. We describe the development of job exposure matrices (JEM) for non-specific and wood specific dust exposures and use them to examine associations with respiratory morbidity in a sawmill worker cohort.

Methods: The non-specific dust JEM was constructed from a predictive model that used 1399 inhalable dust measurements collected from 1981–97. For the wood dust JEM, a model was constructed after applying adjustment factors to the measurements to account for the presence of non-wood dusts. Hospital discharge records for the cohort (n = 7557, employed between 1985–98) were available from 1985 to 98.

Results: The non-specific dust and wood dust exposure models explained 45% and 54% of the variability in exposure, respectively. The two metrics were moderately correlated for the JEM mill/job/time period cells (Pearson r = 0.55) and for cumulative exposure (partial correlation = 0.43, adjusted for exposure duration). We observed a dose–response relationship between cumulative exposure to wood dust and hospitalisation for chronic obstructive pulmonary disease (n = 133), but not with non-specific dust exposure. Standardised rate ratios (SRR) for cumulative wood dust versus non-specific dust exposure, respectively, were: 1.4 (95% CI 0.31 to 1.6) v 0.70 (0.79 to 2.6) for 5–10 mg/m3-years; 2.0 (1.1 to 3.4) v 0.82 (0.40 to 1.7) for 10–20 mg/m3 years; and 1.6 (0.96 to 2.8) v 0.69 (0.35 to 1.4) for >20 mg/m3 years (reference: <5 mg/m3 years). For asthma hospitalisation while employed, we observed a stronger relationship for >30 days of exposure to wood dust levels above 1 mg/m3 (2.2, 0.93 to 5.4) than for non-specific dust (1.5, 0.66 to 3.3).

Conclusions: The relationships between health outcomes and wood dust were significantly attenuated when a non-specific dust metric was used. This study demonstrates the importance of developing exposure metrics as specific to the disease causing agent as possible, particularly when the composition of mixed exposures varies by work areas and jobs.


A. Baccarelli1,2, R. Pfeiffer2, D. Consonni1, A. C. Pesatori1, M. Bonzini1, D. Patterson3, P. A. Bertazzi1, M. T. Landi2.1Department of Occupational and Environmental Health, University of Milan, Milan, Italy; 2Division of Cancer Epidemiology and Genetics, NCI/NIH/DHHS, Bethesda, MD, USA; 3Centers for Disease Control and Prevention, Atlanta, GA, USA

Introduction: In environmental and occupational research, it is common to find levels of exposure to a specific agent that are below the limit of detection of the method of measurement employed. Proper treatment of below limit measurements is critical to avoid bias and unnecessary loss of information.

Methods: We reviewed possible statistical strategies for treating below limit measurements, including deletion, simple substitution, distributional methods, and distribution based imputation. We applied these different methods in a recent investigation on chloracne conducted on the Seveso population exposed to 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD), in which 55.6% of the measurements were below limit.

Results: Theoretical statistical studies indicate that distribution based imputation of below limit measurements represents a robust and reliable method that can produce estimates with negligible bias even when up to 80% of the samples are below limit. Although simple substitution methods (such as substituting limit/2 for the below limit data) are widely employed, their use may be adequate only when the percentage of measurements is low (e.g., <15% of the samples). We used distribution based imputation to estimate mean plasma TCDD concentrations in the Seveso study. Multiple imputations were performed in order to obtain correct variance estimates. When mean plasma TCDD levels obtained using distribution based multiple imputation were taken as referents, the other procedures showed relative bias ranging from 23% to 330%.

Conclusions: We recommend the use of distribution based multiple imputation in investigations including a considerable proportion of below limit observations. The procedure can be performed using standard statistical software, allows any statistical analysis to be conducted on the imputed datasets, and results in valid statistical inference that properly reflects the presence of below limit measurements.


T. J. Keegan, K. M. Venables, L. M. Carpenter, C. Brooks, V. Beral, M. J. Nieuwenhuijsen1, P. Doyle2, N. E. S. Maconochie2, T. Fletcher2.1Department of Public Health, University of Oxford, Oxford, UK; Imperial College, London, UK; 2London School of Tropical Medicine and Hygiene, London. UK

Introduction: The UK has carried out trials relating to chemical warfare defence involving members of the armed forces at a facility at Porton Down since the First World War. Epidemiological research has been commissioned to assess the long term health of approximately 20 000 subjects who took part in 1939–89. The design of this cohort study is reported separately. Here, we describe the methodology for the detailed exposure assessment.

Methods: The methodology was developed using a pre-existing database on individual exposure, and also data from our pilot study (ICOH, X2004, Utrecht). In the pilot study,150 representative individuals were included, for whom we reviewed all their available records from the Porton Down experimental archive.

Results: We estimate that several hundred individual chemicals were used in the study period. Most of the potential exposure subcohorts were small and with sparse quantitative exposure data. However, there were two large subcohorts with quantitative exposure data: (a) exposed to vesicants, such as mustard, and (b) exposed to nerve agents, such as sarin. There were approximately 11 000 vesicant and 3000 nerve agent exposures, although the numbers of individuals exposed are yet to be established. Pilot data suggest that vesicant exposures were predominantly dermal, and acute effects across the range of a semi-quantitative erythema scale were seen. Nerve agent exposures were both dermal and inhalant, and pilot biological effect data indicated depression of blood cholinesterase ranging from 0 to 47%.

Conclusions: For the vesicant and nerve agent subcohorts, detailed information will be abstracted on intensity and/or duration of exposure, acute biological effect, and exposure modifiers. For other chemicals, only name and selected modifiers will be abstracted. A database for data abstraction has been developed and data entry screens will be demonstrated.

View Abstract

Request permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.