Selecting an exposure lag period

Epidemiology. 1995 Jul;6(4):387-90. doi: 10.1097/00001648-199507000-00010.

Abstract

In epidemiology, there is an inclination to consider more credible the larger estimates of exposure effect. For example, higher relative risks or rate ratios are often emphasized as a criterion for choosing among various hypothesized exposure-lag values. An alternative criterion for this choice might be based on a goodness-of-fit measure. We present examples, based on hypothetical data, in which an exposure-lag parameter is estimated by trial and error fitting: we compare the behavior of the likelihood-ratio goodness-of-fit statistic obtained over the assigned values of the parameter with that of the relative risk. We show that there can be inconsistencies between the highest-estimate and likelihood-based goodness-of-fit criteria. Concern about the validity of the highest-estimate criterion prompts us to recommend that this criterion not be used for the estimation of exposure-weighting parameters, which should preferably be based on a priori biological knowledge or on goodness-of-fit criteria.

MeSH terms

  • Case-Control Studies
  • Epidemiologic Methods
  • Humans
  • Models, Statistical*
  • Probability
  • Risk Assessment*
  • Sensitivity and Specificity