A simple data set that shows that an infinite coefficient is not always obvious. This was sent to us as a query about the S-Plus software of `why does the robust variance fail'. The answer is that standard errors of an infinite coefficient really don't make sense, and that an approximate jackknife is always far from infinity.
> coxph(Surv(t1, t2, status) ~ x1 + x2 + cluster(id)) coef exp(coef) se(coef) robust se z p x1 7.64 2085 25.3 0.732 10.44 0.0e+00 x2 5.85 347 25.3 1.151 5.08 3.8e-07 Likelihood ratio test=9.84 on 2 df, p=0.0073 n= 50
Both x1 and x2 are binary covariates. A table showing the number of event/censored observations of each type is
x2 0 1 +---------- x1 0| 1/7 1/7 1|24/10 0/0There is no obvious "no hazard" column or row, such as usually causes infinite coefficients in a main effects model; or even a 0/x no events cell as would cause this for a model with interactions.
However, detailed examination shows that the one event in the 0,0 cell of the table happens to be the largest time point in the entire data set. The likelihood of the model as a whole is unchanged if this observation were censored; its contribution to the score statistic is (covariate value of the event - average covariate value), which is 0 since the average includes only one observation. A pair of covariates with pattern
no risk positive risk pos risk no riskwill have both coefficients infinite in the bivariate model, while both may be finite in the univariate models.