Models for interval-censored survival data presenting a fraction of “cure” or “immune” patients have recently been proposed in the literature, particularly extending the mixture cure model to interval-censored data. However, little is known about the goodness-of-fit of such models. In a mixture cure model, the survival distribution of the entire population is improper and expressed in terms of the survival distribution of uncured individuals, i.e. the latency part of the model, and the probability to experience the event of interest, i.e. the incidence part. To validate a mixture cure model, assumptions made on both parts need to be checked, i.e. the survival distribution of uncured individuals, the link function used in the latency and the linearity of the covariates used in the both parts of the model. In this work, we investigate the Cox-Snell and deviance residuals and show how they can be adapted and used to perform diagnostics checks when all subjects are right- or interval-censored and some subjects are cured with unknown cure status. A large simulation study investigates the ability of these residuals to detect a departure from the assumptions of the mixture model. Developed techniques are applied to a real data set about Alzheimer’s disease.
ASJC Scopus subject areas
- Statistics and Probability
- Health Information Management