Accelerated failure time model

Accelerated failure time model

In the statistical area of survival analysis, an accelerated failure time model (AFT model) is a parametric model that provides an alternative to the commonly-used proportional hazards models. Whereas a proportional hazards model assumes that the effect of a covariate is to multiply the hazard by some constant, an AFT model assumes that the effect of a covariate is to multiply the predicted event time by some constant. AFT models can be therefore be framed as linear models for the logarithm of the survival time.

Comparison with proportional hazard models

The biggest difference is that AFT models are always fully parametric, i.e. a probability distribution must be specified, as there is no known equivalent of Cox's semi-parametric proportional hazards model. The choice of origin from which to measure time at risk is important in all parametric survival models.

Unlike proportional hazards models, the regression parameter estimates from AFT models are robust to the presence of unmeasured confounders. They are also less affected by the choice of probability distribution. [Citation | journal=Statistics in Medicine| year=2004 |volume=23 |pages=3177–3192 |doi=10.1002/sim.1876| title=Parametric accelerated failure time models with random effects and an application to kidney transplant survival| first1=Philippe |last1=Lambert |first2= Dave |last2=Collett| first3=Alan| last3= Kimber |first4=Rachel |last4=Johnson]

The results of AFT models are easily interpreted. [Citation| title=On the use of the accelerated failure time model as an alternative to the proportional hazards model in the treatment of time to event data: A case study in influenza | journal=Drug Information Journal | year= 2002 | last1=Kay |first1= Richard| last2= Kinnersley|first2= Nelson| volume=36| pages=571–579| url=http://findarticles.com/p/articles/mi_qa3899/is_200207/ai_n9139743] For example, the results of a clinical trial with mortality as the endpoint could be interpreted as a certain percentage increase in future life expectancy on the new treatment compared to the control. So a patient could be informed that he would be expected to live (say) 15% longer if he took the new treatment. Hazard ratios can prove harder to explain in layman's terms.

More probability distributions can be used in AFT models than parametric proportional hazard models, including distributions that have unimodal hazard functions.

Distributions used in AFT models

To be used in an AFT model, a distribution must have a parameterisation that includes a scale parameter. The logarithm of the scale parameter is then modelled as a linear function of the covariates.

The log-logistic distribution provides the most commonly-used AFT model. Unlike the Weibull distribution, it can exhibit a non-monotonic hazard function which increases at early times and decreases at later times. It is similar in shape to the log-normal distribution but its cumulative distribution function has a simple closed form, which becomes important computationally when fitting data with censoring.

The Weibull distribution (including the exponential distribution as a special case) can be parameterised as either a proportional hazards model or an AFT model, and is the only family of distributions to have this property. The results of fitting a Weibull model can therefore be interpreted in either framework.

Other distributions suitable for AFT models include the log-normal, gamma and inverse Gaussian distributions, although they are less popular than the log-logistic, partly as their cumulative distribution functions do not have a closed form.

References

*Citation | title =Modelling Survival Data in Medical Research|first=D. |last=Collett | year=2003 | edition=2nd | publisher=CRC press| isbn=1584883251

Further reading

Articles

*Citation| journal=British Journal of Cancer |year=2003 |number=89 |pages=431-436 | doi=10.1038/sj.bjc.6601119 |title=Survival Analysis Part II: Multivariate data analysis - an introduction to concepts and methods |first1=MJ| last1=Bradburn| first2= TG |last2=Clark |first3=SB | last3=Love | first4=DG |last4=Altman
*Citation |title=Fundamentals of Survival Data| first=Philip |last=Hougaard| journal=Biometrics| volume=55 |number=1| year=1999| pages= 13-22| url=http://links.jstor.org/sici?sici=0006-341X%28199903%2955%3A1%3C13%3AFOSD%3E2.0.CO%3B2-5

Books

*Citation | title=Analysis of Survival Data | first1= David Roxbee | last1=Cox | author1-link=David Cox (statistician) | first2= D.|last2= Oakes | publisher=CRC Press | year=1984 | isbn= 041224490X
*Citation| title=Analysing Survival Data from Clinical Trials and Observational Studies| first1=Ettore| last1= Marubini| first2= Maria Grazia| last2=Valsecchi | year=1995| publisher=Wiley |isbn=0470093412


Wikimedia Foundation. 2010.

Игры ⚽ Нужен реферат?

Look at other dictionaries:

  • First-hitting-time model — In statistics, first hitting time models are a sub class of survival models. The first hitting time, also called first passage time, of a set A with respect to an instance of a stochastic process is the time until the stochastic process first… …   Wikipedia

  • Failure rate — is the frequency with which an engineered system or component fails, expressed for example in failures per hour. It is often denoted by the Greek letter λ (lambda) and is important in reliability engineering. The failure rate of a system usually… …   Wikipedia

  • Time series — Time series: random data plus trend, with best fit line and different smoothings In statistics, signal processing, econometrics and mathematical finance, a time series is a sequence of data points, measured typically at successive times spaced at …   Wikipedia

  • Model selection — is the task of selecting a statistical model from a set of candidate models, given data. In the simplest cases, a pre existing set of data is considered. However, the task can also involve the design of experiments such that the data collected is …   Wikipedia

  • General linear model — Not to be confused with generalized linear model. The general linear model (GLM) is a statistical linear model. It may be written as[1] where Y is a matrix with series of multivariate measurements, X is a matrix that might be a design matrix, B… …   Wikipedia

  • Survival analysis — is a branch of statistics which deals with death in biological organisms and failure in mechanical systems. This topic is called reliability theory or reliability analysis in engineering, and duration analysis or duration modeling in economics or …   Wikipedia

  • Proportional hazards models — are a class of survival models in statistics. Survival models relate the time that passes before some event occurs to one or more covariates that may be associated with that quantity. In a proportional hazards model, the unique effect of a unit… …   Wikipedia

  • Monte Carlo method — Not to be confused with Monte Carlo algorithm. Computational physics …   Wikipedia

  • Survival function — The survival function, also known as a survivor function or reliability function, is a property of any random variable that maps a set of events, usually associated with mortality or failure of some system, onto time. It captures the probability… …   Wikipedia

  • Kaplan–Meier estimator — The Kaplan–Meier estimator,[1][2] also known as the product limit estimator, is an estimator for estimating the survival function from life time data. In medical research, it is often used to measure the fraction of patients living for a certain… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”