The Kolmogorov-Lilliefors Test tests if a random variable follows a certain family of distributions (e.g. Gaussian). Donsker's theorem is no longer valid, which means that Kolmogorov-Smirnov values will no longer work. We also cannot plug in estimators from the data into the potential distribution that we are testing against, because that would result in conclusions that are too conservative.
Let X1,…,Xn be i.i.d. random variables and follow the cdf F. Let ^F0 be a continuous cdf of the family that we are testing against. Let parameters of the ^F0 (e.g. μ and σ2 for Gaussian) be the estimators from Xis, assuming that they come from a distribution of that family (ˆμ and ˆσ2 for Gaussian). This test that tells us whether those values come from a distribution of that family.
H0:F=^F0 H1:F≠^F0
Let Fn be the empirical cdf of the sample X1,…,Xn. If F=F0, then Fn(t)≈F0(t) for t∈[0,1].
The test statistic is:
˜Tn=√nsupt∈R|Fn(t)−^F0(t)|
The Kolmogorov-Smirnov with asymptotic level α is defined as:
Ψα=1{Tn>qα}
The quantiles of the Kolmogorov-Lilliefors test are smaller than those of the Kolmogorov-Smirnov test. This is because the K-L test is more conservative, since the null hypothesis distribution is based on information from the samples.