This function fits the semi-supervised mixture model multiple times.
It is called by mixtura
and scrutor
.
fit.wrap(y, z, dist, phi, pi, gamma, starts = 1, it.em = 100, epsilon = 1e-04)
y | observations:
numeric vector of length |
---|---|
z | class labels:
integer vector of length |
dist | distributional assumption:
character |
phi | dispersion parameters:
numeric vector of length |
pi | zero-inflation parameter(s):
numeric vector of length |
gamma | offset:
numeric vector of length |
starts | restarts of the |
it.em | (maximum) number of iterations in the |
epsilon | convergence criterion for the |
This function returns the parameter estimates, the posterior probabilities, and the likelihood.
probability of belonging to class 1:
numeric vector of length n
path of the log-likelihood:
numeric vector with maximum length it.em
parameter estimates under H0
:
data frame
parameter estimates under H1
:
data frame
log-likelihood under H0
:
numeric
log-likelihood under H1
:
numeric
likelihood-ratio test statistic: positive numeric
The distributions are parametrised as follows:
Gaussian
y ~ N(mean,sd^2)
E[y]=mean
Var[y]=sd^2
Negative binomial
y ~ NB(mu,phi)
E[y]=mu
Var[y]=mu+phi*mu^2
Zero-inflated negative binomial
y ~ ZINB(mu,phi,pi)
E[y]=(1-pi)*mu
# data simulation n <- 100 z <- rep(0:1,each=n/2) y <- rnorm(n=n,mean=2*z,sd=1) z[(n/4):n] <- NA # model fitting fit.wrap(y,z,dist="norm")#> $posterior #> [1] 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 #> [6] 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 #> [11] 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 #> [16] 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 #> [21] 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.8425921935 #> [26] 0.4084662331 0.0066437359 0.0245016147 0.0312726042 0.7853968391 #> [31] 0.2318178842 0.0131493443 0.0449175639 0.0276646813 0.0368652934 #> [36] 0.0325069881 0.3195242749 0.0004680964 0.2854726258 0.0387377356 #> [41] 0.1510768584 0.3550874100 0.6020975215 0.2035236657 0.0697451110 #> [46] 0.0713482334 0.3921015126 0.0026858865 0.3711345096 0.0049133548 #> [51] 0.8218491385 0.8852113235 0.9470413234 0.8238578622 0.9904841501 #> [56] 0.8625968713 0.9894537082 0.8687773658 0.5464222455 0.8422754448 #> [61] 0.1874638518 0.9559626410 0.9323836723 0.5652770911 0.9764486990 #> [66] 0.7183888664 0.9665881190 0.3501853431 0.6524783755 0.9383541242 #> [71] 0.9226385563 0.9973239048 0.6918441633 0.9535939895 0.4754974138 #> [76] 0.6311521557 0.3533258240 0.7460639013 0.3347035229 0.7490640459 #> [81] 0.9821081960 0.9441602989 0.5505044669 0.8397911658 0.7766631154 #> [86] 0.9445805976 0.9201395562 0.7524431750 0.9078371861 0.9357469764 #> [91] 0.9552854771 0.9716816087 0.9469883654 0.9366126681 0.5172012021 #> [96] 0.9727968251 0.9750004007 0.9630710243 0.3459891189 0.4238540425 #> #> $converge #> [1] -175.6058 -174.4763 -173.5012 -172.5741 -171.7009 -170.9523 -170.3971 #> [8] -170.0464 -169.8550 -169.7608 -169.7160 -169.6936 -169.6810 -169.6729 #> [15] -169.6672 -169.6630 -169.6598 -169.6573 -169.6555 -169.6540 -169.6529 #> [22] -169.6520 -169.6513 -169.6508 -169.6504 -169.6501 -169.6499 -169.6497 #> [29] -169.6495 -169.6494 -169.6493 #> #> $estim0 #> p0 mean0 sd0 p1 mean1 sd1 #> 1 1 0.9049049 1.431286 0 NaN NaN #> #> $estim1 #> p0 mean0 sd0 p1 mean1 sd1 #> 1 0.41292 0.01860373 1.093092 0.58708 2.005021 1.007327 #> #> $loglik0 #> [1] -177.827 #> #> $loglik1 #> [1] -169.6493 #> #> $lrts #> [1] 16.35543 #>