Message/Author |
|
|
I need to run a cross-validation of my analysis on a different sample. My original CFA was conducted on the sample for Company A; now I need to estimate this model's fit on the sample for Company B. In other words, I need to make two steps: 1) estimate the measurement model on sample 1; 2) cross-validate: estimate the fit of the same model run on the sample 2 with all parameters fixed to the ones estimated in step 1. I do not see an easy way to do this in MPlus (estimate model parameters on one sample, then check this model against the data in the second sample). Can such cross-validation be done automatically? Thank you! |
|
|
You can use the SVALUES option of the OUTPUT command to obtain input with starting values that are the ending values of an analysis, for example, sample 1. You can change the asterisks (*) to @ to fix the parameters to the values rather than to have them as starting values. |
|
|
Thank you, this method worked very well. I have one more question after performing the analysis: why does it happen that fixing a parameter (particularly, covariances between latent constructs) increases the degrees of freedom of the model? The number of parameters estimated goes down, but degrees of freedom goes up. Is there any text (or intuitive explanation) that would explain this? |
|
|
Any time one parameter is fixed the degrees of freedom will increase by one. The degrees of freedom are the number of free parameters in the H1 model minus the number of free parameters in the H0 model. |
|
|
is it possible to compute a cross-validation index (CVI) like Cudeck and Browne (1983) in Mplus using random samples drawn from a population? Is the CVI calculated using the formula provided in Cudeck and Browne (1983) MBR article? |
|
|
The MacCallum et al 1994 MBR article discusses this index and shows how it's computed using the ML fitting function the minimization of which you get reported in TECH5 of the Mplus output. |
|
|
The TECH 5 output indicates the "F" or discrepancy function is negative, which it cannot be. Cudeck and Browne state the F "discrepancy function is a nonnegative scalar-valued function of two covariance matrices S and Sigma." Is there something special about the EM procedure that generates a negative value? Below the EM algorithm iterations is a second set of iterations using a different estimation procedure (Quasi-Newton iterations) is the final iteration value the correct F for the procedure? this # is positive. |
|
|
Please send the output to Support. |
|
|
MPlus outputs the sample covariance matrix with a format of F15.8 as the default. We want a lower variance/covariance matrix FORMAT=F5.2. This option did not override the default. We wish to use the lower variance/covariance matrix for the sample and the implied population (Sigma) to compute the Cross Validation Index using Cudeck and Browne's (1983) formula (also Whittaker 2006 reproduced this formula with some modification). Is there any way to override the default format 15.8? |
|
|
The FORMAT option of the SAVEDATA command is for the analysis variables only not for matrices. |
|
QianLi Xue posted on Thursday, June 13, 2019 - 7:16 am
|
|
|
can MPLUS do k-fold cross-validation of a CFA model? |
|
|
Not automatically. |
|
QianLi Xue posted on Monday, June 17, 2019 - 9:01 am
|
|
|
Do you have a reference on how to do K-fold cross-validation of a CFA? |
|
|
No. Try SEMNET. |
|
Back to top |