18 Simple Structure

Learning Objectives

After completing this session, you will be to:

  • explain promax rotation.

As a special note, did we really achieve simple structure? Although rotation helps us achieve simple structure, if the interrelationships do not hold itself up to simple structure, we can only modify our model. In this case we chose to remove Item 2 from our model.

Promax Rotation

Promax rotation begins with Varimax (orthgonal) rotation, and uses Kappa to raise the power of the loadings. Promax really reduces the small loadings. Promax also runs faster than Varimax, and in our example Promax took 3 iterations while Direct Quartimin (Direct Oblimin with Delta =0) took 5 iterations.

Quiz

True or False

  1. Varimax, Quartimax and Equamax are three types of orthogonal rotation and Direct Oblimin, Direct Quartimin and Promax are three types of oblique rotations.

Answers: 1. T.

Generating Factor Scores

Suppose the Principal Investigator is happy with the final factor analysis which was the two-factor Direct Quartimin solution. She has a hypothesis that SPSS Anxiety and Attribution Bias predict student scores on an introductory statistics course, so would like to use the factor scores as a predictor in this new regression analysis. Since a factor is by nature unobserved, we need to first predict or generate plausible factor scores. In SPSS, there are three methods to factor score generation, Regression, Bartlett, and Anderson-Rubin.

Generating factor scores using the Regression Method in SPSS

In order to generate factor scores, run the same factor analysis model but click on Factor Scores (Analyze – Dimension Reduction – Factor – Factor Scores). Then check Save as variables, pick the Method and optionally check Display factor score coefficient matrix.

The code pasted in the SPSS Syntax Editor looksl like this:

FACTOR

/VARIABLES q01 q02 q03 q04 q05 q06 q07 q08

/MISSING LISTWISE

/ANALYSIS q01 q02 q03 q04 q05 q06 q07 q08

/PRINT INITIAL EXTRACTION ROTATION FSCORE

/PLOT EIGEN ROTATION

/CRITERIA FACTORS(2) ITERATE(100)

/EXTRACTION PAF

/CRITERIA ITERATE(100) DELTA(0)

/ROTATION OBLIMIN

/SAVE REG(ALL)

/METHOD=CORRELATION.

Here we picked the Regression approach after fitting our two-factor Direct Quartimin solution. After generating the factor scores, SPSS will add two extra variables to the end of your variable list, which you can view via Data View. The figure below shows what this looks like for the first 5 participants, which SPSS calls FAC1_1 and FAC2_1 for the first and second factors. These are now ready to be entered in another analysis as predictors.

For those who want to understand how the scores are generated, we can refer to the Factor Score Coefficient Matrix. These are essentially the regression weights that SPSS uses to generate the scores. We know that the ordered pair of scores for the first participant is −0.880,−0.113−0.880,−0.113. We also know that the 8 scores for the first participant are 2,1,4,2,2,2,3,12,1,4,2,2,2,3,1. However, what SPSS uses is actually the standardized scores, which can be easily obtained in SPSS by using Analyze – Descriptive Statistics – Descriptives – Save standardized values as variables. The standardized scores obtained are:   −0.452,−0.733,1.32,−0.829,−0.749,−0.2025,0.069,−1.42−0.452,−0.733,1.32,−0.829,−0.749,−0.2025,0.069,−1.42. Using the Factor Score Coefficient matrix, we multiply the participant scores by the coefficient matrix for each column. For the first factor:

(0.284)(−0.452)+(−0.048)−0.733)+(−0.171)(1.32)+(0.274)(−0.829)+(0.197)(−0.749)+(0.048)(−0.2025)+(0.174)(0.069)+(0.133)(−1.42)=−0.880,(0.284)(−0.452)+(−0.048)−0.733)+(−0.171)(1.32)+(0.274)(−0.829)+(0.197)(−0.749)+(0.048)(−0.2025)+(0.174)(0.069)+(0.133)(−1.42)=−0.880,

which matches FAC1_1  for the first participant. You can continue this same procedure for the second factor to obtain FAC2_1.

Factor Score Coefficient Matrix
Item Factor
1 2
1 0.284 0.005
2 -0.048 -0.019
3 -0.171 -0.045
4 0.274 0.045
5 0.197 0.036
6 0.048 0.095
7 0.174 0.814
8 0.133 0.028
Extraction Method: Principal Axis Factoring. Rotation Method: Oblimin with Kaiser Normalization. Factor Scores Method: Regression.

The second table is the Factor Score Covariance Matrix,

Factor Score Covariance Matrix
Factor 1 2
1 1.897 1.895
2 1.895 1.990
Extraction Method: Principal Axis Factoring. Rotation Method: Oblimin with Kaiser Normalization. Factor Scores Method: Regression.

This table can be interpreted as the covariance matrix of the factor scores, however it would only be equal to the raw covariance if the factors are orthogonal. For example, if we obtained the raw covariance matrix of the factor scores we would get

Correlations
FAC1_1 FAC1_2
FAC1_1 Covariance 0.777 0.604
FAC1_2 Covariance 0.604 0.870

You will notice that these values are much lower. Let’s compare the same two tables but for Varimax rotation:

Factor Score Covariance Matrix
Factor 1 2
1 0.670 0.131
2 0.131 0.805
Extraction Method: Principal Axis Factoring. Rotation Method: Varimax with Kaiser Normalization. Factor Scores Method: Regression.

If you compare these elements to the Covariance table below, you will notice they are the same.

Correlations
FAC1_1 FAC1_2
FAC1_1 Covariance 0.670 0.131
FAC1_2 Covariance 0.131 0.805

Note with the Bartlett and Anderson-Rubin methods you will not obtain the Factor Score Covariance matrix.

Regression, Bartlett and Anderson-Rubin compared

Among the three methods, each has its pluses and minuses. The regression method maximizes the correlation (and hence validity) between the factor scores and the underlying factor but the scores can be somewhat biased. This means even if you have an orthogonal solution, you can still have correlated factor scores. For Bartlett’s method, the factor scores highly correlate with its own factor and not with others, and they are an unbiased estimate of the true factor score. Unbiased scores means that with repeated sampling of the factor scores, the average of the scores is equal to the average of the true factor score. The Anderson-Rubin method perfectly scales the factor scores so that the factor scores are uncorrelated with other factors and uncorrelated with other factor scores. Since Anderson-Rubin scores impose a correlation of zero between factor scores, it is not the best option to choose for oblique rotations. Additionally, Anderson-Rubin scores are biased.

In summary, if you do an orthogonal rotation, you can pick any of the the three methods. For orthogonal rotations, use Bartlett if you want unbiased scores, use the regression method if you want to maximize validity and use Anderson-Rubin if you want the factor scores themselves to be uncorrelated with other factor scores. If you do oblique rotations, it’s preferable to stick with the Regression method. Do not use Anderson-Rubin for oblique rotations.

Quiz

True or False

  1. If you want the highest correlation of the factor score with the corresponding factor (i.e., highest validity), choose the regression method.
  2. Bartlett scores are unbiased whereas Regression and Anderson-Rubin scores are biased.
  3. Anderson-Rubin is appropriate for orthogonal but not for oblique rotation because factor scores will be uncorrelated with other factor scores.

Answers: 1. T, 2. T, 3. T

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

Business Research Methods by Icfai Business School is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book