### 5.1 Example of Multiple Regression Analysis

 Research Question : What are the factors that influence the economic performance of a company? Economic performance is measured by the return on capital employed. Methodology : Multiple Regression Analysis Dataset : FIN.DAT
##### SYNTAX
```\$RUN REGRESSN
\$FILES
PRINT = FIN2.LST
DICTIN = FIN.DIC
DATAIN = FIN.DAT
\$SETUP
MULTIPLE REGRESSION ANALYSIS
MDHANDLING=20 -
PRINT =(DICT,MATRIX)
METHOD =STANDARD -
DEPVAR=V2 VARS=(V3-V14) PARTIALS = (V4,V6)
---------------------------------------```
##### Extract from Computer output

After filtering    40 cases read from the input data file
1                            Multiple Regression Analysis
Number of variables = 13

Number of cases = 40

General statistics

```    Variable                    Standard                       Range
Number     Sum        Average     Deviation        Max            	 Min    Variable name

3    1220.00000    30.50000      22.94028       91.0000     	   .0000      GEARRAT
4    6107.00000   152.67500      83.14257      346.0000   	   .0000       CAPINT
5    1501.00000    37.52500      99.04415      625.0000       -32.0000       WCFTDT
6   17400.00000   435.00000     101.31469      629.0000          .0000      LOGSALR
7   17481.00000   437.02500      57.28762      625.0000       341.0000      LOGASST
8    7241.00000   181.02500     194.75408     1298.0000        29.0000       CURRAT
9    4761.00000   119.02500     203.46316     1298.0000        14.0000      QUIKRAT
10    1336.00000    33.40000      19.63879       94.0000          .0000      NFATAST
11    1062.00000    26.55000      17.27597       74.0000          .0000      INVTAST
12    2087.00000    52.17500      27.45522      110.0000          .0000       FATTOT
13    1283.00000    32.07500      31.34865      183.0000          .0000       PAYOUT
14    1029.00000    25.72500      31.16045      147.0000       -58.0000       WCFTCL
2     573.00000    14.32500      13.47816       57.0000       -18.0000       RETCAO
```

Total correlation matrix, R (i , j)

 Variable 3 4 5 6 7 8 9 10 11 12 13 14 2 3 1 4 0.25408 1.00000 5 -0.24469 -.15269 1.00000 6 0.25469 .44428 .17024 1.00000 7 0.03872 -.16479 .52682 .56174 1.00000 8 -0.33094 -.35385 .16414 -.64180 -.04598 1.00000 9 -0.32016 -.35849 .17734 -.66285 -.02466 .98477 1.00000 10 -0.04826 -.03200 .26213 .35414 .28465 -.26450 -.20830 1.00000 11 0.19325 .21390 -.22603 .16725 -.27516 -.20229 -.34905 -.36819 1.00000 12 -0.0435 .10083 .14592 .36966 .14491 -.29820 -.26876 .84863 -.19877 1.00000 13 -0.16966 -.07478 .06026 .19473 .15331 -.15550 -.19910 .20157 .10792 .08638 1.00000 14 -0.55195 -.23789 .32532 -.31345 .18291 .70115 .70703 .03454 -.32794 -.01796 -.05759 1.00000 2 -0.17029 .30785 .07720 .29985 .15358 -.09854 -.11095 -.32696 .11418 -.25051 -.03738 .32642 1.00000

1 Analysis 1

Standard regression dependent variable is V2 RETCAP

The partial correlation matrix
Variables held constant
4 6

 Variable 3 5 7 8 9 10 11 12 13 14 2 3 1 5 -0.27302 1.00000 7 -0.04828 .47985 1.00000 8 -0.21347 .34952 .53320 1.00000 9 -0.197 .38335 .61543 .97390 1.00000 10 -0.12105 .17064 -.01927 -.07684 .01671 1.00000 11 0.13581 -.23179 -.44482 -.11173 -.31292 -.44481 1.00000 12 -0.14312 .07365 -.15024 -.09390 -.04169 .83264 -.27670 1.00000 13 -0.20755 -.02046 -.05949 -.06026 -.11533 .10765 .11015 .00185 1.00000 14 -0.50536 .39053 .47616 .68298 .69899 .14231 -.28136 .10313 -.01801 1.00000 2 -0.31124 .08531 .11754 .15278 .14616 -.46072 .03702 -.40171 -.06739 .50166 1.00000

 Standard error of estimate 7.371 F ratio for the regression 8.618 Multiple correlation coefficient .89049 adjusted .83723 Fraction of explained variance (RSQD) .79297 adjusted .70095 Determinant of the correlation matrix .79478E-05 Residual degrees of freedom (N-K-1) 27 Constant term 19.797

Partial

 Var.no. B Sigma(B) Beta Sigma(Beta) RSQD Marg SQD T-ratio Cov.ratio Variable name 3 -.0287 .0719 -.0488 .1224 .0059 .0012 .3991 .4881 GEARRAT 4 -.0135 .0231 -.0831 .1426 .0124 .0026 .5827 .6231 CAPINT 5 .0075 .0154 .0548 .1130 .0086 .0018 .4848 .3997 WCFTDT 6 .1143 .0356 .8594 .2673 .2769 .0793 3.2155 .8927 LOGSALE 7 -.0796 .0475 -.3384 .2020 .0942 .0215 1.6756 .8120 LOGASST 8 -.2470 .0925 -3.5690 1.3373 .2087 .0546 2.6688 .9957 CURRAT 9 .2116 .0953 3.1944 1.4388 .1544 .0378 2.2202 .9963 QUIKRAT 10 -.5052 .1419 -.7362 .2068 .3195 .0972 3.5606 .8206 NFATAST 11 .2711 .1877 .3475 .2406 .0717 .0160 1.4442 .8676 INVTAST 12 -.0109 .0910 -.0221 .1854 .0005 .0001 .1194 .7769 FATTOT 13 .0342 .0459 .0796 .1067 .0202 .0043 .7462 .3265 PAYOUT 14 .4242 .0682 .9806 .1576 .5891 .2968 6.2219 .6913 WCFTCL

##### INTERPRETATION

IDAMS reports analysis specifications:
Number of variable= 13
Number of cases= 40

Descriptive statistics of predictor variables
Matrix of correlation coefficients between the variables
Matrix of second order partial correlation coefficients; Variables V4 and V6 held constant.

The most striking feature of the analysis is the extremely low value of the determinant of the correlation matrix: .79478E-0.5, which is almost close to zero. This implies multicollinearity.

Covariance ratio of a variable is the square of multiple correlation coefficient, R2, with other p-1 predictor variables in the equation. It is a measure of intercorrelation. Variables, which contribute to multicollinearity can be identified from the values of Covariance Ratio.

Covariance ratios of several predictor variables are particularly high:

LOGSALE:  0.8927
LOGASST:  0.8120
CURRAT:     0.9957
NFATAST:   0.8206
INVTAST:    0.8676

The standard error of the dependent variable (7.371) is about one half of the mean, implying that the reliability of prediction by the multiple regression model. is poor.

The regression model explains 70% of the adjusted variance of the dependant variable.

As we shall see in example EX5(2), regression model with just three variables in the equation explains 80% of the adjusted variance in the dependent variable.

The F ratio of the full-scale model, though statistically significant, is much less that for the reduced model (See Example:{EX5 (2) }.

 F ratio Full Scale model with 12 predictors 8.618 (12, 27) Reduced model with 3 predictors 54.080 (3, 36)

In view of high multicollinerity, there is hardly any point in interpreting the statistical significance of individual predictors.

### 5.2 Example of Stepwise Multiple Regression Analysis

 Research Question : What are the factors that influence the economic performance of a company? Economic performance is measured by the return on capital employed Methodology : Stepwise Multiple Regression Analysis Dataset : FIN.DAT
##### SYNTAX
```\$RUN REGRESSN
\$FILES
PRINT = FINANCE.LST
DICTIN = FIN..DIC
DATAIN = FIN.DAT
\$SETUP
MULTIPLE REGRESSION ANALYSIS
MDHANDLING=20 -
PRINT=(DICT,MATRIX)
METHOD=STEP -
DEPVAR=V2 -
VARS=(V3-V14) -
FINRATIO=4.0 -
FOUTRATIO=3.9 -
PRINT=STEP```
##### EXTRACT FROM COMPUTER PRINTOUT

After filtering 40 cases read from the input data file

1
Number of variables = 13

Number of cases = 40

General statistics

```  Variable                          Standard                            Range
Number        Sum          Average       Deviation       Max           Min          Variable name

3         6.51000        .16275         .29387          .6300        -1.2800        WCFTCl
4         5.50000        .13750         .26862          .6000        -1.2800        WCFTDT
5       186.62000       4.66550         .56091         5.7600         3.8500        LOGSALE
6       178.48000       4.46200         .54956         5.7800         3.5400        LOGASST
7        12.18000        .30450         .29506         1.7800          .0000        GEARRAT
8        72.61000       1.81525         .96339         5.4400          .3600        CAPINT
9        12.48000        .31200         .16801          .7200          .0400        NFATAST
10        18.64000        .46600         .24250         1.1600          .0700        FATTOT
11        10.31000        .25775         .12751          .5000          .0000        INVTAST
12        20.99000        .52475         .67732         4.2100          .0000        PAYOUT
13        32.78000        .81950         .43338         2.6300          .2400        QUIKRAT
14        56.81000       1.42025         .57876         3.9800          .5400        CURRAT
2         5.89000        .14725         .13832          .3800         -.5000        RETCAP
```

Total correlation matrix,R(i,j)

 Variable 3 4 5 6 7 8 9 10 11 12 13 14 2 3 1 4 0.97571 1.00000 5 0.1495 .10485 1.00000 6 0.28584 .23865 .92288 1.00000 7 -0.8059 -.85831 -.23171 -.26941 1.00000 8 -0.27071 -.25965 .32128 -.02663 .04959 1.00000 9 0.23904 .28834 -.15964 -.02112 .01890 -.28992 1.00000 10 0.22706 .25003 -.17268 -.06891 .06415 -.27656 .89277 1.00000 11 -0.29647 -.29789 .21327 .00413 .08117 .47623 -.51601 -.45605 1.00000 12 0.06847 .06442 -.10710 -.14761 -.20794 .04481 -.04754 .02995 .00440 1.00000 13 0.36545 .27387 -.11104 .03927 -.24832 -.35899 -.17360 -.13197 -.23884 .15625 1.00000 14 0.26125 .17287 .06019 .13924 -.22736 -.18670 -.28630 -.25807 .20761 .13312 .85640 1.00000 2 0.83467 .84840 .18609 .22485 -.79247 -.04618 -.00229 -.03428 -.15417 -.02805 .14635 .07227 1.00000

Step No. 0

Dependent variable is V 2 RETCAP

F-level to enter= 4.000

Standard error of Y = .1383

F-level to remove = 3.900

Variable numbers

3 4 5 6 7 8 9 10 11 12 13 14 2
**************** Listing of marginal R-squares for all potential predictors ***

 Step no. Var. no. Variable name Marg rsqd Categorical variables (all codes) Previously in (*) Marg RSQD T-ratio 0 3 WCFTCL .6967 0 4 WCFTDT .7198 0 5 LOGSALE .0346 0 6 LOGASST .0506 0 7 GEARRAT .6280 0 8 CAPINT .0021 0 9 NFATAST .0000 0 10 FATTOT .0012 0 11 INVTAST .0238 0 12 PAYOUT] .0008 0 13 QUIKRAT .0214 0 14 CURRAT .0052

Step No 1

Variable entered 4 WCFTDT

F-level 97.610

T-level 9.880

 Standard error of estimate .7418E-01 F ratio for the regression 97.610 Multiple correlation coefficient .84840 adjusted .84404 Fraction of explained variance (RSQD) .71979 adjusted .71241 Determinant of the correlation matrix 1.0000 Residual degrees of freedom (N-K-1) 38 Constant term .87180E-01

Partial

 Var. no. B Sigma(B) Beta Sigma(Beta) RSQD Marg RSQD T-ratio Cov. ratio Variable name 4 .4369 .0442 .8484 .0859 .7198 .7198 9.8798 .0000 WCFTDT

**************** Listing of marginal R-squares for all potential predictors ***

 Step no. Var. no. Variable name Marg rsqd Categorical variables (all codes) Previously in (*) Marg RSQD T-ratio 1 3 WCFTCL .0010 1 4 WCFTDT .7198 * 1 5 LOGSALE .0095 1 6 LOGASST .0005 1 7 GEARRAT .0157 1 8 CAPINT .0325 1 9 NFATAST .0665 1 10 FATTOT .0648 1 11 INVTAST .0107 1 12 PAYOUT] .0069 1 13 QUIKRAT .0080 1 14 CURRAT .0057

Step No 2

Variable entered 9 NFATAST

F-level 11.513

T-level 3.393

 Standard error of estimate .6565E-01 F ratio for the regression 68.063 Multiple correlation coefficient .88673 adjusted .88019 Fraction of explained variance (RSQD) .78628 adjusted .77473 Determinant of the correlation matrix .91686 Residual degrees of freedom (N-K-1) 37 Constant term +.15086

Partial

 Var. no. B Sigma(B) Beta Sigma(Beta) RSQD Marg RSQD T-ratio Cov. ratio Variable name 4 .4769 .0409 .9261 .0794 .7863 .7863 11.6673 .0831 WCFTDT 9 -.2217 .0653 -.2693 .0794 .2373 .0665 3.3930 .0831 NFATAST

**************** Listing of marginal R-squares for all potential predictors ***

 Step no. Var. no. Variable name Marg rsqd Categorical variables (all codes) Previously in (*) Marg RSQD T-ratio 2 3 WCFTCL .0004 2 4 WCFTDT .7863 * 2 5 LOGSALE .0022 2 6 LOGASST .0000 2 7 GEARRAT .0003 2 8 CAPINT .0153 2 9 NFATAST .0665 * 2 10 FATTOT .0032 2 11 INVTAST .0004 2 12 PAYOUT] .0102 2 13 QUIKRAT .0277 2 14 CURRAT .0321

Step no 3

Variable entered 14 CURRAT

F-level 6.367

T-level 2.523

 Standard error of estimate .6135E-01 F ratio for the regression 54.080 Multiple correlation coefficient .90465 adjusted .89625 Fraction of explained variance (RSQD) .81840 adjusted .80327 Determinant of the correlation matrix .77647 Residual degrees of freedom (N-K-1) 36 Constant term .23146

Partial

 Var. no. B Sigma(B) Beta Sigma(Beta) RSQD Marg RSQD T-ratio Cov. ratio Variable name 4 .5048 .0398 .9803 .0772 .8174 .8128 12.6937 .1542 WCFTDT 9 -.2805 .0654 -.3407 .0794 .3385 .0929 4.2917 .1996 NFATAST 4 -.0465 .0184 -.1947 .0772 .1503 .0321 2.5233 .1531 CURRAT

**************** Listing of marginal R-squares for all potential predictors ***

 Step no. Var. no. Variable name Marg rsqd Categorical variables (all codes) Previously in (*) Marg RSQD T-ratio 3 3 WCFTCL .0028 3 4 WCFTDT .8128* 3 5 LOGSALE .0017 3 6 LOGASST .0001 3 7 GEARRAT .0007 3 8 CAPINT .0065 3 9 NFATAST .0929* 3 10 FATTOT .0032 3 11 INVTAST .0000 3 12 PAYOUT] .0068 3 13 QUIKRAT .0008 3 14 CURRAT .0321*

Completed 3 steps of regression

##### INTERPRETATION
 IDAMS reports analysis specifications: Number of variables = 13 Number of cases = 40 General Statistics Descriptive statistics of all predictor variables Correlation matrix of predictor + dependent variables Step No. 0 No variables in the model. Correlation matrix shows that V4 (WCFTDT) has the highest correlation with the dependent variable (0.8480); Marginal RSQD (0.7198) Hence V4 is entered at Step 1. Step No. 1 Variable V4 is entered at this step with F ratio = 97.610>>FIN RATIO With V4 in the equation, V9 (NFATST) is now the best candidate, since it has the highest value of RSQD. Step No. 2 Variable V9 is entered into the model, with F ratio = 11.513 With V4 and V9 in the equation, the best candidate is now V14 (CURRAT) since it has the highest value of Marginal RSQD after V4 and V9. Step No. 3 V14 is entered into the equation at this step. With V4 and V9 already in the equation, V14 enters with F ratio = 6.367 After this step no other variable qualifies for entering into the regression equation. The regression model is: RETCAP = .23146+.5048 WCFTDT-.2805 NFATAST-.0465 CURRAT Adequacy of the fitted model F ratio = 54.080 df (3,36) p < .000 highly significant Standard error of the estimate of the dependent variable = .06135 which is quite low Þ high reliability of estimation. Determinant of the correlation matrix = .77647, Value close to 0 Þ Multicolinearity                        1 Þ No multicolliearity Recall that for the full scale model the determinants of the correlation matrix was .79478E-05, which is close to 0 Þ high multicollinearity. The standard error of the estimate is now much less than that for the full - scale model: Full Scale Model: 7.371 Reduced Model : 0.06135

### 5.3 Example of Multiple Classification Analysis Program

 Research Question : What is the influence of institutional setting and rank of academic scientists on percentage of work time devoted to teaching? Methodology : Multiple Classification Analysis Dataset : ANJU. DAT
##### SYNTAX
```\$RUN MCA
\$FILES
PRINT 	 = MCA.LST
DICTIN = ANJU.DIC
DATAIN = ANJU.DAT
\$SETUP
INCLUDE V9 = 1-3
TIME SPENT ON TEACHING
PRINT=CDICT
DEPVAR=V2 CONVARS=(V9,V14) PRINT=TABLES
DEPVAR=V2 CONVARS=(V9,V14) OUTLIERS=EXCL, OUTDIS=2.0
```
##### Extract from Computer output

After filtering 1055 cases read from the input data file
3 cases contained illegal characters and were treated according to BADDATA specification
1Analysis - 1 TIME SPENT ON TEACHING

Dependent variable

Name v262:teaching *
Include outliers? YES
 a { Convergence test Test for convergence
PCTMEAN
.00500
Print coefficients? NO
Number of predictors 2
Predictor list
Variable Name   Number of codes
9 v204:rank * 3
14 sv:inst type * 4
 Number of cases eliminated due to dependent variable requirements 17 due to weight requirements 0 due to predictor requirements 0 Number of cases remaining: 1038 Number of outlying cases: 0

1 Weighted frequency table # 1
Column var    14 : sv:inst type
Row var           9 : v204:rank
Code 1             2      3      4

 1 97 147 39 80 2 123 96 17 130 3 148 42 2 117

1Results based on test 2 Iteration 3

Dependent variable statistics

Dependent variable (y) = 2: v262:teaching
M e a n = 41.935450
Standard deviation = 18.106530
Sum of Y = 43529.000
Sum of Y square = 2165385.0
Total sum of squares = 339976.70
Explained sum of square = 72102.410
Residual sum of squares = 267874.30
Number of cases = 1038
1 Predictor summary statistics

Predictor 9: v204:rank

 Class Label No of cases Sum of weights Percents Class mean deviation from grandmean Coefficient Adjustedmean Stand dev. 1 prof 363 363 35.0 34.8237 -7.1117600 -4.9859400 36.949510 16.076364 2 reader 366 366 35.3 42.4399 .50443920 .32019400 42.255650 16.957495 3 lecturer 309 309 29.8 49.6926 7.7571050 5.4780100 47.413460 18.412111

Eta-square = .10896630      Beta-square = .53927850E-01

Eta = .33010040        Beta = .23222370

Predictor 14: sv:inst type

 Class Label No of cases Sum of weights Percents Class mean deviation from grandmean Coefficient Adjustedmean Stand dev. 1 type1 368 368 35.5 48.2283 6.2928090 5.2969040 47.232360 17.572491 2 type2 285 285 27.5 36.2246 -5.7108900 -4.0543360 37.881110 14.869104 3 type3 58 58 5.6 18.9828 -22.952690 -19.882820 22.052630 9.8808540 4 type4 327 327 31.5 43.9021 1.9666890 1.0991750 43.034630 17.741054

Eta-square = .16380060 Beta-square = .11275420

Eta = .40472280       Beta = .33578890

1 Analysis summary statistics

R-squared (unadjusted) = proportion of variation explained by fitted model = .21208
Adjustment for degrees of freedom = 1.00484

1Dependent variable       2: v262:teaching

Listing of Betas in descending order

 Rank Var. no. Name Beta 1 14 sv:inst type .33578890 2 9 v204:rank .23222370

##### INTERPRETATION

IDAMS reports analysis specifications:

Number of cases read= 1055 cases
Number of cases eliminated due to bad data/missing values
in the dependent variables =17.
Number of cases processed= 1038

a = Parameters for convergence

Variables:

 Dependent: V262 : [Time spent on teaching] Predictors: V9 : Rank (3 categories) V14 : Type of Institutions (4 categories)

Bivariate frequency table
V14 (rows) ´ V9 (columns)
Type of Institutions ´ Rank

Dependent variable statistics
(A)   Grand mean : 41..935450
Standard Deviation: 18..106530

Predictor Summary Statistics
First predictor V9 : Rank
(B)  Class mean = Average time spent by the academic scientist of a given category

Deviation from the grand mean = Class mean – Grand mean

Coefficient = Deviation of the class mean from the grand mean after holding constant the other predictors (in this case V14 )

Adjusted mean = Grand mean + Coefficient

(a) Grand mean = 41.935450
V9(Class 1) – (Prof.)
(b) Class mean = 34.8237
(c) Deviation = (b)-(a) = -7.117600
(d) Coefficient = -4.985940
(e) Adjusted mean = (a)+(d) = 41.935450 + (-) 4.985940 = 36.94951

Eta square = .10896630
Eta square indicates that approximately 11 per cent of the variance in the dependent variable (i.e. time spent on teaching) is explained by V9 (rank)

Eta square is an approximate measure of the relationship between V9 (rank) and V2 (percentage of time spent on teaching)

Eta square adjusted indicates that approximately 10.7% of the variance in the dependent variable (V2) is explained by V9, while holding constant the other predictor.

Unadjusted deviation sum of squares = Sum of squares for the predictor based on unadjusted deviations from grand mean.

Adjusted deviations sum of squares = As above but based on adjusted deviations of the predictor from the grand mean.

Summary Statistics

R2 unadjusted indicates that approximately 21% of the variance in the dependent variable is explained by the fitted model.
Adjustment for degrees of freedom = 1.00484

There is not much difference between the values of the unadjusted and adjusted R2, which implies that there is hardly any interaction between the two variable categories.

The predictors are ranked by Beta values, which show relative importance of the two predictors.

Institutional setting of an academic scientist seems to have greater effect on the time spent by an academic scientist than his rank.

1 Analysis - 2 TIME SPENT ON TEACHING

Dependent variable

Name v262:teaching *
Subscript 2

Max.code

9999999.000

Include MD1?

NO

Include MD2?

NO
Include outliers? NO
 a { Convergence test Test for convergence
PCTMEAN
.00500
Print coefficients? NO
Number of predictors 2
Predictor list
Variable Name   Number of codes
9 v204:rank * 3
14 sv:inst type * 4
 Number of cases eliminated due to dependent variable requirements 71 due to weight requirements 0 due to predictor requirements 0 Number of cases remaining: 984 Number of outlying cases: 54 1 Results based on test        2           Iteration     3

Dependent variable statistics

Dependent variable (y) = 2: v262:teaching
M e a n = 40.723580
Standard deviation = 15.807100
Sum of Y = 40072.000
Sum of Y square = 1877492.0
Total sum of squares = 245616.80
Explained sum of square = 51493.110
Residual sum of squares = 194123.70
Number of cases = 984

1 Predictor summary statistics

Predictor 9: v204:rank

 Class Label No of cases Sum of weights Percents Class mean deviation from grandmean Coefficient Adjustedmean Stand dev. 1 prof 352 352 35.8 34.4375 -6.2860760 -4.6923460 36.031230 14.398938 2 reader 353 353 35.9 41.4731 .74951230 .57253610 41.296110 15.608127 3 lecturer 279 279 28.4 47.7061 6.9825180 5.1957000 45.919270 14.615196

Eta-square = .11281920   Beta-square = .62690240E-01

Eta = .33588570    Beta = .25038020

Predictor 14: sv:inst type

 Class Label No of cases Sum of weights Percents Class mean deviation from grandmean Coefficient Adjustedmean Stand dev. 1 type1 342 342 34.8 45.9649 5.2413370 4.3874430 45.111020 14.975011 2 type2 280 280 28.5 36.5429 -4.1807180 -2.6519350 38.071640 14.179025 3 type3 54 54 5.5 20.2593 -20.464320 -17.698160 23.025420 8.9825516 4 type4 308 308 31.3 42.2922 1.5686320 .64200960 41.365590 15.220176

Eta-square = .15333510   Beta-square = .10420150

Eta = .39158030   Beta = .32280260

1 Analysis summary statistics
R-squared (unadjusted) = proportion of variation explained by fitted model = .20965
Adjustment for degrees of freedom = 1.00511

1 Dependent variable     2: v262:teaching

Listing of Betas in descending order

 Rank Var. no. Name Beta 1 14 sv:inst type .32280260 2 9 v204:rank .25038020

##### INTERPRETATION

In this analysis outlier cases have been eliminated.
IDAMS reports analysis specifications:

Number of cases read = 1055 cases.
Number of cases eliminated due to bad data/missing values= 17
Number of cases eliminated due to outliers 54
Total number of cases eliminated= 71
Number of cases processed= 984

a ‘Parameters’, for convergence

 Variables: Dependent: V262 : [Time spent on teaching] Predictors: V9 : Rank (3 categories) V14 : Type of Institutions (4 categories)

Dependent variable statistics
(C)      Grand mean : 40.723580
Standard Deviation: 15.807100

Predictor Summary Statistics
First predictor V9 : Rank
(D)     Class mean = 34.4375 ( Average time spent by the academic scientist of a given category)
Deviation from the grand mean = Class mean – Grand mean

Coefficient = Deviation of the class mean from the grand mean after holding constant the other predictors (in this case V14 )

Adjusted mean = Grand mean + Coefficient

(a)Grand mean = 40.723580
V9(Class 1) – (Prof.)
(b) Class mean = 34. 4375
(c) Deviation = (b)-(a) = -6.2860760
(d) Coefficient = -4.985940
(e) Adjusted mean = (a)+(d) = 40.723580+(-)4.985940 = 36.031230

Eta square = .11281920
Eta square indicates that approximately 11.3 per cent of the variance in the dependent variable (i.e. time spent on teaching) is explained by V9 (rank)

Eta square is an approximate measure of the relationship between V9 (rank) and V2 (percentage spent on teaching)

Eta square adjusted indicates that approximately 11.1% of the variance in the dependent variable (V2) is explained by V9, while holding constant other predictors.

Unadjusted deviation sum of squares = Sum of squares for the predictor based on unadjusted deviations from the grand mean.

Adjusted deviations sum of squares = As above, but based on adjusted deviations of the predictor from the grand mean.

Summary Statistics
R2 unadjusted indicates that approximately 21% of the variance in the dependent variable is explained by the fitted model.
Adjustment for degrees of freedom = 1.00511