Suppose that the regression involves two sets of variables, and . Thus,
The normal equations are
Theorem (Orthogonal Partitioned Regression). In the multiple linear least squares regression of on two sets of variables and , if the two sets of variables are orthogonal, then the separate coefficient vectors can be obtained by separate regressions of on alone and on alone.
Theorem (Frisch-Waugh-Lovell Theorem). In the linear least squares regression of vector on two sets of variables, and , the subvector is the set of coefficients obtained when the residuals from a regression of on alone are regressed on the set of residuals obtained when each column of is regressed on .
Theorem (Change in the Sum of Squares When a Variable is Added to a Regression). If is the sum of squared residuals when is regressed on and is the sum of squared residuals when is regressed on and , then
where is the coefficient on in the long regression of on and is the vector of residuals when is regressed on .
We want to know how the variation of is explained by the variation of :
We can obtain a measure of how well the regression line fits the data by using the
Theorem (Change in When a Variable is Added to a Regression). Let be the coefficient of determination in the regression of on and an additional variable , let be the same for the regression of on alone, and let be the partial correlation between and , controlling for . Then
where the partial correlation is the simple correlation between and , where the square of the partial correlation coefficient is
The adjusted (for degrees of freedom), which in corporates a penalty for these results is computed as follows