This is another partial F TEST. Here, the alternatives are:
\(H_0\): \(\beta_q=\beta_{q+1}=\cdots =0\)
\(H_{\alpha}\): not all of the \(\beta_k\) in \(H_0\) equal zero
\[ F^* = \frac{SSR(X_q,\cdots,X_{p-1}|X_1,\cdots, X_{q-1})}{p-q}\div \frac{SSE(X_1,\cdots,X_{p-1})}{n-p}\]
When test about regression coefficients are derired that do not involove testing whether one or several \(\beta_k\) equal zero, extra sums of squares cannot be used and the general linear test approach requires separate fittings of the full and reduced model. For instance, for the full model containing three \(X\) variables (full model):
\(H_0\): \(\beta_1=\beta_2\)
\(H_{\alpha}\): \(\beta_1\neq\beta_2\)
The procedure would be to fit the full model (1), and then the reduced model :
Another example where extra sums of squares cannot be used is in the following test for regression (1):
Here, the reduced model would be
Note that the new response variable \(Y - 3X_{1} -5 X_{3}\) in the (3), since \(\beta_1X_1\) and \(\beta_3X_3\) are known constants under \(H_0\). We then use the general linear test statistic \(F^*\) with 2 and degree of freedom (n-4).
See more details on HW#4.
Equation (4) is very important