Radhakrishna Rao, C.
(1973)
*Unified theory of least squares
*
Communications in Statistics, 1
(1).
pp. 1-8.
ISSN 0361-0918

Full text not available from this repository.

Official URL: http://www.tandfonline.com/doi/abs/10.1080/0361092...

Related URL: http://dx.doi.org/10.1080/03610927208827002

## Abstract

Let (Y, Xβ, σ^{2}I) where E(Y)=Xβ and D(Y) = E(Y→Xβ)'=σ^{2}G, be the Gauss-Markoff model, where A' denotes the transpose of the matrix A. Further let β^{^} be astationary point (supposed to exist for all Y) of Y - Xβ)' M(Y-Xβ); i.e., where its derivative with respect to β is the zero vector. It is shown that if β^{^}; is the BLUE of p'β for every P∈S(X'), the linear space generated by the columns of X', and an unbiased estimator of σ^{2} is ƒ^{-1}(Y-Xβ^{^})' M(Y-Xβ^{^}) f=R(G:X)-R(X), where R(V) denotes the rank of V, then it is necessary and sufficient that M is a symmetric g-inverse of (G+X∪X') where U is any summarice matrix such that S(G:X) = S(G + X∪X'). The method is valid whether G is singular or not and R(X) is full or not. A simple choice of U is always U=k^{2}I, K¬0.

Item Type: | Article |
---|---|

Source: | Copyright of this article belongs to Taylor and Francis Group. |

Keywords: | Least Squares Theory; G-inverse; Linear Estimation; Gauss-markoff Model; Singular Dispersion Matrix; Unified Theory for Linear Models; Linear Hypotheses |

ID Code: | 71688 |

Deposited On: | 28 Nov 2011 04:04 |

Last Modified: | 28 Nov 2011 04:04 |

Repository Staff Only: item control page