Radhakrishna Rao, C.
(1947)
*Minimum variance and the estimation of several parameters*
Mathematical Proceedings of the Cambridge Philosophical Society, 43
(2).
pp. 280-283.
ISSN 0305-0041

Full text not available from this repository.

Official URL: http://journals.cambridge.org/production/action/cj...

Related URL: http://dx.doi.org/10.1017/S0305004100023471

## Abstract

With the help of certain inequalities concerning the elements of the dispersion matrix of a set of statistics, and of the information matrix, the following results have been proved. Some of these inequalities are extensions of results given by Fisher (1) in the case of a single parameter. (i) Efficient statistics are explicit functions of the minimal set of sufficient statistics. (ii) Functions of the minimal set of sufficient statistics, satisfying the property of uniqueness defined in the text, are best unbiased estimates. Under certain conditions estimates possessing exactly the minimum possible variance can be obtained by the method of maximum likelihood. (iii) In large samples maximum likelihood estimates supply efficient statistics in the case of several parameters. (iv) The importance of replacing the sample by an exhaustive set of sufficient statistics (referred to in this paper as the minimal set) as a first step in any methodological problem has been stressed by R. A. Fisher in various articles and lectures. The above discussion supplies a formal demonstration of this view so far as the problem of estimation is concerned.

Item Type: | Article |
---|---|

Source: | Copyright of this article belongs to Cambridge University Press. |

ID Code: | 71474 |

Deposited On: | 25 Nov 2011 12:39 |

Last Modified: | 25 Nov 2011 12:39 |

Repository Staff Only: item control page