RUS  ENG JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB
 General information Latest issue Archive Impact factor Subscription Guidelines for authors Submit a manuscript Search papers Search references RSS Latest issue Current issues Archive issues What is RSS

 Teor. Veroyatnost. i Primenen.: Year: Volume: Issue: Page: Find

 Teor. Veroyatnost. i Primenen., 1975, Volume 20, Issue 3, Pages 463–487 (Mi tvp3161)

Asymptotically optimal tests in mathematical statistics

A. A. Borovkov

Novosibirsk

Abstract: Let $X=(x_1,…,x_n)$ be a sample from a distribution $F(x)$, and $\Phi=\{F^\theta\}_{\theta\in\Theta}$, $\Theta\subset R^k$, be a parametric set of distribution functions (d.f.).
We consider asymptotically optimal tests for the following problems.
1. Testing the hypothesis $H_1=\{F=F_1\}$, $F_1\in\Phi$ against $H_2=\{F\ne F_1,F\in\Phi\}$.
2. Testing the hypothesis $H_1=\{F=F_1\}$ against $H_2=\{F\ne F_1\}$ with the help of grouping observations in intervals. In other words, we consider problem 1 in the special case when $\Theta$ consists of either discrete d.f.'s with discontinuities at fixed points or of d.f.'s with constant densities on fixed intervals.
3. Testing the complex hypothesis $H_1=\{F\in\Phi_\alpha\}$ against $H_2=\{F\in\Phi-\Phi_\alpha\}$ where $\Phi_\alpha=\{F^{(\alpha,\beta)}\}_{\beta\in B}$ is a set of $F^\theta\in\Phi$ with a fixed part $\alpha$ of coordinates of the vector $\theta=(\alpha,\beta)$ ($\alpha$ and $\beta$ are subvectors of $\theta$). Here the set $\Theta$ is of the form $A\times B$, $A$ and $B$ being the sets of values of $\alpha$ and $\beta$ respectively.
For instance, the problem of testing the hypothesis $H_1=\{\mathbf Ex_i=0\}$ that the sample $X$ taken from a normal population with an unknown variance has zero mean is in this class.
4. Testing, with the help of grouping, the hypothesis $H_1$ that $F$ belongs to some parametrical set $\Phi=\{F^\theta\}_{\theta\in\Theta}$ against the alternative $H_2=\{F\notin\Phi\}$.
5. Two-sample problems. Let $Y=(y_1,…,y_m)$ be a sample from an unknown distribution $G$ and let $\Phi_\alpha$ be the above subset of $\Phi$. We consider the problem of testing the complex hypothesis $H_1=\bigcup_\alpha\{F\in\Phi_\alpha,G\in\Phi_\alpha\}$ against the complementary alternative.
For instance, the well-known Behrens–Fisher problem of testing the equality of the mean values in the samples $X$ and $Y$ from normal populations with unknown variances belongs to this class of problems.
6. Testing the hypothesis of homogeneity $H_1=\{F=G\}$ in the two-sample problem, when in the above conditions, $\alpha=\theta$ and $\Phi_\alpha=\Phi_\theta$ consists of the element $F^\theta$ only.
7. Testing homogeneity $H_1=\{F=G\}$ in the general case with the help of grouping observations (without assumption that $F$ and $G$ belong to the same parametrical set).
8. The problem of pattern recognition. Let $X_1,…,X_s$ be $s$ samples from unknown d.f.'s $F_1,…,F_s$ respectively which belong to a parametrical set $\Phi$. A sample $Y$ is known to belong to one of $F_1,…,F_s$. It is required to find out which d.f. precisely the sample $Y$ belongs to.
9. The problem of pattern recognition in the general case with the help of grouping observations and without assumptions that $F_1,…,F_s$ belong to the same parametrical set.
We find optimal tests for the above problems on the basis of Bayes' approach. A remarkable fact is that the form of an optimal decision function asymptotically weakly depends on the a priori distributions and on the loss functions if these characteristics are continuous and vary not too fast. This enables to find universal tests (independent of the a priori characteristics) for which the losses are asymptotically equivalent to those of optimal tests.
We discuss also the accuracy of the statements of problems and the form of the tests obtained. For instance, for the simplest problem 1 the asymptotically optimal test is of the following form: the hypothesis $H_1=\{F=F_1\}$ should be rejected if $\max\limits_{\theta\in\Theta}L(X\mid\theta)-L(X\mid\theta_1)>c$ where $L(X\mid\theta)$ is the logarithmic likelihood function (under some smoothness conditions), parameter $\theta_1$ corresponds to $F_1$, the constant $c$ determines the significance level of the test.

Full text: PDF file (1457 kB)

English version:
Theory of Probability and its Applications, 1976, 20:3, 447–469

Bibliographic databases:

Citation: A. A. Borovkov, “Asymptotically optimal tests in mathematical statistics”, Teor. Veroyatnost. i Primenen., 20:3 (1975), 463–487; Theory Probab. Appl., 20:3 (1976), 447–469

Citation in format AMSBIB
\Bibitem{Bor75} \by A.~A.~Borovkov \paper Asymptotically optimal tests in mathematical statistics \jour Teor. Veroyatnost. i Primenen. \yr 1975 \vol 20 \issue 3 \pages 463--487 \mathnet{http://mi.mathnet.ru/tvp3161} \mathscinet{http://www.ams.org/mathscinet-getitem?mr=381085} \zmath{https://zbmath.org/?q=an:0352.62020} \transl \jour Theory Probab. Appl. \yr 1976 \vol 20 \issue 3 \pages 447--469 \crossref{https://doi.org/10.1137/1120055}