Теория вероятностей и ее применения
RUS  ENG    ЖУРНАЛЫ   ПЕРСОНАЛИИ   ОРГАНИЗАЦИИ   КОНФЕРЕНЦИИ   СЕМИНАРЫ   ВИДЕОТЕКА   ПАКЕТ AMSBIB  
Общая информация
Последний выпуск
Архив
Импакт-фактор
Подписка
Правила для авторов
Загрузить рукопись

Поиск публикаций
Поиск ссылок

RSS
Последний выпуск
Текущие выпуски
Архивные выпуски
Что такое RSS



Теория вероятн. и ее примен.:
Год:
Том:
Выпуск:
Страница:
Найти






Персональный вход:
Логин:
Пароль:
Запомнить пароль
Войти
Забыли пароль?
Регистрация


Теория вероятностей и ее применения, 2024, том 69, выпуск 4, страницы 791–799
DOI: https://doi.org/10.4213/tvp5606
(Mi tvp5606)
 

Краткие сообщения

Almost sure results for generalized normal distribution

Wen L.a, Zhuang Z.b

a School of Mathematics and Big Data, Foshan University, Foshan, P. R. China
b Department of Mathematics, Hong Kong University of Science and Technology, Hong Kong, P. R. China
Список литературы:
Аннотация: В работе получена сходимость почти наверное, характеризующая флуктуации к бесконечности или некоторой фиксированной точке, для обобщенных нормально распределенных величин.
Ключевые слова: обобщенное нормальное распределение, сходимость почти наверное.
Финансовая поддержка Номер гранта
Guangdong Provincial Philosophy and Social Science Planning Project GD24DWQYJ04
This research is supported by the Guangdong Provincial Philosophy and Social Science Planning Project (GD24DWQYJ04).
Поступила в редакцию: 23.10.2022
Исправленный вариант: 08.06.2023
Принята в печать: 17.09.2023
Дата публикации: 25.10.2024
Английская версия:
Theory of Probability and its Applications, 2025, Volume 69, Issue 4, Pages 630–636
DOI: https://doi.org/10.1137/S0040585X97T992185
Реферативные базы данных:
Тип публикации: Статья

1. Introduction and main results

A random variable $X$ is said to have the normal distribution with parameters $\mu$ and $\sigma^2$ if its probability distribution function (p.d.f.) is given by

$$ \begin{equation*} f(x\mid \mu,\sigma)=\frac{1}{\sqrt{2\pi}\,\sigma}\exp\biggl\{-\frac{(x-\mu)^2}{2\sigma^2}\biggr\}, \end{equation*} \notag $$
for $-\infty<x<\infty$, $-\infty<\mu<\infty$, $\sigma>0$. We call $X$ obey the standard normal distribution if $\mu=0$ and $\sigma=1$.

In probability theory and statistics, the normal distribution is the most significant probability distribution. A large number of random variables are either nearly or exactly represented by the normal distribution, in every physical science and economics. Furthermore, it can be used to approximate other probability distributions. Therefore, many properties, related to the normal distribution, have been studied. For example, let $\{X_n,\, n\geqslant1\}$ be a sequence of independent random variables with the common distribution as $X$, suppose that $X$ obeys the standard normal distribution, then it is well known that

$$ \begin{equation} \limsup_{n\to\infty}\frac{X_n}{\sqrt{2\log n}}=1\quad \text{a.s.} \end{equation} \tag{1.1} $$
(see [2; Example 2.1.2(a)]) and the cluster set (set of all limit points) of the sequence $\{X_n/\sqrt{2\log n},\, n\geqslant2\}$ is $[-1,1]$ with probability one (see [4]), where and in the following, $\log x=\log_e(\max\{0, x\})$ for any real number $x$, and set $\log 0=-\infty$.

Formula (1.1), characterizing the fluctuation of the normal sequence, is very simple and beautiful. And it had been strengthened to

$$ \begin{equation} \lim_{n\to\infty}\frac{\max_{1\leqslant k\leqslant n}X_k}{\sqrt{2\log n}}=1\quad \text{a.s.}, \end{equation} \tag{1.2} $$
see [9; Example viii], or see [2; Example 2.1.2(b)].

Formula (1.2) is a special case of the almost sure asymptotic properties of extremes of independent and identically distributed random variables. For more detail, one can refer to [9] and [5].

Although the normal distribution is the most popular distribution in probability and statistics, many new distributions have been constantly proposed in order to meet the needs. For example, Nadarajah [7] introduced the generalized normal distribution.

A random variable $X$ is said to have the generalized normal distribution with parameters $\mu$, $\sigma$, and $s$ if its p.d.f. is given by

$$ \begin{equation} f(x|\mu,\sigma,s)=\frac{s}{2\sigma \Gamma(1/s)}\exp\biggl\{-\biggl|\frac{x-\mu}{\sigma}\biggr|^s\biggr\}, \end{equation} \tag{1.3} $$
for $-\infty<x<\infty$, $-\infty<\mu<\infty$, $\sigma>0$, and $s>0$, where $\Gamma(\,{\cdot}\,)$ is the Gamma function.

The generalized normal distribution, also known as the generalized Gaussian distribution, exponential power distribution or generalized error distribution, is the generalization of the normal distribution. The normal distribution belongs to a special case of the generalized normal distribution. The exponential power distribution was first proposed by Subbotin [11], and Vianelli [13] provided an analytical form of the generalized error distribution density function. Box and Tao [1] referred to this distribution as an exponential power distribution. Tadikamala [12] provided a sampling method for exponential power distribution. Nadarajah [7] studied the relevant properties of the generalized normal distribution, such as the extreme value distribution of the failure function, moments, and order statistic, discussed the maximum likelihood estimation, and gave the information matrix. Nadarajah [8] pointed out that the generalized normal distribution proposed by him belongs to exponential power distribution. Li and Li [6] provided high-order expansions for normalizing the maximum density of generalized error distributions and used them to derive high-order expansions for extreme moments. Rui and Nadarajah [10] studied the accurate maximum likelihood estimators of positional and scale parameters for a given generalized Gaussian distribution shape parameter $s=3,4,5$.

Due to the importance of the generalized normal distribution, the goal of the paper is to extend (1.1) and (1.2) from the normal distribution to the generalized normal distribution. We now state the main results. The proofs of those results will be detailed in the next section.

Theorem 1.1. Let $\{X, X_n,\, n\,{\geqslant}\,1\}$ be a sequence of independent and identically distributed random variables, Suppose that $X$ has the p.d.f. as (1.3), then

$$ \begin{equation*} \limsup_{n\to\infty}\frac{X_n-\mu}{(\log n)^{1/s}}=\sigma\quad\textit{a.s.}, \end{equation*} \notag $$
and the cluster set of the sequence $\{(X_n-\mu)/(\log n)^{1/s},\, n\geqslant2\}$ is $[-\sigma,\sigma]$ with probability one.

Theorem 1.2. Under the condition of Theorem 1.1, we have

$$ \begin{equation} \lim_{n\to\infty}\frac{\max_{1\leqslant k\leqslant n}(X_k-\mu)}{(\log n)^{1/s}}=\sigma\quad\textit{a.s.} \end{equation} \tag{1.4} $$
and
$$ \begin{equation} \lim_{n\to\infty}\Bigl[\max_{1\leqslant k\leqslant n}(X_k-\mu)\Bigr]^{1/(\log\log n)}=e^{1/s}\quad\textit{a.s.} \end{equation} \tag{1.5} $$

Theorems 1.1 and 1.2 give the convergence rates of $\limsup_{n\to\infty}(X_n{-}{\kern1pt}\mu){\kern1pt}{=}{\kern1pt}\infty$ a.s. We know that $\liminf_{n\to\infty}|X_n-a|=0$ a.s. for any real number $a$, we also can give the convergence rates of it. Firstly, we give a general result.

Theorem 1.3. Let $\{Y,\,Y_n,\, n\geqslant1\}$ be a sequence of independent and identically distributed random variables with

$$ \begin{equation} \lim_{y\to 0^+}\frac{1}{y}\,\mathbf P\{|Y|<y\}=c_0\in(0,+\infty). \end{equation} \tag{1.6} $$
Then, we have
$$ \begin{equation} \liminf_{n\to\infty}\frac{\log|Y_n|}{\log n}=-1\quad\textit{a.s.} \end{equation} \tag{1.7} $$
and
$$ \begin{equation} \lim_{n\to\infty}\frac{\min_{1\leqslant k\leqslant n}\log |Y_k|}{\log n}=-1\quad\textit{a.s.} \end{equation} \tag{1.8} $$
Denote by $D$ the cluster set of the sequence $\{\log|Y_n|/\log n,\, n\geqslant2\}$ with probability one. Then $[-1,0]\subset D$ and $[-1,0]=D$ under the additional condition $\mathbf E|Y|^\delta<\infty$ for all $\delta>0$.

Remark 1.1. By (1.6), $\mathbf P\{Y=0\}=0$, hence $\log |Y_n|$ are well defined for all $n\geqslant1$.

Remark 1.2. Let $Y$ be a random variable with p.d.f. $g(y)$, suppose that $g(y)$ is continuous in the neighbours of zero and $g(0)\ne 0$, then (1.6) holds with $c_0=2g(0)$.

By Theorem 1.3, we have the following corollary.

Corollary 1.1. Under the condition of Theorem 1.1, we have, for any $a\in (-\infty, +\infty)$,

$$ \begin{equation*} \liminf_{n\to\infty}\frac{\log|X_n-a|}{\log n}=-1\quad\textit{a.s.} \end{equation*} \notag $$
and the cluster set of the sequence $\{\log|X_n-a|/\log n,\, n\geqslant2\}$ is $[-1,0]$ with probability one. Furthermore,
$$ \begin{equation*} \lim_{n\to\infty}\frac{\min_{1\leqslant k\leqslant n}\log |X_k-a|}{\log n}=-1 \quad\textit{a.s.} \end{equation*} \notag $$

Remark 1.3. Corollary 1.1 is new even in the normal distributed case.

Throughout this paper, $C$ always stands for a positive constant whose value is of no importance and may differ from one place to another.

2. Lemmas and proofs of main results

Let $\{A_n,\, n\,{\geqslant}\,1\}$ be a sequence of events. As usual the abbreviation $\{A_n\ {\rm i.o.}\}$ stands for the case that the events $A_n$ occur infinitely often. That is,

$$ \begin{equation*} \{A_n\text{ i.o.}\}=\bigcap^\infty_{n=1}\bigcup^\infty_{k=n}A_k. \end{equation*} \notag $$

The following lemmas are needed, the first lemma follows from [2; Example 1.6.25(a)] and the second one is due to [7; Lemma 2.3] can be obtained by the same argument as Lemma 2.2.

Lemma 2.1. Let $\{b_n,\, n\geqslant1\}$ be a nondecreasing sequence of positive real numbers such that $\lim_{n\to\infty}b_n=\infty$ and let $\{V_n,\, n\geqslant1\}$ be a sequence of random variables. Then

$$ \begin{equation*} \mathbf P\Bigl\{\max_{1\leqslant k\leqslant n}V_k>b_n\textit{ i.o.}\Bigr\}=\mathbf P\{V_n> b_n \textit{ i.o.}\}. \end{equation*} \notag $$

Lemma 2.2. Let $X$ be a random variable with the p.d.f. as (1.3). Then

$$ \begin{equation*} \mathbf P\{X-\mu>x\}\sim \frac{1}{2\Gamma(1/s)} \biggl(\frac{x}{\sigma}\biggr)^{1-s}\exp\biggl\{-\biggl(\frac{x}{\sigma}\biggr)^s\biggr\} \end{equation*} \notag $$
as $x\to+\infty$.

Proof of Theorem 1.1. Note that $(X-\mu)/\sigma$ has the p.d.f. as (1.3) with $\mu=0$ and $\sigma=1$, so we can assume that $\mu=0$ and $\sigma=1$. Hence, we should prove that
$$ \begin{equation} \limsup_{n\to\infty}\frac{X_n}{(\log n)^{1/s}}=1\quad\text{a.s.} \end{equation} \tag{2.1} $$
and the cluster set of the sequence $\{X_n/(\log n)^{1/s},\, n\geqslant2\}$ is $[-1,1]$ with probability one.

To prove (2.1), it is enough to show that, for all $\varepsilon\in (0,1)$,

$$ \begin{equation*} \limsup_{n\to\infty}\frac{X_n}{(\log n)^{1/s}}\leqslant (1+\varepsilon)^{1/s}\quad\text{a.s.} \end{equation*} \notag $$
and
$$ \begin{equation*} \limsup_{n\to\infty}\frac{X_n}{(\log n)^{1/s}}\geqslant (1-\varepsilon)^{1/s}\quad\text{a.s.}, \end{equation*} \notag $$
which are equivalent to
$$ \begin{equation} \mathbf P\{X_n> (1+\varepsilon)^{1/s}(\log n)^{1/s}\text{ i.o.}\}=0 \end{equation} \tag{2.2} $$
and
$$ \begin{equation} \mathbf P\{X_n> (1-\varepsilon)^{1/s}(\log n)^{1/s}\text{ i.o.}\}=1, \end{equation} \tag{2.3} $$
respectively. In the view of the Borel–Cantelli lemma, to prove (2.2), it suffices to prove that
$$ \begin{equation} \sum^\infty_{n=2}\mathbf P\{X> (1+\varepsilon)^{1/s}(\log n)^{1/s}\}=\sum^\infty_{n=2}\mathbf P\{X_n> (1+\varepsilon)^{1/s}(\log n)^{1/s}\}<\infty. \end{equation} \tag{2.4} $$
By Lemma 2.2,
$$ \begin{equation*} \mathbf P\{X> (1+\varepsilon)^{1/s}(\log n)^{1/s}\}\leqslant C(\log n)^{1/s-1}n^{-(1+\varepsilon)} \end{equation*} \notag $$
and it is obvious that $\sum^\infty_{n=1}(\log n)^{1/s-1}n^{-(1+\varepsilon)}<\infty$. Thus (2.4) holds, and hence (2.2) holds.

In the view of the Borel–Cantelli lemma, to prove (2.3), it suffices to prove that

$$ \begin{equation} \sum^\infty_{n=2}\mathbf P\{X> (1-\varepsilon)^{1/s}(\log n)^{1/s}\}=\sum^\infty_{n=2}\mathbf P\{X_n> (1-\varepsilon)^{1/s}(\log n)^{1/s}\}=\infty. \end{equation} \tag{2.5} $$
By Lemma 2.2,
$$ \begin{equation*} \mathbf P\{X> (1-\varepsilon)^{1/s}(\log n)^{1/s}\}> C(\log n)^{1/s-1}n^{-(1-\varepsilon)} \end{equation*} \notag $$
holds, when $n$ is large enough and it is obvious that
$$ \begin{equation*} \sum^\infty_{n=1}(\log n)^{1/s-1}n^{-(1-\varepsilon)}=\infty. \end{equation*} \notag $$
Thus (2.5) holds, and hence (2.3) holds.

Note that $-X$ has the same distribution as $X$, thus by (2.1),

$$ \begin{equation*} \limsup_{n\to\infty}\frac{-X_n}{(\log n)^{1/s}}=-1\quad\text{a.s.}, \end{equation*} \notag $$
i.e.,
$$ \begin{equation} \liminf_{n\to\infty}\frac{X_n}{(\log n)^{1/s}}=-1\quad\text{a.s.} \end{equation} \tag{2.6} $$
Therefore, to prove that the cluster set of the sequence $\{X_n/(\log n)^{1/s},\, n\,{\geqslant}\,2\}$ is $[-1,1]$ with probability one, it is enough to prove that, for any $\alpha\in (0,1)$, there exists a subsequence $\{m(n),n\geqslant1\}$ of $\{1,2,\dots\}$, such that
$$ \begin{equation} \limsup_{n\to\infty}\frac{X_{m(n)}}{(\log m(n))^{1/s}}=\alpha\quad\text{a.s.} \end{equation} \tag{2.7} $$
and
$$ \begin{equation} \liminf_{n\to\infty}\frac{X_{m(n)}}{(\log m(n))^{1/s}}=-\alpha\quad\text{a.s.} \end{equation} \tag{2.8} $$
In fact, for any subsequence $\{m(n),\, n\geqslant1\}$ of $\{1,2,\dots\}$, $\{X_{m(n)},\, n\geqslant1\}$ is a sequence of independent random variables with common distribution as $X$, by (2.1) and (2.6),
$$ \begin{equation*} \limsup_{n\to\infty}\frac{X_{m(n)}}{(\log n)^{1/s}}=1\quad\text{a.s.} \end{equation*} \notag $$
and
$$ \begin{equation*} \liminf_{n\to\infty}\frac{X_{m(n)}}{(\log n)^{1/s}}=-1\quad\text{a.s.}, \end{equation*} \notag $$
which imply (2.7) and (2.8) by setting $m(n)=\lfloor n^{1/\alpha^s}\rfloor$, and the fact that $\alpha^s\log m(n)\sim \log n$, where $\lfloor x\rfloor$ denotes the integer part of the real number $x$. The proof is completed.

Proof of Theorem 1.2. We also assume that $\mu=0$ and $\sigma=1$. Hence, we can rewrite (1.4) as
$$ \begin{equation} \lim_{n\to\infty}\frac{\max_{1\leqslant k\leqslant n}X_k}{(\log n)^{1/s}}=1\quad\text{a.s.} \end{equation} \tag{2.9} $$

To prove (2.9), it is enough to show that, for all $\varepsilon\in (0,1)$,

$$ \begin{equation*} \limsup_{n\to\infty}\frac{\max_{1\leqslant k\leqslant n}X_k}{(\log n)^{1/s}}\leqslant (1+\varepsilon)^{1/s} \quad\text{a.s.} \end{equation*} \notag $$
and
$$ \begin{equation*} \liminf_{n\to\infty}\frac{\max_{1\leqslant k\leqslant n}X_k}{(\log n)^{1/s}}\geqslant (1-\varepsilon)^{1/s} \quad\text{a.s.}, \end{equation*} \notag $$
which are equivalent to
$$ \begin{equation} \mathbf P\Bigl\{\max_{1\leqslant k\leqslant n}X_k> (1+\varepsilon)^{1/s}(\log n)^{1/s}\text{ i.o.}\Bigr\}=0 \end{equation} \tag{2.10} $$
and
$$ \begin{equation} \mathbf P\Bigl\{\max_{1\leqslant k\leqslant n}X_k<(1-\varepsilon)^{1/s}(\log n)^{1/s}\text{ i.o.}\Bigr\}=0, \end{equation} \tag{2.11} $$
respectively. By Lemma 2.1, (2.10) holds from (2.2). In the view of the Borel–Cantelli lemma, to prove (2.11), it suffices to prove that
$$ \begin{equation} \sum^\infty_{n=2}\mathbf P\Bigl\{\max_{1\leqslant k\leqslant n}X_k< (1-\varepsilon)^{1/s}(\log n)^{1/s}\Bigr\}<\infty. \end{equation} \tag{2.12} $$
Using the elementary inequality $1-x<e^{-x}$, for $x>0$,
$$ \begin{equation*} \begin{aligned} \, &\mathbf P\Bigl\{\max_{1\leqslant k\leqslant n}X_k< (1-\varepsilon)^{1/s}(\log n)^{1/s}\Bigr\} =[1-\mathbf P\{X>(1-\varepsilon)^{1/s}(\log n)^{1/s}\}]^n \\ &\qquad<\exp\bigl\{-n\, \mathbf P\{X>(1-\varepsilon)^{1/s}(\log n)^{1/s}\}\bigr\} \end{aligned} \end{equation*} \notag $$
and by Lemma 2.2,
$$ \begin{equation*} n\, \mathbf P\{X>(1-\varepsilon)^{1/s}(\log n)^{1/s}\}\sim \frac{1}{2\Gamma(s)}\, (1-\varepsilon)^{1-1/s}(\log n)^{1-1/s}n^\varepsilon >n^{\varepsilon/2}, \end{equation*} \notag $$
the last inequality holds for large enough $n$. Then (2.12) follows from $\sum^\infty_{n=2}e^{-n^{\varepsilon/2}}<\infty$, and hence (2.11) holds.

We now prove (1.5). By (1.4),

$$ \begin{equation*} \begin{aligned} \, &\lim_{n\to\infty}\frac{\log\max_{1\leqslant k\leqslant n}(X_k-\mu)}{\log\log n} \\ &\qquad=\lim_{n\to\infty}\frac{\log[\max_{1\leqslant k\leqslant n}(X_k-\mu)/(\log n)^{1/s}]+\log(\log n)^{1/s}}{\log\log n}=\frac{1}{s}\quad\text{a.s.}, \end{aligned} \end{equation*} \notag $$
which implies that (1.5) holds. The proof is completed.

Proof of Theorem 1.3. In view of the Borel–Cantelli lemma, to prove (1.7), it suffices to prove that, for every $\varepsilon\in (0,1)$,
$$ \begin{equation} \sum^\infty_{n=2}\mathbf P\{\log|Y_n|<-(1+\varepsilon)\log n\}<\infty \end{equation} \tag{2.13} $$
and
$$ \begin{equation} \sum^\infty_{n=2}\mathbf P\{\log|Y_n|<-(1-\varepsilon)\log n\}=\infty. \end{equation} \tag{2.14} $$
By (1.6), for $n$ large enough, we have
$$ \begin{equation*} \mathbf P\{\log|Y_n|<-(1+\varepsilon)\log n\}=\mathbf P\{|Y|<n^{-(1+\varepsilon)}\}\leqslant Cn^{-(1+\varepsilon)} \end{equation*} \notag $$
and
$$ \begin{equation*} \mathbf P\{\log|Y_n|<-(1-\varepsilon)\log n\}=\mathbf P\{|Y|<n^{-(1+\varepsilon)}\}\geqslant Cn^{-(1-\varepsilon)}. \end{equation*} \notag $$
Then (2.13) and (2.14) hold from the fact that
$$ \begin{equation*} \sum^\infty_{n=2}n^{-(1+\varepsilon)}<\infty\quad\text{and}\quad \sum^\infty_{n=2}n^{-(1-\varepsilon)}=\infty, \end{equation*} \notag $$
respectively.

We now prove (1.8). Set $Z=1/|X|$ and $Z_n=1/|Y_n|$, then, by (1.6),

$$ \begin{equation*} \lim_{z\to+\infty}z\, \mathbf P\{Z>z\}=c_0, \end{equation*} \notag $$
which implies that
$$ \begin{equation*} \lim_{n\to\infty}\frac{\max_{1\leqslant k\leqslant n}\log Z_n}{\log n}= 1\quad\text{a.s.} \end{equation*} \notag $$
by using [3; Theorem 2.3], and hence (1.8) holds.

To prove that $[-1,0]\subset D$, it suffices to show that, for any $\alpha\in (0,1)$, there exists a subsequence $\{m(n),\, n\geqslant1\}$ of $\{1,2,\dots\}$ such that

$$ \begin{equation} \liminf_{n\to\infty}\frac{\log|Y_{m(n)}|}{\log m(n)}=-\alpha\quad\text{a.s.} \end{equation} \tag{2.15} $$
In fact, for any subsequence $\{m(n),\, n\geqslant1\}$ of $\{1,2,\dots\}$, $\{Y_{m(n)},\, n\geqslant1\}$ is a sequence of independent random variables with the common distribution like $Y$, by (1.7),
$$ \begin{equation*} \liminf_{n\to\infty}\frac{\log|Y_{m(n)}|}{\log n}=-1\quad\text{a.s.}, \end{equation*} \notag $$
which implies (2.15) by setting $m(n)=\lfloor n^{1/\alpha}\rfloor$ and the fact $\alpha \log m(n)\sim \log n$.

Assume that $\mathbf E|Y|^\delta<\infty$, for all $\delta>0$, then we have, for a fixed $\delta>0$,

$$ \begin{equation*} \sum^\infty_{n=1}\mathbf P\{|Y|>\varepsilon n^{1/\delta}\}<\infty, \end{equation*} \notag $$
which implies that $n^{-1/\delta}|Y_n|\to 0$ a.s. in the view of the Borel–Cantelli lemma, and hence
$$ \begin{equation*} \limsup_{n\to\infty}\frac{\log |Y_n|}{\log n}\leqslant \frac{1}{\delta}\quad\text{a.s.} \end{equation*} \notag $$
Let $\delta\to\infty$, we have
$$ \begin{equation*} \limsup_{n\to\infty}\frac{\log |Y_n|}{\log n}\leqslant 0\quad\text{a.s.} \end{equation*} \notag $$
Therefore, $[-1,0]=D$. The proof is completed.

Proof of Corollary 1.1. Set $Y=X-a$ and $Y_n=X_n-a$, for all $n\geqslant1$. Then the p.d.f. of $Y$ is $g(y)=f(y+a\,|\,\mu, \sigma, s)$. Obviously, $g(y)$ is continuous for all $y\in(-\infty,\infty)$, and $g(0)=f(a\,|\,\mu, \sigma, s)\ne 0$. Meanwhile, it is easy to show that $\mathbf E|Y|^\delta<\infty$ for all $\delta>0$. Then, by Remark 1.2 and Theorem 1.3, Corollary 1.1 holds. The proof is completed.

Acknowledgments

The authors would like to thank the referees for the helpful comments.

Список литературы

1. G. E. P. Box, G. C. Tiao, Bayesian inference in statistical analysis, Addison-Wesley Series in Behavioral Science: Quantitative Methods, Addison-Wesley Publishing Co., Reading, MA–London–Don Mills, ON, 1973, xviii+588 pp.  crossref  mathscinet  zmath
2. T. K. Chandra, The Borel–Cantelli lemma, SpringerBriefs Stat., Springer, Heidelberg, 2012, xii+106 pp.  crossref  mathscinet  zmath
3. Shuhua Chang, Deli Li, Yongcheng Qi, A. Rosalsky, “A method for estimating the power of moments”, J. Inequal. Appl., 2018, 54, 14 pp.  crossref  mathscinet  zmath
4. Ping-yan Chen, “Some limit results for moving sums of stable random variables”, Northeast. Math. J., 20:1 (2004), 5–12  mathscinet  zmath
5. Я. Галамбош, Асимптотическая теория экстремальных порядковых статистик, Наука, М., 1984, 304 с.  mathscinet  zmath; пер. с англ.: J. Galambos, The asymptotic theory of extreme order statistics, 2nd ed., R. E. Krieger Publishing Co., Inc., Malabar, FL, 1987, xvi+414 с.  mathscinet  zmath
6. Chunqiao Li, Tingting Li, “Density expansions of extremes from general error distribution with applications”, J. Inequal. Appl., 2015 (2015), 356, 15 pp.  crossref  mathscinet  zmath
7. S. Nadarajah, “A generalized normal distribution”, J. Appl. Stat., 32:7 (2005), 685–694  crossref  mathscinet  zmath  adsnasa
8. S. Nadarajah, “Acknowledgement of priority: the generalized normal distribution”, J. Appl. Stat., 33:9 (2006), 1031–1032  crossref  mathscinet  zmath  adsnasa
9. S. I. Resnick, R. J. Tomkins, “Almost sure stability of maxima”, J. Appl. Probab., 10:2 (1973), 378–401  crossref  mathscinet  zmath
10. Rui Li, S. Nadarajah, “The true maximum-likelihood estimators for the generalized Gaussian distribution with $p = 3,4,5$”, Comm. Statist. Theory Methods, 46:18 (2017), 8821–8835  crossref  mathscinet  zmath
11. M. T. Subbotin, “On the law of frequency of error”, Матем. сб., 31:2 (1923), 296–301  mathnet  zmath
12. P. R. Tadikamalla, “Random sampling from the exponential power distribution”, J. Amer. Statist. Assoc., 75:371 (1980)  crossref  mathscinet  zmath
13. S. Vianelli, “La misura della variabilita condizionata in uno schema generale delle curve normali di frequenza”, Statistica, 23:4 (1963), 447–474

Образец цитирования: Wen L., Zhuang Z., “Almost sure results for generalized normal distribution”, Теория вероятн. и ее примен., 69:4 (2024), 791–799; Theory Probab. Appl., 69:4 (2025), 630–636
Цитирование в формате AMSBIB
\RBibitem{WenZhu24}
\by Wen~L., Zhuang~Z.
\paper Almost sure results for generalized normal distribution
\jour Теория вероятн. и ее примен.
\yr 2024
\vol 69
\issue 4
\pages 791--799
\mathnet{http://mi.mathnet.ru/tvp5606}
\crossref{https://doi.org/10.4213/tvp5606}
\mathscinet{https://mathscinet.ams.org/mathscinet-getitem?mr=4914731}
\transl
\jour Theory Probab. Appl.
\yr 2025
\vol 69
\issue 4
\pages 630--636
\crossref{https://doi.org/10.1137/S0040585X97T992185}
\scopus{https://www.scopus.com/record/display.url?origin=inward&eid=2-s2.0-86000066437}
Образцы ссылок на эту страницу:
  • https://www.mathnet.ru/rus/tvp5606
  • https://doi.org/10.4213/tvp5606
  • https://www.mathnet.ru/rus/tvp/v69/i4/p791
  • Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Теория вероятностей и ее применения Theory of Probability and its Applications
    Статистика просмотров:
    Страница аннотации:234
    PDF полного текста:5
    HTML русской версии:8
    Список литературы:51
    Первая страница:23
     
      Обратная связь:
     Пользовательское соглашение  Регистрация посетителей портала  Логотипы © Математический институт им. В. А. Стеклова РАН, 2026