Despite the relevance of the binomial distribution for probability theory and applied statistical inference, its higher-order moments are poorly understood. The existing formulas are either not general enough, or not structured and simplified enough for intended applications.
This paper introduces novel formulas for binomial moments in the form of polynomials in the variance rather than in the success probability. The obtained formulas are arguably better structured, simpler and superior in their numerical properties compared to prior works. In addition, the paper presents algorithms to derive these formulas along with working implementation in Python’s symbolic algebra package.
The novel approach is a combinatorial argument coupled with clever algebraic simplifications which rely on symmetrization theory. As an interesting byproduct asymptotically sharp estimates for central binomial moments are established, improving upon previously known partial results.
Entropic Value-at-Risk (EVaR) measure is a convenient coherent risk measure. Due to certain difficulties in finding its analytical representation, it was previously calculated explicitly only for the normal distribution. We succeeded to overcome these difficulties and to calculate Entropic Value-at-Risk (EVaR) measure for Poisson, compound Poisson, Gamma, Laplace, exponential, chi-squared, inverse Gaussian distribution and normal inverse Gaussian distribution with the help of Lambert function that is a special function, generally speaking, with two branches.
This work obtains sharp closed-form exponential concentration inequalities of Bernstein type for the ubiquitous beta distribution, improving upon sub-Gaussian and sub-gamma bounds previously studied in this context.
The proof leverages a novel handy recursion of order 2 for central moments of the beta distribution, obtained from the hypergeometric representations of moments; this recursion is useful for obtaining explicit expressions for central moments and various tail approximations.
Suitable families of random variables having power series distributions are considered, and their asymptotic behavior in terms of large (and moderate) deviations is studied. Two examples of fractional counting processes are presented, where the normalizations of the involved power series distributions can be expressed in terms of the Prabhakar function. The first example allows to consider the counting process in [Integral Transforms Spec. Funct. 27 (2016), 783–793], the second one is inspired by a model studied in [J. Appl. Probab. 52 (2015), 18–36].
We study convexity properties of the Rényi entropy as function of $\alpha >0$ on finite alphabets. We also describe robustness of the Rényi entropy on finite alphabets, and it turns out that the rate of respective convergence depends on initial alphabet. We establish convergence of the disturbed entropy when the initial distribution is uniform but the number of events increases to ∞ and prove that the limit of Rényi entropy of the binomial distribution is equal to Rényi entropy of the Poisson distribution.
In this paper we provide a systematic exposition of basic properties of integrated distribution and quantile functions. We define these transforms in such a way that they characterize any probability distribution on the real line and are Fenchel conjugates of each other. We show that uniform integrability, weak convergence and tightness admit a convenient characterization in terms of integrated quantile functions. As an application we demonstrate how some basic results of the theory of comparison of binary statistical experiments can be deduced using integrated quantile functions. Finally, we extend the area of application of the Chacon–Walsh construction in the Skorokhod embedding problem.
Let $\{\xi _{1},\xi _{2},\dots \}$ be a sequence of independent random variables, and η be a counting random variable independent of this sequence. In addition, let $S_{0}:=0$ and $S_{n}:=\xi _{1}+\xi _{2}+\cdots +\xi _{n}$ for $n\geqslant 1$. We consider conditions for random variables $\{\xi _{1},\xi _{2},\dots \}$ and η under which the distribution functions of the random maximum $\xi _{(\eta )}:=\max \{0,\xi _{1},\xi _{2},\dots ,\xi _{\eta }\}$ and of the random maximum of sums $S_{(\eta )}:=\max \{S_{0},S_{1},S_{2},\dots ,S_{\eta }\}$ belong to the class of consistently varying distributions. In our consideration the random variables $\{\xi _{1},\xi _{2},\dots \}$ are not necessarily identically distributed.
Let $\{\xi _{1},\xi _{2},\dots \}$ be a sequence of independent random variables, and η be a counting random variable independent of this sequence. We consider conditions for $\{\xi _{1},\xi _{2},\dots \}$ and η under which the distribution function of the random sum $S_{\eta }=\xi _{1}+\xi _{2}+\cdots +\xi _{\eta }$ belongs to the class of consistently varying distributions. In our consideration, the random variables $\{\xi _{1},\xi _{2},\dots \}$ are not necessarily identically distributed.
Let $\{\xi _{1},\xi _{2},\dots \}$ be a sequence of independent random variables (not necessarily identically distributed), and η be a counting random variable independent of this sequence. We obtain sufficient conditions on $\{\xi _{1},\xi _{2},\dots \}$ and η under which the distribution function of the random sum $S_{\eta }=\xi _{1}+\xi _{2}+\cdots +\xi _{\eta }$ belongs to the class of $\mathcal{O}$-exponential distributions.
We obtain the distance between the exact and approximate distributions of partial maxima of a random sample under power normalization. It is observed that the Hellinger distance and variational distance between the exact and approximate distributions of partial maxima under power normalization is the same as the corresponding distances under linear normalization.