We study convexity properties of the Rényi entropy as function of $\alpha >0$ on finite alphabets. We also describe robustness of the Rényi entropy on finite alphabets, and it turns out that the rate of respective convergence depends on initial alphabet. We establish convergence of the disturbed entropy when the initial distribution is uniform but the number of events increases to ∞ and prove that the limit of Rényi entropy of the binomial distribution is equal to Rényi entropy of the Poisson distribution.
This paper represents an extended version of an earlier note [10]. The concept of weighted entropy takes into account values of different outcomes, i.e., makes entropy context-dependent, through the weight function. We analyse analogs of the Fisher information inequality and entropy power inequality for the weighted entropy and discuss connections with weighted Lieb’s splitting inequality. The concepts of rates of the weighted entropy and information are also discussed.