Random Variables

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 290013 Experts worldwide ranked by ideXlab platform

Przemysław Matuła - One of the best experts on this subject based on the ideXlab platform.

Xiangchen Wang - One of the best experts on this subject based on the ideXlab platform.

  • complete convergence and almost sure convergence of weighted sums of Random Variables
    Journal of Theoretical Probability, 1995
    Co-Authors: Bhaskara M Rao, Tiefeng Jiang, Xiangchen Wang
    Abstract:

    Letr>1. For eachn≥1, let {X nk , −∞Random Variables. We provide some very relaxed conditions which will guarantee $$\Sigma _{n \geqslant 1} n^{r - 2} P\{ |\Sigma _{k = - \infty }^\infty X_{nk} | \geqslant \varepsilon \}< \infty $$ for every e>0. This result is used to establish some results on complete convergence for weighted sums of independent Random Variables. The main idea is that we devise an effetive way of combining a certain maximal inequality of Hoffmann-Jorgensen and rates of convergence in the Weak Law of Large Numbers to establish results on complete convergence of weighted sums of independent Random Variables. New results as well as simple new proofs of known ones illustrate the usefulness of our method in this context. We show further that this approach can be used in the study of almost sure convergence for weighted sums of independent Random Variables. Convergence rates in the almost sure convergence of some summability methods ofiid Random Variables are also established.

Siddharth Muthukrishnan - One of the best experts on this subject based on the ideXlab platform.

  • entropy power inequality for a family of discrete Random Variables
    International Symposium on Information Theory, 2011
    Co-Authors: Naresh Sharma, Siddharth Muthukrishnan
    Abstract:

    It is known that the Entropy Power Inequality (EPI) always holds if the Random Variables have density. Not much work has been done to identify discrete distributions for which the inequality holds with the differential entropy replaced by the discrete entropy. Harremoes and Vignat showed that it holds for the pair (B(m, p),B(n, p)), m, n ∈ ℕ, (where B(n, p) is a Binomial distribution with n trials each with success probability p) for p = 0.5. In this paper, we considerably expand the set of Binomial distributions for which the inequality holds and, in particular, identify n 0 (p) such that for all m, n ≥ n 0 (p), the EPI holds for (B(m, p),B(n, p)). We further show that the EPI holds for the discrete Random Variables that can be expressed as the sum of n independent and identically distributed (IID) discrete Random Variables for large n.

  • Entropy power inequality for a family of discrete Random Variables
    2011 IEEE International Symposium on Information Theory Proceedings, 2011
    Co-Authors: Naresh Sharma, Smarajit Das, Siddharth Muthukrishnan
    Abstract:

    It is known that the Entropy Power Inequality (EPI) always holds if the Random Variables have density. Not much work has been done to identify discrete distributions for which the inequality holds with the differential entropy replaced by the discrete entropy. Harremo\"{e}s and Vignat showed that it holds for the pair (B(m,p), B(n,p)), m,n \in \mathbb{N}, (where B(n,p) is a Binomial distribution with n trials each with success probability p) for p = 0.5. In this paper, we considerably expand the set of Binomial distributions for which the inequality holds and, in particular, identify n_0(p) such that for all m,n \geq n_0(p), the EPI holds for (B(m,p), B(n,p)). We further show that the EPI holds for the discrete Random Variables that can be expressed as the sum of n independent identical distributed (IID) discrete Random Variables for large n.

Varun Jog - One of the best experts on this subject based on the ideXlab platform.

  • an entropy inequality for symmetric Random Variables
    International Symposium on Information Theory, 2018
    Co-Authors: Jing Hao, Varun Jog
    Abstract:

    We establish a lower bound on the entropy of weighted sums of (possibly dependent) Random Variables (X 1 , X 2 , …, X n) possessing a symmetric joint distribution. Our lower bound is in terms of the joint entropy of (X 1 , X 2 , …, Xn). We show that for $n$ ≥ 3, the lower bound is tight if and only if X i 'S are i.i.d. Gaussian Random Variables. For $n$ = 2 there are numerous other cases of equality apart from i.i.d. Gaussians, which we completely characterize. Going beyond sums, we also present an inequality for certain linear transformations of (X 1 , …, X n.). Our primary technical contribution lies in the analysis of the equality cases, and our approach relies on the geometry and the symmetry of the problem.

  • an entropy inequality for symmetric Random Variables
    arXiv: Information Theory, 2018
    Co-Authors: Jing Hao, Varun Jog
    Abstract:

    We establish a lower bound on the entropy of weighted sums of (possibly dependent) Random Variables $(X_1, X_2, \dots, X_n)$ possessing a symmetric joint distribution. Our lower bound is in terms of the joint entropy of $(X_1, X_2, \dots, X_n)$. We show that for $n \geq 3$, the lower bound is tight if and only if $X_i$'s are i.i.d.\ Gaussian Random Variables. For $n=2$ there are numerous other cases of equality apart from i.i.d.\ Gaussians, which we completely characterize. Going beyond sums, we also present an inequality for certain linear transformations of $(X_1, \dots, X_n)$. Our primary technical contribution lies in the analysis of the equality cases, and our approach relies on the geometry and the symmetry of the problem.

Chaoming Hwang - One of the best experts on this subject based on the ideXlab platform.