The Experts below are selected from a list of 303 Experts worldwide ranked by ideXlab platform

Sangun Park - One of the best experts on this subject based on the ideXlab platform.

  • Kullback-Leibler Information of Consecutive Order Statistics
    Communications for Statistical Applications and Methods, 2015
    Co-Authors: Ilmun Kim, Sangun Park
    Abstract:

    A calculation of the Kullback-Leibler information of consecutive Order Statistics is complicated because it depends on a multi-dimensional integral. Park (2014) discussed a representation of the Kullback-Leibler information of the first r Order Statistics in terms of the hazard function and simplified the r-fold integral to a single integral. In this paper, we first express the Kullback-Leibler information in terms of the reversed hazard function. Then we establish a generalized result of Park (2014) to an arbitrary consecutive Order Statistics. We derive a single integral form of the Kullback-Leibler information of an arbitrary block of Order Statistics; in addition, its relation to the Fisher information of Order Statistics is discussed with numerical examples provided.

  • Equal fisher information in Order Statistics
    Sankhya, 2004
    Co-Authors: Sangun Park, Gang Zheng
    Abstract:

    Any collection of Order Statistics from two different probability distributions may contain equal Fisher information about a scalar parameter. We derive a necessary and sufficient condition under which two distributions have equal Fisher information in any Order Statistics. Hence this condition can be used to define an equivalence relation on parametric distributions. Within the location (scale) family of distributions, we show that this equivalence relation uniquely determines the parametric family by the values of the Fisher information about the location (scale) parameter in any Order Statistics. The results are used to derive some location-scale distribution and obtain a simple characterization in terms of the Fisher information in the sequence of the minimum Order Statistics.

  • Fisher Information in Order Statistics
    Journal of the American Statistical Association, 1996
    Co-Authors: Sangun Park
    Abstract:

    Abstract When we have n independently and identically distributed observations, it is an interesting question how the Fisher information is distributed among Order Statistics. The recipe for the Fisher information in Order Statistics is easy, but the detailed calculation has been known to be complicated. An indirect approach, using a decomposition of the Fisher information in Order Statistics, simplifies the calculation. Some recurrence relations for the Fisher information in Order Statistics are derived that facilitate the calculation. The Fisher information in the first r Order Statistics is an r multiple integral, but it can be simplified to just a double integral by using the decomposition. A recurrence relation further simplifies the double integral to a sum of single integrals. An “information plot” is suggested, from which we can read at once the Fisher information in any set of consecutive Order Statistics for a parametric distribution.

  • The entropy of consecutive Order Statistics
    IEEE Transactions on Information Theory, 1995
    Co-Authors: Sangun Park
    Abstract:

    Calculation of the entropy of a set of consecutive Order Statistics is relatively more complicated than that of the entropy of the individual Order statistic, which has been studied by Wong and Chan (1990). We provide some fundamental relations occurring in the entropy of consecutive-Order Statistics, which are very useful for computations. We first consider the decomposition of the entropy of Order Statistics, and derive some recurrence relations in the first r Order Statistics. We also establish a dual principle for the entropy of Order Statistics, which yields a dual relation from a given relation in the entropy of Order Statistics.

H. Zahedi - One of the best experts on this subject based on the ideXlab platform.

  • Information properties of Order Statistics and spacings
    IEEE Transactions on Information Theory, 2004
    Co-Authors: Nader Ebrahimi, Ehsan S. Soofi, H. Zahedi
    Abstract:

    We explore properties of the entropy, Kullback-Leibler information, and mutual information for Order Statistics. The probability integral transformation plays a pivotal role in developing our results. We provide bounds for the entropy of Order Statistics and some results that relate entropy Ordering of Order Statistics to other well-known Orderings of random variables. We show that the discrimination information between Order Statistics and data distribution, the discrimination information among the Order Statistics, and the mutual information between Order Statistics are all distribution free and are computable using the distributions of the Order Statistics of the samples from the uniform distribution. We also discuss information properties of spacings for uniform and exponential samples and provide a large sample distribution-free result on the entropy of spacings. The results show interesting symmetries of information Orderings among Order Statistics.

Dan Schonfeld - One of the best experts on this subject based on the ideXlab platform.

  • On the convergence and roots of Order-Statistics filters
    IEEE Transactions on Signal Processing, 1995
    Co-Authors: Mohammed A. Charif-chefchaouni, Dan Schonfeld
    Abstract:

    We propose a comprehensive theory of the convergence and characterization of roots of Order-Statistics filters. Conditions for the convergence of iterations of Order-Statistics filters are proposed. Criteria for the morphological characterization of roots of Order-Statistics filters are also proposed. >

  • Morphological representation of Order-Statistics filters
    IEEE transactions on image processing : a publication of the IEEE Signal Processing Society, 1995
    Co-Authors: Mohammed A. Charif-chefchaouni, Dan Schonfeld
    Abstract:

    We propose a comprehensive theory for the morphological bounds on Order-Statistics filters (and their repeated iterations). Conditions are derived for morphological openings and closings to serve as bounds (lower and upper, respectively) on Order-Statistics filters (and their repeated iterations). Under various assumptions, morphological open-closings and close-openings are also shown to serve as (tighter) bounds (lower and upper, respectively) on iterations of Order-Statistics filters. Simulations of the application of the results presented to image restoration are finally provided. >

  • Morphological bounds on Order-Statistics filters
    Image Algebra and Morphological Image Processing IV, 1993
    Co-Authors: Mohammed A. Charif-chefchaouni, Dan Schonfeld
    Abstract:

    In this paper, we investigate the morphological bounds on Order- Statistics (median) filters (and their repeated iterations). Conditions are derived for morphological openings and closing to serve as bounds (lower and upper, respectively) on Order- Statistics (median) filters (and their repeated iterations). Under various assumptions, morphological open-closings (open- close-openings) and close-openings (close-open-closings) are also shown to serve as (tighter) bounds (lower and upper, respectively) on iterations of Order-Statistics (median) filters. Conditions for the convergence of iterations of Order-Statistics (median) filters are proposed. Criteria for the morphological characterization of roots of Order-Statistics (median) filters are also proposed.© (1993) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Nader Ebrahimi - One of the best experts on this subject based on the ideXlab platform.

  • Information properties of Order Statistics and spacings
    IEEE Transactions on Information Theory, 2004
    Co-Authors: Nader Ebrahimi, Ehsan S. Soofi, H. Zahedi
    Abstract:

    We explore properties of the entropy, Kullback-Leibler information, and mutual information for Order Statistics. The probability integral transformation plays a pivotal role in developing our results. We provide bounds for the entropy of Order Statistics and some results that relate entropy Ordering of Order Statistics to other well-known Orderings of random variables. We show that the discrimination information between Order Statistics and data distribution, the discrimination information among the Order Statistics, and the mutual information between Order Statistics are all distribution free and are computable using the distributions of the Order Statistics of the samples from the uniform distribution. We also discuss information properties of spacings for uniform and exponential samples and provide a large sample distribution-free result on the entropy of spacings. The results show interesting symmetries of information Orderings among Order Statistics.

Ana M. Valle - One of the best experts on this subject based on the ideXlab platform.