Usual Estimator

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 183 Experts worldwide ranked by ideXlab platform

William E. Strawderman - One of the best experts on this subject based on the ideXlab platform.

  • On Bayes and unbiased Estimators of loss
    Annals of the Institute of Statistical Mathematics, 2003
    Co-Authors: Dominique Fourdrinier, William E. Strawderman
    Abstract:

    We consider estimation of loss for generalized Bayes or pseudo-Bayes Estimators of a multivariate normal mean vector, θ. In 3 and higher dimensions, the MLE X is UMVUE and minimax but is inadmissible. It is dominated by the James-Stein Estimator and by many others. Johnstone (1988, On inadmissibility of some unbiased estimates of loss, Statistical Decision Theory and Related Topics, IV (eds. S. S. Gupta and J. O. Berger), Vol. 1, 361–379, Springer, New York) considered the estimation of loss for the Usual Estimator X and the James-Stein Estimator. He found improvements over the Stein unbiased Estimator of risk. In this paper, for a generalized Bayes point Estimator of θ, we compare generalized Bayes Estimators to unbiased Estimators of loss. We find, somewhat surprisingly, that the unbiased Estimator often dominates the corresponding generalized Bayes Estimator of loss for priors which give minimax Estimators in the original point estimation problem. In particular, we give a class of priors for which the generalized Bayes Estimator of θ is admissible and minimax but for which the unbiased Estimator of loss dominates the generalized Bayes Estimator of loss. We also give a general inadmissibility result for a generalized Bayes Estimator of loss.

  • Estimation of a parameter vector restricted to a cone
    Statistics & Probability Letters, 2002
    Co-Authors: Idir Ouassou, William E. Strawderman
    Abstract:

    Abstract We study estimation of a location vector restricted to a convex cone when the dimension, p, is at least 3. We find Estimators which improve on the “UsualEstimator (the MLE in the normal case) in the general case of a spherically symmetric distribution with unknown scale. The improved Estimators may be viewed as Stein-type shrinkage Estimators on the set where the Usual unbiased Estimator (in the unrestricted case) satisfies the restriction. The improved procedures have the extremely strong property of improving on the “UsualEstimator uniformly and simultaneously for all spherically symmetric distributions.

  • Sequential estimation of the variance of a normal distribution
    Sequential Analysis, 1995
    Co-Authors: Jayalakshmi Natarajan, Robert Wood Johnson, William E. Strawderman
    Abstract:

    In estimating the variance of a normal distribution when the loss function is essentially squared error, a sequential version of Stein's Estimator is used to show the existence of sequential Estimators which are better both in risk (expected loss) and sample size than the Usual Estimator of a given fixed sample size.

  • Stein estimation for non-normal spherically symmetric location families in three dimensions
    Journal of Multivariate Analysis, 1992
    Co-Authors: Stefan S. Ralescu, Ann Cohen Brandwein, William E. Strawderman
    Abstract:

    Abstract We consider estimation of a location vector in the presence of known or unknown scale parameter in three dimensions. The technique of proof is Stein's integration by parts and it is used to cover several cases (e.g., non-unimodal distributions) for which previous results were known only in the cases of four and higher dimensions. Additionally, we give a necessary and sufficient condition on the shrinkage constant for improvement on the Usual Estimator for the spherical uniform distribution.

  • a james stein type Estimator for combining unbiased and possibly biased Estimators
    Journal of the American Statistical Association, 1991
    Co-Authors: Edwin J Green, William E. Strawderman
    Abstract:

    Abstract We present a method for combining unbiased sample data with possibly biased auxiliary information. The Estimator we derive is similar in spirit to the James–Stein Estimator. We prove that the Estimator dominates the sample mean under quadratic loss. When the auxiliary information is unbiased, our Estimator has risk slightly greater than the Usual combined Estimator. As the bias increases, however, the risk of the Usual Estimator is unbounded, while the risk of our Estimator is bounded by the risk of the sample mean. We show how our Estimator can be considered an approximation to the best linear combination of the sample data and the auxiliary information, allude to how it can be derived as an empirical Bayes Estimator, and suggest a method for constructing confidence sets. Finally, the performance of our Estimator is compared to that of the sample mean and the Usual combined Estimator using real forestry data.

Kazuhiro Ohtani - One of the best experts on this subject based on the ideXlab platform.

M. Z. Anis - One of the best experts on this subject based on the ideXlab platform.

  • Estimating the Mean of Normal Distribution with Known Coefficient of Variation
    American Journal of Mathematical and Management Sciences, 2008
    Co-Authors: M. Z. Anis
    Abstract:

    SYNOPTIC ABSTRACTIn many situations, the coefficient of variation is known though the mean and variance may not be known. This additional information on the coefficient of variation can be used to improve upon the Usual Estimator of the unknown mean. Three biased but simple Estimators for the mean of the normal distribution when the coefficient of variation is known are proposed and their properties are studied. The performance of these Estimators is compared with some other existing methods; and it turns out that the new proposed Estimators compete well.

Edwin J Green - One of the best experts on this subject based on the ideXlab platform.

  • a james stein type Estimator for combining unbiased and possibly biased Estimators
    Journal of the American Statistical Association, 1991
    Co-Authors: Edwin J Green, William E. Strawderman
    Abstract:

    Abstract We present a method for combining unbiased sample data with possibly biased auxiliary information. The Estimator we derive is similar in spirit to the James–Stein Estimator. We prove that the Estimator dominates the sample mean under quadratic loss. When the auxiliary information is unbiased, our Estimator has risk slightly greater than the Usual combined Estimator. As the bias increases, however, the risk of the Usual Estimator is unbounded, while the risk of our Estimator is bounded by the risk of the sample mean. We show how our Estimator can be considered an approximation to the best linear combination of the sample data and the auxiliary information, allude to how it can be derived as an empirical Bayes Estimator, and suggest a method for constructing confidence sets. Finally, the performance of our Estimator is compared to that of the sample mean and the Usual combined Estimator using real forestry data.

Nels Grevstad - One of the best experts on this subject based on the ideXlab platform.