Statistical Inference

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 91143 Experts worldwide ranked by ideXlab platform

Mh Aliabadi - One of the best experts on this subject based on the ideXlab platform.

  • a multi fidelity modelling approach to the Statistical Inference of the equivalent initial flaw size distribution for multiple site damage
    International Journal of Fatigue, 2019
    Co-Authors: Llewellyn Morse, Zahra Sharif Khodaei, Mh Aliabadi
    Abstract:

    Abstract A new methodology for the Statistical Inference of the Equivalent Initial Flaw Size Distribution (EIFSD) using the Dual Boundary Element Method (DBEM) is proposed. As part of the Inference, Bayesian updating is used to calibrate the EIFS based on data obtained from simulated routine inspections of a structural component from a fleet of aircraft. An incremental crack growth procedure making use of the DBEM is employed for the modelling of the simultaneous growth of cracks in the structure due to fatigue. Multi-fidelity modelling, in the form of Co-Kriging, is used to create surrogate models that act in place of the DBEM model for the expensive Monte Carlo sampling procedure required for the Statistical Inference of the EIFSD. The proposed methodology is applied to a numerical example featuring a long fuselage lap joint splice in the presence of Multiple Site Damage (MSD). Results show that the EIFSD can be accurately estimated within 10% error with data from just 50 inspections. The employed Co-Kriging models proved to be effective substitutes for the DBEM model, providing significant reductions in the computational cost associated with the implementation of the proposed Statistical Inference methodology.

  • A multi-fidelity modelling approach to the Statistical Inference of the equivalent initial flaw size distribution for multiple-site damage
    'Elsevier BV', 2018
    Co-Authors: Morse L, Sharif Khodaei Z, Mh Aliabadi
    Abstract:

    A new methodology for the Statistical Inference of the Equivalent Initial Flaw Size distribution (EIFSD) using the Dual Boundary Element Method (DBEM) is proposed. As part of the Inference, Bayesian updating is used to calibrate the EIFS based on data obtained from simulated routine inspections of a structural component from a fleet of aircraft. An incremental crack growth procedure making use of the DBEM is employed for the modelling of the simultaneous growth of cracks in the structure due to fatigue. Multi-fidelity modelling, in the form of Co-Kriging, is used to create surrogate models that act in place of the DBEM model for the expensive Monte Carlo sampling procedure required for the Statistical Inference of the EIFSD. The proposed methodology is applied to a numerical example featuring a long fuselage lap joint splice in presence of multiple site damage (MSD). Results show that the EIFSD can be accurately estimated with data from 50 inspections. The employed Co-Kriging models proved to be effective substitutes for the DBEM model, providing significant reductions in the computational cost associated with the implementation of the proposed Statistical Inference methodology

Tengyuan Liang - One of the best experts on this subject based on the ideXlab platform.

  • Statistical Inference for the population landscape via moment adjusted stochastic gradients
    Journal of The Royal Statistical Society Series B-statistical Methodology, 2019
    Co-Authors: Tengyuan Liang
    Abstract:

    Modern Statistical Inference tasks often require iterative optimization methods to compute the solution. Convergence analysis from an optimization viewpoint informs us only how well the solution is approximated numerically but overlooks the sampling nature of the data. In contrast, recognizing the randomness in the data, statisticians are keen to provide uncertainty quantification, or confidence, for the solution obtained by using iterative optimization methods. The paper makes progress along this direction by introducing moment‐adjusted stochastic gradient descent: a new stochastic optimization method for Statistical Inference. We establish non‐asymptotic theory that characterizes the Statistical distribution for certain iterative methods with optimization guarantees. On the Statistical front, the theory allows for model misspecification, with very mild conditions on the data. For optimization, the theory is flexible for both convex and non‐convex cases. Remarkably, the moment adjusting idea motivated from ‘error standardization’ in statistics achieves a similar effect to acceleration in first‐order optimization methods that are used to fit generalized linear models. We also demonstrate this acceleration effect in the non‐convex setting through numerical experiments.

  • Statistical Inference for the population landscape via moment adjusted stochastic gradients
    arXiv: Machine Learning, 2017
    Co-Authors: Tengyuan Liang
    Abstract:

    Modern Statistical Inference tasks often require iterative optimization methods to approximate the solution. Convergence analysis from optimization only tells us how well we are approximating the solution deterministically, but overlooks the sampling nature of the data. However, due to the randomness in the data, statisticians are keen to provide uncertainty quantification, or confidence, for the answer obtained after certain steps of optimization. Therefore, it is important yet challenging to understand the sampling distribution of the iterative optimization methods. This paper makes some progress along this direction by introducing a new stochastic optimization method for Statistical Inference, the moment adjusted stochastic gradient descent. We establish non-asymptotic theory that characterizes the Statistical distribution of the iterative methods, with good optimization guarantee. On the Statistical front, the theory allows for model misspecification, with very mild conditions on the data. For optimization, the theory is flexible for both the convex and non-convex cases. Remarkably, the moment adjusting idea motivated from "error standardization" in statistics achieves similar effect as Nesterov's acceleration in optimization, for certain convex problems as in fitting generalized linear models. We also demonstrate this acceleration effect in the non-convex setting through experiments.

Llewellyn Morse - One of the best experts on this subject based on the ideXlab platform.

  • a multi fidelity modelling approach to the Statistical Inference of the equivalent initial flaw size distribution for multiple site damage
    International Journal of Fatigue, 2019
    Co-Authors: Llewellyn Morse, Zahra Sharif Khodaei, Mh Aliabadi
    Abstract:

    Abstract A new methodology for the Statistical Inference of the Equivalent Initial Flaw Size Distribution (EIFSD) using the Dual Boundary Element Method (DBEM) is proposed. As part of the Inference, Bayesian updating is used to calibrate the EIFS based on data obtained from simulated routine inspections of a structural component from a fleet of aircraft. An incremental crack growth procedure making use of the DBEM is employed for the modelling of the simultaneous growth of cracks in the structure due to fatigue. Multi-fidelity modelling, in the form of Co-Kriging, is used to create surrogate models that act in place of the DBEM model for the expensive Monte Carlo sampling procedure required for the Statistical Inference of the EIFSD. The proposed methodology is applied to a numerical example featuring a long fuselage lap joint splice in the presence of Multiple Site Damage (MSD). Results show that the EIFSD can be accurately estimated within 10% error with data from just 50 inspections. The employed Co-Kriging models proved to be effective substitutes for the DBEM model, providing significant reductions in the computational cost associated with the implementation of the proposed Statistical Inference methodology.

Zahra Sharif Khodaei - One of the best experts on this subject based on the ideXlab platform.

  • a multi fidelity modelling approach to the Statistical Inference of the equivalent initial flaw size distribution for multiple site damage
    International Journal of Fatigue, 2019
    Co-Authors: Llewellyn Morse, Zahra Sharif Khodaei, Mh Aliabadi
    Abstract:

    Abstract A new methodology for the Statistical Inference of the Equivalent Initial Flaw Size Distribution (EIFSD) using the Dual Boundary Element Method (DBEM) is proposed. As part of the Inference, Bayesian updating is used to calibrate the EIFS based on data obtained from simulated routine inspections of a structural component from a fleet of aircraft. An incremental crack growth procedure making use of the DBEM is employed for the modelling of the simultaneous growth of cracks in the structure due to fatigue. Multi-fidelity modelling, in the form of Co-Kriging, is used to create surrogate models that act in place of the DBEM model for the expensive Monte Carlo sampling procedure required for the Statistical Inference of the EIFSD. The proposed methodology is applied to a numerical example featuring a long fuselage lap joint splice in the presence of Multiple Site Damage (MSD). Results show that the EIFSD can be accurately estimated within 10% error with data from just 50 inspections. The employed Co-Kriging models proved to be effective substitutes for the DBEM model, providing significant reductions in the computational cost associated with the implementation of the proposed Statistical Inference methodology.

Jan-willem Romeijn - One of the best experts on this subject based on the ideXlab platform.

  • bayesian Statistical Inference
    2011
    Co-Authors: Rolf Haenni, Jan-willem Romeijn, Gregory Wheeler, Jon Williamson
    Abstract:

    Bayesian statistics is much more easily connected to the inferential problem of Schema (1.1) than classical statistics. The feature that distinguishes Bayesian Statistical Inference from classical statistics is that it also employs probability assignments over Statistical hypotheses. It is therefore possible to present a Bayesian Statistical procedure as an Inference concerning probability assignments over hypotheses. Recall that we called the Inference of probability assignments over data on the assumption of a Statistical hypothesis direct. Because in Bayesian Inference we derive a probability assignment over hypotheses on the basis of data, it is sometimes called indirect Inference.

  • theory change and bayesian Statistical Inference
    Philosophy of Science, 2005
    Co-Authors: Jan-willem Romeijn
    Abstract:

    This paper addresses the problem that Bayesian Statistical Inference cannot accommodate theory change, and proposes a framework for dealing with such changes. It first presents a scheme for generating predictions from observations by means of hypotheses. An example shows how the hypotheses represent the theoretical structure underlying the scheme. This is followed by an example of a change of hypotheses. The paper then presents a general framework for hypotheses change, and proposes the minimization of the distance between hypotheses as a rationality criterion. Finally the paper discusses the import of this for Bayesian Statistical Inference.