Total Probability

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 324 Experts worldwide ranked by ideXlab platform

S Parolai - One of the best experts on this subject based on the ideXlab platform.

  • Total Probability theorem versus shakeability a comparison between two seismic hazard approaches used in central asia
    Seismological Research Letters, 2015
    Co-Authors: D Bindi, S Parolai
    Abstract:

    The comparison of seismic‐hazard maps produced in different countries, or computed for the same country but at different times, is often hampered by the difficulties encountered in properly accounting for the differences among the implemented methodologies. An example of such difficulty is given by the comparison between the hazard maps computed during the Cold War period for the former Soviet Union, which includes vast regions exposed to high seismic hazard (e.g., the central Asian countries and the Caucasus region), and recent assessments carried out for the same regions following approaches developed in Western countries (e.g., Ullah et al. , 2015). These comparisons should take into account the differences in the underlying methodologies used in the former Soviet Union and, in several cases, still in use. In the Western countries, the process of formalizing the seismic‐hazard assessment within a probabilistic framework (probabilistic seismic‐hazard assessment [PSHA]) was developed during the 1960s at the Universidad Nacional Autonoma de Mexico (UNAM) and at the Massachusetts Institute of Technology (MIT) (Rosenblueth, 1964; Esteva, 1967, 1968, 1970; Cornell, 1968). With the works of Cornell (1971) and Merz and Cornell (1973), PSHA was finally formalized within the context of the Total Probability theorem, which accounted for ground‐motion variability, and its use then became widespread through the implementation and dissemination of the EQRISK software (McGuire, 1976). A comprehensive review of the early development of PSHA can be found in Bommer and Abrahamson (2006) and in McGuire (2008). On the other hand, the development of a probabilistic framework for seismic‐hazard assessment in the former Soviet Union (hereinafter referred to as the USSR [Union of Soviet Socialist Republics]) was initiated in the 1940s with the works of Medvedev (1947) and developed by Riznichenko in the 1960s (e.g., Riznichenko, 1965, 1992). To quantitatively represent the …

  • Total Probability Theorem Versus Shakeability: A Comparison between Two Seismic‐Hazard Approaches Used in Central Asia
    Seismological Research Letters, 2015
    Co-Authors: D Bindi, S Parolai
    Abstract:

    The comparison of seismic‐hazard maps produced in different countries, or computed for the same country but at different times, is often hampered by the difficulties encountered in properly accounting for the differences among the implemented methodologies. An example of such difficulty is given by the comparison between the hazard maps computed during the Cold War period for the former Soviet Union, which includes vast regions exposed to high seismic hazard (e.g., the central Asian countries and the Caucasus region), and recent assessments carried out for the same regions following approaches developed in Western countries (e.g., Ullah et al. , 2015). These comparisons should take into account the differences in the underlying methodologies used in the former Soviet Union and, in several cases, still in use. In the Western countries, the process of formalizing the seismic‐hazard assessment within a probabilistic framework (probabilistic seismic‐hazard assessment [PSHA]) was developed during the 1960s at the Universidad Nacional Autonoma de Mexico (UNAM) and at the Massachusetts Institute of Technology (MIT) (Rosenblueth, 1964; Esteva, 1967, 1968, 1970; Cornell, 1968). With the works of Cornell (1971) and Merz and Cornell (1973), PSHA was finally formalized within the context of the Total Probability theorem, which accounted for ground‐motion variability, and its use then became widespread through the implementation and dissemination of the EQRISK software (McGuire, 1976). A comprehensive review of the early development of PSHA can be found in Bommer and Abrahamson (2006) and in McGuire (2008). On the other hand, the development of a probabilistic framework for seismic‐hazard assessment in the former Soviet Union (hereinafter referred to as the USSR [Union of Soviet Socialist Republics]) was initiated in the 1940s with the works of Medvedev (1947) and developed by Riznichenko in the 1960s (e.g., Riznichenko, 1965, 1992). To quantitatively represent the …

D Bindi - One of the best experts on this subject based on the ideXlab platform.

  • Total Probability theorem versus shakeability a comparison between two seismic hazard approaches used in central asia
    Seismological Research Letters, 2015
    Co-Authors: D Bindi, S Parolai
    Abstract:

    The comparison of seismic‐hazard maps produced in different countries, or computed for the same country but at different times, is often hampered by the difficulties encountered in properly accounting for the differences among the implemented methodologies. An example of such difficulty is given by the comparison between the hazard maps computed during the Cold War period for the former Soviet Union, which includes vast regions exposed to high seismic hazard (e.g., the central Asian countries and the Caucasus region), and recent assessments carried out for the same regions following approaches developed in Western countries (e.g., Ullah et al. , 2015). These comparisons should take into account the differences in the underlying methodologies used in the former Soviet Union and, in several cases, still in use. In the Western countries, the process of formalizing the seismic‐hazard assessment within a probabilistic framework (probabilistic seismic‐hazard assessment [PSHA]) was developed during the 1960s at the Universidad Nacional Autonoma de Mexico (UNAM) and at the Massachusetts Institute of Technology (MIT) (Rosenblueth, 1964; Esteva, 1967, 1968, 1970; Cornell, 1968). With the works of Cornell (1971) and Merz and Cornell (1973), PSHA was finally formalized within the context of the Total Probability theorem, which accounted for ground‐motion variability, and its use then became widespread through the implementation and dissemination of the EQRISK software (McGuire, 1976). A comprehensive review of the early development of PSHA can be found in Bommer and Abrahamson (2006) and in McGuire (2008). On the other hand, the development of a probabilistic framework for seismic‐hazard assessment in the former Soviet Union (hereinafter referred to as the USSR [Union of Soviet Socialist Republics]) was initiated in the 1940s with the works of Medvedev (1947) and developed by Riznichenko in the 1960s (e.g., Riznichenko, 1965, 1992). To quantitatively represent the …

  • Total Probability Theorem Versus Shakeability: A Comparison between Two Seismic‐Hazard Approaches Used in Central Asia
    Seismological Research Letters, 2015
    Co-Authors: D Bindi, S Parolai
    Abstract:

    The comparison of seismic‐hazard maps produced in different countries, or computed for the same country but at different times, is often hampered by the difficulties encountered in properly accounting for the differences among the implemented methodologies. An example of such difficulty is given by the comparison between the hazard maps computed during the Cold War period for the former Soviet Union, which includes vast regions exposed to high seismic hazard (e.g., the central Asian countries and the Caucasus region), and recent assessments carried out for the same regions following approaches developed in Western countries (e.g., Ullah et al. , 2015). These comparisons should take into account the differences in the underlying methodologies used in the former Soviet Union and, in several cases, still in use. In the Western countries, the process of formalizing the seismic‐hazard assessment within a probabilistic framework (probabilistic seismic‐hazard assessment [PSHA]) was developed during the 1960s at the Universidad Nacional Autonoma de Mexico (UNAM) and at the Massachusetts Institute of Technology (MIT) (Rosenblueth, 1964; Esteva, 1967, 1968, 1970; Cornell, 1968). With the works of Cornell (1971) and Merz and Cornell (1973), PSHA was finally formalized within the context of the Total Probability theorem, which accounted for ground‐motion variability, and its use then became widespread through the implementation and dissemination of the EQRISK software (McGuire, 1976). A comprehensive review of the early development of PSHA can be found in Bommer and Abrahamson (2006) and in McGuire (2008). On the other hand, the development of a probabilistic framework for seismic‐hazard assessment in the former Soviet Union (hereinafter referred to as the USSR [Union of Soviet Socialist Republics]) was initiated in the 1940s with the works of Medvedev (1947) and developed by Riznichenko in the 1960s (e.g., Riznichenko, 1965, 1992). To quantitatively represent the …

Igor Baseski - One of the best experts on this subject based on the ideXlab platform.

  • an efficient method to calculate the failure rate of dynamic systems with random parameters using the Total Probability theorem
    SAE International Journal of Materials and Manufacturing, 2015
    Co-Authors: Monica Majcher, Zissimos P Mourelatos, Igor Baseski, Vasileios Geroulas, Amandeep Singh
    Abstract:

    Abstract : Using the Total Probability theorem, we propose a method to calculate the failure rate of a linear vibratory system with random parameters excited by stationary Gaussian processes. The response of such a system is non-stationary because of the randomness of the input parameters. A space-filling design, such as optimal symmetric Latin hypercube sampling or maximin, is first used to sample the input parameter space. For each design point, the output process is stationary and Gaussian. We present two approaches to calculate the corresponding conditional Probability of failure. A Kriging metamodel is then created between the input parameters and the output conditional probabilities allowing us to estimate the conditional probabilities for any set of input parameters. The Total Probability theorem is finally applied to calculate the time-dependent Probability of failure and the failure rate of the dynamic system. The proposed method is demonstrated using a vibratory system. Our approach can be easily extended to non-stationary Gaussian input processes.

  • Time-Dependent Reliability Analysis Using the Total Probability Theorem
    Journal of Mechanical Design, 2015
    Co-Authors: Zissimos P Mourelatos, Monica Majcher, Vijitashwa Pandey, Igor Baseski
    Abstract:

    A new reliability analysis method is proposed for time-dependent problems with explicit in time limit-state functions of input random variables and input random processes using the Total Probability theorem and the concept of composite limit state. The input random processes are assumed Gaussian. They are expressed in terms of standard normal variables using a spectral decomposition method. The Total Probability theorem is employed to calculate the time-dependent Probability of failure using time-dependent conditional probabilities which are computed accurately and efficiently in the standard normal space using the first-order reliability method (FORM) and a composite limit state of linear instantaneous limit states. If the dimensionality of the Total Probability theorem integral is small, we can easily calculate it using Gauss quadrature numerical integration. Otherwise, simple Monte Carlo simulation (MCS) or adaptive importance sampling are used based on a Kriging metamodel of the conditional probabilities. An example from the literature on the design of a hydrokinetic turbine blade under time-dependent river flow load demonstrates all developments.

  • time dependent reliability analysis using the Total Probability theorem
    Design Automation Conference, 2014
    Co-Authors: Zissimos P Mourelatos, Monica Majcher, Vijitashwa Pandey, Igor Baseski
    Abstract:

    A new reliability analysis method is proposed for time-dependent problems with limit-state functions of input random variables, input random processes and explicit in time using the Total Probability theorem and the concept of composite limit state. The input random processes are assumed Gaussian. They are expressed in terms of standard normal variables using a spectral decomposition method. The Total Probability theorem is employed to calculate the time-dependent Probability of failure using a time-dependent conditional Probability which is computed accurately and efficiently in the standard normal space using FORM and a composite limit state of linear instantaneous limit states. If the dimensionality of the Total Probability theorem integral (equal to the number of input random variables) is small, we can easily calculate it using Gauss quadrature numerical integration. Otherwise, simple Monte Carlo simulation or adaptive importance sampling is used based on a pre-built Kriging metamodel of the conditional Probability. An example from the literature on the design of a hydrokinetic turbine blade under time-dependent river flow load demonstrates all developments.Copyright © 2014 by ASME

Zissimos P Mourelatos - One of the best experts on this subject based on the ideXlab platform.

  • an efficient method to calculate the failure rate of dynamic systems with random parameters using the Total Probability theorem
    SAE International Journal of Materials and Manufacturing, 2015
    Co-Authors: Monica Majcher, Zissimos P Mourelatos, Igor Baseski, Vasileios Geroulas, Amandeep Singh
    Abstract:

    Abstract : Using the Total Probability theorem, we propose a method to calculate the failure rate of a linear vibratory system with random parameters excited by stationary Gaussian processes. The response of such a system is non-stationary because of the randomness of the input parameters. A space-filling design, such as optimal symmetric Latin hypercube sampling or maximin, is first used to sample the input parameter space. For each design point, the output process is stationary and Gaussian. We present two approaches to calculate the corresponding conditional Probability of failure. A Kriging metamodel is then created between the input parameters and the output conditional probabilities allowing us to estimate the conditional probabilities for any set of input parameters. The Total Probability theorem is finally applied to calculate the time-dependent Probability of failure and the failure rate of the dynamic system. The proposed method is demonstrated using a vibratory system. Our approach can be easily extended to non-stationary Gaussian input processes.

  • Time-Dependent Reliability Analysis Using the Total Probability Theorem
    Journal of Mechanical Design, 2015
    Co-Authors: Zissimos P Mourelatos, Monica Majcher, Vijitashwa Pandey, Igor Baseski
    Abstract:

    A new reliability analysis method is proposed for time-dependent problems with explicit in time limit-state functions of input random variables and input random processes using the Total Probability theorem and the concept of composite limit state. The input random processes are assumed Gaussian. They are expressed in terms of standard normal variables using a spectral decomposition method. The Total Probability theorem is employed to calculate the time-dependent Probability of failure using time-dependent conditional probabilities which are computed accurately and efficiently in the standard normal space using the first-order reliability method (FORM) and a composite limit state of linear instantaneous limit states. If the dimensionality of the Total Probability theorem integral is small, we can easily calculate it using Gauss quadrature numerical integration. Otherwise, simple Monte Carlo simulation (MCS) or adaptive importance sampling are used based on a Kriging metamodel of the conditional probabilities. An example from the literature on the design of a hydrokinetic turbine blade under time-dependent river flow load demonstrates all developments.

  • time dependent reliability analysis using the Total Probability theorem
    Design Automation Conference, 2014
    Co-Authors: Zissimos P Mourelatos, Monica Majcher, Vijitashwa Pandey, Igor Baseski
    Abstract:

    A new reliability analysis method is proposed for time-dependent problems with limit-state functions of input random variables, input random processes and explicit in time using the Total Probability theorem and the concept of composite limit state. The input random processes are assumed Gaussian. They are expressed in terms of standard normal variables using a spectral decomposition method. The Total Probability theorem is employed to calculate the time-dependent Probability of failure using a time-dependent conditional Probability which is computed accurately and efficiently in the standard normal space using FORM and a composite limit state of linear instantaneous limit states. If the dimensionality of the Total Probability theorem integral (equal to the number of input random variables) is small, we can easily calculate it using Gauss quadrature numerical integration. Otherwise, simple Monte Carlo simulation or adaptive importance sampling is used based on a pre-built Kriging metamodel of the conditional Probability. An example from the literature on the design of a hydrokinetic turbine blade under time-dependent river flow load demonstrates all developments.Copyright © 2014 by ASME

Weijun Yang - One of the best experts on this subject based on the ideXlab platform.

  • an approach based on theorem of Total Probability for reliability analysis of rc columns with random eccentricity
    Structural Safety, 2013
    Co-Authors: Youbao Jiang, Weijun Yang
    Abstract:

    Abstract The load and resistance factors of reinforced concrete (RC) column in many design codes are based largely on reliability calibrations. In these calibrations, the fixed eccentricity criterion is often used and a simple linear failure function is built accordingly. However, the eccentricity of RC column is random under combined random horizontal and vertical loads. In this case, the limit state function may become a complex one. Consequently, there may be large errors in reliability calculation still with the fixed eccentricity criterion. Considering the random distribution of eccentricity, a practical reliability analysis approach is proposed for RC column based on the theorem of Total Probability. The essential steps of this approach include the solution of probabilistic eccentricity model and the solutions of probabilistic resistance model and probabilistic axial force model within a certain range of eccentricity. Then, such solutions are studied one by one. A main advantage of this approach is that it can explicitly give the ratio of the conditional failure Probability within a certain range of eccentricity to the Total failure Probability. This is of help to design RC columns reasonably. Finally, the reliability calculation of several examples is carried out with the proposed approach. The results indicate that this approach is as accurate as the Monte Carlo method.