Safety Factor Approach

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 201 Experts worldwide ranked by ideXlab platform

Van Keulenfred - One of the best experts on this subject based on the ideXlab platform.

Fred Van Keulen - One of the best experts on this subject based on the ideXlab platform.

  • partial Safety Factor Approach to the design of submarine pressure hulls using nonlinear finite element analysis
    Finite Elements in Analysis and Design, 2013
    Co-Authors: John R Mackay, Fred Van Keulen
    Abstract:

    A framework for the design of submarine pressure hulls using nonlinear finite element (FE) analysis is presented in order to improve upon the conventional analytical-empirical design procedure. A numerical methodology is established that allows the collapse pressure of a hull to be predicted with controlled accuracy. The methodology is characterized by quasi-static incremental analysis, including material and geometric nonlinearities, of FE models constructed from shell elements. The numerical methodology is used with ANSYS to predict the results of 47 collapse experiments on small-scale ring-stiffened cylinders representative of submarine hulls. A probabilistic analysis is applied to the experimental-numerical comparisons in order to estimate the accuracy of the FE methodology and derive a partial Safety Factor (PSF) for design. It is demonstrated that a high level of accuracy, within 10% with 95% confidence, can be achieved if the prescribed FE methodology is followed. Furthermore, it is shown that the PSF for design does not need to be very large, even if a high degree of statistical confidence is built in. The designer can be 99.5% confident that the FE error has been accounted for by dividing the predicted collapse pressure by a PSF=1.134.

Gerald Oeser - One of the best experts on this subject based on the ideXlab platform.

  • what s the penalty for using the square root law of inventory centralisation
    International Journal of Retail & Distribution Management, 2019
    Co-Authors: Gerald Oeser
    Abstract:

    The square root law (SRL) is a popular model for assessing inventory levels when changing the number of warehouses. Previous empirical research, however, has shown that mostly its underlying assumptions do not hold in practice. This sparks the question how inaccurate the SRL’s results are. The paper aims to discuss this issue.,In 26 company cases of reducing the number of warehouses, the estimation error of the SRL is analysed irrespective of its underlying assumptions.,The analysis reveals an average estimation error for total inventory of 27.85 per cent (median=27.84 per cent), but a high variability across the cases. The SRL seems to mostly overestimate inventory savings from centralisation and inventory increases from decentralisation. Managers should only use the SRL if inventory depends on the number of warehouses in their situation, i.e. if they use the economic order or production quantity policy and Safety Factor Approach. Suggestions for coping with the SRL’s estimation error are given.,This paper is based on the 26 cases that could be found in a thorough literature review in the ten most widely spoken languages and that contained or allowed to deduce the necessary information. In order to enable wider generalisations, this sample could be extended.,Most past research has been more theoretical in nature. This research is the first to investigate the SRL’s estimation error using a variety of company cases and how to cope with this error.

  • What’s the penalty for using the Square Root Law of inventory centralisation?
    International Journal of Retail & Distribution Management, 2019
    Co-Authors: Gerald Oeser
    Abstract:

    The square root law (SRL) is a popular model for assessing inventory levels when changing the number of warehouses. Previous empirical research, however, has shown that mostly its underlying assumptions do not hold in practice. This sparks the question how inaccurate the SRL’s results are. The paper aims to discuss this issue.,In 26 company cases of reducing the number of warehouses, the estimation error of the SRL is analysed irrespective of its underlying assumptions.,The analysis reveals an average estimation error for total inventory of 27.85 per cent (median=27.84 per cent), but a high variability across the cases. The SRL seems to mostly overestimate inventory savings from centralisation and inventory increases from decentralisation. Managers should only use the SRL if inventory depends on the number of warehouses in their situation, i.e. if they use the economic order or production quantity policy and Safety Factor Approach. Suggestions for coping with the SRL’s estimation error are given.,This paper is based on the 26 cases that could be found in a thorough literature review in the ten most widely spoken languages and that contained or allowed to deduce the necessary information. In order to enable wider generalisations, this sample could be extended.,Most past research has been more theoretical in nature. This research is the first to investigate the SRL’s estimation error using a variety of company cases and how to cope with this error.

R Mackayjohn - One of the best experts on this subject based on the ideXlab platform.

Kenny S. Crump - One of the best experts on this subject based on the ideXlab platform.

  • Cancer Risk Assessment and the Biostatistical Revolution of the 1970s—A Reflection:
    Dose-response : a publication of International Hormesis Society, 2018
    Co-Authors: Kenny S. Crump
    Abstract:

    Before around 1960, assessment of risk from exposure to toxic substances, including risk of cancer, was generally implemented using the NOAEL-Safety Factor Approach that essentially assumed that an...

  • Use of threshold and mode of action in risk assessment.
    Critical reviews in toxicology, 2011
    Co-Authors: Kenny S. Crump
    Abstract:

    Under current guidelines, exposure guidelines for toxicants are determined by following one of two different tracks depending on whether the toxicant's mode of action (MOA) is believed to involve an exposure threshold. Although not denying the existence of thresholds, this paper points out problems with how the threshold concept and MOA is used in risk assessment. Thresholds are frequently described using imprecise terms that imply some unspecified increase in risk, which robs them of any meaning (any reasonable dose response will satisfy such a definition) and tacitly implies a value judgment about how large a risk is acceptable. MOA is generally used only to inform a threshold's existence and not its value. Often MOA is used only to conclude that the adverse effect requires an upstream cellular or biochemical response for which a threshold is simply assumed. Data to inform MOA often come from animals, which complicates evaluation of the role of human variation in genetic and environmental conditions, and the possible interaction of the toxicant with processes already producing background toxicity in humans. In response to these and other problems with the current two-track Approach, this paper proposes a modified point of departure/Safety Factor Approach to setting exposure guidelines for all toxicants. MOA and the severity of the toxic effect would be addressed using Safety Factors calculated from guidelines established by consensus and based on scientific judgment. The method normally would not involve quantifying low-dose risk, and would not require a threshold determination, although MOA information regarding the likelihood of a threshold could be used in setting Safety Factors.

  • The linearized multistage model and the future of quantitative risk assessment
    Human & experimental toxicology, 1996
    Co-Authors: Kenny S. Crump
    Abstract:

    The linearized multistage (LMS) model has for over 15 years been the default dose-response model used by the U.S. Environmental Protection Agency (USEPA) and other federal and state regulatory agencies in the United States for calculating quantitative estimates of low-dose carcinogenic risks from animal data. The LMS model is in essence a flexible statistical model that can describe both linear and non-linear dose-response patterns, and that produces an upper confidence bound on the linear low-dose slope of the dose-response curve. Unlike its namesake, the Armitage-Doll multistage model, the parameters of the LMS do not correspond to actual physiological phenomena. Thus the LMS is 'biological' only to the extent that the true biological dose response is linear at low dose and that low-dose slope is reflected in the experimental data. If the true dose response is non-linear the LMS upper bound may overestimate the true risk by many orders of magnitude. However, competing low-dose extrapolation models, including those derived from 'biologically-based models' that are capable of incorporating additional biological information, have not shown evidence to date of being able to produce quantitative estimates of low-dose risks that are any more accurate than those obtained from the LMS model. Further, even if these attempts were successful, the extent to which more accurate estimates of low-dose risks in a test animal species would translate into improved estimates of human risk is questionable. Thus, it does not appear possible at present to develop a quantitative Approach that would be generally applicable and that would offer significant improvements upon the crude bounding estimates of the type provided by the LMS model. Draft USEPA guidelines for cancer risk assessment incorporate an Approach similar to the LMS for carcinogens having a linear mode of action. However, under these guidelines quantitative estimates of low-dose risks would not be developed for carcinogens having a non-linear mode of action; instead dose-response modelling would be used in the experimental range to calculate an LED10* (a statistical lower bound on the dose corresponding to a 10% increase in risk), and Safety Factors would be applied to the LED10* to determine acceptable exposure levels for humans. This Approach is very similar to the one presently used by USEPA for non-carcinogens. Rather than using one Approach for carcinogens believed to have a linear mode of action and a different Approach for all other health effects, it is suggested herein that it would be more appropriate to use an Approach conceptually similar to the 'LED10*-Safety Factor' Approach for all health effects, and not to routinely develop quantitative risk estimates from animal data.