Nonlinear Regression

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 297 Experts worldwide ranked by ideXlab platform

Ronald E Brown - One of the best experts on this subject based on the ideXlab platform.

  • detecting outliers when fitting data with Nonlinear Regression a new method based on robust Nonlinear Regression and the false discovery rate
    BMC Bioinformatics, 2006
    Co-Authors: Harvey J Motulsky, Ronald E Brown
    Abstract:

    Nonlinear Regression, like linear Regression, assumes that the scatter of data around the ideal curve follows a Gaussian or normal distribution. This assumption leads to the familiar goal of Regression: to minimize the sum of the squares of the vertical or Y-value distances between the points and the curve. Outliers can dominate the sum-of-the-squares calculation, and lead to misleading results. However, we know of no practical method for routinely identifying outliers when fitting curves with Nonlinear Regression. We describe a new method for identifying outliers when fitting data with Nonlinear Regression. We first fit the data using a robust form of Nonlinear Regression, based on the assumption that scatter follows a Lorentzian distribution. We devised a new adaptive method that gradually becomes more robust as the method proceeds. To define outliers, we adapted the false discovery rate approach to handling multiple comparisons. We then remove the outliers, and analyze the data using ordinary least-squares Regression. Because the method combines robust Regression and outlier removal, we call it the ROUT method. When analyzing simulated data, where all scatter is Gaussian, our method detects (falsely) one or more outlier in only about 1–3% of experiments. When analyzing data contaminated with one or several outliers, the ROUT method performs well at outlier identification, with an average False Discovery Rate less than 1%. Our method, which combines a new method of robust Nonlinear Regression with a new method of outlier identification, identifies outliers from Nonlinear curve fits with reasonable power and few false positives.

  • Detecting outliers when fitting data with Nonlinear Regression – a new method based on robust Nonlinear Regression and the false discovery rate
    BMC Bioinformatics, 2006
    Co-Authors: Harvey J Motulsky, Ronald E Brown
    Abstract:

    Nonlinear Regression, like linear Regression, assumes that the scatter of data around the ideal curve follows a Gaussian or normal distribution. This assumption leads to the familiar goal of Regression: to minimize the sum of the squares of the vertical or Y-value distances between the points and the curve. Outliers can dominate the sum-of-the-squares calculation, and lead to misleading results. However, we know of no practical method for routinely identifying outliers when fitting curves with Nonlinear Regression. We describe a new method for identifying outliers when fitting data with Nonlinear Regression. We first fit the data using a robust form of Nonlinear Regression, based on the assumption that scatter follows a Lorentzian distribution. We devised a new adaptive method that gradually becomes more robust as the method proceeds. To define outliers, we adapted the false discovery rate approach to handling multiple comparisons. We then remove the outliers, and analyze the data using ordinary least-squares Regression. Because the method combines robust Regression and outlier removal, we call it the ROUT method. When analyzing simulated data, where all scatter is Gaussian, our method detects (falsely) one or more outlier in only about 1–3% of experiments. When analyzing data contaminated with one or several outliers, the ROUT method performs well at outlier identification, with an average False Discovery Rate less than 1%. Our method, which combines a new method of robust Nonlinear Regression with a new method of outlier identification, identifies outliers from Nonlinear curve fits with reasonable power and few false positives.

L. K. Jones - One of the best experts on this subject based on the ideXlab platform.

Harvey J Motulsky - One of the best experts on this subject based on the ideXlab platform.

  • detecting outliers when fitting data with Nonlinear Regression a new method based on robust Nonlinear Regression and the false discovery rate
    BMC Bioinformatics, 2006
    Co-Authors: Harvey J Motulsky, Ronald E Brown
    Abstract:

    Nonlinear Regression, like linear Regression, assumes that the scatter of data around the ideal curve follows a Gaussian or normal distribution. This assumption leads to the familiar goal of Regression: to minimize the sum of the squares of the vertical or Y-value distances between the points and the curve. Outliers can dominate the sum-of-the-squares calculation, and lead to misleading results. However, we know of no practical method for routinely identifying outliers when fitting curves with Nonlinear Regression. We describe a new method for identifying outliers when fitting data with Nonlinear Regression. We first fit the data using a robust form of Nonlinear Regression, based on the assumption that scatter follows a Lorentzian distribution. We devised a new adaptive method that gradually becomes more robust as the method proceeds. To define outliers, we adapted the false discovery rate approach to handling multiple comparisons. We then remove the outliers, and analyze the data using ordinary least-squares Regression. Because the method combines robust Regression and outlier removal, we call it the ROUT method. When analyzing simulated data, where all scatter is Gaussian, our method detects (falsely) one or more outlier in only about 1–3% of experiments. When analyzing data contaminated with one or several outliers, the ROUT method performs well at outlier identification, with an average False Discovery Rate less than 1%. Our method, which combines a new method of robust Nonlinear Regression with a new method of outlier identification, identifies outliers from Nonlinear curve fits with reasonable power and few false positives.

  • Detecting outliers when fitting data with Nonlinear Regression – a new method based on robust Nonlinear Regression and the false discovery rate
    BMC Bioinformatics, 2006
    Co-Authors: Harvey J Motulsky, Ronald E Brown
    Abstract:

    Nonlinear Regression, like linear Regression, assumes that the scatter of data around the ideal curve follows a Gaussian or normal distribution. This assumption leads to the familiar goal of Regression: to minimize the sum of the squares of the vertical or Y-value distances between the points and the curve. Outliers can dominate the sum-of-the-squares calculation, and lead to misleading results. However, we know of no practical method for routinely identifying outliers when fitting curves with Nonlinear Regression. We describe a new method for identifying outliers when fitting data with Nonlinear Regression. We first fit the data using a robust form of Nonlinear Regression, based on the assumption that scatter follows a Lorentzian distribution. We devised a new adaptive method that gradually becomes more robust as the method proceeds. To define outliers, we adapted the false discovery rate approach to handling multiple comparisons. We then remove the outliers, and analyze the data using ordinary least-squares Regression. Because the method combines robust Regression and outlier removal, we call it the ROUT method. When analyzing simulated data, where all scatter is Gaussian, our method detects (falsely) one or more outlier in only about 1–3% of experiments. When analyzing data contaminated with one or several outliers, the ROUT method performs well at outlier identification, with an average False Discovery Rate less than 1%. Our method, which combines a new method of robust Nonlinear Regression with a new method of outlier identification, identifies outliers from Nonlinear curve fits with reasonable power and few false positives.

  • fitting models to biological data using linear and Nonlinear Regression a practical guide to curve fitting
    2004
    Co-Authors: Harvey J Motulsky, Arthur Christopoulos
    Abstract:

    FITTING DATA WITH Nonlinear Regression FITTING DATA WITH LINEAR Regression MODELS HOW Nonlinear Regression WORKS CONFIDENCE INTERVALS OF THE PARAMETERS COMPARING MODELS HOW DOES A TREATMENT CHANGE THE CURVE? FITTING RADIOLIGAND AND ENZYME KINETICS DATA FITTING DOES-RESPONSE CURVES FITTING CURVES WITH GRAPHPAD PRISM

Dag Tjøstheim - One of the best experts on this subject based on the ideXlab platform.

  • Estimation in Nonlinear Regression with Harris Recurrent Markov Chains
    Annals of Statistics, 2016
    Co-Authors: Degui Li, Dag Tjøstheim
    Abstract:

    In this paper, we study parametric Nonlinear Regression under the Harris recurrent Markov chain framework. We first consider the Nonlinear least squares estimators of the parameters in the homoskedastic case, and establish asymptotic theory for the proposed estimators. Our results show that the convergence rates for the estimators rely not only on the properties of the Nonlinear Regression function, but also on the number of regenerations for the Harris recurrent Markov chain. Furthermore, we discuss the estimation of the parameter vector in a conditional volatility function, and apply our results to the Nonlinear Regression with I(1) processes and derive an asymptotic distribution theory which is comparable to that obtained by Park and Phillips (2001). Some numerical studies including simulation and empirical application are provided to examine the finite sample performance of the proposed approaches and results.

  • Nonlinear Regression with Harris Recurrent Markov Chains
    2012
    Co-Authors: Degui Li, Dag Tjøstheim
    Abstract:

    In this paper, we study parametric Nonlinear Regression under the Harris recurrent Markov chain framework. We first consider the Nonlinear least squares estimators of the parameters in the homoskedastic case, and establish asymptotic theory for the proposed estimators. Our results show that the convergence rates for the estimators rely not only on the properties of the Nonlinear Regression function, but also on the number of regenerations for the Harris recurrent Markov chain. We also discuss the estimation of the parameter vector in a conditional volatility function and its asymptotic theory. Furthermore, we apply our results to the Nonlinear Regression with I(1) processes and establish an asymptotic distribution theory which is comparable to that obtained by Park and Phillips (2001). Some simulation studies are provided to illustrate the proposed approaches and results.

Degui Li - One of the best experts on this subject based on the ideXlab platform.

  • Estimation in Nonlinear Regression with Harris Recurrent Markov Chains
    Annals of Statistics, 2016
    Co-Authors: Degui Li, Dag Tjøstheim
    Abstract:

    In this paper, we study parametric Nonlinear Regression under the Harris recurrent Markov chain framework. We first consider the Nonlinear least squares estimators of the parameters in the homoskedastic case, and establish asymptotic theory for the proposed estimators. Our results show that the convergence rates for the estimators rely not only on the properties of the Nonlinear Regression function, but also on the number of regenerations for the Harris recurrent Markov chain. Furthermore, we discuss the estimation of the parameter vector in a conditional volatility function, and apply our results to the Nonlinear Regression with I(1) processes and derive an asymptotic distribution theory which is comparable to that obtained by Park and Phillips (2001). Some numerical studies including simulation and empirical application are provided to examine the finite sample performance of the proposed approaches and results.

  • Nonlinear Regression with Harris Recurrent Markov Chains
    2012
    Co-Authors: Degui Li, Dag Tjøstheim
    Abstract:

    In this paper, we study parametric Nonlinear Regression under the Harris recurrent Markov chain framework. We first consider the Nonlinear least squares estimators of the parameters in the homoskedastic case, and establish asymptotic theory for the proposed estimators. Our results show that the convergence rates for the estimators rely not only on the properties of the Nonlinear Regression function, but also on the number of regenerations for the Harris recurrent Markov chain. We also discuss the estimation of the parameter vector in a conditional volatility function and its asymptotic theory. Furthermore, we apply our results to the Nonlinear Regression with I(1) processes and establish an asymptotic distribution theory which is comparable to that obtained by Park and Phillips (2001). Some simulation studies are provided to illustrate the proposed approaches and results.