Default Model

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 3396 Experts worldwide ranked by ideXlab platform

Matthias Scherer - One of the best experts on this subject based on the ideXlab platform.

  • How to Construct a Portfolio-Default Model?
    Financial Engineering with Copulas Explained, 2020
    Co-Authors: Matthias Scherer
    Abstract:

    We discuss different possibilities how Models for dependent Default times can be constructed. Such Models are required for many tasks, examples are the pricing of portfolio-credit derivatives, the risk management of credit portfolios, and computations in the context of counterparty credit risk. It also illustrates how copulas can be used to create dependence between pre-specified marginal laws, which is one of the core applications of Sklar’s Theorem 1.2.4.

  • A TRACTABLE MULTIVARIATE Default Model BASED ON A STOCHASTIC TIME-CHANGE
    International Journal of Theoretical and Applied Finance, 2020
    Co-Authors: Matthias Scherer
    Abstract:

    A stochastic time-change is applied to introduce dependence to a portfolio of credit-risky assets whose Default times are Modeled as random variables with arbitrary distribution. The dependence structure of the vector of Default times is completely separated from its marginal Default probabilities, making the Model analytically tractable. This separation is achieved by restricting the time-change to suitable Lévy subordinators which preserve the marginal distributions. Jump times of the Lévy subordinator are interpreted as times of excess Default clustering. Relevant for practical implementations is that the parameters of the time-change allow for an intuitive economical explanation and can be calibrated independently of the marginal Default probabilities. On a theoretical level, a so-called time normalization allows to compute the resulting copula of the Default times. Moreover, the exact portfolio-loss distribution and an approximation for large portfolios under a homogeneous portfolio assumption are derived. Given these results, the pricing of complex portfolio derivatives is possible in closed-form. Three different implementations of the Model are proposed, including a compound Poisson subordinator, a Gamma subordinator, and an Inverse Gaussian subordinator. Using two parameters to adjust the dependence structure in each case, the Model is capable of capturing the full range of dependence patterns from independence to complete comonotonicity. A simultaneous calibration to portfolio-CDS spreads and CDO tranche spreads is carried out to demonstrate the Model's applicability.

  • A multivariate Default Model with spread and event risk
    Applied Mathematical Finance, 2013
    Co-Authors: Pablo Olivares, Steffen Schenk, Matthias Scherer
    Abstract:

    We present a new portfolio Default Model based on a conditionally independent and identically distributed (CIID) structure of the Default times. It combines an intensity-based ansatz in the spirit of Duffie and Gârleanu (2001). Risk and valuation of collateralized debt obligations. Financial Analysts Journal , 57 (1), 41--59. with the Levy subordinator concept introduced in Mai and Scherer (2009). A tractable multivariate Default Model based on a stochastic time-change. International Journal of Theoretical and Applied Finance , 12 (2), 227--249. We aim at exploiting the computational advantages of the CIID framework for evaluating multiname credit derivatives, while incorporating two central drivers for credit products. More precisely, we allow for both a dynamic evolution of the portfolio credit Default swap (CDS) spread (unlike static copula Models) and cataclysmic events allowing for simultaneous Defaults (unlike intensity-based portfolio loss processes). While the former feature is considered to be crucial for consistently hedging credit products, the second property is supposed to take into account Default clusters and the market's fear of extreme events. For applications, the Model is approximated by a related top-down representation of the portfolio loss process. It is shown how to coherently calculate hedging deltas for collateralized debt obligations (CDOs) w.r.t. portfolio CDS and how to consistently calibrate the Model to the two products. Both tasks solely require the computation of one-dimensional (Laplace inversion) integrals and can be carried out within fractions of a second. Illustrating the stability and functionality of the pricing approach, the new Model and the Models it is related to are calibrated to a daily time-series of iTraxx Europe index CDS and CDOs. We find the fitting results of the presented Model to be very promising and conclude that it may be used for the dynamic pricing and hedging of credit derivatives.

  • Shot-noise driven multivariate Default Models
    European Actuarial Journal, 2012
    Co-Authors: Matthias Scherer, Ludwig Schmid, Thorsten Schmidt
    Abstract:

    The recent financial crisis, responsible for massive accumulations of credit events, emphasizes the urgent need for adequate portfolio Default Models. Due to the high dimensionality of real credit portfolios, balancing flexibility and numerical tractability is of uttermost importance. To acknowledge this, a multivariate Default Model with interesting stylized properties is introduced in the following way: a non-decreasing shot-noise process serves as common stochastic clock. Individual Default times are defined as the first-passage times of the common clock across independent exponentially distributed threshold levels. We obtain a Default Model which has a dynamic stochastic representation, contagion effects, a positive probability for joint Defaults, the ability to separate univariate marginal laws from the dependence structure, and the option for efficient pricing routines under a “large homogeneous groups” assumption. Besides this, the Model is well-suited for insurance portfolios which are subject to catastrophe risks and the pricing of catastrophe derivatives.

  • Default Models based on scale mixtures of Marshall-Olkin copulas: properties and applications
    Metrika, 2012
    Co-Authors: German Bernhart, Marcos Escobar Anel, Matthias Scherer
    Abstract:

    We present a unification of the Archimedean and the Levy-frailty copula Model for portfolio Default Models. The new Default Model exhibits a copula known as scale mixture of Marshall-Olkin copulas and an investigation of the dependence structure reveals that desirable properties of both original Models are combined. This allows for a wider range of dependence patterns, while the analytical tractability is retained. Furthermore, simultaneous Defaults and Default clustering are incorporated. In addition, a hierarchical extension is presented which allows for a heterogeneous dependence structure. Finally, the Model is applied to the pricing of CDO contracts. For this purpose, an efficient Laplace transform inversion approach is developed. Supporting a separation of marginal Default probabilities and dependence structure, the Model can be calibrated to CDS contracts in a first step. In a second step, the calibration of several parametric families to CDO contracts demonstrates a good fitting quality, which further emphasizes the suitability of the approach.

Thorsten Schmidt - One of the best experts on this subject based on the ideXlab platform.

  • Shot-noise driven multivariate Default Models
    European Actuarial Journal, 2012
    Co-Authors: Matthias Scherer, Ludwig Schmid, Thorsten Schmidt
    Abstract:

    The recent financial crisis, responsible for massive accumulations of credit events, emphasizes the urgent need for adequate portfolio Default Models. Due to the high dimensionality of real credit portfolios, balancing flexibility and numerical tractability is of uttermost importance. To acknowledge this, a multivariate Default Model with interesting stylized properties is introduced in the following way: a non-decreasing shot-noise process serves as common stochastic clock. Individual Default times are defined as the first-passage times of the common clock across independent exponentially distributed threshold levels. We obtain a Default Model which has a dynamic stochastic representation, contagion effects, a positive probability for joint Defaults, the ability to separate univariate marginal laws from the dependence structure, and the option for efficient pricing routines under a “large homogeneous groups” assumption. Besides this, the Model is well-suited for insurance portfolios which are subject to catastrophe risks and the pricing of catastrophe derivatives.

Horacio Sapriza - One of the best experts on this subject based on the ideXlab platform.

  • Quantitative properties of sovereign Default Models: Solution methods matter
    Review of Economic Dynamics, 2020
    Co-Authors: Juan Carlos Hatchondo, Leonardo Martinez, Horacio Sapriza
    Abstract:

    We study the sovereign Default Model that has been used to account for the cyclical behavior of interest rates in emerging market economies. This Model is often solved using the discrete state space technique with evenly spaced grid points. We show that this method necessitates a large number of grid points to avoid generating spurious interest rate movements. This makes the discrete state technique significantly more inefficient than using Chebyshev polynomials or cubic spline interpolation to approximate the value functions. We show that the inefficiency of the discrete state space technique is more severe for parameterizations that feature a high sensitivity of the bond price to the borrowing level for the borrowing levels that are observed more frequently in the simulations. In addition, we find that the efficiency of the discrete state space technique can be greatly improved by (i) finding the equilibrium as the limit of the equilibrium of the finite-horizon version of the Model, instead of iterating separately on the value and bond price functions and (ii) concentrating grid points in asset levels at which the bond price is more sensitive to the borrowing level and in levels that are observed more often in the Model simulations. Our analysis is also relevant for the study of other credit markets. (Copyright: Elsevier)Emerging economies; Sovereign debt; Default; Numerical methods

  • Quantitative Properties of Sovereign Default Models: Solution Methods Matter
    Review of Economic Dynamics, 2010
    Co-Authors: Juan Carlos Hatchondo, Leonardo Martinez, Horacio Sapriza
    Abstract:

    We study the sovereign Default Model that has been used to account for the cyclical behavior of interest rates in emerging market economies. This Model is often solved using the discrete state space technique with evenly spaced grid points. We show that this method necessitates a large number of grid points to avoid generating spurious interest rate movements. This makes the discrete state technique significantly more inefficient than using Chebyshev polynomials or cubic spline interpolation to approximate the value functions. We show that the inefficiency of the discrete state space technique is more severe for parameterizations that feature a high sensitivity of the bond price to the borrowing level for the borrowing levels that are observed more frequently in the simulations. In addition, we find that the efficiency of the discrete state space technique can be greatly improved by (i) finding the equilibrium as the limit of the equilibrium of the finite-horizon version of the Model, instead of iterating separately on the value and bond price functions and (ii) concentrating grid points in asset levels at which the bond price is more sensitive to the borrowing level and in levels that are observed more often in the Model simulations. Our analysis questions the robustness of results in the sovereign Default literature and is also relevant for the study of other credit markets.

  • Online Appendix to "Quantitative properties of sovereign Default Models: solution methods"
    2010
    Co-Authors: Juan Carlos Hatchondo, Leonardo Martinez, Horacio Sapriza
    Abstract:

    This document describes how we evaluate the accuracy of the solution of the baseline sovereign Default Model using the test proposed by den Haan and Marcet (1994). We show that the solutions obtained using Chebyshev collocation and cubic spline interpolation approximate the equilibrium with reasonable accuracy and illustrate the challenges that arise when the test is applied to the solution obtained using the discrete state space technique.

  • Quantitative Properties of Sovereign Default Models : Solution Methods Matter, Working Paper 10-04
    2010
    Co-Authors: Juan Carlos Hatchondo, Leonardo Martinez, Horacio Sapriza
    Abstract:

    We study the sovereign Default Model that has been used to account for the cyclical behavior of interest rates in emerging market economies. This Model is often solved using the discrete state space technique with evenly spaced grid points. We show that this method necessitates a large number of grid points to avoid generating spurious interest rate movements. This makes the discrete state technique significantly more inefficient than using Chebyshev polynomials or cubic spline interpolation to approximate the value functions. We show that the inefficiency of the discrete state space technique is more severe for parameterizations that feature a high sensitivity of the bond price to the borrowing level for the borrowing levels that are observed more frequently in the simulations. In addition, we find that the efficiency of the discrete state space technique can be greatly improved by (i) finding the equilibrium as the limit of the equilibrium of the finite-horizon version of the Model, instead of iterating separately on the value and bond price functions and (ii) concentrating grid points in asset levels at which the bond price is more sensitive to the borrowing level and in levels that are observed more often in the Model simulations. Our analysis is also relevant for the study of other credit markets. ; WP 10-04 replaces earlier versions listed as WP 09-13 and WP 06-11

Edward J. Calabrese - One of the best experts on this subject based on the ideXlab platform.

  • The EPA Cancer Risk Assessment Default Model Proposal: Moving Away From the LNT.
    Dose-response, 2018
    Co-Authors: Edward J. Calabrese, Jaap C. Hanekamp, Dima Yazji Shamoun
    Abstract:

    : This article strongly supports the Environmental Protection Agency proposal to make significant changes in their cancer risk assessment principles and practices by moving away from the use of the linear nonthreshold (LNT) dose-response as the Default Model. An alternate approach is proposed based on Model uncertainty which integrates the most scientifically supportable features of the threshold, hormesis, and LNT Models to identify the doses that optimize population-based responses (ie, maximize health benefits/minimize health harm). This novel approach for cancer risk assessment represents a significant improvement to the current LNT Default method from scientific and public health perspectives.

  • From Muller to mechanism: How LNT became the Default Model for cancer risk assessment
    Environmental Pollution, 2018
    Co-Authors: Edward J. Calabrese
    Abstract:

    Abstract This paper summarizes the historical and scientific foundations of the Linear No-Threshold (LNT) cancer risk assessment Model. The story of cancer risk assessment is an extraordinary one as it was based on an initial incorrect gene mutation interpretation of Muller, the application of this incorrect assumption in the derivation of the LNT single-hit Model, and a series of actions by leading radiation geneticists during the 1946–1956 period, including a National Academy of Sciences (NAS) Biological Effects of Atomic Radiation (BEAR) I Genetics Panel (Anonymous, 1956), to sustain the LNT belief via a series of deliberate obfuscations, deceptions and misrepresentations that provided the basis of modern cancer risk assessment policy and practices. The reaffirming of the LNT Model by a subsequent and highly influential NAS Biological Effects of Ionizing Radiation (BEIR) I Committee (NAS/NRC, 1972) using mouse data has now been found to be inappropriate based on the discovery of a significant documented error in the historical control group that led to incorrect estimations of risk in the low dose zone. Correction of this error by the original scientists and the application of the adjusted/corrected data back to the BEIR I (NAS/NRC, 1972) report indicates that the data would have supported a threshold rather than the LNT Model. Thus, cancer risk assessment has a poorly appreciated, complex and seriously flawed history that has undermined policies and practices of regulatory agencies in the U.S. and worldwide to the present time.

  • hormesis outperforms threshold Model in national cancer institute antitumor drug screening database
    Toxicological Sciences, 2006
    Co-Authors: Edward J. Calabrese, John Staudenmayer, Edward J Stanek, George R Hoffmann
    Abstract:

    Which dose-response Model best explains low-dose responses is a critical issue in toxicology, pharmacology, and risk assessment. The present paper utilized the U.S. National Cancer Institute yeast screening database that contains 56,914 dose-response studies representing the replicated effects of 2189 chemically diverse possible antitumor drugs on cell proliferation in 13 different yeast strains. Multiple evaluation methods indicated that the observed data are inconsistent with the threshold Model while supporting the hormetic Model. Hormetic response patterns were observed approximately four times more often than would be expected by chance alone. The data call for the rejection of the threshold Model for low-dose prediction, and they support the hormetic Model as the Default Model for scientific interpretation of low-dose toxicological responses.

  • hormesis from marginalization to mainstream a case for hormesis as the Default dose response Model in risk assessment
    Toxicology and Applied Pharmacology, 2004
    Co-Authors: Edward J. Calabrese
    Abstract:

    Abstract The paper provides an account of how the hormetic dose response has emerged in recent years as a serious dose-response Model in toxicology and risk assessment after decades of extreme marginalization. In addition to providing the toxicological basis of this dose-response revival, the paper reexamines the concept of a Default dose Model in toxicology and risk assessment and makes the argument that the hormetic Model satisfies criteria (e.g., generalizability, frequency, application to risk assessment endpoints, false positive/negative potential, requirements for hazard assessment, reliability of estimating risks, capacity for validation of risk estimates, public health implications of risk estimates) for such a Default Model better than its chief competitors, the threshold and linear at low dose Models. The selection of the hormetic Model as the Default Model in risk assessment for noncarcinogens and specifically for carcinogens would have a profound impact on the practice of risk assessment and its societal implications.

Phiroz M. Bhagat - One of the best experts on this subject based on the ideXlab platform.

  • Embedding Theoretical Models in Neural Networks
    1992 American Control Conference, 1992
    Co-Authors: Mark A. Kramer, Michael L. Thompson, Phiroz M. Bhagat
    Abstract:

    A novel method for incorporating constraints and Default Models into neural networks is presented. The method involves a parallel arrangement of a Default Model and a radial basis function network. The training procedure accounts for equality and inequality constraints that must be satisfied for all future inputs to the network. In the case of linear equality constraints and no inequality constraints, training is reduced to a quadratic problem possessing an analytical solution. The extrapolation properties of the Model-based network are controllable to a greater extent than previous network Models.