Risk Measurement

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 162000 Experts worldwide ranked by ideXlab platform

Sandeep Juneja - One of the best experts on this subject based on the ideXlab platform.

  • kernel smoothing for nested estimation with application to portfolio Risk Measurement
    Operations Research, 2017
    Co-Authors: Jeff L Hong, Sandeep Juneja, Guangwu Liu
    Abstract:

    Nested estimation involves estimating an expectation of a function of a conditional expectation via simulation. This problem has of late received increasing attention amongst researchers due to its broad applicability particularly in portfolio Risk Measurement and in pricing complex derivatives. In this paper, we study a kernel smoothing approach. We analyze its asymptotic properties, and present efficient algorithms for practical implementation. While asymptotic results suggest that the kernel smoothing approach is preferable over nested simulation only for low-dimensional problems, we propose a decomposition technique for portfolio Risk Measurement, through which a high-dimensional problem may be decomposed into low-dimensional ones that allow an efficient use of the kernel smoothing approach. Numerical studies show that, with the decomposition technique, the kernel smoothing approach works well for a reasonably large portfolio with 200 Risk factors. This suggests that the proposed methodology may serve...

  • nested simulation in portfolio Risk Measurement
    Management Science, 2010
    Co-Authors: Michael B Gordy, Sandeep Juneja
    Abstract:

    Risk Measurement for derivative portfolios almost invariably calls for nested simulation. In the outer step, one draws realizations of all Risk factors up to the horizon, and in the inner step, one reprices each instrument in the portfolio at the horizon conditional on the drawn Risk factors. Practitioners may perceive the computational burden of such nested schemes to be unacceptable and adopt a variety of second-best pricing techniques to avoid the inner simulation. In this paper, we question whether such short cuts are necessary. We show that a relatively small number of trials in the inner step can yield accurate estimates, and we analyze how a fixed computational budget may be allocated to the inner and the outer step to minimize the mean square error of the resultant estimator. Finally, we introduce a jackknife procedure for bias reduction.

  • nested simulation in portfolio Risk Measurement
    2008
    Co-Authors: Michael B Gordy, Sandeep Juneja
    Abstract:

    Risk Measurement for derivative portfolios almost invariably calls for nested simulation. In the outer step one draws realizations of all Risk factors up to the horizon, and in the inner step one re-prices each instrument in the portfolio at the horizon conditional on the drawn Risk factors. Practitioners may perceive the computational burden of such nested schemes to be unacceptable, and adopt a variety of second-best pricing techniques to avoid the inner simulation. In this paper, we question whether such short cuts are necessary. We show that a relatively small number of trials in the inner step can yield accurate estimates, and analyze how a fixed computational budget may be allocated to the inner and the outer step to minimize the mean square error of the resultant estimator. Finally, we introduce a jackknife procedure for bias reduction and a dynamic allocation scheme for improved efficiency.

Michael B Gordy - One of the best experts on this subject based on the ideXlab platform.

  • nested simulation in portfolio Risk Measurement
    Management Science, 2010
    Co-Authors: Michael B Gordy, Sandeep Juneja
    Abstract:

    Risk Measurement for derivative portfolios almost invariably calls for nested simulation. In the outer step, one draws realizations of all Risk factors up to the horizon, and in the inner step, one reprices each instrument in the portfolio at the horizon conditional on the drawn Risk factors. Practitioners may perceive the computational burden of such nested schemes to be unacceptable and adopt a variety of second-best pricing techniques to avoid the inner simulation. In this paper, we question whether such short cuts are necessary. We show that a relatively small number of trials in the inner step can yield accurate estimates, and we analyze how a fixed computational budget may be allocated to the inner and the outer step to minimize the mean square error of the resultant estimator. Finally, we introduce a jackknife procedure for bias reduction.

  • nested simulation in portfolio Risk Measurement
    2008
    Co-Authors: Michael B Gordy, Sandeep Juneja
    Abstract:

    Risk Measurement for derivative portfolios almost invariably calls for nested simulation. In the outer step one draws realizations of all Risk factors up to the horizon, and in the inner step one re-prices each instrument in the portfolio at the horizon conditional on the drawn Risk factors. Practitioners may perceive the computational burden of such nested schemes to be unacceptable, and adopt a variety of second-best pricing techniques to avoid the inner simulation. In this paper, we question whether such short cuts are necessary. We show that a relatively small number of trials in the inner step can yield accurate estimates, and analyze how a fixed computational budget may be allocated to the inner and the outer step to minimize the mean square error of the resultant estimator. Finally, we introduce a jackknife procedure for bias reduction and a dynamic allocation scheme for improved efficiency.

Giacomo Scandolo - One of the best experts on this subject based on the ideXlab platform.

  • robustness and sensitivity analysis of Risk Measurement procedures
    Quantitative Finance, 2010
    Co-Authors: Rama Cont, Romain Deguest, Giacomo Scandolo
    Abstract:

    Measuring the Risk of a financial portfolio involves two steps: estimating the loss distribution of the portfolio from available observations and computing a ``Risk measure" which summarizes the Risk of the portfolio. We define the notion of ``Risk Measurement procedure", which includes both of these steps and introduce a rigorous framework for studying the robustness of Risk Measurement procedures and their sensitivity to changes in the data set. Our results point to a conflict between subadditivity and robustness of Risk Measurement procedures and show that the same Risk measure may exhibit quite different sensitivities depending on the estimation procedure used. Our results illustrate in particular that using recently proposed Risk measures like CVaR/ expected shortfall lead to a less robust Risk Measurement procedure than historical Value at Risk. We also propose alternative Risk Measurement procedures which possess the robustness property.

  • Robustness and Sensitivity Analysis of Risk Measurement Procedures
    SSRN Electronic Journal, 2008
    Co-Authors: Rama Cont, Romain Deguest, Giacomo Scandolo
    Abstract:

    Measuring the Risk of a financial portfolio involves two steps: estimating the loss distribution of the portfolio from available observations and computing a "Risk measure" which summarizes the Risk of the portfolio. We define the notion of "Risk Measurement procedure", which includes both of these steps, and study the robustness of Risk Measurement procedures and their sensitivity to a change in the data set. After introducing a rigorous definition of 'robustness' of a Risk Measurement procedure, we illustrate the presence of a conflict between subadditivity and robustness of Risk Measurement procedures. We propose a measure of sensitivity for Risk Measurement procedures and compute the sensitivity function of various examples of Risk estimators used in financial Risk management, showing that the same Risk measure may exhibit quite different sensitivities depending on the estimation procedure used. Our results illustrate in particular that using historical Value at Risk leads to a more robust procedure for Risk Measurement than recently proposed alternatives like CVaR. We also propose other Risk Measurement procedures which possess the robustness property.

Paolo Vicig - One of the best experts on this subject based on the ideXlab platform.

  • Financial Risk Measurement with imprecise probabilities
    International Journal of Approximate Reasoning, 2008
    Co-Authors: Paolo Vicig
    Abstract:

    Although financial Risk Measurement is a largely investigated research area, its relationship with imprecise probabilities has been mostly overlooked. However, Risk measures can be viewed as instances of upper (or lower) previsions, thus letting us apply the theory of imprecise previsions to them. After a presentation of some well known Risk measures, including Value-at-Risk or VaR, coherent and convex Risk measures, we show how their definitions can be generalized and discuss their consistency properties. Thus, for instance, VaR may or may not avoid sure loss, and conditions for this can be derived. This analysis also makes us consider a very large class of imprecise previsions, which we termed convex previsions, generalizing convex Risk measures. Shortfall-based measures and Dutch Risk measures are also investigated. Further, conditional Risks can be measured by introducing conditional convex previsions. Finally, we analyze the role in Risk Measurement of some important notions in the theory of imprecise probabilities, like the natural extension or the envelope theorems.

  • Convex Imprecise Previsions for Risk Measurement
    2003
    Co-Authors: Renato Pelessoni, Paolo Vicig
    Abstract:

    In this paper we introduce convex imprecise previsions as a special class of imprecise previsions, showing that they retain or generalise most of the relevant properties of coherent imprecise previsions but are not necessarily positively homogeneous. The broader class of weakly convex imprecise previsions is also studied and its fundamental properties are demonstrated. The notions of weak convexity and convexity are then applied to Risk Measurement, leading to a more general definition of convex Risk measure than the one already known in Risk Measurement literature.

  • Imprecise previsions for Risk Measurement
    International Journal of Uncertainty Fuzziness and Knowledge-Based Systems, 2003
    Co-Authors: Renato Pelessoni, Paolo Vicig
    Abstract:

    In this paper the theory of coherent imprecise previsions is applied to Risk Measurement. We introduce the notion of coherent Risk measure defined on an arbitrary set of Risks, showing that it can be considered a special case of coherent upper prevision. We also prove that our definition generalizes the notion of coherence for Risk measures defined on a linear space of random numbers, given in literature. Consistency properties of Value-at-Risk (VaR), currently one of the most used Risk measures, are investigated too, showing that it does not necessarily satisfy a weaker notion of consistency called 'avoiding sure loss'. We introduce sufficient conditions for VaR to avoid sure loss and to be coherent. Finally we discuss ways of modifying incoherent Risk measures into coherent ones.

Tao Ju-chun - One of the best experts on this subject based on the ideXlab platform.