Efficient Frontier

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 360 Experts worldwide ranked by ideXlab platform

Marcos Lopez De Prado - One of the best experts on this subject based on the ideXlab platform.

  • a robust estimator of the Efficient Frontier
    Social Science Research Network, 2016
    Co-Authors: Marcos Lopez De Prado
    Abstract:

    Convex optimization solutions tend to be unstable, to the point of entirely offsetting the benefits of optimization. For example, in the context of financial applications, it is known that portfolios optimized in-sample often underperform the naive (equal weights) allocation out-of-sample. This instability can be traced back to two sources: (i) noise in the input variables; and (ii) signal structure that magnifies the estimation errors in the input variables. A first innovation of this paper is to introduce the nested clustered optimization algorithm (NCO), a method that tackles both sources of instability. Over the past 60 years, various approaches have been developed to address these two sources of instability. These approaches are flawed in the sense that different methods may be appropriate for different input variables, and it is unrealistic to expect that one method will dominate all the rest under all circumstances. Accordingly, a second innovation of this paper is to introduce MCOS, a Monte Carlo approach that estimates the allocation error produced by various optimization methods on a particular set of input variables. The result is a precise determination of what method is most robust to a particular case. Thus, rather than relying always on one particular approach, MCOS allows users to apply opportunistically whatever optimization method is best suited in a particular setting. Presentation materials are available at: https://ssrn.com/abstract=3469964.

  • the sharpe ratio Efficient Frontier
    Journal of Risk, 2012
    Co-Authors: David H Bailey, Marcos Lopez De Prado
    Abstract:

    ABSTRACT We evaluate the probability that an estimated Sharpe ratio exceeds a given threshold in presence of non-Normal returns. We show that this new uncertainty-adjusted investment skill metric (called Probabilistic Sharpe ratio , or PSR) has a number of important applications: First, it allows us to establish the track record length needed for rejecting the hypothesis that a measured Sharpe ratio is below a certain threshold with a given confidence level. Second, it models the trade-off between track record length and undesirable statistical features (e.g., negative skewness with positive excess kurtosis). Third, it explains why track records with those undesirable traits would benefit from reporting performance with the highest sampling frequency such that the IID assumption is not violated. Fourth, it permits the computation of what we call the Sharpe ratio Efficient Frontier (SEF), which lets us optimize a portfolio under non-Normal, leveraged returns while incorporating the uncertainty derived from track record length. Results can be validated using the Python code in the Appendix. Keywords: Sharpe ratio, Efficient Frontier, IID, Normal distribution, Skewness, Excess Kurtosis, track record. JEL Classifications: C02, G11, G14, D53.

  • the sharpe ratio Efficient Frontier
    Social Science Research Network, 2012
    Co-Authors: David H Bailey, Marcos Lopez De Prado
    Abstract:

    We evaluate the probability that an estimated Sharpe ratio exceeds a given threshold in presence of non-Normal returns. We show that this new uncertainty-adjusted investment skill metric (called Probabilistic Sharpe ratio, or PSR) has a number of important applications: First, it allows us to establish the track record length needed for rejecting the hypothesis that a measured Sharpe ratio is below a certain threshold with a given confidence level. Second, it models the trade-off between track record length and undesirable statistical features (e.g., negative skewness with positive excess kurtosis). Third, it explains why track records with those undesirable traits would benefit from reporting performance with the highest sampling frequency such that the IID assumption is not violated. Fourth, it permits the computation of what we call the Sharpe ratio Efficient Frontier (SEF), which lets us optimize a portfolio under non-Normal, leveraged returns while incorporating the uncertainty derived from track record length. Results can be validated using the Python code in the Appendix.

Jesús T. Pastor - One of the best experts on this subject based on the ideXlab platform.

  • estimating and decomposing overall inefficiency by determining the least distance to the strongly Efficient Frontier in data envelopment analysis
    Operational Research, 2020
    Co-Authors: Juan Aparicio, Jesús T. Pastor, Jose L Sainzpardo, Fernando Vidal
    Abstract:

    This paper proposes a new method to measure economic inefficiency of decision making units based on the calculation of the least distance to the Pareto-Efficient Frontier in data envelopment analysis. While all previously published approaches that have dealt with the problem of determining least distances to the Efficient Frontier are focus on exclusively technical inefficiency, the new methodology opens the door to applications of this approach when market prices, together with inputs and outputs, are available. Finally, the paper empirically illustrates the new method using recent data on the mandarins’ production in a Spanish eastern province.

  • graph productivity change measure using the least distance to the pareto Efficient Frontier in data envelopment analysis
    Omega-international Journal of Management Science, 2017
    Co-Authors: Juan Aparicio, Eva M Garcianove, Magdalena Kapelko, Jesús T. Pastor
    Abstract:

    This paper proposes a new method to measure productivity change of decision making units in the full input-output space. The new approach is based on the calculation of the least distance to the Pareto-Efficient Frontier and hence provides the closest targets for evaluated decision making units to reach the strongly Efficient Frontier with least effort. Another advantage of the new methodology is that it always leads to feasible solutions. The productivity change in the new approach is operationalized as a Luenberger-type indicator in the Data Envelopment Analysis framework and it is decomposed into efficiency change and technical change. The paper empirically illustrates the new method using recent data on the Spanish quality wine sector.

  • estimating and decomposing overall inefficiency by determining the least distance to the strongly Efficient Frontier in data envelopment analysis
    Social Science Research Network, 2017
    Co-Authors: Juan Aparicio, Jesús T. Pastor, Jose L Sainzpardo, Fernando Vidal
    Abstract:

    This paper proposes a new method to measure economic inefficiency of decision making units based on the calculation of the least distance to the Pareto-Efficient Frontier in Data Envelopment Analysis (DEA). While all previously published approaches that have dealt with the problem of determining least distances to the Efficient Frontier are focus on exclusively technical inefficiency, the new methodology opens the door to applications of this approach when market prices, together with inputs and outputs, are available. Finally, the paper empirically illustrates the new method using recent data on the mandarins' production in a Spanish eastern province.

  • the determination of the least distance to the strongly Efficient Frontier in data envelopment analysis oriented models modelling and computational aspects
    MPRA Paper, 2016
    Co-Authors: Juan Aparicio, Jose Manuel Cordero, Jesús T. Pastor
    Abstract:

    Determining the least distance to the Efficient Frontier for estimating technical inefficiency, with the consequent determination of closest targets, has been one of the relevant issues in recent Data Envelopment Analysis literature. This new paradigm contrasts with traditional approaches, which yield furthest targets. In this respect, some techniques have been proposed in order to implement the new paradigm. A group of these techniques is based on identifying all the Efficient faces of the polyhedral production possibility set and, therefore, is associated with the resolution of a NP-hard problem. In contrast, a second group proposes different models and particular algorithms to solve the problem avoiding the explicit identification of all these faces. These techniques have been applied more or less successfully. Nonetheless, the new paradigm is still unsatisfactory and incomplete to a certain extent. One of these challenges is that related to measuring technical inefficiency in the context of oriented models, i.e., models that aim at changing inputs or outputs but not both. In this paper, we show that existing specific techniques for determining the least distance without identifying explicitly the Frontier structure for graph measures, which change inputs and outputs at the same time, do not work for oriented models. Consequently, a new methodology for satisfactorily implementing these situations is proposed. Finally, the new approach is empirically checked by using a recent PISA database consisting of 902 schools.

  • closest targets and strong monotonicity on the strongly Efficient Frontier in dea
    Omega-international Journal of Management Science, 2014
    Co-Authors: Juan Aparicio, Jesús T. Pastor
    Abstract:

    Abstract The determination of closest Efficient targets has attracted increasing interest of researchers in recent Data Envelopment Analysis (DEA) literature. Several methods have been introduced in this respect. However, only a few attempts exist that analyze the implications of using closest targets on the technical inefficiency measurement. In particular, least distance measures based on Holder norms satisfy neither weak nor strong monotonicity on the strongly Efficient Frontier. In this paper, we study Holder distance functions and show why strong monotonicity fails. Along this line, we provide a solution for output-oriented models that allows assuring strong monotonicity on the strongly Efficient Frontier. Our approach may also be extended to the most general case, i.e. non-oriented models, under some conditions of regularity.

Juan Aparicio - One of the best experts on this subject based on the ideXlab platform.

  • estimating and decomposing overall inefficiency by determining the least distance to the strongly Efficient Frontier in data envelopment analysis
    Operational Research, 2020
    Co-Authors: Juan Aparicio, Jesús T. Pastor, Jose L Sainzpardo, Fernando Vidal
    Abstract:

    This paper proposes a new method to measure economic inefficiency of decision making units based on the calculation of the least distance to the Pareto-Efficient Frontier in data envelopment analysis. While all previously published approaches that have dealt with the problem of determining least distances to the Efficient Frontier are focus on exclusively technical inefficiency, the new methodology opens the door to applications of this approach when market prices, together with inputs and outputs, are available. Finally, the paper empirically illustrates the new method using recent data on the mandarins’ production in a Spanish eastern province.

  • graph productivity change measure using the least distance to the pareto Efficient Frontier in data envelopment analysis
    Omega-international Journal of Management Science, 2017
    Co-Authors: Juan Aparicio, Eva M Garcianove, Magdalena Kapelko, Jesús T. Pastor
    Abstract:

    This paper proposes a new method to measure productivity change of decision making units in the full input-output space. The new approach is based on the calculation of the least distance to the Pareto-Efficient Frontier and hence provides the closest targets for evaluated decision making units to reach the strongly Efficient Frontier with least effort. Another advantage of the new methodology is that it always leads to feasible solutions. The productivity change in the new approach is operationalized as a Luenberger-type indicator in the Data Envelopment Analysis framework and it is decomposed into efficiency change and technical change. The paper empirically illustrates the new method using recent data on the Spanish quality wine sector.

  • estimating and decomposing overall inefficiency by determining the least distance to the strongly Efficient Frontier in data envelopment analysis
    Social Science Research Network, 2017
    Co-Authors: Juan Aparicio, Jesús T. Pastor, Jose L Sainzpardo, Fernando Vidal
    Abstract:

    This paper proposes a new method to measure economic inefficiency of decision making units based on the calculation of the least distance to the Pareto-Efficient Frontier in Data Envelopment Analysis (DEA). While all previously published approaches that have dealt with the problem of determining least distances to the Efficient Frontier are focus on exclusively technical inefficiency, the new methodology opens the door to applications of this approach when market prices, together with inputs and outputs, are available. Finally, the paper empirically illustrates the new method using recent data on the mandarins' production in a Spanish eastern province.

  • the determination of the least distance to the strongly Efficient Frontier in data envelopment analysis oriented models modelling and computational aspects
    MPRA Paper, 2016
    Co-Authors: Juan Aparicio, Jose Manuel Cordero, Jesús T. Pastor
    Abstract:

    Determining the least distance to the Efficient Frontier for estimating technical inefficiency, with the consequent determination of closest targets, has been one of the relevant issues in recent Data Envelopment Analysis literature. This new paradigm contrasts with traditional approaches, which yield furthest targets. In this respect, some techniques have been proposed in order to implement the new paradigm. A group of these techniques is based on identifying all the Efficient faces of the polyhedral production possibility set and, therefore, is associated with the resolution of a NP-hard problem. In contrast, a second group proposes different models and particular algorithms to solve the problem avoiding the explicit identification of all these faces. These techniques have been applied more or less successfully. Nonetheless, the new paradigm is still unsatisfactory and incomplete to a certain extent. One of these challenges is that related to measuring technical inefficiency in the context of oriented models, i.e., models that aim at changing inputs or outputs but not both. In this paper, we show that existing specific techniques for determining the least distance without identifying explicitly the Frontier structure for graph measures, which change inputs and outputs at the same time, do not work for oriented models. Consequently, a new methodology for satisfactorily implementing these situations is proposed. Finally, the new approach is empirically checked by using a recent PISA database consisting of 902 schools.

  • closest targets and strong monotonicity on the strongly Efficient Frontier in dea
    Omega-international Journal of Management Science, 2014
    Co-Authors: Juan Aparicio, Jesús T. Pastor
    Abstract:

    Abstract The determination of closest Efficient targets has attracted increasing interest of researchers in recent Data Envelopment Analysis (DEA) literature. Several methods have been introduced in this respect. However, only a few attempts exist that analyze the implications of using closest targets on the technical inefficiency measurement. In particular, least distance measures based on Holder norms satisfy neither weak nor strong monotonicity on the strongly Efficient Frontier. In this paper, we study Holder distance functions and show why strong monotonicity fails. Along this line, we provide a solution for output-oriented models that allows assuring strong monotonicity on the strongly Efficient Frontier. Our approach may also be extended to the most general case, i.e. non-oriented models, under some conditions of regularity.

Eric J Gonzales - One of the best experts on this subject based on the ideXlab platform.

  • Efficient Frontier of route choice for modeling the equilibrium under travel time variability with heterogeneous traveler preferences
    Economics of Transportation, 2017
    Co-Authors: Mahyar Amirgholy, Eric J Gonzales
    Abstract:

    Abstract Travelers consider the average duration and the reliability of travel time when choosing their route. However, the relative importance of average travel time and reliability not only depends on the purpose of the trip, but also varies from one person to another. Users seek to minimize their travel costs leading to an equilibrium condition in which they choose routes in such a way that they cannot reduce the general cost of their own trip. In this paper, we adopt the concept of the Efficient Frontier to represent the equilibrium route choice of the heterogeneous users in a network under travel time variability. Then, we use the primary properties of the Efficient Frontier to propose a mathematical formulation for the route choice problem for a discrete or continuous distribution of sensitivity of users to variations in route travel times. An analytical-based algorithm is designed to assign the heterogeneous demand to the network. Efficiency of the proposed algorithm in solving the route choice problem is also compared in a numerical example with a classic iterative method with a smoothing factor.

Enrique Ballestero - One of the best experts on this subject based on the ideXlab platform.

  • mean semivariance Efficient Frontier a downside risk model for portfolio selection
    Applied Mathematical Finance, 2005
    Co-Authors: Enrique Ballestero
    Abstract:

    An ongoing stream in financial analysis proposes mean‐semivariance in place of mean‐variance as an alternative approach to portfolio selection, since segments of investors are more averse to returns below the mean value than to deviations above and below the mean value. Accordingly, this paper searches for a stochastic programming model in which the portfolio semivariance is the objective function to be minimized subject to standard parametric constraints, which leads to the mean‐semivariance Efficient Frontier. The proposed model relies on an empirically tested basis, say, portfolio diversification and the empirical validity of Sharpe's beta regression equation relating each asset return to the market. From this basis, the portfolio semivariance matrix form is strictly mathematically derived, thus an operational quadratic objective function is obtained without resorting to heuristics. Ease of computation is highlighted by a numerical example, which allows one to compare the results from the proposed mean...