Mathematical Structure

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 175302 Experts worldwide ranked by ideXlab platform

Maurizio Bagnara - One of the best experts on this subject based on the ideXlab platform.

  • bayesian calibration of simple forest models with multiplicative Mathematical Structure a case study with two light use efficiency models in an alpine forest
    Ecological Modelling, 2018
    Co-Authors: Maurizio Bagnara, Marcel Van Oijen, David Cameron, Damiano Gianelle, Federico Magnani, Matteo Sottocornola
    Abstract:

    Abstract Forest models are increasingly being used to study ecosystem functioning, through simulation of carbon fluxes and productivity in different biomes and plant functional types all over the world. Several forest models based on the concept of Light Use Efficiency (LUE) rely mostly on a simplified Mathematical Structure and empirical parameters, require little amount of data to be run, and their computations are usually fast. However, possible calibration issues must be investigated in order to ensure reliable results. Here we addressed the important issue of delayed convergence when calibrating LUE models, characterized by a multiplicative Structure, with a Bayesian approach. We tested two models (Prelued and the Horn and Schulz (2011a) model), applying three Markov Chain Monte Carlo-based algorithms with different number of iterations, and different sets of prior parameter distributions with increasing information content. The results showed that recently proposed algorithms for adaptive calibration did not confer a clear advantage over the Metropolis–Hastings Random Walk algorithm for the forest models used here, and that a high number of iterations is required to stabilize in the convergence region. This can be partly explained by the multiplicative Mathematical Structure of the models, with high correlations between parameters, and by the use of empirical parameters with neither ecological nor physiological meaning. The information content of the prior distributions of the parameters did not play a major role in reaching convergence with a lower number of iterations. We conclude that there is a need for a more careful approach to calibration to solve potential problems when applying models characterized by a multiplicative Mathematical Structure. Moreover, the calibration proved time consuming and Mathematically difficult, so advantages of using a computationally fast and user-friendly model were lost due to the calibration process needed to obtain reliable results.

  • a user friendly forest model with a multiplicative Mathematical Structure a bayesian approach to calibration
    Geoscientific Model Development Discussions, 2014
    Co-Authors: Maurizio Bagnara, Marcel Van Oijen, David Cameron, Damiano Gianelle, Federico Magnani, Matteo Sottocornola
    Abstract:

    Forest models are being increasingly used to study ecosystem functioning, through the reproduction of carbon fluxes and productivity in very different forests all over the world. Over the last two decades, the need for simple and “easy to use” models for practical applications, characterized by few parameters and equations, has become clear, and some have been developed for this purpose. These models aim to represent the main drivers underlying forest ecosystem processes while being applicable to the widest possible range of forest ecosystems. Recently, it has also become clear that model performance should not be assessed only in terms of accuracy of estimations and predictions, but also in terms of estimates of model uncertainties. Therefore, the Bayesian approach has increasingly been applied to calibrate forest models, with the aim of estimating the uncertainty of their results, and of comparing their performances. Some forest models, considered to be user-friendly, rely on a multiplicative or quasimultiplicative Mathematical Structure, which is known to cause problems during the calibration process, mainly due to high correlations between parameters. In a Bayesian framework using a Markov Chain Monte Carlo sampling this is likely to impair the reaching of a proper convergence of the chains and the sampling from the correct posterior distribution. Here we show two methods to reach proper convergence when using a forest model with a multiplicative Structure, applying different algorithms with different number of iterations during the Markov Chain Monte Carlo or a two-steps calibration. The results showed that recently proposed algorithms for adaptive calibration do not confer a clear advantage over the Metropolis–Hastings Random Walk algorithm for the forest model used here. Moreover, the calibration remains time consuming and Mathematically difficult, so advantages of using a fast and user-friendly model can be lost due to the calibration process that is needed to obtain reliable results.

Matteo Sottocornola - One of the best experts on this subject based on the ideXlab platform.

  • bayesian calibration of simple forest models with multiplicative Mathematical Structure a case study with two light use efficiency models in an alpine forest
    Ecological Modelling, 2018
    Co-Authors: Maurizio Bagnara, Marcel Van Oijen, David Cameron, Damiano Gianelle, Federico Magnani, Matteo Sottocornola
    Abstract:

    Abstract Forest models are increasingly being used to study ecosystem functioning, through simulation of carbon fluxes and productivity in different biomes and plant functional types all over the world. Several forest models based on the concept of Light Use Efficiency (LUE) rely mostly on a simplified Mathematical Structure and empirical parameters, require little amount of data to be run, and their computations are usually fast. However, possible calibration issues must be investigated in order to ensure reliable results. Here we addressed the important issue of delayed convergence when calibrating LUE models, characterized by a multiplicative Structure, with a Bayesian approach. We tested two models (Prelued and the Horn and Schulz (2011a) model), applying three Markov Chain Monte Carlo-based algorithms with different number of iterations, and different sets of prior parameter distributions with increasing information content. The results showed that recently proposed algorithms for adaptive calibration did not confer a clear advantage over the Metropolis–Hastings Random Walk algorithm for the forest models used here, and that a high number of iterations is required to stabilize in the convergence region. This can be partly explained by the multiplicative Mathematical Structure of the models, with high correlations between parameters, and by the use of empirical parameters with neither ecological nor physiological meaning. The information content of the prior distributions of the parameters did not play a major role in reaching convergence with a lower number of iterations. We conclude that there is a need for a more careful approach to calibration to solve potential problems when applying models characterized by a multiplicative Mathematical Structure. Moreover, the calibration proved time consuming and Mathematically difficult, so advantages of using a computationally fast and user-friendly model were lost due to the calibration process needed to obtain reliable results.

  • a user friendly forest model with a multiplicative Mathematical Structure a bayesian approach to calibration
    Geoscientific Model Development Discussions, 2014
    Co-Authors: Maurizio Bagnara, Marcel Van Oijen, David Cameron, Damiano Gianelle, Federico Magnani, Matteo Sottocornola
    Abstract:

    Forest models are being increasingly used to study ecosystem functioning, through the reproduction of carbon fluxes and productivity in very different forests all over the world. Over the last two decades, the need for simple and “easy to use” models for practical applications, characterized by few parameters and equations, has become clear, and some have been developed for this purpose. These models aim to represent the main drivers underlying forest ecosystem processes while being applicable to the widest possible range of forest ecosystems. Recently, it has also become clear that model performance should not be assessed only in terms of accuracy of estimations and predictions, but also in terms of estimates of model uncertainties. Therefore, the Bayesian approach has increasingly been applied to calibrate forest models, with the aim of estimating the uncertainty of their results, and of comparing their performances. Some forest models, considered to be user-friendly, rely on a multiplicative or quasimultiplicative Mathematical Structure, which is known to cause problems during the calibration process, mainly due to high correlations between parameters. In a Bayesian framework using a Markov Chain Monte Carlo sampling this is likely to impair the reaching of a proper convergence of the chains and the sampling from the correct posterior distribution. Here we show two methods to reach proper convergence when using a forest model with a multiplicative Structure, applying different algorithms with different number of iterations during the Markov Chain Monte Carlo or a two-steps calibration. The results showed that recently proposed algorithms for adaptive calibration do not confer a clear advantage over the Metropolis–Hastings Random Walk algorithm for the forest model used here. Moreover, the calibration remains time consuming and Mathematically difficult, so advantages of using a fast and user-friendly model can be lost due to the calibration process that is needed to obtain reliable results.

Didier Sornette - One of the best experts on this subject based on the ideXlab platform.

  • Mathematical Structure of quantum decision theory
    Advances in Complex Systems, 2010
    Co-Authors: V I Yukalov, Didier Sornette
    Abstract:

    One of the most complex systems is the human brain whose formalized functioning is characterized by decision theory. We present a "Quantum Decision Theory" of decision-making, based on the Mathematical theory of separable Hilbert spaces. This Mathematical Structure captures the effect of superposition of composite prospects, including many incorporated intentions, which allows us to explain a variety of interesting fallacies and anomalies that have been reported to particularize the decision-making of real human beings. The theory describes entangled decision-making, non-commutativity of subsequent decisions, and intention interference of composite prospects. We demonstrate how the violation of the Savage's sure-thing principle (disjunction effect) can be explained as a result of the interference of intentions, when making decisions under uncertainty. The conjunction fallacy is also explained by the presence of the interference terms. We demonstrate that all known anomalies and paradoxes, documented in the context of classical decision theory, are reducible to just a few Mathematical archetypes, all of which allow the finding of straightforward explanations in the frame of the developed quantum approach.

  • Mathematical Structure of quantum decision theory
    arXiv: Artificial Intelligence, 2008
    Co-Authors: V I Yukalov, Didier Sornette
    Abstract:

    One of the most complex systems is the human brain whose formalized functioning is characterized by decision theory. We present a "Quantum Decision Theory" of decision making, based on the Mathematical theory of separable Hilbert spaces. This Mathematical Structure captures the effect of superposition of composite prospects, including many incorporated intentions, which allows us to explain a variety of interesting fallacies and anomalies that have been reported to particularize the decision making of real human beings. The theory describes entangled decision making, non-commutativity of subsequent decisions, and intention interference of composite prospects. We demonstrate how the violation of the Savage's sure-thing principle (disjunction effect) can be explained as a result of the interference of intentions, when making decisions under uncertainty. The conjunction fallacy is also explained by the presence of the interference terms. We demonstrate that all known anomalies and paradoxes, documented in the context of classical decision theory, are reducible to just a few Mathematical archetypes, all of which finding straightforward explanations in the frame of the developed quantum approach.

Gilles Savard - One of the best experts on this subject based on the ideXlab platform.

  • Mathematical Structure of a bilevel strategic pricing model
    European Journal of Operational Research, 2009
    Co-Authors: Patrice Marcotte, Gilles Savard
    Abstract:

    This paper is concerned with the characterization of optimal strategies for a service firm acting in an oligopolistic environment. The decision problem is formulated as a leader-follower game played on a transportation network, where the leader firm selects a revenue-maximizing price schedule that takes explicitly into account the rational behavior of the customers. In the context of our analysis, the follower's problem is associated with a competitive network market involving non atomic customer groups. The resulting bilevel model can therefore be viewed as a model of product differentiation subject to structural network constraints.

V I Yukalov - One of the best experts on this subject based on the ideXlab platform.

  • Mathematical Structure of quantum decision theory
    Advances in Complex Systems, 2010
    Co-Authors: V I Yukalov, Didier Sornette
    Abstract:

    One of the most complex systems is the human brain whose formalized functioning is characterized by decision theory. We present a "Quantum Decision Theory" of decision-making, based on the Mathematical theory of separable Hilbert spaces. This Mathematical Structure captures the effect of superposition of composite prospects, including many incorporated intentions, which allows us to explain a variety of interesting fallacies and anomalies that have been reported to particularize the decision-making of real human beings. The theory describes entangled decision-making, non-commutativity of subsequent decisions, and intention interference of composite prospects. We demonstrate how the violation of the Savage's sure-thing principle (disjunction effect) can be explained as a result of the interference of intentions, when making decisions under uncertainty. The conjunction fallacy is also explained by the presence of the interference terms. We demonstrate that all known anomalies and paradoxes, documented in the context of classical decision theory, are reducible to just a few Mathematical archetypes, all of which allow the finding of straightforward explanations in the frame of the developed quantum approach.

  • Mathematical Structure of quantum decision theory
    arXiv: Artificial Intelligence, 2008
    Co-Authors: V I Yukalov, Didier Sornette
    Abstract:

    One of the most complex systems is the human brain whose formalized functioning is characterized by decision theory. We present a "Quantum Decision Theory" of decision making, based on the Mathematical theory of separable Hilbert spaces. This Mathematical Structure captures the effect of superposition of composite prospects, including many incorporated intentions, which allows us to explain a variety of interesting fallacies and anomalies that have been reported to particularize the decision making of real human beings. The theory describes entangled decision making, non-commutativity of subsequent decisions, and intention interference of composite prospects. We demonstrate how the violation of the Savage's sure-thing principle (disjunction effect) can be explained as a result of the interference of intentions, when making decisions under uncertainty. The conjunction fallacy is also explained by the presence of the interference terms. We demonstrate that all known anomalies and paradoxes, documented in the context of classical decision theory, are reducible to just a few Mathematical archetypes, all of which finding straightforward explanations in the frame of the developed quantum approach.