Theory of Probability

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 264 Experts worldwide ranked by ideXlab platform

Chiara Marletto - One of the best experts on this subject based on the ideXlab platform.

Michael Emmett Brady - One of the best experts on this subject based on the ideXlab platform.

  • How J M Keynes's Logical Theory of Probability Totally Refutes All Attacks on the Concept of Probability
    Social Science Research Network, 2017
    Co-Authors: Michael Emmett Brady
    Abstract:

    J Derbyshire has recently repeated G L S Shackle’s original attack on the concept of Probability in a number of recent articles. He adds nothing new to Shackle’s arguments of forty to eighty years ago that were completely refuted in 1959 by R. Weckstein, who based his refutation of Shackle on a relatively limited understanding of chapters 1-3 and 6 of Keynes’s A Treatise on Probability (1921). Shackle never replied to Weckstein’s repeated demonstrations in 1959 that Keynes’s approach refuted his arguments, which held only against the limiting frequency, relative frequency, propensity and subjective interpretations of Probability. Shackle’s arguments failed completely when confronted with Keynes’s Theory of logical Probability. J M Keynes’s Theory of Probability is a logical, objective, epistemological approach that is based on partial ordering and not complete orderings. This results in non additive, non linear interval valued probabilities, specified and operationalized in terms of logical propositions about any kind of event. It can easily deal with unique events, single events, crucial events, infrequent events, frequent events, non repeatable events, irreversible events, path dependence, sensitivity to initial conditions, emergence, complex causation, attractor states, partial uncertainty, complete and total uncertainty, fundamental uncertainty, irreducible uncertainty, by means of Keynes’s weight of the evidence analysis, which, when combined with his interval valued Probability, can demonstrate that Shackle’s Theory is a very special Theory that applies only to situations of complete and total uncertainty, which Keynes categorized as ignorance. There is not a single one of Derbyshire’s objections to the concept of Probability left standing if Keynes’s logical approach to Probability is used.

  • On J M Keynes's Rejection, in General, of Ramsey's Subjective Theory of Probability: The Keynes–Townshend Exchanges of 1937 and 1938
    Social Science Research Network, 2017
    Co-Authors: Michael Emmett Brady
    Abstract:

    J M Keynes rejected Ramsey’s subjective Theory of Probability in general. He did accept Ramsey’s betting quotient approach in the special case where the weight of the evidence, w, equaled one so that all the probabilities were linear, additive, precise, exact, definite, single number answers. In general, Keynes’s probabilities were indeterminate, interval valued probabilities that were non additive and nonlinear because the weight, w, was less than 1. F.Y. Edgeworth, Bertrand Russell, and Edwin Bidell Wilson all recognized the interval valued nature of Keynes’s probabilities (See my Reviewing the Reviewers of Keynes’s A Treatise on Probability, 2016). The Keynes – Townshend exchanges provide incontrovertible evidence that Keynes never accepted Ramsey’s subjective approach to Probability because there was no place in that Theory for interval valued Probability or for the concept of the weight of the evidence, since, for Ramsey, the subjective estimate of a degree of belief is the confidence a decision maker has in the betting odds while for Keynes, it is the degree of rational belief, not the degree of belief.The Keynes –Townshend exchanges, if carefully read and digested, contains the relevant evidence that allows a reader to conclude that Keynes remained an adherent practitioner of his Theory of logical Probability his entire life. He never changed his mind.Overlooking the Keynes – Townshend exchanges of 1937-38 explains why economists and academicians have failed to see the close connections that exist between the GT and TP.

  • Richard E. Braithwaite on J. M. Keynes's A Treatise on Probability and Logical Theory of Probability: Ignorance is Bliss
    Social Science Research Network, 2016
    Co-Authors: Michael Emmett Brady
    Abstract:

    Richard E Braithwaite’s Oct,1931 review article in Mind on Jefferys’ work on Probability also summarized what the current assessment of Keynes’s A Treatise on Probability and logical Theory of Probability was. This assessment is based on a complete and total ignorance on Braithwaite’s part about what Keynes actually accomplished in the A Treatise on Probability. He had no idea about what an interval valued, indeterminate Probability is. He had no idea about how Keynes built on Boole’s upper-lower bound approach. He had no idea about the concept of the weight of the evidence, w, and how it is connected to the size of the difference between the lower and upper bound. He apparently forgot that Keynes’s logical approach to Probability was carefully laid out in 1907 and 1908, which would be 12 or 13 years before Harold Jeffreys published his articles in 1919 with Wrinch.Finally, Braithwaite had no idea about how to compare objects using a relation of similarity and/or dissimilarity which, of course, is the basic requirement needed for pattern recognition, which is recognized as fundamental by cognitive psychologists and cognitive scientists if decision makers are to successfully use their intuition and induction. He is totally oblivious to the connection between degrees of similarity and Keynes’s logical Probability relations. In short, he was an ignorant fool who failed abysmally, like Hugh Townshend, to make use of the clues Keynes periodically sent his way.

  • The Economic Consequences of G L S Shackle's Ignorance of Keynes's Theory of Probability, Uncertainty, and Decision Making
    SSRN Electronic Journal, 2013
    Co-Authors: Michael Emmett Brady
    Abstract:

    G L S Shackle’s justification for creating his non-probabilistic, anti-inductive, deductivist, falsificationist, degree of disbelief, potential surprise, ascendancy function, focus gain-loss approach to decision making was founded on his misbelief that no Theory of Probability, including J M Keynes’s logical approach to Probability, was capable of incorporating a residual hypothesis specifying an unknown additional possible course of action available to the decision maker, where the summation of the set of all probabilities, each Probability being mapped in a one-to-one, onto correspondence with the set of all possible outcomes, would be less than 1. Shackle believed that all extant theories of Probability required the assumption of additivity. This additivity condition, which would have to hold as a necessary condition in all theories of Probability, including Keynes’s, meant that such theories could not deal with real uncertainty, which involved a lack of knowledge of some or all of the possible outcomes and/or consequences that might occur in the future. Shackle completely failed in his lifetime to recognize that Adam Smith, George Boole and J M Keynes had already incorporated non additivity into their logical approaches to Probability. For instance, Keynes had already succeeding in creating a non-linear, non additive decision Theory when Shackle was 5 years old. Shackle’s ignorance of Keynes’s magnum opus, the 1921 A Treatise on Probability(TP), means that his attempt to introduce non linearity and non additivity into decision making was equivalent to trying to reinvent the wheel, since Boole and Keynes had already accomplished Shackle’s task long before Shackle entered grammar school. Shackle also presented a technically inferior, more difficult and unclear approach when compared to that developed by Boole and Keynes. An examination of Shackle’s commentary on Keynes’s General Theory (GT) reveals that Shackle simply had no idea about what Keynes’s approach to Probability, based on interval estimates and equality-inequality constraints, was all about. This has led to a situation where economists and philosophers in the 20th Century had no idea about how Keynes operationalized his logical approach to Probability and decision making. For instance, the American, English, and Italian Post Keynesian schools, along with their Institutionalist and heterodox allies, due to their ignorance of Keynes’s TP-GT approach, unwittingly substituted Shackle’s rival Theory of uncertainty as their foundation instead of building on Keynes’s TP-GT approach. The consequences have been catastrophic. Shackle’s unsupported claim was that there was only either a state of complete uncertainty or certainty. Keynes, of course, had already correctly defined uncertainty in both the GT and TP as having different gradations based on his weight of the evidence analysis in chapters 6 and 26 of the TP. Keynes completely rejects Shackle. The major consequence has been the complete rout of these schools’ economic and philosophical positions in their intellectual battles with the Benthamite Utilitarians, who completely control almost all university/college economics departments. The remnants of these Post Keynesian schools meet annually and lament their feebleness, lack of relevance and impotence. of course, this was destined to happen since Shackle is simply not close to Keynes as either a theorist or technician. Four types of belated rear guard actions that the nearly extinct remnants of Post Keynesianism can consider, as possible alternative courses of action aimed at slowing down the neoclassical juggernaut in the future, are discussed - (a) admit their egregious errors and return to Keynes’s GT and TP, (b) ally themselves with Paul Krugman’s and Joseph Stiglitz’s use of imperfect and asymmetric information analysis, (c) build on the work of Daniel Ellsberg’s 1962 dissertation analysis of ambiguity, and/or (d) ally themselves with the econophysics followers of Benoit Mandelbrot and Nassim Nicholas Taleb, with their distinction between the wild risk of the Cauchy distribution versus the mild risk of the Normal distribution. There are no other choices, although, given the emphasis these schools of thought place on the importance of one’s imagination, they can, of course, imagine themselves spending their last days daydreaming and creating all kinds of kaleidoscopic rival hypotheses in which they are successfully able to challenge the neoclassical schools of thought.

  • adam smith s Theory of Probability and the roles of risk and uncertainty in economic decision making
    2013
    Co-Authors: Michael Emmett Brady
    Abstract:

    Adam Smith rejected the use of the mathematical laws of the calculus of probabilities because the basic information-data-knowledge provided in the real world of decision making did not allow a decision maker to specify precise, definite, exact, numerical probabilities or discover the Probability distributions. This means that Smith rejected the classical interpretation of Probability of La Place and the Bernoulli brothers, the limiting frequency-relative frequency interpretation of Probability, and the personalist, subjectivist, psychological Bayesian approach used by all neoclassical schools of thought because all of these approaches to Probability claim that ALL probabilities can be represented by a single numeral between 0 and 1 and the decision maker knows the Probability distributions. Smith, like Keynes, rejects this immediately. Thus, Smith’s inductive or logical concept of Probability, like Keynes’s, only approaches mathematical Probability in the limit. Adam Smith recognized that economic decision makers were confronted with knowledge structures that were not sharp and clear, but cloudy and amorphous. However, decision makers were still able to use the concept of Probability in the weaker, interval sense of the concept of Probability that was first thought to have been advocated by George Boole and later, with much greater force, by John Maynard Keynes in his two Fellowship dissertations, submitted in 1907 and 1908, respectively, and his A Treatise on Probability (1921). Instead of sharp, definite, determinate, calculated, and exact probabilistic estimates or distributions, inexact, indefinite, indeterminate, and imprecise estimates of probabilities could be derived and used so that decision makers were able to make choices among different possible options that concerned the future in a rational fashion. An important conclusion of this paper is that it was Adam Smith who first explicitly recognized that the mathematical concept of Probability is not applicable, in general, in real world decision making. Smith also rejects the normative and prescriptive roles of mathematical Probability in decision making. Adam Smith applied his approach to Probability and uncertainty by analyzing the economic decisions made by human beings in choosing a particular profession and organizing various insurance markets to cover the risk of loss. Smith’s risk is however, not the standard deviation of the Normal Probability distribution used by "Modern" economists, since important data/information/knowledge is missing and not available to the decision maker at the point in time that he is required to make a decision.

David Deutsch - One of the best experts on this subject based on the ideXlab platform.

Judith Rousseau - One of the best experts on this subject based on the ideXlab platform.

  • Harold Jeffreys' Theory of Probability revisited: a reply
    2009
    Co-Authors: Christian P. Robert, Nicolas Chopin, Judith Rousseau
    Abstract:

    We are grateful to all discussants (Bernardo, Gelman, Kass, Lindley, Senn, and Zellner) of our re-visitation for their strong support in our enterprise and for their overall agreement with our perspective. Further discussions with them and other leading statisticians showed that the legacy of Theory of Probability is alive and lasting.

  • Rejoinder: Harold Jeffreys’s Theory of Probability Revisited
    Statistical Science, 2009
    Co-Authors: Christian P. Robert, Nicolas Chopin, Judith Rousseau
    Abstract:

    We are grateful to all discussants (Bernardo, Gelman, Kass, Lindley, Senn, and Zellner) of our re-visitation for their strong support in our enterprise and for their overall agreement with our perspective. Further discussions with them and other leading statisticians showed that the legacy of Theory of Probability is alive and lasting.

  • Harold Jeffreys’s Theory of Probability Revisited
    Statistical Science, 2009
    Co-Authors: Nicolas Chopin, Christian P. Robert, Judith Rousseau
    Abstract:

    Published nearly seventy years ago, Jeffreys' Theory of Probability (1939) has had a unique impact on the Bayesian community and is now considered to be one of the main classics in Bayesian Statistics as well as the initiator of the objective Bayes school. In particular, its advances on the derivation of noninformative priors as well as on the scaling of Bayes factors have had a lasting impact on the field. However, the book reflects the characteristics of the time, especially in terms of mathematical rigorousness. In this paper, we point out the fundamental aspects of this reference work, especially the thorough coverage of testing problems and the construction of both estimation and testing noninformative priors based on functional divergences. Our major aim here is to help modern readers in navigating in this difficult text and in concentrating on passages that are still relevant today.

  • harold jeffreys s Theory of Probability revisited
    Statistical Science, 2009
    Co-Authors: Nicolas Chopin, Christian P. Robert, Judith Rousseau
    Abstract:

    Published nearly seventy years ago, Jeffreys' Theory of Probability (1939) has had a unique impact on the Bayesian community and is now considered to be one of the main classics in Bayesian Statistics as well as the initiator of the objective Bayes school. In particular, its advances on the derivation of noninformative priors as well as on the scaling of Bayes factors have had a lasting impact on the field. However, the book reflects the characteristics of the time, especially in terms of mathematical rigorousness. In this paper, we point out the fundamental aspects of this reference work, especially the thorough coverage of testing problems and the construction of both estimation and testing noninformative priors based on functional divergences. Our major aim here is to help modern readers in navigating in this difficult text and in concentrating on passages that are still relevant today.

Alexander N. Varnavsky - One of the best experts on this subject based on the ideXlab platform.

  • Software for interactive lectures on the Theory of Probability and mathematical statistics software for interactive lectures
    2017 6th Mediterranean Conference on Embedded Computing (MECO), 2017
    Co-Authors: Alexander N. Varnavsky
    Abstract:

    In this article proposed an interactive approach to the lectures on the Theory of Probability and mathematical statistics using software and mobile technologies. The approach is presented to how it is possible to involve the entire group of students in the lecture in the collection of data and modeling of the queuing system. The mobile application for student gadgets is shown that allows to interact and manage the simulated system. The work of the macro for Excel, which collects data, calculates and visualizes the results, is shown. Here also presented a program for interactive modeling of a queuing system during lectures.