Sunday, July 14, 2013

A Classification of Types of Probability and Theories of Probability

There are three fundamental conceptual divisions in the way that probability theory has been interpreted:
(1) the Classical interpretation;

(2) the epistemological (or epistemic) probability theory, further divided into
(i.) the logical interpretation;
(ii.) the subjective interpretation (personalism, subjective Bayesianism);
(iii.) the intersubjective interpretation;
(3) objective probability theory, further divided into
(i.) the frequency interpretation;
(ii.) the propensity interpretation.
(Gillies 2000: 2).
These are basically overarching philosophical interpretations of probability. The Classical interpretation is probably of historical interest only.

Keynes and Harold Jeffreys held the logical interpretation (2.i), which nevertheless seems widely rejected by modern philosophers of probability.

Frequency theorists include John Venn, A. N. Kolmogorov and Richard von Mises.

A subjective personalist theory of probability was developed by Bruno de Finetti, Frank P. Ramsey, and Leonard J. Savage. A decision making theory was developed from this that is still fundamental in neoclassical economics.

Regarding actual types of probability as a property and not a philosophy theory, although there are different classifications (see Appendix 1), perhaps there are two types, as argued by Rudolf Carnap and Ian Hacking:
(1) Epistemic/epistemological probability
A property of inferred propositions in inductive arguments, depending on the validity and soundness of the inductive arguments and evidence offered in support of it. It is thus a partial logical entailment. This is basically inductive probabilism.

(2) Aleatory probability
Long-run, relative frequency probabilities that are numerical values and that pertain to properties of elements of sets, classes or kinds. (McCann 1994: 27).
Basic notation to express probability is
where P is the probability,
h is some hypothesis or conclusion, and
e is the evidence or premises.
This is usually read as “the probability of h given evidence e.” Numerical values for probability lie between 0 and 1.

0 denotes impossibility and 1 certainty.

Epistemic/epistemological probability is obviously strongly connected with induction, generally the following types of argument:
(1) induction by simple enumeration;
(2) argument by analogy;
(3) statistical syllogism, and
(4) induction to a particular.
While aleatory probabilities are capable of having numerical values, it seems that many types of inference from inductive arguments are not.

But even many events that look like they might have aleatory probabilities cannot yield them:
“In games of chance, scientific inference is possible because … an aggregate regularity (in fair games) is readily apparent; chance affords an objective, homogeneous, stationary series. In empirically observable series, on the other hand, series chosen from a potentially unstable natural environment, such homogeneity and regularity may not be in evidence. One cannot a priori assume stability; rather one must be alert to the possibly chaotic nature of any empirical series which may, over the short and the long run, generate patterns for which a probability distribution does not exist or one which generates no discernible pattern whatsoever.” (McCann 1994: 32–33).
For example, what use is the time series data on the average daily selling price of a stock in providing an objective numerical value for the probability that this stock will have value y on the 15 July, 2017? The answer is: it is useless.

The assumption of an objective, homogeneous, stationary process producing events or variables over time, in the past, present and future, is the ergodic hypothesis or ergodic axiom, familiar from neoclassical economics. If the relative frequencies of outcomes of some process converge over a long-run time series, then the process is ergodic (Glickman 2003: 368). But many economic phenomena are non-ergodic, and, for example, non-stationarity is a sufficient condition for non-ergodicity. Therefore objective probabilities do not exist in such processes: past and present time series data are of limited value or just useless for strict prediction or forecasts in terms of numerical value probabilities.

But probabilities – whether (1) objectively numerical or (2) inductive and non-numerical – only form a basis and criterion for decision and action, and decision making theory must be concerned with how people actually make decisions in particular situations, and avoid highly abstract, logically incoherent, and empirically false theories.

Other systems of classifying types of probability as a property include the following:
(I.) Wesley Salmon (1967):
(1) Classical.
(2) subjective;
(3) frequency
(4) logical, and
(5) personal.
(II.) Roy Weatherford (1982):
(1) Classical.
(2) subjective/personal;
(3) frequency, and
(4) logical.
(III.) Leonard J. Savage (1972):
(1) necessarian;
(2) personalist, and
(3) frequentist.
Further Reading
Abrams, Marshall. 2012. “Mechanistic Social Probability: How individual Choices and Varying Circumstances produce Stable Social Patterns,” in Harold Kincaid (ed.), The Oxford Handbook of Philosophy of Social Science, Oxford University Press, Oxford. 184-226.

Galavotti, M. C. 2010. “Probability,” in Stathis Psillos and Martin Curd (eds.), The Routledge Companion to Philosophy of Science. Routledge, London and New York. 414-424.

Hartmann, S. and J. Sprenger, 2010. “Bayesian Epistemology,” in Sven Bernecker and Duncan Pritchard (eds.), The Routledge Companion to Epistemology. Routledge, London. 609-620.

Humphreys, Paul. 1998. “Probability, Interpretations of,” in Edward Craig (ed.), Routledge Encyclopedia of Philosophy. Volume 7, Nihilism - Quantum Mechanics. Routledge, London.

Interpretations of Probability, Stanford Encyclopedia of Philosophy, 2002 (rev. 2011)

Loewer, Barry. 1998. “Probability Theory and Epistemology,” in Edward Craig (ed.), Routledge Encyclopedia of Philosophy. Volume 7, Nihilism - Quantum Mechanics. Routledge, London. 705-711.

Skyrms, B. 2010. “Probability, Theories of,” in Jonathan Dancy, Ernest Sosa, and Matthias Steup (eds.), A Companion to Epistemology (2nd edn.). Wiley-Blackwell, Oxford. 622–626.

Gillies, Donald. 2000. Philosophical Theories of Probability. Routledge, London and New York.

Glickman, M. 2003. “Uncertainty,” in J. E. King (ed.), The Elgar Companion to Post Keynesian Economics. E. Elgar Pub., Cheltenham, UK and Northhampton, MA. 366–370.

McCann, Charles R. 1994. Probability Foundations of Economic Theory. Routledge, London.

Salmon, Wesley Charles. 1967. Foundations of Scientific Inference. University of Pittsburgh Press, Pittsburgh.

Savage, Leonard Jimmie. 1972. The Foundations of Statistics (2nd rev. edn.). Dover, New York.

Weatherford, Roy. 1982. Philosophical Foundations of Probability Theory. Routledge & Kegan Paul, London.


  1. "For example, what use is the time series data on the average daily selling price of a stock in providing an objective numerical value for the probability that this stock will have value y on the 15 July, 2017? The answer is: it is useless."

    A better example is the EMH, because people actually do claim that its for real. It states that stock prices do follow a random walk -- as you've said above -- but that they will gravitate toward (if not reflect perfectly) their underlying value.

    The value is derived from an objective measure of their return. This return changes in response to information (for example, a new discovery by a drug company). The EMH holds that the new information is non-ergodic but that the manner in which people interpret it -- that is, the manner in which they set the price -- is ergodic because it follows the underlying value.

    This is actually the trick to a great deal of neoclassical theory. Events are often seen as non-ergodic. But peoples', and hence the economy's, response to these events is ergodic.

  2. Here you go, LK:

    1. Very interesting.

      You say:

      "Everything comes down to this: do you believe that the future mirrors the past? Or, put another way: do you believe in deterministic laws that govern the universe and can be understood by human beings?."

      With respect to "do you believe that the future mirrors the past", the answer is -- obviously in significant areas of human history or social life -- "no" or "rarely", or "only in a limited sense." E.g., business cycles -- in the general sense of cycles of expansion and recession -- probably will continue as long as we have some form of capitalism, even though each phase of any cycle has unique properties and variables, not predictable in a objective sense.

      And once one moves from social sciences to natural sciences, then surely the answer must be, yes, there are plenty of processes where the future does mirror the past: cycles of days and nights, cycles of seasons from winter to summer, and so on.

      A Davidson says, the world is terribly complex mix of ergodic and non-ergodic processes. It is complicated by the existence of processes that behave in an ergodic way in the short term but in non-ergodic way in the long term.

    2. Well, I meant that -- and I think that I say this right after -- in a very metaphysical sense. I'm talking in terms of Absolutes here.

      The Big Question is: can we understand It All? Are there overarching Laws?

      Or, alternatively, are many of the things you mention simply repetitions, as philosophers like Hume, Kierkegaard and Nietzsche thought (the list is more extensive, but I won't go into it here)? Are the processes that are somewhat ergodic -- I don't think Davidson would claim that they are entirely ergodic -- just repetitions that are completely finite.

      I obviously take the view that they are repetitions and not Laws. I would say that, if I offered Davidson or Keynes my terminology, they would say largely the same thing.

      The neoclassicals and the probability guys, on the other hand, see an overarching structure where I see repetitions. They seek out Laws where I -- and I think you and other Post-Keynesians -- humbly seek out regularities or repetitions.

    3. Philip,

      I believe that you mean those things in a sense that is - too - metaphysical. Despite your harsh rebuttal of my earlier point of connection between the determinism and the existence of probabilities, I still think that you are wrong in connecting those two concepts.

      It is practically impossible to find out how far the determinism goes. For example, the game of pool is deterministic in terms of classical mechanics, yet to what precision? Penrose thought that for mechanical interaction to be completely deterministic it must be infinitely precise in some physical sense; one of the more interesting if amateurish takes on his ideas I've read proposed that 'universe' has limited precision and some physical phenomena we observe are the results of round-up errors. But what really matters for the existence of probabilities is the existence of emergent regularities, not determinism or indeterminism of reality itself.

      In a double-slit experiment the outcome of the next experiment is almost certainly non-deterministic, that is, completely uncertain. But in repeated series of experiments an overlaying structure emerges - results converge to 50/50. This stochastic process is ergodic, despite that the underlying physical phenomena is indeterminate (or we believe it to be indeterminate).

      Note that I disagree with your philosophical interpretations of determinism and the existence of probabilities, not with existence of ontological uncertainty or the economy being a non-ergodic process. On these matters I'm one hundred percent with Keynes and Davidson and other like-minded people. I hope that I was able to present my objections to you in a constructive manner.

    4. Sorry if that was a tad obscure. Here:

      Hopefully that will clarify.

  3. "Or, put another way: do you believe in deterministic laws that govern the universe and can be understood by human beings?"

    Understanding presupposes an ability to decide the focus of one's inquiry, to consider the evidence, to evaluate alternatives, to reach a judgment. (This was Kant's point). If all of these activities are determined in, say, the manner of a clock's workings, then it seems that these laws could not be *understood* by human beings or any other rational creature.