Behavioral finance FAQ / Glossary (Bayes, Bayesian)

A    B    C    D    E    F    G-H    I-L    M    N-O    P-Q    R    S    T-U    V-Z

Full list

This is a separate page of the B section of the Glossary


Dates of related message(s) in the
Behavioral-Finance group (*):

Year/month, d: developed / discussed,
i: incidental

Bayes, Bayesian probability, learning

00/9i,10i - 01/2i - 02/2i,5i,8i,9i -
03/3i - 04/8i - 08/9d
+ see model,
probabilities, uncertainty, weak
signals, (reaction to) information,
adjustment + probabilities site link

When there is fog ahead,

the trick is to test the track step by step

until being more certain of the road.


Bayesian probabilities (or conditional probabilities) are probabilities that

* are based on initial subjective (or at least tentative) hypotheses,
* are later adjusted when new information
   (or even ...lack of event) confirm or contradict them.

They usually apply to a set of "mutually exclusive"  hypothesis / probabilities 
/ asumptions (it will  be fully red, or blue, or blue and red, or another color)

They are flexible / , adaptable prevision tools.

They start with "quantified intuitions" (and
sometimes objective probabilities) and the Bayesian
process is to test and adjust them in real time
when additional information are known.

Why to be subjective?

Why start often with impressions, or assumptions
drawn from your hat?

Just because only them are available!
And because there is my brain under my hat!

In many cases, objective probabilities are not available or not fully

This uncertainty (see that word) happens:

When events and situations are new (as unprecedented
    or with rare precedents).

All the more when human factors are at play (in economics and
     finance for example).

Then, only subjective probabilities, tentative assumptions, partial
knowledge, testable hypotheses and educated guesses can be

To apply subjective / partially subjective probabilities to models that were
built to use fully objective probabilities apparently strays from rationality.
How could half baked guesses lead to fully scientific  /predictive "asset
pricing models" ?

Actually, such tentative models, whatever their shortcomings, at
least help test the water.

And their subjectivity decreases at each adjustment step to such tests.

Bayesian probabilities differ
    from standard probabilities

Past frequencies vs. future hypothesis.

Standard probabilities are usually based on data frequencies in past
not on assumptions and hypotheses, however

Also, in random time-distributions, the next event is usually assumed to
be independent
of prior ones.

On the contrary, the Bayesian law is fully future

/ scenario oriented, not just driven by past data (if they exist)

For example, it does not trust "back testing" fully, although that procedure
can help to do the first estimate...

It also considers the next probability, when a new event occurs (or new
appear) as "conditioned" (thus the "conditional probabilities"
appellation), of course to that new fact but also to the previous estimates


=> It simply adjusts that prior estimate.

It can also consider that this next probability is conditioned to the occurrence
of some event.

=> If that probable event does not occur, the probability changes also.

How the "Bayes law" (*) expresses this.

(*) from Thomas Bayes, a 18th Century mathematician

The probability -p- of some proposition after receiving information
that some proposition -q-
has occurred is called its "conditional

It is written as Pr(p|q) and reads as
      the new probability of -p- given that -q- has occurred.

Example: choose your Excalibur!

Supposing only one among three swords (or spades
if you have no use for swords):

  Excalibur A, Excalibur B and Excalibur C

can make you King Arthur (or a good gardener)

You have to choose 1 of them.

That gives you a 1/3 probability, no?

Knowing that, you decide to play,
and you choose A.

It does not give you the crown (or the garden), sorry.

But it gives you the right to choose between B and C.

Your probability becomes 1/2.

And you discover that in fact
the hidden original probability was above 1//3.

How to determine the initial hypotheses
     and on what basis to adjust them.

Educated guess, then looking for more clues.

1) Bayesian
probabilities are

at the start

assumptions /
hypotheses /
about the
odds that one (*)
proposition will
prove right or

Whenever possible, those priors are 
not full fantasies.

     They might be objective probabilities, but
insufficient to cover the specificities of to
         the actual situation,

They are often based on experience, or on

some signals, and flimsy coincidences or 
on deductions, inferences, reasoning.

But they might also be purely intuitive / 
subjective (degree of belief).

2) Each new finding
or event
(or lack of
finding or event) is

compared to its

occurrences, so as
to adjust them.

This helps to deduct
new probabilities

* about the real

truth (in static

* or about the final



In practice, every time new information 
or a new finding is available, probabilities are 
(see adjustment)
into "revised"

This is done by taking into account those initial 
probabilities as well  as any new relevant

All new / additional facts or events (and

their frequency) that might reinforce,
complement or change the assumptions.

Also all non-occurrences (**) of events
   (and their frequency) that would confirm or 
infirm a scenario.

The "residual possibilities" after an

expected event took place (***), or failed
to take place
It changes:

the probability numerator of a
    predicted outcome,

but also the denominator, the sum
    of all possible outcomes.

(*) Actually, there might be one or several propositions, depending
the scenarios that can be imagined, every one with its odds

(**) Among other specificities, and in some applications, Bayesian
statistics help to calculate, from the number of times a probable

event has *not*occurred, the probability that  it will occur in
the future.

This is called "inverse probability".

(***) Your cat just died? Sorry, but at least it will not die again!

Those revised probabilities are less based on subjective beliefs than
the initial ones, as objective facts are taken into accounts.

Bayesian probabilities should not be confused with fuzzy logic (see that phrase),
although there is some similarity in their iterative processes.

Also, there is some relation with computer techniques such as genetic algorithm
orsoft computing.

Is it just about waiting for occurrences?
    Or about anticipating them?

Are your plans B, C, D, etc. ready?

We see that a Bayesian probability is an adaptation approach that leads
tto change the probabilities p when q occurs (or does not occur).

But adaptation, however needed, is not anticipation.

To make decisions there is a need not only to adapt to what
happens,but also to anticipate what could happen.

The anticipation part in the Bayesian "philosophy" is to start with "priors"
and to have an idea about what could confirm or change them.

This incites, whenever possible to prepare beforehand
a range of various key scenarios (see that
word) and give them tentative probabilities.

Every key scenario anticipates a possible bifurcation (see
word) of the chain of events that can change dramatically the
foreseeable situation

In what cases to use Bayesian probabilities?

In the case of a misty past and of a foggy future.

This mental process, also called Bayesian learning or
Bayesian inferences, suits situations of ambiguity /
(see that word and its
difference with "risk"),
in which probabilities cannot
be scientifically
quantified, as past data do not exist
or are not reliable.

Here are several cases:

When historical data about the frequency of events are

unavailable, too scarce, or not trustworthy,

When situations are new, thus when past data are:

Either inexistent, or insufficient

Or not relevant. It would be dangerous to use data on past

situations that are not really similar.
That covers all cases where the future might not be fully
identical to the past we know

Those various situations are frequent in economics and finance, as
human activities are highly prone to uncertainty.

In such cases, Bayesian probabilities help to adjust economic and financial
expectations and decisions

Are people Bayesian?

Is Joe a good adjuster?

"Adjustment" (of probabilities) is the key in Bayesian thinking.

This rings a bell as the word is also used for the behavioral finance

"underreaction - adjustment - overreaction" chain (see those

When a new event takes place, investors and the market often:

Underreact: they stay anchored on the old situation and are not
   awarz of the new odds.

This delays the market price adjustment,

Overreact: they look only at the new event without checking whether

it fits or not the previous hypotheses they made on the basis of the old

This shows that underreactions / overreactions are also under-adjustments /
of prior beliefs and are signs that investors are not
always fully Bayesian
(and also not fully rational).

But use the stuff carefully.

Beware of "false positive".

All this does not mean that Bayesian thinking is fully rational and effective.

As seen above, it often starts with subjective assumptions, hoping they will
become progressively 
more objective when tested against the events and
adjusted accordingly.

The risk is that some biased assumptions might get
confirmed :

By a scarce random coincidence
   (see small numbers, representativeness),

Or by selective exposure (see that phrase).

Then the biased mental anchoring get solidified (see "confirmatory bias").

As a common example, this is sometimes what creates superstition.

(*) To find those messages: reach that BF group and, once there,
      1) click "messages", 2) enter your query in "search archives".

Members of the Behavioral Finance Group,
 please vote on the glossary quality at
BF polls


This page last update: 10/08/15   

    B section of the Glossary
Behavioral-Finance Gallery main page

Disclaimer / Avertissement légal