I've been reading "Ten great ideas about chance", by Persi Diaconis.
It includes a chapter which utilizes the idea of deriving
probability from statistics, rather than the usual, other
way round. Apparently, invented by Bruno de Finitti, whom
the author idolizes.
Anyhow, he discusses how means can imply probabilities.
And repeatedly refers to "the structure of probability", as
explanation.
He never defines this phrase. I have no idea what it means.
Can anyone here elaborate?
RichD wrote:
I've been reading "Ten great ideas about chance", by Persi Diaconis.
It includes a chapter which utilizes the idea of deriving
probability from statistics, rather than the usual, other
way round. Apparently, invented by Bruno de Finitti, whom
the author idolizes.
Anyhow, he discusses how means can imply probabilities.
And repeatedly refers to "the structure of probability", as
explanation.
He never defines this phrase. I have no idea what it means.
Can anyone here elaborate?
I haven't seen this, but that author may just be referring to the basic >axioms of probability. It might be that de Finitti works with ideas
about how probabilities or uncertainties (personal probabilities)
derived from data should behave (consistency as new data are added,
etc.) and can then derive the usual axioms of probability on that
basis, for his idea of what a personal probability should behave like.
I've been reading "Ten great ideas about chance", by Persi Diaconis.
It includes a chapter which utilizes the idea of deriving
probability from statistics, rather than the usual, other
way round. Apparently, invented by Bruno de Finitti, whom
the author idolizes.
Anyhow, he discusses how means can imply probabilities.
And repeatedly refers to "the structure of probability", as explanation.
I've been reading "Ten great ideas about chance", by Persi Diaconis.
It includes a chapter which utilizes the idea of deriving
probability from statistics, rather than the usual, other
way round. Apparently, invented by Bruno de Finitti, whom
the author idolizes.
Anyhow, he discusses how means can imply probabilities.
Whittle, P. (2005). Probability via Expectation, 4th ed.
"We assume a sample space W, setting a level of description of the realization of the system under study. In addition, we postulate that
to each numerical-valued observable X(w) can be attached a number E(X),
the expected value or expectation of X. The description of the variation
of w over W implied by the specification of these expectations will
be termed a probability process"
And repeatedly refers to "the structure of probability", as explanation.
Pretty sure we are talking about Kolmogorov's "probability theory as
part of mathematics within the modern theory of measure and integral",
with "a real-valued random variable [being] a measurable function from
the basic set to the real numbers" [as above].
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 113 |
Nodes: | 8 (1 / 7) |
Uptime: | 127:22:59 |
Calls: | 2,501 |
Files: | 8,692 |
Messages: | 1,924,641 |