[Search for users]
[Overall Top Noters]
[List of all Conferences]
[Download this site]
Title: | Mathematics at DEC |
|
Moderator: | RUSURE::EDP |
|
Created: | Mon Feb 03 1986 |
Last Modified: | Fri Jun 06 1997 |
Last Successful Update: | Fri Jun 06 1997 |
Number of topics: | 2083 |
Total number of notes: | 14613 |
1429.0. "Probability" by JARETH::EDP (Always mount a scratch monkey.) Mon Apr 22 1991 10:53
<<< RUSURE::NOTES1:[NOTES$LIBRARY]MATH.NOTE;7 >>>
-< Mathematics at DEC >-
================================================================================
Note 1283.85 Math Noters Dinner 85 of 92
SMAUG::ABBASI 4 lines 11-APR-1991 15:20
-< but whats is actual value of pi/4 ? >-
--------------------------------------------------------------------------------
ref .-1
PI/4 is not a rational number , WHat then does mean to give an
irrational number value to a probability ?
/naser
<<< RUSURE::NOTES1:[NOTES$LIBRARY]MATH.NOTE;7 >>>
-< Mathematics at DEC >-
================================================================================
Note 1283.86 Math Noters Dinner 86 of 92
GUESS::DERAMO "Dan D'Eramo" 4 lines 11-APR-1991 16:11
-< ignore him, he's just bein' a Buffon :-) >-
--------------------------------------------------------------------------------
Oops, dropped my toothpick. Oh well, at least it landed
completely within one plank of the wooden floor.
Dan
<<< RUSURE::NOTES1:[NOTES$LIBRARY]MATH.NOTE;7 >>>
-< Mathematics at DEC >-
================================================================================
Note 1283.87 Math Noters Dinner 87 of 92
CADSYS::COOPER "Topher Cooper" 31 lines 11-APR-1991 16:33
-< Good question. >-
--------------------------------------------------------------------------------
RE: .85 (naser)
> What then does [it] mean to give an irrational number value to a
> probability?
An interesting and important question since it lead to the rejection
of the 'classical' probability formulation which could not answer
it (probability defined in terms of equiprobably primitive events),
and the adoptation of the frequentist probability formulation which
stands as the foundation of classical statistical theory (note the
difference between classical probability theory and classical
statistical theory).
In the frequentist interpretation of probablility as formulated by
Von Misses and Church, you start with infinite sequences of samples
made under indistinguishable conditions. If the sequence obeys certain
conditions, then it is said to be a random sequence. The probability
of an event (outcome) in that sequence is the limit of the proportion
out of 'n' samples with that outcome as n goes to infinity. Since it
is defined in terms of a limit an irrational value makes perfect sense.
In the subjective interpretation of probability which is the basis of
the Bayesian statistical theory, a probability represents a degree of
rational uncertainty and there is no particular reason why that degree
of uncertainty should not be represented by an irrational number.
(note: whatever the frequentists claim, most people, including
technical people, think of probability in subjective/Bayesian terms.
If you have ever said "That theory isn't very probable given the
available evidence" then you are a closet Bayesian).
Topher
<<< RUSURE::NOTES1:[NOTES$LIBRARY]MATH.NOTE;7 >>>
-< Mathematics at DEC >-
================================================================================
Note 1283.91 Math Noters Dinner 91 of 92
SMAUG::ABBASI 8 lines 19-APR-1991 11:44
-< is 'liklyhood' same as 'probability' ? >-
--------------------------------------------------------------------------------
since we are taking about probabilities, is there a difference between
'likelyhood' and 'probability' of an event?
i seem to remember reading sometime ago, that there a fine difference
between the terms as used in probability domain (may be more in England?),
i dont see difference in these terms. are they interchangable?
thanks,
/naser
<<< RUSURE::NOTES1:[NOTES$LIBRARY]MATH.NOTE;7 >>>
-< Mathematics at DEC >-
================================================================================
Note 1283.92 Math Noters Dinner 92 of 92
CADSYS::COOPER "Topher Cooper" 43 lines 19-APR-1991 14:01
-< Commonly yes, technically no. >-
--------------------------------------------------------------------------------
In common usage the terms are essentially synonymous. When a popular
concept "translates" into a formal concept, the name will frequently
be carried over, then having a "technical meaning". That's what
happend with the term "probability". Later a need was found for
a distinct formalization of the popular concept, and another "technical
ter" was needed. The obvious thing to do was to use the synonymous
popular term, i.e., likelihood.
Fisher, one of the architects (probably valid to call him *the*
principal architect) of classical statistical theory, made a stab at
dealing with Bayesian probability concepts. He defined a quantity
called the "likelihood", which he said was defined to be proportional
to probability. (For technical reasons, modern classical statisticians
talk about a value of a likelihood function rather than about
likelihood directly). While, for example, it makes no sense (in
the frequency interpretation of probability) to talk about the
probability that a specific coin flip is heads, it does make sense to
talk about the *likelihood* that it is heads. Unfortunately, though
you can talk about the likelihood, you can't know what it is, since
the proportionality constant is unknown and unknowable (since it is
basically the prior probability of Bayesian statistics which the
frequentists believe is ill-defined). What can be spoken of in
classical statistics is the liklihood ratio -- in which case the
proportionality constant disappears.
Sounds like I'm just talking about the ratio of probabilities, doesn't
it? In a sense I am. But the formalisim justifies putting
probabilities together in wasy which are otherwise "irrational" in
strict frequentist terms. All this stuff is used frequently by
statisticians for doing things like deciding which of several
estimaters of a quantity is best.
Meanwhile, many Bayesians have picked up on this and refer to prior
and postori liklihoods rather than probabilities. In essence they try
to compromise with the frequentists by saying "OK, if you want
probability to mean only something that is defined in terms of
frequency we'll go along with that. But there is also *liklihood*
which represents degree of uncertainty, of which probability is
a special case." The difference is that, to a subjectivist/Bayesian,
you *can* give specific values to a likelihood without having to deal
with ratios.
Topher
T.R | Title | User | Personal Name | Date | Lines |
---|
1429.1 | another slant on likelihood | CSSE::NEILSEN | Wally Neilsen-Steinhardt | Tue Apr 23 1991 14:19 | 32 |
| what Topher says in .0 or 1283.92 is not wrong, but it can be made a bit more
specific.
The following is paraphrased from _Statistics_, Winkler and Hays; I share their
Bayesian outlook.
A likelihood is a particularly interesting example of a conditional probability.
A general conditional probability P(A|B) gives the probability of event A given
the occurrence of event B. When A is an observation and B is some state of
the world, we call P(A|B) a likelihood.
For example, suppose we have a production line which is producing some unknown
fraction of failures. And suppose we take a sample of ten items and find that
two of the sample fail. We can define events
A = two fail out of a sample of ten
B = the true fraction of failures is f
Then P(A|B) is the likelihood that we will see that sample, given that failure
rate. With the usual rules of probability we can compute likelihoods as a
function of sample outcome and true fraction.
One of Fisher's great contributions was the principle of maximum likelihood,
which says that a good (sometimes the best) estimator for B is that which
maximizes the likelihood of the A actually observed. This together with a bit
of calculus can produce estimator formulas in some quite complicated
situations. Note that since we are maximizing a function, we don't care about
scale factors, so it is often sufficient to work with ratios.
As Topher says, Fisher and other frequentists had a real problem regarding many
of their P(A|B)s as probabilities, so they used a different word and a lot of
different reasoning to arrive at the principle of maximum likelihood.
|