[Search for users] [Overall Top Noters] [List of all Conferences] [Download this site]

Conference rusure::math

Title:Mathematics at DEC
Moderator:RUSURE::EDP
Created:Mon Feb 03 1986
Last Modified:Fri Jun 06 1997
Last Successful Update:Fri Jun 06 1997
Number of topics:2083
Total number of notes:14613

1784.0. "Roulette: Red vs. black (excluding the greens)" by KDX200::ROBR (Orange squares suck.) Wed Sep 01 1993 11:16

    
    Can anybody verify the logic in this?  A friend and I were talking the
    other night about what the chances are of playing roulette and having
    red come up 8 times in a row.  His explanation follows (which I don't
    exactly agree with).
    
    
    
    
Let's see if I can explain what I tried to Sunday night.

Using a coin toss as an example...

If a coin is tossed twice..(2 to the 1st power) you should get 1 head and 1 tail

If a coin is tossed 4 times (2 to the 2nd power) you should get 2 heads and 2 
tails. However there is a good chance that either 2 heads or 2 tails will come
up in a row.

If a coin is tossed 8 times (2 to the 3rd power) you should get four heads and
4 tails. However there is a good chance that either 3 heads or 3 tails will
come up in a row.

Are you with me here....

Extrapolate...
If a coin is tossed 256 times (2 to the 8th power) you should get 128 heads and
128 tails. However there is a good chance that either 8 heads or 8 tails will
come up in a row. I may be wrong but my gut tells me that there should be
an occurence of BOTH 8 Heads and 8 Tails in this sample.
Please keep in mind that an occurence of 7 heads in a row should happen twice
in this example. Once that stopped at 7 and one that continued to 8 (or perhaps
9 depending where the laws of chaos and strange attractors fit in)

If my logic is sound, then an occurence of 12 Blacks in a row on a roulette
is far from impossible given a large enough sample.

Hope I 'splained it.


Larry
T.RTitleUserPersonal
Name
DateLines
1784.1AMCCXN::BERGHPeter Bergh, (719) 592-5036, DTN 592-5036Wed Sep 01 1993 14:3593
           <<< Note 1784.0 by KDX200::ROBR "Orange squares suck." >>>
              -< Roulette: Red vs. black (excluding the greens) >-


    
<    Can anybody verify the logic in this?  A friend and I were talking the
<    other night about what the chances are of playing roulette and having
<    red come up 8 times in a row.  His explanation follows (which I don't
<    exactly agree with).

For simplicity, we'll assume that the coin is exactly even (i.e., that the odds
of getting heads is the same as the odds of getting tails) and that it is
impossible for the coin to get anything other than heads or tails.

We'll also assume that successive coin tosses are independent.  For example, if
you get heads once, you will not then "toss" the coin by turning the other side
up.

< Let's see if I can explain what I tried to Sunday night.

< Using a coin toss as an example...

< If a coin is tossed twice..(2 to the 1st power) you should get 1 head and 1
< tail

No.  You will get exactly one of heads followed by heads (HH), heads followed
by tails (HT), tails followed by heads (TH), or tails followed by tails (TT).
You "should" get 1 head and 1 tail *only* in the sense that it is the most
likely outcome (TH or HT).

From having looked at the possibilities, we get that

	P(HH) == P(HT) == P(TH) == P(TT) = .25

< If a coin is tossed 4 times (2 to the 2nd power) you should get 2 heads and 2 
< tails. However there is a good chance that either 2 heads or 2 tails will come
< up in a row.

	P(HHHH) == P(HHHT) == P(HHTH) == P(HHTT) ==
	P(HTHH) == P(HTHT) == P(HTTH) == P(HTTT) ==
	P(THHH) == P(THHT) == P(THTH) == P(THTT) ==
	P(TTHH) == P(TTHT) == P(TTTH) == P(TTTT) == .0625

Thus:
	P(2 heads and 2 tails) == 6/16
	P(exactly 2 heads in a row) == P(exactly 2 tails in a row) == 4/16
	P(at least 2 heads in a row) == P(at least 2 tails in a row) == 8/16

< If a coin is tossed 8 times (2 to the 3rd power) you should get four heads and
< 4 tails. However there is a good chance that either 3 heads or 3 tails will
< come up in a row.

I'm not going to list all possibilities, but each possible sequence of heads
and tails has a probability of 1 in 256.  The odds of 4 heads and 4 tails are

	B(8,4)/256 (== .2734375)

where B(n,m) is the binomial coefficient, defined as n!/((m!)*((n-m)!)).
(n! is 1*2*...*n.  For example, 1! == 1, 2! == 2, 3! == 6, and 4! == 24).

< Are you with me here....

I'm listening.

< Extrapolate...
< If a coin is tossed 256 times (2 to the 8th power) you should get 128 heads
< and 128 tails. However there is a good chance that either 8 heads or 8 tails
< will come up in a row. I may be wrong but my gut tells me that there should
< be an occurence of BOTH 8 Heads and 8 Tails in this sample.

The combinations get more complicated that I'm willing to deal with.

< Please keep in mind that an occurence of 7 heads in a row should happen twice
< in this example. Once that stopped at 7 and one that continued to 8 (or
< perhaps 9 depending where the laws of chaos and strange attractors fit in)

It's quite likely that you'll get 7+ heads in a row if you toss 256 times.

< If my logic is sound, then an occurence of 12 Blacks in a row on a roulette
< is far from impossible given a large enough sample.

I'll not comment on your logic, but 12 heads in a row is virtually certain
(its probability can be made arbitrarily close to 1) given a large enough
sample (but the needed sample size increases beyond bounds as you get closer to
1).

The real question is not whether 12 heads in a row will *eventually* occur, but
whether it will occur within X tosses.

An example (due to G Gamow) is that your terminal will eventually "tunnel"
through your desk and land on the floor.  The problem is the "eventually" part:
it will take many times the lifetime of the universe before such an occurrence
becomes even remotely likely.
1784.2KDX200::ROBROrange squares suck.Wed Sep 01 1993 15:057
    
    thanks!  very interesting....  the discussion came up when we were
    discussing various gambling odds.  his roulette system had been to play
    x units on one color, then just keep doubling each time until he
    eventually would win and break even. however he lost 8 times in a row.
    
    
1784.3It --doesn't-- work.VMSDEV::HALLYBFish have no concept of fireWed Sep 01 1993 16:4537
>    x units on one color, then just keep doubling each time until he
>    eventually would win and break even. however he lost 8 times in a row.
    
This is quoted from the rec.gambling Frequently Asked Questions (FAQ) list.

Q:S1  Martingale betting systems -- just double your bet until you win
A:S1  (Frank Irwin, Steve Jacobs)
 
  From: "The Eudaemonic Pie" by Thomas A. Bass
 
    The word comes from the French expression "porter les chausses a la
    martingale," which means "to wear one's pants like the natives of
    Martigue," a village in Provence where trousers are fastened at the
    rear.  The expression implies that this style of dress and method of
    betting are equally ridiculous.
 
  The betting scheme merely states that you would want to double your
  bet after each loss.  Beginning with one unit, you would bet two units
  if you lost the first.  Then four, then eight, until you win a bet.  You
  would then revert to a one unit bet.  The theory is that with each win
  you will win all that you lost since the last win, plus one unit.  The
  reality is that you will quickly come to a betting ceiling, governed by
  either your bankroll or the house limit, above which you may not increase
  your bet.  After 9 straight losses (it's happened to me) you would be
  betting 512 units.
 
  In practice, a lot of people get sucked into betting this way because it
  gives the illusion of really working.  This is because most of the time,
  you will end a string of bets with a win.  However, on those rare occasions
  when you do lose, you will lose a lot of money.  So, the end result is that
  you win a small amount almost always, but when you lose you will lose more
  than all of your little wins combined.
 
  The important point to realize is that most games simply cannot be beat
  in the long run.  In games such as craps, roulette, and non-progressive
  slot machines, it is mathematically impossible to gain an advantage over
  the house.
1784.4AMCCXN::BERGHPeter Bergh, (719) 592-5036, DTN 592-5036Wed Sep 01 1993 16:5226
           <<< Note 1784.2 by KDX200::ROBR "Orange squares suck." >>>

    
<    thanks!  very interesting....  the discussion came up when we were
<    discussing various gambling odds.  his roulette system had been to play
<    x units on one color, then just keep doubling each time until he
<    eventually would win and break even. however he lost 8 times in a row.

"His"� roulette system *only* works when

    1.	The bank allows him to keep your bets on black/red/even/odd when the
	zero or double zero comes up

    2.	He has infinite capital.

In that case, he'll be x units up after playing for a long time.

Basically, there are no systems that will work with a finite capital and an
unbiased roulette.  The existence of the zero is enough to guarantee that the
bank wins in the long run (it gives the bank 1/37 (or 2/38, for roulettes with
a zero and a double zero) of the money wagered).

=====================

� It's a quite old system that gets published with some frequency.  Like most
other systems, it requires infinite capital.
1784.5KDX200::ROBROrange squares suck.Wed Sep 01 1993 17:2523
    
    Here's something on a similar line...  I play blackjack occasionally. 
    The system I was taught recently was explained to me as thus:
    
    Bet 1 unit, if you lose bet 3, then 7, then 15 n*2+1.  Once you win, go
    back to 1.  Very similar, except I was explained that the odds of
    winning the first in a string are roughly 50/50.  If you lose the odds
    of winning the 2nd hand are 75, if you lose the third is 87.5, and on
    until the odds are 99%+ after 5 or 6 hands of losing.  
    
    Now, obviously this takes into account standard blackjack betting
    'rules' (ie double on an 11, etc).  However, I don't know if these
    numbers are true (he said his math professor at tufts explained this to
    him and has made 2 million playing this system).  ive played it myself
    on a pc game called beat the house which is a very accurate bj game.  i
    HAVE hit streaks where ive lost 7 hands in a row.
    
    Any ideas if there is truth to this, or if it's too complex to explain,
    a pointer to something that would?  I'm just wondering if this thing is
    as good as it's made out to be (I chickened out after the 4th tier when
    i used it in vegas, the result being i limited my losses to only $325
    or so :') ).
    
1784.6AUSSIE::GARSONnouveau pauvreWed Sep 01 1993 23:3820
    re .?
    
    Cute etymology of Martingale. You learn something new everyday.
    
    re .4
    
    Actually you don't quite need infinite capital. Given no house limit
    and resources sufficiently larger than the house, you may be able to
    bankrupt them before they bankrupt you (assuming a fair game - which
    roulette isn't). In reality of course there *is* a house limit (and
    your resources are very much less than the house) and Martingale is
    flawed.
    
    re .*
    
    The thing to remember about gambling is that the companies involved are
    not charities. The odds have to be against you. On average you will
    lose. Thus if you want to gamble, think of the money you lose as paying
    for entertainment and ask yourself how else you could entertain
    yourself with the money.
1784.7RUSURE::EDPAlways mount a scratch monkey.Thu Sep 02 1993 09:1811
    Re .3:
    
    > In games such as craps, roulette, and non-progressive slot machines,
    > it is mathematically impossible to gain an advantage over the house.
    
    Unless poor maintenance has allowed the pins on the roulette wheel to
    become loose or you are able to measure the ball and wheel position and
    velocity and compute where it is likely to land.
    
    
    				-- edp
1784.8RUSURE::EDPAlways mount a scratch monkey.Thu Sep 02 1993 09:2939
    Re .5:
    
    > The system I was taught recently was explained to me as thus:
    
    The important thing to realize is that on EACH play of a game, there is
    an "expected" loss.  This loss is not really expected to occur
    frequently; it is just the _average_ of the losses and wins that will
    occur over time.
    
    Since this average loss exists on each play, any combination of bets
    you make still adds up to an expected loss.  You can, through various
    betting schemes, trade off a large probability of a small win against a
    small probability of a large loss -- but you can never change the fact
    that the average is still a loss.
    
    Yes, you can make your probability of winning as close to certainty as
    you like -- but the cost of doing that is that the amount you risk
    losing increases greatly.
    
    If you want to win at blackjack, get a book called Beat the Dealer by
    Thorpe.  In blackjack, some casino's rules allow a player who uses
    optimal strategy to have slight "expected" win on each play.  And even
    if that isn't possible under a given set of rules, most casino's rules
    allow a player to have a slight "expected" win on some plays when the
    player knows something about what cards are left in the deck.
    
    In the former case, you need to memorize a medium-sized set of rules
    about how to play.  It's not too hard.  In the latter case, you need to
    count cards and remember more rules about making your bet smaller when
    the deck is bad for you and making it larger when the deck is good for
    you.
    
    
    				-- edp
    
    
Public key fingerprint:  8e ad 63 61 ba 0c 26 86  32 0a 7d 28 db e7 6f 75
To get PGP, FTP /computing/security/software/PGP/pgp23.zip from
src.doc.ic.ac.uk.
1784.9n*2+1 sequence is called "The Grand Martingale"VMSDEV::HALLYBFish have no concept of fireThu Sep 02 1993 09:4921
>                     Very similar, except I was explained that the odds of
>    winning the first in a string are roughly 50/50.  If you lose the odds
>    of winning the 2nd hand are 75, if you lose the third is 87.5, and on
>    until the odds are 99%+ after 5 or 6 hands of losing.  
    
    NO!
    
    Say the odds of winning are 50-50.  You lose the first hand.
    The odds of winning the second hand then become ... 50-50.
    Say you lose the second hand.  The odds of winning the _third_ 
    hand then become ... 50-50.  Say you lose the third hand.
    The odds of winning the _fourth_ hand then become ... 50-50.
    
    In general, the odds of winning hand N are 50-50, regardless of
    the results of the previous hands.
    
    You can verify this yourself with a coin-toss game you can hack up
    in a couple minutes.  Which you ought to do if you're thinking about
    putting some REAL MONEY on the line.
    
      John
1784.10and i always fall victim to 'free' drinks :') KDX200::ROBROrange squares suck.Thu Sep 02 1993 11:378
    
    well, coin toss, is a bit different than blackjack.  that is a
    straightforward, only two options can happen type thing.  Seems like
    apples and oranges.  I'm very curious what Thorpe has to say though, so
    I will grab that book EDP mentions... I'm familiar with counting cards,
    half decks, true count and all that good stuff, just not very good at
    it :').
    
1784.11Equation.CADSYS::COOPERTopher CooperThu Sep 02 1993 12:2663
    The odds here are actually pretty easy to calculate -- if you define
    in the right way what you mean when you talk about how long you keep
    playing.

    Let us talk about "runs" which can be runs of heads or tails (I prefer
    the coin, since then we don't have to "exclude the greens" which is,
    as has been pointed out, an important element in roulette), and let
    us say that we are betting on "heads".  Everything that I say below
    about "head runs" applies equally to "tail runs" of course.

    For our purposes we don't have a "head run" unless we have at least one
    head, so the probability that a given head run will be *at least* of
    length 1 is 1.  The probability that the next toss will be a head is
    � so the probability that the run will be *at least* of length 2 is
    �.  Similary the probability that the run will be at least of length 3
    is �.  In general, the probability that a given head run will be of
    length at least "n" is:

		      n-1
	    p(n) = 1/2

    Now --

    Let us assume that we have decided to play for at least 2N *runs*.  The
    runs will necessarily alternate between heads and tails, so exactly N
    of the runs will be head runs.

    What is the probability that there will be at least one head run which
    is of length n or greater?  Generally this kind of question is
    controlled by something called the Binomial distribution, but in this
    case we can get the answer without explict reference to that
    distribution.

    The probability that there will be at least one head run out of the N
    head runs which is at least of length n, is 1 minus the probability
    that none of the N head runs will be at least of length n.

    Let q(n) = 1-p(n) (p(n) defined as above).  The probability that the
    first head run is not of length n or greater is q(n), the probability
    that the first *and* second head run is not of length n or greater is
    q(n)*q(n) and in general, the probability that out of N runs none of
    them are not of length n or greater is:

		N
	    q(n)

    So the probability that out of N head runs (2N total runs) there is
    at least one of length n or greater is:

			     N
		 /      1   \
	    1 - | 1 - ------ |
		|       n-1  |
		 \     2    /

    By the way, the average length for each of those 2N runs will be about
    2 tosses, so the average length for a series of 2N runs will be about
    4N tosses.  A *very rough* estimate for the probability if you are
    interested in the probability of a head run of length at least n when
    you make T tosses (or T spins) is to replace N in the above formula
    by T/2 -- but I emphasize, that that is only a very rough estimate.

                                     Topher
1784.12VMSDEV::HALLYBFish have no concept of fireThu Sep 02 1993 13:1512
>    well, coin toss, is a bit different than blackjack.  that is a
>    straightforward, only two options can happen type thing.  Seems like
>    apples and oranges.  
    
    Jeez, THIS from the author of .0 ("I'll use coins to exclude the green").
    
    It's not apples and oranges.  Maybe oranges and tangerines.
    In both cases the "betting strategy" will surely not yield
    "a couple million dollars" unless one started out with something
    that large.
    
      John
1784.13Beat the Dealer is rather out of date.CADSYS::COOPERTopher CooperThu Sep 02 1993 14:5335
RE: .8 (edp)

>    In the former case, you need to memorize a medium-sized set of rules
>    about how to play.  It's not too hard.  In the latter case, you need to
>    count cards and remember more rules about making your bet smaller when
>    the deck is bad for you and making it larger when the deck is good for
>    you.

    Learning the rules of card-counting is fairly easy.  Learning to use
    them to "beat" a computer program is just as easy.  Learning to use
    them to beat a US casino is *very* hard.  US casinos consider card
    counting to be cheating (they define playing well as cheating), and
    will throw you out if they suspect you are doing it.  And they know
    what to look for.  Get "caught" a few times and your picture will
    circulate and you will find that you cannot get into any casino.  To
    do effective card counting you must be so good at it that you can seem
    to not be paying attention, chat casually with staff and passers by,
    etc.  You must also not move as fast as mathematical strategy says that
    you should when the deck gets "rich" as sudden spurts of higher betting
    is one of the things they look for (especially if you drop back down
    when they bring in a fresh-shuffled deck).

    I have made some money from black-jack card counting.  But not by
    doing it.  I invested in a group of card-counters.  They put people
    at the table (preferably women -- many of the security people and
    dealers "know" that women can't do anything mathematical) who knew
    basic strategy backwards and forwards, and what to do when the deck
    was rich, but who did *no* counting.  The counting was done by
    "bystanders" who would signal when to shift strategy.  Nobody hung
    around for very long either as a player or as a bystander.  They
    packed their hotel bedrooms with people, by the way, to keep down
    expenses.  The incremental return on investment was small enough that
    the whole thing wouldn't have worked if they weren't frugal.

                                          Topher
1784.14KDX200::ROBROrange squares suck.Thu Sep 02 1993 18:535
    
    actually i wasn't the author of .0.  if you go back and check, i was
    posting a mail message from a friend.  anyway, maybe it's not apples
    and oranges, just an orange and a muc more complex orange :').