[Search for users] [Overall Top Noters] [List of all Conferences] [Download this site]

Conference rusure::math

Title:Mathematics at DEC
Moderator:RUSURE::EDP
Created:Mon Feb 03 1986
Last Modified:Fri Jun 06 1997
Last Successful Update:Fri Jun 06 1997
Number of topics:2083
Total number of notes:14613

680.0. "Say "Red" or "Black"" by VIDEO::OSMAN (Eric, dtn 223-6664, weight 146) Fri Mar 20 1987 17:30

    In a previous note, we discussed the game called "Say Red!".  The
    game is that a shuffled standard playing card deck is slowly
    revealed to you one card at a time.
    
    At any point, you say "Red!".  If the next revealed card is
    a heart or diamond (i.e. a red card), you win.  Oh, you must
    make a call before the very last card.
    
    What is the probability of winning ?  The fascinating answer is
    that even with having the cards revealed, the expected winning
    for this game is still only 50%.
    
    Here's a newer game now.  Instead of saying "Red!", you can
    say "Red!" or "Black!", and of course you must call it BEFORE
    the last card.  (Actually, in previous game, I don't think this
    rule was important).
    
    Now what is the probability of winning, if played with best
    strategy ?  And what is that strategy ?
    
    /Eric
T.RTitleUserPersonal
Name
DateLines
680.177/102CLT::GILBERTeager like a childSun Mar 22 1987 19:520
680.2you've only half-answered it, Peter.VIDEO::OSMANEric, dtn 223-6664, weight 146Tue Mar 24 1987 11:4624
    I asked two questions, probability of winning, and best strategy.
    You've given an (unexplained!) answer, but haven't answered the
    second question.
    
    Here's some provoking thought:
    
    	If you always wait until exactly two cards are left, you have
    	50% probability that both are the same color, which would allow
    	you to win for sure.  The other 50% of the time you have one
    	of each color left, so you have a 50% chance of winning.
    
    	Hence if your strategy is to wait until two cards are left,
    	your expected winning is 75% of the time.
    
    	But now, suppose you modify this strategy slightly.  If during
    	the draw, you ever reach an abundance of 80% of one color unseen,
    	call that color.  Otherwise, wait until there are exactly two
    	cards left.
    
    	It won't happen very often that you'll reach an abundance of
    	80% of one color unseen.  But will the non-zero probability
    	of this happening cause this to be a better strategy ?
    
 /Eric
680.3passCLT::GILBERTeager like a childTue Mar 24 1987 12:5393
If you always wait until two cards remain, there are 4 possibilities:

	state	probability   expectation
	 BB	1/2 * 25/51	 1
	 RR	1/2 * 25/51	 1
	 BR	1/2 * 26/51	1/2
	 RB	1/2 * 26/51	1/2

This gives an overall expectation of 25/51 + 1/2 * 26/51 = 38/51.

In fact, this gives the best strategy, as the following program shows.


But wait! you may say.  What if there is just one black card, and 26
red cards -- isn't it better to say 'red'?  No.

Let e(b,r) be the expectation when there are 'b' black cards and 'r'
red cards.  Now
	e(1,1) = 1/2, and
	e(1,2) = max (2/3, 1/3*1 + 2/3*e(1,1)) = max (2/3, 2/3) = 2/3, and
	e(1,3) = max (3/4, 1/4*1 + 3/4*e(1,2)) = max (3/4, 3/4) = 3/4, and
	...
	e(1,n) = max (n/(n+1), 1/n*1 + n/(n+1)*e(1,n-1)) = n/(n+1).

Furthermore, we can easily show that passing is always at least as good
as guessing.  We know that e(m,n) = e(n,m), and e(n,n) >= 1/2.  Consider
e(b,r) with 1 < b < r.  We have:

	e(b,r) >= r/(b+r)
	e(b,r) = max (r/(b+r), b/(b+r)*e(b-1,r) + r/(b+r)*e(b,r-1))
But 
	b/(b+r)*e(b-1,r) + r/(b+r)*e(b,r-1)
	    >= b/(b+r) * r/(b+r-1) + r/(b+r) * (r-1)/(b+r-1)
	    >= (b*r + r*r - r)/((b+r)*(b+r-1)) = r/(b+r)

Thus, 
	b/(b+r)*e(b-1,r) + r/(b+r)*e(b,r-1) > r/(b+r),

and so passing is always at least as good as guessing.

program z (input, output);
{ }
{	The Red/Black game.
{ }
type
	real = double;
var
	exp:	array[-1..52,-1..52] of real;

function rd (a,b: integer): real;	{ real division }
var x,y: real;
begin
x := a; y := b; rd := x/y;
end;


procedure main;
var
    i,b,r:	integer;
    exp1,exp2:	real;
begin

exp[2,0] := 1;
exp[1,1] := 0.5;
exp[0,2] := 1;

for i := 0 to 52 do exp[-1,i] := 0;	{ just define these, as a convenience }
for i := 0 to 52 do exp[i,-1] := 0;	{ since they're always multiplied by 0 }

for i := 3 to 52 do	{ total number of cards remaining }
    begin
    for r := 0 to i do
	begin		{ calculate exp[j,i-j] }
	{ }
	{ There are three choices --
	{	we say 'red',
	{	we say 'black', or
	{	we pass.
	{ }
	b := i-r;
	exp1 := rd( max(b,r), i );	{ expectation, if we say red or black }
	exp2 := rd(b,i) * exp[b-1,r] + rd(r,i) * exp[b,r-1];	{ if we pass }
	exp[b,r] := max (exp1,exp2);
	if exp1 > exp2 then
	    writeln (b:3, r:3, ' -- say B/R for ', exp1, '(vs ', exp2, ')');
	end;
    end;
writeln ('exp[26,26] = ', exp[26,26]);
end;

begin
main;
end.
680.4BEING::POSTPISCHILAlways mount a scratch monkey.Tue Mar 24 1987 13:006
    Re .*:
    
    Eric, why don't you submit this to sci.math?
    
    
    				-- edp
680.5Recap...?CHOVAX::YOUNGBack from the Shadows Again,Wed Mar 25 1987 01:409
    Peter:
    
    I did'nt read through all the math in .3 (sorry) but are you esentially
    saying that if you have say, many reds and few blacks that you are
    better off passing than guessing because by the time you get down
    to 2 cards, they are likely to be 2 reds?
    
    
    --  Barry
680.6isn't it 75% ?VIDEO::OSMANEric, dtn 223-6664, weight 146Wed Mar 25 1987 11:0212
I haven't even been following sci.math.  What's the latest method of
subscribing.

Peter, given that waiting until the last two cards is the best
strategy, I don't see why the answer isn't 75% exactly.

	RR or BB has 50% probability, BR or RB has 50% probability.

	So, half the time you'll win for sure, and the other half
	you'll win half the time, so (1 + 1/2)/2 = 3/4.

/Eric
680.7.7 < .75VINO::JMUNZERWed Mar 25 1987 13:1813
Eric:
                                     (r+b)
Given r reds and b blacks, there are (   ) ways to reduce to 2 cards.
                                     ( 2 )

(r)   (b)                      (r)   (b)   (r)   (b)
( ) * ( ) reduce to 1 of each; ( ) * ( ) + ( ) * ( ) reduce to one suit.
(1)   (1)                      (2)   (0)   (0)   (2)

E.g. with a six-card deck, it's probability 3/5 that you get to 1 of each,
and therefore probability 3/10 that you lose the game.

John
680.8Another viewMODEL::YARBROUGHWed Mar 25 1987 15:438
>Peter, given that waiting until the last two cards is the best
>strategy, I don't see why the answer isn't 75% exactly.
>
>	RR or BB has 50% probability, BR or RB has 50% probability.

Another way of looking at it: There are 26!*24! ways of getting down to 2 
like cards, vs. 25!*25! ways of getting down to two unlike. The ratio of
these is 26/25, so the cases are not equally probable. Just close. 
680.925!*26! -- Yeesh!CLT::GILBERTeager like a childWed Mar 25 1987 20:172
    I looked at it from the perspective of *choosing* the last two cards.
    That's why RR has probability 26/52 * 25/51, which is less than 1/4.