Deal or No Deal?
July 18, 2006 4:16 PM Subscribe
My workplace is doing a little motivational contest based on the TV show Deal or No Deal (stupidest name ever, IMHO). I believe I've developed a strategy for the game that's not really covered anywhere else. Am I off my rocker, or is this valid? More importantly, how else can I improve on the basic strategy?
I'm not desperate to win, as I don't expect the prizes to be so top-notch that I'd absolutely kick myself for accidentally choosing the $1 case over the $1,000,000 case, so my risk-tolerance is exceptionally high. Assume infinite for probability purposes.
I do want to game the system though, as I'm kind of a twink by nature. Here's what I've come up with so far:
The basic strategy is to take the mean of all the cases remaining, and refuse the deal if the deal is less than the mean, as you stand to win on average better than what's being offered. Most of the time, if not all of the time, the banker will offer you less than the mean, so it's nearly always correct to refuse the deal.
Once you have that mean, you can also sum the number of cases remaining with prizes both below and above your mean. At this point, you have the chance that choosing another case (refusing the deal) will lower your mean and lower the next deal.
However, it seems that the banker also raises the % of the mean that they offer up to a certain point as well, to offset this kind of playing. If this is a valid strategy, how else can I improve it? If it isn't, where did I go wrong?
I'm not desperate to win, as I don't expect the prizes to be so top-notch that I'd absolutely kick myself for accidentally choosing the $1 case over the $1,000,000 case, so my risk-tolerance is exceptionally high. Assume infinite for probability purposes.
I do want to game the system though, as I'm kind of a twink by nature. Here's what I've come up with so far:
The basic strategy is to take the mean of all the cases remaining, and refuse the deal if the deal is less than the mean, as you stand to win on average better than what's being offered. Most of the time, if not all of the time, the banker will offer you less than the mean, so it's nearly always correct to refuse the deal.
Once you have that mean, you can also sum the number of cases remaining with prizes both below and above your mean. At this point, you have the chance that choosing another case (refusing the deal) will lower your mean and lower the next deal.
However, it seems that the banker also raises the % of the mean that they offer up to a certain point as well, to offset this kind of playing. If this is a valid strategy, how else can I improve it? If it isn't, where did I go wrong?
Maybe the rules are different in your version, but in the UK version you very rarely play to the end, so what's in your own box hardly matters at all. What you're betting on is whether you expect the next offer to be bigger or smaller, and since the offers are entirely up to the banker, it's much more about psychology rather than probability.
In short, your strategy sucks.
posted by cillit bang at 4:35 PM on July 18, 2006
In short, your strategy sucks.
posted by cillit bang at 4:35 PM on July 18, 2006
Be sure to weight the values based on their relative likelihoods. If you do, you're not really gaming the system so much as playing optimally. However in a real game the non-linear value of money makes such strategies flawed (eg a 100% chance of getting $20k might be better than a 25% chance of getting $100k, in terms of expected increase in quality of life)
posted by aubilenon at 4:40 PM on July 18, 2006
posted by aubilenon at 4:40 PM on July 18, 2006
There's another thing to consider, namely risk aversion.
Also, remember that different amounts of money mean different things to different people. A prize of $200 might be twice as valuable as a prize of $100. Both of these amounts of money are commonplace, we work with this much several times a week. But a prize of $1 million wouldn't be twice as valuable as a prize of $0.5 million. Those are both (to your average Joe) life-changing amounts of money.
posted by CrunchyFrog at 4:46 PM on July 18, 2006
Also, remember that different amounts of money mean different things to different people. A prize of $200 might be twice as valuable as a prize of $100. Both of these amounts of money are commonplace, we work with this much several times a week. But a prize of $1 million wouldn't be twice as valuable as a prize of $0.5 million. Those are both (to your average Joe) life-changing amounts of money.
posted by CrunchyFrog at 4:46 PM on July 18, 2006
Response by poster: cillit bang, I'm not even entirely sure of how our version plays, especially in terms of how the banker generates the offer. I'm just interested in playing as mathematically optimally as possible, as Robot Johnny caught.
aubilenon, as far as I know, the "relative likelihoods" of all values are equal, as each value has an equal chance of being in each case. In my own particular case, there is no real difference between the $1 prize and the $1,000,000 prize (though there likely is, I just personally don't care), so my expected quality of life increase is basically nil, regardless of prize.
posted by Imperfect at 4:52 PM on July 18, 2006
aubilenon, as far as I know, the "relative likelihoods" of all values are equal, as each value has an equal chance of being in each case. In my own particular case, there is no real difference between the $1 prize and the $1,000,000 prize (though there likely is, I just personally don't care), so my expected quality of life increase is basically nil, regardless of prize.
posted by Imperfect at 4:52 PM on July 18, 2006
There's a couple issues with your strategy...
The basic strategy is to take the mean of all the cases remaining, and refuse the deal if the deal is less than the mean
Wouldn't this just make it a 50-50 chance of coming out ahead of the banker offer? Unless the banker offer is very, very low ... but even then, your odds wouldn't increase that much. And what if the banker offer IS the mean?
Most of the time, if not all of the time, the banker will offer you ...
Better make sure the banker is consistently playing by these rules. You could end up screwing yourself.
However, it seems that the banker also raises the % of the mean that they offer up to a certain point as well, to offset this kind of playing.
Finally, remember that the television show IS NOT designed to be gambling, in the legal sense. The "banker" is really the show's producers, who have perfect knowledge of all the dollar values before the game even starts. The producers' job is not to play the game back at you. The producers' job is to create drama. So the things you see on the televised show are not based on any fixed algorithim, or even very solid strategy.
posted by frogan at 4:52 PM on July 18, 2006
The basic strategy is to take the mean of all the cases remaining, and refuse the deal if the deal is less than the mean
Wouldn't this just make it a 50-50 chance of coming out ahead of the banker offer? Unless the banker offer is very, very low ... but even then, your odds wouldn't increase that much. And what if the banker offer IS the mean?
Most of the time, if not all of the time, the banker will offer you ...
Better make sure the banker is consistently playing by these rules. You could end up screwing yourself.
However, it seems that the banker also raises the % of the mean that they offer up to a certain point as well, to offset this kind of playing.
Finally, remember that the television show IS NOT designed to be gambling, in the legal sense. The "banker" is really the show's producers, who have perfect knowledge of all the dollar values before the game even starts. The producers' job is not to play the game back at you. The producers' job is to create drama. So the things you see on the televised show are not based on any fixed algorithim, or even very solid strategy.
posted by frogan at 4:52 PM on July 18, 2006
Response by poster: CrunchyFrog, I'm currently not risk-averse at all. I suppose I should have stated that this strategy is to be applied in my current game only, not in the real version, and that my risk tolerance is effectively infinite.
posted by Imperfect at 4:53 PM on July 18, 2006
posted by Imperfect at 4:53 PM on July 18, 2006
you stand to win on average better than what's being offered
I don't understand how this "averaging" works. How do you "average" the odds of suddenly choosing, say, 50K, 100K and 200K in a single round? It seems like you're implying that whatever's on the board has some kind of causal effect on what you're likely to pick next, that it'll all even out in the end. But it won't - you can go from everything to nothing in the blink of an eye, and you don't have an unlimited amount of plays (like, say, a roulette wheel) to wait for the averaging to kick in. But probability stuff is hard, and I've always been crap at it.
posted by obiwanwasabi at 4:55 PM on July 18, 2006
I don't understand how this "averaging" works. How do you "average" the odds of suddenly choosing, say, 50K, 100K and 200K in a single round? It seems like you're implying that whatever's on the board has some kind of causal effect on what you're likely to pick next, that it'll all even out in the end. But it won't - you can go from everything to nothing in the blink of an eye, and you don't have an unlimited amount of plays (like, say, a roulette wheel) to wait for the averaging to kick in. But probability stuff is hard, and I've always been crap at it.
posted by obiwanwasabi at 4:55 PM on July 18, 2006
cillit bang, I'm not even entirely sure of how our version plays, especially in terms of how the banker generates the offer. I'm just interested in playing as mathematically optimally as possible, as Robot Johnny caught.
But I'm saying that's completely wrongheaded. The one factor in deciding whether to deal or not deal is based on whether you think the next offer will be higher or lower. Since the next offer is more-or-less picked out of thin air, the system can't be modeled mathematically.
posted by cillit bang at 4:59 PM on July 18, 2006
But I'm saying that's completely wrongheaded. The one factor in deciding whether to deal or not deal is based on whether you think the next offer will be higher or lower. Since the next offer is more-or-less picked out of thin air, the system can't be modeled mathematically.
posted by cillit bang at 4:59 PM on July 18, 2006
If you just barely care about the prizes, but don't really care very much (which appears to be the case), then as long as there is something on the board you value even slightly more than what the banker is offering, you should refuse the offer.
If you don't care about the prizes at all, then just take the first offer so you can quit playing the stupid game.
posted by willnot at 5:03 PM on July 18, 2006
If you don't care about the prizes at all, then just take the first offer so you can quit playing the stupid game.
posted by willnot at 5:03 PM on July 18, 2006
Since the next offer is more-or-less picked out of thin air, the system can't be modeled mathematically.
That sounds like a challenge to me!@
Okay. The term we're dancing around here is Expected Utility (more or less). Essentially what you're doing, when you choose whether or not to continue, is comparing the expected utility (weighted by how risk-averse you are or are not) to the offer. If you'd rather have 99K for sure than whatever your next offer might be, go home. This is where the mathematical modeling comes in.
Theoretically, you could track every offer and the mean of every step of the game. At some point, you're going to have an average offer, expressed as a percentage of the mean of all the cases remaining. Unless the standard deviation is enormous (which I'm pretty sure it's not, based on the episodes I've seen), a good player should be able to predict with a considerable amount of certainty approximately what the next offer will be.
So. We come to the point at which you've got to play your predictive ideas vs. probability. At this point, it all depends on what's left on the board. If you've got 4 high values and one low value, your next offer is probably going to be lower. If you've got a lot of low values on the board, your offer will probably increase.
So, there's no "trick" that's going to win you the million bucks. There are, however, smart decisions.
As for your comment, CB, about very few games being played to the end, of course!@ There's only a few situations in which anybody (that wasn't a total idot) would choose to play to the end. You've really got to have two high-value cases, such that either one is above the player's limit of "a crapload of money" (that's a technical term). There might be one or two situations in which playing until the end might be advantageous, but I can't think of them right now.
As a note, this article might be of interest to people who are big enough nerds to comment in this thread...
posted by god hates math at 5:27 PM on July 18, 2006
That sounds like a challenge to me!@
Okay. The term we're dancing around here is Expected Utility (more or less). Essentially what you're doing, when you choose whether or not to continue, is comparing the expected utility (weighted by how risk-averse you are or are not) to the offer. If you'd rather have 99K for sure than whatever your next offer might be, go home. This is where the mathematical modeling comes in.
Theoretically, you could track every offer and the mean of every step of the game. At some point, you're going to have an average offer, expressed as a percentage of the mean of all the cases remaining. Unless the standard deviation is enormous (which I'm pretty sure it's not, based on the episodes I've seen), a good player should be able to predict with a considerable amount of certainty approximately what the next offer will be.
So. We come to the point at which you've got to play your predictive ideas vs. probability. At this point, it all depends on what's left on the board. If you've got 4 high values and one low value, your next offer is probably going to be lower. If you've got a lot of low values on the board, your offer will probably increase.
So, there's no "trick" that's going to win you the million bucks. There are, however, smart decisions.
As for your comment, CB, about very few games being played to the end, of course!@ There's only a few situations in which anybody (that wasn't a total idot) would choose to play to the end. You've really got to have two high-value cases, such that either one is above the player's limit of "a crapload of money" (that's a technical term). There might be one or two situations in which playing until the end might be advantageous, but I can't think of them right now.
As a note, this article might be of interest to people who are big enough nerds to comment in this thread...
posted by god hates math at 5:27 PM on July 18, 2006
cillit bang, I'm not even entirely sure of how our version plays, especially in terms of how the banker generates the offer. I'm just interested in playing as mathematically optimally as possible, as Robot Johnny caught.
It depends by what 'rules' they'll be playing under as to whether this strategy has any merit.
In the UK version, for example, the banker does use psychology with offers. In similar offer situations, the amount can vary wildly based on the "banker's" perception of the player and how likely they are to take a certain amount.
posted by wackybrit at 5:33 PM on July 18, 2006
It depends by what 'rules' they'll be playing under as to whether this strategy has any merit.
In the UK version, for example, the banker does use psychology with offers. In similar offer situations, the amount can vary wildly based on the "banker's" perception of the player and how likely they are to take a certain amount.
posted by wackybrit at 5:33 PM on July 18, 2006
I haven't seen the show on TV, but just from your description I think that at the very least, you should base your responses on the median amount of value remaining rather than the mean. Say, for example, there's 101 boxes total. 100 of those contain $1, but one contains $100,000. Meaning that it's very likely that you're going to pick a $1 box, no matter what the average remaining value is. In fact, the median value of the boxes is $1, while the mean box value is $991. If you get offered even as low as $100 in a situation like that, take it.
posted by LionIndex at 5:40 PM on July 18, 2006
posted by LionIndex at 5:40 PM on July 18, 2006
the wikipedia entry on deal or no deal actually discusses a lot of the issues mentioned here.
posted by juv3nal at 5:47 PM on July 18, 2006
posted by juv3nal at 5:47 PM on July 18, 2006
i apologize for the derail but what do you mean when you use the word twink? the only definitions i know make no sense in this context.
posted by phil at 5:55 PM on July 18, 2006
posted by phil at 5:55 PM on July 18, 2006
It depends by what 'rules' they'll be playing under as to whether this strategy has any merit.
Exactly. We have no idea what game we're talking about here. I'm assuming your employer is not actually offering a potential million dollars, so the probabilities and risks here are completely different from those in the game. How do you know who wins, and what does the winner get? With no advertising to sell, what are the goals of the banker in your game?
posted by scottreynen at 6:26 PM on July 18, 2006
Exactly. We have no idea what game we're talking about here. I'm assuming your employer is not actually offering a potential million dollars, so the probabilities and risks here are completely different from those in the game. How do you know who wins, and what does the winner get? With no advertising to sell, what are the goals of the banker in your game?
posted by scottreynen at 6:26 PM on July 18, 2006
Thank God for God Hates Math. This is an expected value question, which is how limit poker should be played (and plays a lot into no limit). The way you will succeed is by playing wisely, which will guarantee returns over the broader sample of everyone who plays wisely, but not necessarily do you any good in your individual case. Even if the odds are 99:1, you'll still pick wrong one time out of 100.
posted by klangklangston at 6:59 PM on July 18, 2006
posted by klangklangston at 6:59 PM on July 18, 2006
I have the same twink question. What in the world do you mean by that here?
posted by jdroth at 7:10 PM on July 18, 2006
posted by jdroth at 7:10 PM on July 18, 2006
i apologize for the derail but what do you mean when you use the word twink?
Seconded.
posted by Civil_Disobedient at 7:40 PM on July 18, 2006
Seconded.
posted by Civil_Disobedient at 7:40 PM on July 18, 2006
So Imperfect used his Troll Shaman main to help his Tauren Druid get better at office game shows?
posted by jdroth at 10:19 PM on July 18, 2006
posted by jdroth at 10:19 PM on July 18, 2006
Unless the standard deviation is enormous, a good player should be able to predict with a considerable amount of certainty approximately what the next offer will be.
Yes and no. You could probably come up with a reasonable estimate given which boxes are on the table when the offer is made, but the player has to make their decision before the next 3 boxes are opened, and so before that is known. The randomness of what the next boxes will be multiplied by the randomness of the banker's offers make any strategy basically useless. It's like having a strategy for scissors paper stone.
posted by cillit bang at 5:17 AM on July 19, 2006
Yes and no. You could probably come up with a reasonable estimate given which boxes are on the table when the offer is made, but the player has to make their decision before the next 3 boxes are opened, and so before that is known. The randomness of what the next boxes will be multiplied by the randomness of the banker's offers make any strategy basically useless. It's like having a strategy for scissors paper stone.
posted by cillit bang at 5:17 AM on July 19, 2006
I cannot believe I missed this question until now. We've discussed the show before, specifically with a view to figuring out how the banker comes up with an offer. A site linked in that thread has a commenter explain that he gets pretty close to the actual offer by discounting the expected value of the remaining cases by a certin percentage each round. The first round offer would be something like 11% of the expected value, the second round offer would be 22%. I had suspected some method like this was being used, and I found that in the shows I watched after that, I was able to predict offers pretty well.
But to answer the question, the way to "win" is to take the offer that is equal to or better than the expected value of the remaining cases. I have seen this happen at least twice on the show, and I believe both times the deal was refused, with the contestants eventually taking home much less money.
Because of the randomness of the game, you will often end up with less than previous offers, but it seems that the optimal strategy is to go farther in order to increase the percentage of expected value you will be offered.
Me, I'd probably use some combination of this strategy and the "would I call this a crapload of money" question.
posted by lackutrol at 12:16 PM on July 19, 2006
But to answer the question, the way to "win" is to take the offer that is equal to or better than the expected value of the remaining cases. I have seen this happen at least twice on the show, and I believe both times the deal was refused, with the contestants eventually taking home much less money.
Because of the randomness of the game, you will often end up with less than previous offers, but it seems that the optimal strategy is to go farther in order to increase the percentage of expected value you will be offered.
Me, I'd probably use some combination of this strategy and the "would I call this a crapload of money" question.
posted by lackutrol at 12:16 PM on July 19, 2006
...basically useless. It's like having a strategy for scissors paper stone.
Actually there are strategies that work (give you a less than randomly expected proportion of losses) for rock paper scissors if you're talking about a large number of games against one opponent. It's an exceedingly rare human opponent that will pick completely randomly...unless maybe they're explicitly aware you're trying to exploit that.
posted by juv3nal at 3:32 PM on July 19, 2006
Actually there are strategies that work (give you a less than randomly expected proportion of losses) for rock paper scissors if you're talking about a large number of games against one opponent. It's an exceedingly rare human opponent that will pick completely randomly...unless maybe they're explicitly aware you're trying to exploit that.
posted by juv3nal at 3:32 PM on July 19, 2006
I always check to see how many cases are left, how much money is in the ones left, and what chance I have of knocking a high amount off... I've watched the show for awhile and am generally pretty good at picking the best time to deal...
I won $500 on the show a few months ago! I correctly guessed the amount in my case.
oh, and on another slightly off topic comment, I saw an episode (here in melbourne) where the contestant knocked out the $200,000 then the $100,000 then the $75,000 then the $50,000 in the first four cases in that order - so the four highest cases were gone straight away! damn!
posted by jonathanstrange at 11:30 PM on July 30, 2006
I won $500 on the show a few months ago! I correctly guessed the amount in my case.
oh, and on another slightly off topic comment, I saw an episode (here in melbourne) where the contestant knocked out the $200,000 then the $100,000 then the $75,000 then the $50,000 in the first four cases in that order - so the four highest cases were gone straight away! damn!
posted by jonathanstrange at 11:30 PM on July 30, 2006
« Older Sleek and stylish trend lines and histograms. | What can I do after teaching for a few years? Newer »
This thread is closed to new comments.
posted by Robot Johnny at 4:32 PM on July 18, 2006