Pirate Democracy [Solution]
Moderators: jestingrabbit, Moderators General, Prelates
Pirate Democracy [Solution]
Answer is to this question.
The simple answer is that you can take 951 coins and give 1 coin each to the 49 least fierce pirates.
Here's why:
If there are only two pirates, the fiercer of the two needs only his own vote, so he'll take all thousand coins and leave the least fierce pirate with none.
If there are three pirates, the least fierce pirate will always vote in favour (thus providing victory to ) any offer that provides him with one or more coins, because if he rejects it we get the twopirate situation and he gets nothing.
If there are four pirates, the least fierce pirate will vote in favour of anything that gives him one coin, because he gains nothing by voting against and doesn't care whether it's pirate four or pirate three who gets the other 999 coins.
Likewise, if there are five or six pirates, pirates one and two will vote in favour of getting one coin each with the most fierce pirate getting 998. Pirate one gains nothing by rejecting and pirate two knows that if he rejects it the fourpirate scenario will arise and he'll get nothing.
This continues up to the point where the 49 leastfierce pirates vote in favour of getting one coin each because they can't gain anything by rejecting it.
The simple answer is that you can take 951 coins and give 1 coin each to the 49 least fierce pirates.
Here's why:
If there are only two pirates, the fiercer of the two needs only his own vote, so he'll take all thousand coins and leave the least fierce pirate with none.
If there are three pirates, the least fierce pirate will always vote in favour (thus providing victory to ) any offer that provides him with one or more coins, because if he rejects it we get the twopirate situation and he gets nothing.
If there are four pirates, the least fierce pirate will vote in favour of anything that gives him one coin, because he gains nothing by voting against and doesn't care whether it's pirate four or pirate three who gets the other 999 coins.
Likewise, if there are five or six pirates, pirates one and two will vote in favour of getting one coin each with the most fierce pirate getting 998. Pirate one gains nothing by rejecting and pirate two knows that if he rejects it the fourpirate scenario will arise and he'll get nothing.
This continues up to the point where the 49 leastfierce pirates vote in favour of getting one coin each because they can't gain anything by rejecting it.
Wait! I've had further thoughts!
You can keep all 1000, and have all but one of the other pirates vote in your favour.
If everybody rejects your idea, you get thrown overboard and then when the next guy proposes he keeps everything, he gets thrown overboard too. This goes all the way down to the secondleastfierce pirate, who only needs his own vote to keep the loot.
So the secondtoleastfierce pirate is the only one who gains anything from novotes. The others weren't going to get anything anyway, so they might as well just vote in your favour to avoid the risks involved in attempting to throw a fierce pirate overboard.
You can keep all 1000, and have all but one of the other pirates vote in your favour.
If everybody rejects your idea, you get thrown overboard and then when the next guy proposes he keeps everything, he gets thrown overboard too. This goes all the way down to the secondleastfierce pirate, who only needs his own vote to keep the loot.
So the secondtoleastfierce pirate is the only one who gains anything from novotes. The others weren't going to get anything anyway, so they might as well just vote in your favour to avoid the risks involved in attempting to throw a fierce pirate overboard.
 Gelsamel
 Lame and emo
 Posts: 8237
 Joined: Thu Oct 05, 2006 10:49 am UTC
 Location: Melbourne, Victoria, Australia
Teaspoon wrote:Wait! I've had further thoughts!
You can keep all 1000, and have all but one of the other pirates vote in your favour.
If everybody rejects your idea, you get thrown overboard and then when the next guy proposes he keeps everything, he gets thrown overboard too. This goes all the way down to the secondleastfierce pirate, who only needs his own vote to keep the loot.
So the secondtoleastfierce pirate is the only one who gains anything from novotes. The others weren't going to get anything anyway, so they might as well just vote in your favour to avoid the risks involved in attempting to throw a fierce pirate overboard.
The correct answer is sort of like this one. You still have to keep people voting in your favour, they're not going to vote for you just because they'll get nothing anyway.
Try working it backwards from 2 Pirates, then add another.
(Also the way the person phrased the question is really bad, especially with the "feirce" thing, just assume that while the 1st pirate is really feirce the 100th pirate isn't a pussy, just less feirce. Also note they value their life and they want to get the most money they can out of it).
Okay, I'm going to number the pirates in descending order of fierceness, so I'm number 1 and the least fierce is number 100.
If it gets down to just two pirates, 99 will vote for himself to get all of it and 100 will get nothing. If there are three (or four) pirates, 98 (or 97) can vote that he takes all and 100 will vote in favour because he gains nothing by voting against. In the fivepirate case, 99 will realise that if he votes against 96 taking all, 97 will take all and he'll still get nothing, so he might as well vote in favour because he can't get anything anyway. This works all the way up to the onehundredpirate case, where enough pirates will realise that they can't gain by voting no that pirate 1 can take all the loot.
I'm sure it makes sense in my head. Less sure that what I've written is working.
If it gets down to just two pirates, 99 will vote for himself to get all of it and 100 will get nothing. If there are three (or four) pirates, 98 (or 97) can vote that he takes all and 100 will vote in favour because he gains nothing by voting against. In the fivepirate case, 99 will realise that if he votes against 96 taking all, 97 will take all and he'll still get nothing, so he might as well vote in favour because he can't get anything anyway. This works all the way up to the onehundredpirate case, where enough pirates will realise that they can't gain by voting no that pirate 1 can take all the loot.
I'm sure it makes sense in my head. Less sure that what I've written is working.
 Gelsamel
 Lame and emo
 Posts: 8237
 Joined: Thu Oct 05, 2006 10:49 am UTC
 Location: Melbourne, Victoria, Australia
The following is the first step of solving it explained. It sort of gives away the whole thing really because it's just this first step that is hard.
If you have 2 people left the situation is split like this
1 is a yes vote, 0 is a no vote.
Person:99100
Coins: 10000
Votes: 10
And with 50% number 99 wins.
However at 3 people.
100000
100
99 and 98 Both won't vote for it since A) 99 could get more if 98 dies. B) 100 can swing both ways. But this has to be guaranteed victory.
99910
100
This way 99 will not vote because he could get more if 98 dies.
99901
101
This way wins because 98 is going to vote no no matter what really, because he could always get more if everyone upto 98 dies. And 100 is at least getting a dollar so he will vote for it.
If you have 2 people left the situation is split like this
1 is a yes vote, 0 is a no vote.
Person:99100
Coins: 10000
Votes: 10
And with 50% number 99 wins.
However at 3 people.
100000
100
99 and 98 Both won't vote for it since A) 99 could get more if 98 dies. B) 100 can swing both ways. But this has to be guaranteed victory.
99910
100
This way 99 will not vote because he could get more if 98 dies.
99901
101
This way wins because 98 is going to vote no no matter what really, because he could always get more if everyone upto 98 dies. And 100 is at least getting a dollar so he will vote for it.
if a pirate will gain nothing voting either way, how can you say he will then vote yes?
i'd argue that it would have to be a sort of ladder situation. (100 gets 50 coins, 99 gets 49, 98 gets 48, etc.) until you get to 50 (who gets nothing). then you keep the rest. although it seems counterintuitive that the least fierce should get more than the others, the last one is also the last one who would get thrown off.
i don't really have a proof for this; it's just an idea
i'd argue that it would have to be a sort of ladder situation. (100 gets 50 coins, 99 gets 49, 98 gets 48, etc.) until you get to 50 (who gets nothing). then you keep the rest. although it seems counterintuitive that the least fierce should get more than the others, the last one is also the last one who would get thrown off.
i don't really have a proof for this; it's just an idea

 Posts: 286
 Joined: Tue Aug 22, 2006 10:35 pm UTC
 Contact:
I think there may be more to this puzzle than meets the eye. If pirates #99 and #100 collude they can each get more than $1, which would then be very rational. Consider the case where they both vote no to the first, say, 10 proposals. In such a situation pirate #11 would rationally offer them more than $1 each, because he has no way of telling what they would do (given their previous seemingly irrational behavior) and he does not want to die. In fact pirate #3 would actually be the first to come to this conclusion. It's a bluffing game now, but I am convinced both pirates #99 and #100 have an expected value higher than $1 if they collude. Since they can both figure that out independently, they will rationally collude.
At this point pirate #2 might accept pirate #1's offer, even if it's $0, just to avoid dying. Here I assume dying is a payoff significantly worse than $0.
At this point pirate #2 might accept pirate #1's offer, even if it's $0, just to avoid dying. Here I assume dying is a payoff significantly worse than $0.
GENERATION 1i: The first time you see this, copy it into your sig on any forum. Square it, and then add i to the generation.
I presented this riddle a couple of months ago (in slightly different form) on my blag. The solution is given in excruciating detail in the comments, along with some discussion about the assumptions made.
I will not succumb to evil, unless she's cute.
Our perfectly logical pirates will vote yes when they reach a situation where they don't gain either way because they recognise the need to maintain a large population of pirates for future raids, and novotes are more likely to reduce the population
They will, however, work on increasing their feirceness between raids in an attempt to be the pirate who says "Sod you all, I'm keeping the lot!"
I don't believe collusion would be permitted. If pirates 99 and 100 agree to vote no to everything and then split it evenly in the end, when it finally gets to 99's turn to suggest, his main motivation of maximising personal gain will override the agreement with 100 so he'll decide to give himself everything. 100 will know this, and won't agree to collude.
Now that I think of it, pirates 98 and 100 might be able to agree to exclude 99 and share, because 98 needs 100's support for his suggestion and he'll only get it by giving him more than he'd get if 99 gets to suggest. This demands a rethink. My original solution was closer than the "take it all" idea.
They will, however, work on increasing their feirceness between raids in an attempt to be the pirate who says "Sod you all, I'm keeping the lot!"
I don't believe collusion would be permitted. If pirates 99 and 100 agree to vote no to everything and then split it evenly in the end, when it finally gets to 99's turn to suggest, his main motivation of maximising personal gain will override the agreement with 100 so he'll decide to give himself everything. 100 will know this, and won't agree to collude.
Now that I think of it, pirates 98 and 100 might be able to agree to exclude 99 and share, because 98 needs 100's support for his suggestion and he'll only get it by giving him more than he'd get if 99 gets to suggest. This demands a rethink. My original solution was closer than the "take it all" idea.

 Posts: 20
 Joined: Thu Jul 20, 2006 9:55 pm UTC
The collusion idea might work, but it's more complicated than you indicate. If 1 and 2 both vote against the first proposal, the second proposal will simply offer 1 coin to pirates 351. Though maybe not. The question is now whether it's logical for those pirates to accept, knowing that if they don't, eventually one of the other pirates is going to have to up the offer. In fact, if pirates 149 reject the first offer, pirate 99 is going to want to up the offer (since presumably pirate 49 will still act the same, and 99 can't get a majority without at least one of the first 49). Pirate 99 will also still need to offer to 49 pirates, so technically nothing is lost by rejecting the first offer. Pirate 98 only needs 48 pirates in his camp, so arguably pirate 49 should be worried and vote for whatever 99 offers. But of course, 99 knows this, and so will offer the same as 100 offered to 49. But even if 49 knows this, he loses nothing by voting against 100.
However, the logic changes a little with 48. 97 will offer 48 only one coin, but 98 will offer him 2, and 99 will offer him 3. Arguably, in fact, 100 will offer him 4, and will offer 49 2. I think the pattern might continue (every pirate n from 1 to 49 gets 2(50n) coins). There's probably more circular logic to get around this, though. The problem is that the optimal action for each pirate is dependent on the simultaneous actions of all the other pirates, which are themselves dependent on the action of the original pirate. Since there are a finite number of coins, pirates, and voting options, the total number of offers and reactions is finite, but it's a large finite number. There is presumably an optimal course 100 can take that will please all pirates 149 (in that none of them can hope for a better subsequent offer that will receive a majority vote), but I'm not certain such a course exists.
However, the logic changes a little with 48. 97 will offer 48 only one coin, but 98 will offer him 2, and 99 will offer him 3. Arguably, in fact, 100 will offer him 4, and will offer 49 2. I think the pattern might continue (every pirate n from 1 to 49 gets 2(50n) coins). There's probably more circular logic to get around this, though. The problem is that the optimal action for each pirate is dependent on the simultaneous actions of all the other pirates, which are themselves dependent on the action of the original pirate. Since there are a finite number of coins, pirates, and voting options, the total number of offers and reactions is finite, but it's a large finite number. There is presumably an optimal course 100 can take that will please all pirates 149 (in that none of them can hope for a better subsequent offer that will receive a majority vote), but I'm not certain such a course exists.
 phlip
 Restorer of Worlds
 Posts: 7461
 Joined: Sat Sep 23, 2006 3:56 am UTC
 Location: Australia
 Contact:
With appropriate assumptions, the most logical solution I can think of is:
The fiercest pirate will keep 951 coins for himself, and offer 1 coin to the third fiercest, 1 to the fifth, 1 to the seventh, etc, and 1 to the ninetyninth.
I'll edit in the proof in a bit, I just have to work on the wording...
[edit]
OK, for the purposes of argument, I make one major assumption: all pirates are greedy, ruthless, untrustworthy beings, who will always act in their own immediate best interest.
The motivation for this assumption is that it makes the puzzle actually solveable.
Consider the 3pirate case. Call the pirates 1, 2 and 3, from most to least fiercest. Consider if 2 and 3 make an alliance such that they accept 1's offer, but throw him overboard anyway and split his share. That is to say that if 1's offer is a gold to 1, b gold to 2 and c gold to 3, both 2 and 3 will vote against it and 1 will be thrown overboard. 2 will then offer b+a/2 gold to 2 and c+a/2 gold to 3, who will vote for it.
In such a situation, 1 has no good strategy. Regardless of what he offers, 2 and 3 can do better by throwing 1 overboard. 1's optimum strategy is to claim no gold for himself, giving it all to 2 and 3, and hope in the benevolence of 2 and 3 to not throw him overboard anyway.
This is not necessarily 2 and 3's optimum strategy, but it seems fair to assume that whatever their optimum strategy is, it will leave 1 without a good strategy.
This makes for a less interesting logic puzzle (since the puzzle asks what 1's strategy should be), hence the assumption, which changes the puzzle parameters to be one that has an interesting solution.
With the assumption, if 2 and 3 make this alliance, and throw 1 overboard, 2 (being the ruthless untrustworthy pirate that he is) will then backstab 3 and take all the gold for himself (he can do this, it being the 2pirate case). 3, being perfectly logical, knows 2 will do this, and never enter the alliance.
The upshot of all this is that a pirate will vote for an offer if it gives them more gold than they would get if they vote against it. If the amount of gold is the same either way, their vote cannot be trusted. Thus, to ensure a pirate's vote, you must simply offer more them gold than they could otherwise recieve. If the next pirate is guaranteed to win if they get to make an offer (which you assume as part of induction) then all you have to do to guarantee a vote is offer more than the next pirate in line would. Guarantee enough votes, and you guarantee a win, which finishes the inductive step.
So, now we start the inductive process to find this strategy.
The 1person case is a nobrainer. The pirate keeps all 1000 for him/herself.
The 2person case is also a nobrainer. The fiercer pirate keeps all 1000.
The 3person case is simple, and described by others above. The fiercest keeps 999, and offers 1 to the weakest. The weakest will then vote for it, knowing that if he/she votes against it, the middle pirate will take all 1000.
In the 4person case, the fiercest will take 999, and offer 1 to the thirdfiercest. The thirdfiercest knows that if he votes the offer down, he'll get nothing because he'll become the secondfiercest of the 3person case, so he'll vote for it.
In the 5person case, the fiercest needs another two votes to win, and there are two people who will get offered 0 gold if he's thrown overboard and it becomes the 4person case. So he/she offers 1 gold to each of them, securing their votes.
And so on. In general terms:
In the nperson case, the fiercest will offer 1 gold to each of the third, fifth, seventh, etc, fiercest pirates, who know that if he is thrown overboard it will become the (n1)person case, and they'll be the evennumbered pirates and recieve nothing.
By mathematical induction, this strategy will always ensure you get all the votes you need, and you will only pay 1 gold for each vote.
The fiercest pirate will keep 951 coins for himself, and offer 1 coin to the third fiercest, 1 to the fifth, 1 to the seventh, etc, and 1 to the ninetyninth.
I'll edit in the proof in a bit, I just have to work on the wording...
[edit]
OK, for the purposes of argument, I make one major assumption: all pirates are greedy, ruthless, untrustworthy beings, who will always act in their own immediate best interest.
The motivation for this assumption is that it makes the puzzle actually solveable.
Consider the 3pirate case. Call the pirates 1, 2 and 3, from most to least fiercest. Consider if 2 and 3 make an alliance such that they accept 1's offer, but throw him overboard anyway and split his share. That is to say that if 1's offer is a gold to 1, b gold to 2 and c gold to 3, both 2 and 3 will vote against it and 1 will be thrown overboard. 2 will then offer b+a/2 gold to 2 and c+a/2 gold to 3, who will vote for it.
In such a situation, 1 has no good strategy. Regardless of what he offers, 2 and 3 can do better by throwing 1 overboard. 1's optimum strategy is to claim no gold for himself, giving it all to 2 and 3, and hope in the benevolence of 2 and 3 to not throw him overboard anyway.
This is not necessarily 2 and 3's optimum strategy, but it seems fair to assume that whatever their optimum strategy is, it will leave 1 without a good strategy.
This makes for a less interesting logic puzzle (since the puzzle asks what 1's strategy should be), hence the assumption, which changes the puzzle parameters to be one that has an interesting solution.
With the assumption, if 2 and 3 make this alliance, and throw 1 overboard, 2 (being the ruthless untrustworthy pirate that he is) will then backstab 3 and take all the gold for himself (he can do this, it being the 2pirate case). 3, being perfectly logical, knows 2 will do this, and never enter the alliance.
The upshot of all this is that a pirate will vote for an offer if it gives them more gold than they would get if they vote against it. If the amount of gold is the same either way, their vote cannot be trusted. Thus, to ensure a pirate's vote, you must simply offer more them gold than they could otherwise recieve. If the next pirate is guaranteed to win if they get to make an offer (which you assume as part of induction) then all you have to do to guarantee a vote is offer more than the next pirate in line would. Guarantee enough votes, and you guarantee a win, which finishes the inductive step.
So, now we start the inductive process to find this strategy.
The 1person case is a nobrainer. The pirate keeps all 1000 for him/herself.
The 2person case is also a nobrainer. The fiercer pirate keeps all 1000.
The 3person case is simple, and described by others above. The fiercest keeps 999, and offers 1 to the weakest. The weakest will then vote for it, knowing that if he/she votes against it, the middle pirate will take all 1000.
In the 4person case, the fiercest will take 999, and offer 1 to the thirdfiercest. The thirdfiercest knows that if he votes the offer down, he'll get nothing because he'll become the secondfiercest of the 3person case, so he'll vote for it.
In the 5person case, the fiercest needs another two votes to win, and there are two people who will get offered 0 gold if he's thrown overboard and it becomes the 4person case. So he/she offers 1 gold to each of them, securing their votes.
And so on. In general terms:
In the nperson case, the fiercest will offer 1 gold to each of the third, fifth, seventh, etc, fiercest pirates, who know that if he is thrown overboard it will become the (n1)person case, and they'll be the evennumbered pirates and recieve nothing.
By mathematical induction, this strategy will always ensure you get all the votes you need, and you will only pay 1 gold for each vote.
Last edited by phlip on Tue Oct 10, 2006 5:45 am UTC, edited 1 time in total.
Code: Select all
enum ಠ_ಠ {°□°╰=1, °Д°╰, ಠ益ಠ╰};
void ┻━┻︵╰(ಠ_ಠ ⚠) {exit((int)⚠);}
I agree. The magic number is 951 delightful peices of golden booty
I'm a visual person, so I made a chart. The yaxis is the pirate making the decision (01 is the most fierce to keep convention). The chart represents pirate y giving _ goldies to pirate x. I'll use 100 total gold for this experiment cause it takes up less space.
Person's best answer
01  0 . . 1 . .0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . .94
02  1 . . 0 . .1 . . 0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . .94
03  0 . . 1 . .0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . .95
04  1 . . 0 . .1 . . 0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . .95
05  0 . . 1 . .0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . .96
06  1 . . 0 . .1 . . 0 . . 1 . . 0 . . 1 . . 0 . .96
07  0 . . 1 . .0 . . 1 . . 0 . . 1 . . 0 . .97
08  1 . . 0 . .1 . . 0 . . 1 . . 0 . .97
09  0 . . 1 . .0 . . 1 . . 0 . .98
10  1 . . 0 . .1 . . 0 . .98
11  0 . . 1 . .0 . .99
12  1 . . 0 . .99
13  0 . 100
+
00 14  13  12  11  10  09  08  07  06  05  04  03  02  01
Why do they vote yes for just 1 gold? In short, because they know that if this person gets voted out, the next person won't offer them any.
Edit: well apparently all extra spaces get trimmed so you get dots... and a meaningless "00" in the corner.
I'm a visual person, so I made a chart. The yaxis is the pirate making the decision (01 is the most fierce to keep convention). The chart represents pirate y giving _ goldies to pirate x. I'll use 100 total gold for this experiment cause it takes up less space.
Person's best answer
01  0 . . 1 . .0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . .94
02  1 . . 0 . .1 . . 0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . .94
03  0 . . 1 . .0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . .95
04  1 . . 0 . .1 . . 0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . .95
05  0 . . 1 . .0 . . 1 . . 0 . . 1 . . 0 . . 1 . . 0 . .96
06  1 . . 0 . .1 . . 0 . . 1 . . 0 . . 1 . . 0 . .96
07  0 . . 1 . .0 . . 1 . . 0 . . 1 . . 0 . .97
08  1 . . 0 . .1 . . 0 . . 1 . . 0 . .97
09  0 . . 1 . .0 . . 1 . . 0 . .98
10  1 . . 0 . .1 . . 0 . .98
11  0 . . 1 . .0 . .99
12  1 . . 0 . .99
13  0 . 100
+
00 14  13  12  11  10  09  08  07  06  05  04  03  02  01
Why do they vote yes for just 1 gold? In short, because they know that if this person gets voted out, the next person won't offer them any.
Edit: well apparently all extra spaces get trimmed so you get dots... and a meaningless "00" in the corner.
Hooks to my brain are well in.
I'm going to number the pirates in ascending order of fierceness for a change.
In the 3pirate scenario, 3 can take $999 and give 1 $1 because it beats the 0 that 1 would get from the 2pirate scenario.
The 4pirate scenario has a couple of outcomes. 4 can only guarantee a yesvote through 2,0,0,998 or 0,1,0,999 and will choose the latter to maximise his own gain.
In the 5pirate scenario, 3 knows that he can't get anything if 5 is eliminated because the 4pirate scenario only pays 1 or 2. 1 also knows that he is certain to lose out in the 4pirate scenario, so 5 has a guaranteed yes on 1,0,1,0,998
6pirate becomes 0,1,0,1,0,998. The pattern is clearly established at this point. If you're even, you guarantee a yesvote by paying all other even pirates 1 coin each and keeping the rest. The same goes for odds supporting odds. An even pirate who votes no on a plan that would get him one coin will end up with nothing when the odd pirates all vote in favour of the next plan down.
In the 3pirate scenario, 3 can take $999 and give 1 $1 because it beats the 0 that 1 would get from the 2pirate scenario.
The 4pirate scenario has a couple of outcomes. 4 can only guarantee a yesvote through 2,0,0,998 or 0,1,0,999 and will choose the latter to maximise his own gain.
In the 5pirate scenario, 3 knows that he can't get anything if 5 is eliminated because the 4pirate scenario only pays 1 or 2. 1 also knows that he is certain to lose out in the 4pirate scenario, so 5 has a guaranteed yes on 1,0,1,0,998
6pirate becomes 0,1,0,1,0,998. The pattern is clearly established at this point. If you're even, you guarantee a yesvote by paying all other even pirates 1 coin each and keeping the rest. The same goes for odds supporting odds. An even pirate who votes no on a plan that would get him one coin will end up with nothing when the odd pirates all vote in favour of the next plan down.
 Gelsamel
 Lame and emo
 Posts: 8237
 Joined: Thu Oct 05, 2006 10:49 am UTC
 Location: Melbourne, Victoria, Australia
Teaspoon wrote:I'm going to number the pirates in ascending order of fierceness for a change.
In the 3pirate scenario, 3 can take $999 and give 1 $1 because it beats the 0 that 1 would get from the 2pirate scenario.
The 4pirate scenario has a couple of outcomes. 4 can only guarantee a yesvote through 2,0,0,998 or 0,1,0,999 and will choose the latter to maximise his own gain.
In the 5pirate scenario, 3 knows that he can't get anything if 5 is eliminated because the 4pirate scenario only pays 1 or 2. 1 also knows that he is certain to lose out in the 4pirate scenario, so 5 has a guaranteed yes on 1,0,1,0,998
6pirate becomes 0,1,0,1,0,998. The pattern is clearly established at this point. If you're even, you guarantee a yesvote by paying all other even pirates 1 coin each and keeping the rest. The same goes for odds supporting odds. An even pirate who votes no on a plan that would get him one coin will end up with nothing when the odd pirates all vote in favour of the next plan down.
>>Ding ding ding!<<
Gelsamel wrote:>>Ding ding ding!<<
I belive phlip had this solution first (though teaspoon had the correct answer first, just with a slighly wrong solution (giving 1 gold to the 49 least fierce pirates). I then made a graph because I have too much free time.
Thus, I demand* a greater or equal amount of dings for phlip and something for myself. Perhaps a warble.
Though you can make the case that all three of us showed that we reached the end using separate development, I live in the USA, land of the patent, so that mean squat.
*request graciously, deargodpleasedon'tshunme
Hooks to my brain are well in.
 Gelsamel
 Lame and emo
 Posts: 8237
 Joined: Thu Oct 05, 2006 10:49 am UTC
 Location: Melbourne, Victoria, Australia
TwoBuy wrote:Gelsamel wrote:>>Ding ding ding!<<
I belive phlip had this solution first (though teaspoon had the correct answer first, just with a slighly wrong solution (giving 1 gold to the 49 least fierce pirates). I then made a graph because I have too much free time.
I didn't understand your graph honestly, ASCII graphs are really bad, I knew you had the gist of the pattern but I wasn't really sure.
Thus, I request graciously a greater or equal amount of dings for phlip and something for myself. Perhaps a warble.
Okay okay, phlip gets "Ding Ding Ding" and so do you, you all got it right you all got it right.
I knew the answer because I've done this brain teaser before.
But after thinking about it for a bit, wouldn't 99/02 depending on the way you label it vote "no" no matter how much he is given because he will always get more if everyone dies, and since he is ruthless he will also vote no even if people above him offer him everything because he just wants them to die.
The problem with collusion strategies is that there's no way to compel other pirates to keep their promises. Suppose you promise to kill somebody if they renege their deal, and they do so. There are now three possibilities:
A) Killing them would put you in a worse position.
B) Killing them would do you neither good nor harm.
C) Killing them would help you.
In case C, the bargain's unnecessary  you would kill them regardless. And there's no reason for you to keep a promise NOT to kill them if they've already kept or broken their word. In case B, the bargain may have an effect, but I'm doubtful. In case A, your only incentive to keep your promise is that you made it.
This can work if we assume that keeping and breaking promises signals to the other pirates that you are a promisekeeper or a promisebreaker by nature. But that seems to be outside the scope of pirate behavior posited by the problem.
A) Killing them would put you in a worse position.
B) Killing them would do you neither good nor harm.
C) Killing them would help you.
In case C, the bargain's unnecessary  you would kill them regardless. And there's no reason for you to keep a promise NOT to kill them if they've already kept or broken their word. In case B, the bargain may have an effect, but I'm doubtful. In case A, your only incentive to keep your promise is that you made it.
This can work if we assume that keeping and breaking promises signals to the other pirates that you are a promisekeeper or a promisebreaker by nature. But that seems to be outside the scope of pirate behavior posited by the problem.
 phlip
 Restorer of Worlds
 Posts: 7461
 Joined: Sat Sep 23, 2006 3:56 am UTC
 Location: Australia
 Contact:
Gelsamel wrote:But after thinking about it for a bit, wouldn't 99/02 depending on the way you label it vote "no" no matter how much he is given because he will always get more if everyone dies, and since he is ruthless he will also vote no even if people above him offer him everything because he just wants them to die.
But you can guarantee that if it gets down to 3 pirates remaining, the thirdweakest pirate will secure the weakest's vote, and the secondweakest will get nothing. So the secondweakest will never have the opportunity to take everything, because he doesn't get the casting vote in the 3pirate case.
That's what the inductive step says... when considering the npirate case, you can guarantee that if the (n1)thweakest pirate gets to make an offer s/he'll win. So when considering your offer, the pirates will only compare it to the (n1)thweakest pirate's offer.
So, basically, n's offer will only win because everyone knows that n1's offer will win. And n1's offer will only win because everyone knows that n2's offer will win, and so on until 3's offer winning because 2's offer will win, which we know it will.
So not only does everyone have to be a perfect logician, they all have to know everyone's a perfect logician, and to know that everyone knows everyone's a perfect logician... and so on to at least, I think, 200 levels of "knowing"...
Basically, you need to be sure that everyone will come to the same conclusions you have, because you're counting on them knowing that. Including the conclusion that everyone came to the same conclusion.
Last edited by phlip on Wed Oct 11, 2006 4:20 am UTC, edited 1 time in total.
Code: Select all
enum ಠ_ಠ {°□°╰=1, °Д°╰, ಠ益ಠ╰};
void ┻━┻︵╰(ಠ_ಠ ⚠) {exit((int)⚠);}
What does the nonownership of a ruth have to do with a pirate wanting to kill other pirates?
I've already argued that all pirates would consider throwing one of the pirates overboard to be a negative outcome, as the removal of any pirate (and especially the fiercest one) will reduce the effectiveness of the crew and may cause further loss of piratey life in their next raid. No pirate will make a choice that reduces his survival chance unless it guarantees him money, so any pirate who won't actually increase their takings by voting no will vote yes.
I've already argued that all pirates would consider throwing one of the pirates overboard to be a negative outcome, as the removal of any pirate (and especially the fiercest one) will reduce the effectiveness of the crew and may cause further loss of piratey life in their next raid. No pirate will make a choice that reduces his survival chance unless it guarantees him money, so any pirate who won't actually increase their takings by voting no will vote yes.
 Gelsamel
 Lame and emo
 Posts: 8237
 Joined: Thu Oct 05, 2006 10:49 am UTC
 Location: Melbourne, Victoria, Australia
Teaspoon wrote:What does the nonownership of a ruth have to do with a pirate wanting to kill other pirates?
I've already argued that all pirates would consider throwing one of the pirates overboard to be a negative outcome, as the removal of any pirate (and especially the fiercest one) will reduce the effectiveness of the crew and may cause further loss of piratey life in their next raid. No pirate will make a choice that reduces his survival chance unless it guarantees him money, so any pirate who won't actually increase their takings by voting no will vote yes.
No, the pirates are bloodthirsty. I think it was commented before "all things being equal they would vote for blood".
I think when considering options like that you have to think within the analogy, the pirates and coins is just a good way to present the problem, the fact they'd need alot of pirates to pilliage better doesn't come into it.
 phlip
 Restorer of Worlds
 Posts: 7461
 Joined: Sat Sep 23, 2006 3:56 am UTC
 Location: Australia
 Contact:
Teaspoon wrote:I've already argued that all pirates would consider throwing one of the pirates overboard to be a negative outcome, as the removal of any pirate (and especially the fiercest one) will reduce the effectiveness of the crew and may cause further loss of piratey life in their next raid. No pirate will make a choice that reduces his survival chance unless it guarantees him money, so any pirate who won't actually increase their takings by voting no will vote yes.
In such a situation, the best solution is to simply keep all 1000 for yourself, and you'll get a vote from everyone except the second in line. The 0 gold you're offering them is equal to the 0 gold that the second in line would offer them.
Code: Select all
enum ಠ_ಠ {°□°╰=1, °Д°╰, ಠ益ಠ╰};
void ┻━┻︵╰(ಠ_ಠ ⚠) {exit((int)⚠);}
phlip wrote:Teaspoon wrote:I've already argued that all pirates would consider throwing one of the pirates overboard to be a negative outcome, as the removal of any pirate (and especially the fiercest one) will reduce the effectiveness of the crew and may cause further loss of piratey life in their next raid. No pirate will make a choice that reduces his survival chance unless it guarantees him money, so any pirate who won't actually increase their takings by voting no will vote yes.
In such a situation, the best solution is to simply keep all 1000 for yourself, and you'll get a vote from everyone except the second in line. The 0 gold you're offering them is equal to the 0 gold that the second in line would offer them.
Indeed. See my second post in this thread.
phlip's induction proof is the same as my own. My argument for why collusion will not is that they are pirates, and the problem statement implies that they will reneg on a deal if it benefits them. Thus if we start with the limiting case of 2 pirates, the pirate #100 knows that if it ever gets to this point, he'll get nothing. Similarly, pirate #99 knows he'll get 1000 if it ever gets to this point. So whatever deal they had up until then is worthless at that point. When we extend to three pirates, all three are aware of this fact and act accordingly.
The scenario is setup such that the only one agent killed per proposal is the one who proposed it and they know the exact order of the proposals. Thus every pirate knows that everyone who comes before him and after him will make a proposal thinking only of himself, effectively forestalling any collusion.
Now change any part of the scenario, such as:
And I think you'll have a very different problem. Hmmm... I haven't thought this through entirely, but it seems interesting: If a proposal fails, the pirate who had the largest proposed share is shot (if there's a tie, they all are). What happens then?
The scenario is setup such that the only one agent killed per proposal is the one who proposed it and they know the exact order of the proposals. Thus every pirate knows that everyone who comes before him and after him will make a proposal thinking only of himself, effectively forestalling any collusion.
Now change any part of the scenario, such as:
 * More than one person being killed
* The proposer has a chance of surviving a failed proposal
* Random order of proposals
And I think you'll have a very different problem. Hmmm... I haven't thought this through entirely, but it seems interesting: If a proposal fails, the pirate who had the largest proposed share is shot (if there's a tie, they all are). What happens then?

 Posts: 563
 Joined: Tue Jul 27, 2010 8:48 am UTC
Re: Pirate Democracy [Solution]
I have a more interesting puzzle: After they found the 1000 coins (and decided the distribution), they now decide to downgrade the captain. The captain's position is decided by his generosity points. He gets +1 for every vote for him, and 1 for every vote against him. He is the lowest ranking person if he has 0 generosity points, and he gains a rank for every added generosity point. Now suppose that they find another 1000 coins, and the new captain decides the distribution according to the same rules as before. This cycle repeats infinitely many times (fine, the pirates are immortal, at least to the point of not dying by old age), but the value of the coins decreases by a factor of 0.9 every cycle. How can the captain maximize his coins value?
I have discovered a truly marvelous proof of this, which this margin is too narrow to contain.
Who is online
Users browsing this forum: Bing [Bot] and 4 guests