Lowest price auctions PUZZLE

A forum for good logic/math puzzles.

Moderators: jestingrabbit, Moderators General, Prelates

cjr
Posts: 15
Joined: Wed Jan 10, 2007 6:29 pm UTC

Lowest price auctions PUZZLE

Postby cjr » Sat Mar 17, 2007 2:28 am UTC

This is a question that has bugged the hell out of me (I don't know the answer but discussed it often at the pub). I'm going to rephrase it in slightly mathematical terms, but there are lots of auctions out there that use this system. If someone knows an answer from a journal or paper on auction theory, then please don't be afraid to just post it. Otherwise, get thinking!

There are m players in a game. Each player chooses a non-negative integer. The person with the lowest unique integer wins.

Example: Three players in game - A chooses 3, B chooses 3, C chooses 4. 3 has been chosen twice, so is invalid, so C wins.

The question: What value n should I pick to maximise my chance of winning the game?

The difficulty I believe stems from the fact that an specific optimal integer is optimal for me, therefore is optimal for everybody, therefore (its non-unique) is optimal for no-one. But don't give up hope! Can you find fruitful bounds of numbers? Can you picture the distribution of guesses? Are we doomed to saying that the optimal outcome is a draw? I don't accept this last proposition, as I ask for the number which most increases my chance of winning.

User avatar
Yakk
Poster with most posts but no title.
Posts: 11129
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Sat Mar 17, 2007 3:19 am UTC

Aim for nash equilibriums. If all but one person plays by your strategy, you want not playing that strategy to screw the person who doesn't play along.

If there are two people your best choice is 1, and the result is a draw. If either changes to 2, they are screwed.

Three people is a harder problem...

With three people, if two of them go for 50/50 1 and 2, the third who goes 3 has a 50% chance of winning. That's no good.

If two of you go A/B/C on 1/2/3, guy #3 can go 1.

A`^2 is #3s chance of winning. So A`^2 must be less than or equal to the chance that either first two players win.

Chance that player 2 wins is: A.

So A`^2 < A
A` is (1-A), so 1-2A+A^2 < A
1-3A+A^2<0

A intercepts 0 at:
3+-sqrt(5)
-----------
2
and is convex up.

So A from .382 to 1 prevents the "go 1" strategy from being optimal.

Player #3 going B wins if both player 1 and 2 go A or C, and ties of both go B.

To make sure that player 1 or 2 have a higher chance of winning that someone going pure-2, we need:
A^2 + C^2 <= A - A^2
2 A^2 - A + C^2 <= 0

This is also convex up in A.

A in
1 +- sqrt(1-8C^2)
--------------------
4
which places a limit on the relationshp between A and C.

...

Heh, this is harder than I'd thought. Maybe Nash's noble prize involved actually producing techniques to solve these problems! :)

User avatar
Strilanc
Posts: 646
Joined: Fri Dec 08, 2006 7:18 am UTC

Postby Strilanc » Sat Mar 17, 2007 3:26 am UTC

All players are equal in this game, so the best you can hope for is a 1/n chance of winning (assuming the game is replayed when everyone ties), where n is the number of players.

I'm pretty sure picking a random integer between 1 and n guarantees this. That way anyone who picks higher only wins if you match one of the other players (better than 1/n odds). Anyone who plans to pick lower is more likely to match other low-pickers, canceling their advantage.

I haven't actually done the calculations, but this is my instinct. It clearly works for the 2-person case (you expect to win half the time).
Don't pay attention to this signature, it's contradictory.

User avatar
3.14159265...
Irrational (?)
Posts: 2413
Joined: Thu Jan 18, 2007 12:05 am UTC
Location: Ajax, Canada

Postby 3.14159265... » Sat Mar 17, 2007 4:02 am UTC

91? :D
"The best times in life are the ones when you can genuinely add a "Bwa" to your "ha""- Chris Hastings

User avatar
MMoto
Posts: 40
Joined: Thu Nov 30, 2006 6:23 pm UTC
Location: Québec, Québec, Canada

Postby MMoto » Sat Mar 17, 2007 6:11 am UTC

Alky wrote:I haven't actually done the calculations, but this is my instinct. It clearly works for the 2-person case (you expect to win half the time).


In the 2-person case, suppose your opponent always picks 1.
Then you lose half the time and draw half the time.

User avatar
Yakk
Poster with most posts but no title.
Posts: 11129
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Sat Mar 17, 2007 1:11 pm UTC

In a 3 person game, assume 1 person picks 1.

They win 4/9th of the time.

It isn't enough to have a setup in which each person wins 1/n of the time: you must have a setup in which anyone changing their strategy screws them.

This is what is known as a Nash equilibirium.

See:
http://en.wikipedia.org/wiki/Nash_equilibrium

(The idea (well, the proof that one exists in all finite games) was worth a Nobel prize, IIRC, so don't feel bad if you missed it.)

cjr
Posts: 15
Joined: Wed Jan 10, 2007 6:29 pm UTC

Postby cjr » Sat Mar 17, 2007 1:22 pm UTC

Yes, Yakk, and I think your approach so far has been spot on :) I have a great interest in game theory but no-one wants to teach me it (mathematics undergraduate). Can you recommend me a text on it?

User avatar
Yakk
Poster with most posts but no title.
Posts: 11129
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Sat Mar 17, 2007 9:47 pm UTC

I really don't know game theory all that well. As a guess, useful topics for game theory would include: Group, Ring, Probability, Measure, Combinatorics, Graph, Analysis...

Graph theory, because you need the notation.
Analysis, Measure and Probability to deal with continuous non-deterministic situations.
Combinatorics, Group and Ring theory to deal with discreet situations.

Maybe some Optimization (linear programming, network theory) to get efficient solutions to some problems...

Heck, that's a good chunk of math you can pick up at an undergrad level. Hmm -- number theory might not be useful, and I'd be surprised if complex analysis or computer theory showed up as useful...

I've always liked the idea of owning the seminal work, in this case:
The Theory of Games and Economic Behavior by von Neumann (the atom smasher)

Wiki's article has an extensive list of texts:
http://en.wikipedia.org/wiki/Game_theory

In this case, we are looking for a Nash Equilbrium, so Nash's works might be useful:
# Morgenstern, Oskar and John von Neumann (1947) The Theory of Games and Economic Behavior Princeton University Press

# Nash, John (1950) "Equilibrium points in n-person games" Proceedings of the National Academy of the USA 36(1):48-49.

...

Part of Nash's equilibrium result revolves around:
Kakutani fixed point theorem
which is clearly analytical in nature, but it is applied to a probability space.

User avatar
Cosmologicon
Posts: 1806
Joined: Sat Nov 25, 2006 9:47 am UTC
Location: Cambridge MA USA
Contact:

Postby Cosmologicon » Sat Mar 17, 2007 9:56 pm UTC

If there's more than a couple people, announce loudly that you're picking 2. Show them your slip with 2 written on it, if you can.

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Postby skeptical scientist » Sun Mar 18, 2007 11:53 am UTC

Cosmologicon wrote:If there's more than a couple people, announce loudly that you're picking 2. Show them your slip with 2 written on it, if you can.

That's really clever if you can actually do it, but I'm assuming that's not a valid strategy.

Nobody else has been hiding spoilers, so I won't either.

Note that there's no reason to expect that a Nash equilibrium strategy is symmetric. If there are 3 players, 2 players playing 1 and 1 player playing 2 is a Nash equilibrium, since if either of the 1 playing players changes, they lose, and the 2 playing player is currently winning. If there are 4 players, then 2 players playing 1, and 2 players playing 2 is a Nash equilibrium. Similarly, for 2n players, 2 players each playing 1, 2, ..., n is a Nash equilibrium, and for 2n+1 players, 2 players each playing 1, 2, ..., n and 1 player playing n+1 is a Nash equilibrium.

These are not the only Nash equilibria, but I doubt there are any symmetric equilibria. If there were, it would have to be very complicated. If n is the highest number that each player can play, then playing n+1 is always better than playing n, so there can't be a highest number each player can play. This already tells us that a symmetrical Nash equilibrium would have to attach a nonzero probability of picking an infinite number of numbers. Since then each player would have to have the same odds of winning on infinitely many different choices, it seems unlikely that such an equilibrium exists at all.

p.s. Does anyone find my new sig funny? Cause if not I'll change it again.
Last edited by skeptical scientist on Sun Mar 18, 2007 12:04 pm UTC, edited 1 time in total.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

User avatar
hermaj
Posts: 6139
Joined: Sun Oct 15, 2006 10:37 am UTC
Location: Sydney, Australia
Contact:

Postby hermaj » Sun Mar 18, 2007 11:56 am UTC

[vent]

Completely off-topic, but argh, this thread needs a change in title. When I'm going through the forums I continually have to catch myself before deleting it.

[/vent]


...Carry on, chaps.

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Postby skeptical scientist » Sun Mar 18, 2007 11:57 am UTC

Nobody's stopping you from changing the title to "Lowest price auctions puzzle", which wouldn't look like spam.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

User avatar
hermaj
Posts: 6139
Joined: Sun Oct 15, 2006 10:37 am UTC
Location: Sydney, Australia
Contact:

Postby hermaj » Sun Mar 18, 2007 12:01 pm UTC

Awesome. Well, I will change it to that. I kind of wanted to say something first so people were not all "Argh, you messed around with my thread".

User avatar
MMoto
Posts: 40
Joined: Thu Nov 30, 2006 6:23 pm UTC
Location: Québec, Québec, Canada

Postby MMoto » Sun Mar 18, 2007 3:19 pm UTC

skeptical scientist wrote:If there are 4 players, then 2 players playing 1, and 2 players playing 2 is a Nash equilibrium. Similarly, for 2n players, 2 players each playing 1, 2, ..., n is a Nash equilibrium.


These aren't Nash equilibria, since every player would want to switch to n+1 to become the winner.

Token
Posts: 1481
Joined: Fri Dec 01, 2006 5:07 pm UTC
Location: London

Postby Token » Sun Mar 18, 2007 3:43 pm UTC

MMoto wrote:These aren't Nash equilibria, since every player would want to switch to n+1 to become the winner.

That wouldn't work, since as soon as they switch, the other player with their original number now has the lowest unique number.

User avatar
jestingrabbit
Factoids are just Datas that haven't grown up yet
Posts: 5967
Joined: Tue Nov 28, 2006 9:50 pm UTC
Location: Sydney

Postby jestingrabbit » Sun Mar 18, 2007 8:57 pm UTC

Cosmologicon wrote:If there's more than a couple people, announce loudly that you're picking 2. Show them your slip with 2 written on it, if you can.


This is actually a pretty weak strategy. I expect it was a toungue in cheek statement, but I'll treat it seriously. Say there are two other players. They know what you're doing, so they can plan their strategy based on yours. They do this as follows.

Firstly we'll assume they can't confer, so they have to play the same strategy. Secondly we'll assume that if its a draw then your oponents choose again, and that you're stuck with your strategy.

What the "do over" rule comes down to is conditioning the probabilities of a win or loss with the event that there is no draw. Let a1 be the odds that one of your oponents chooses 1, a2 be the chance that one of your oponents chooses 2 and a3 be the chance that they choose 3. Because every number more than 3 is essentially equivalent, we can assume that

a1+a2+a3=1.

The probability of a draw is the probability that both oponents choose 2, which is a2^2. You win if both oponents choose 1 or both choose 3 which has probability a1^2 + a3^2. So, the chance of you winning conditioned on there not being a draw is

(a1^2 + a3^2)(1-a2^2)^(-1).

A page of calculus later, and you can see that when

(a1, a2, a3)=(1/5, 3/5, 1/5)

your chance of winning is 2/25, the chance of a draw is 9/25 and the chance that one of your oponents wins is 14/25, distributed evenly between them. Your conditioned chance of winning is 1/8 and the chance that a particular one of your oponents wins is 7/16. Furthermore, these oponents minimize your chances when you make that selection.

Similarly, if you always choose 1, the oposing strategy is to chose 1 with probability 3/4 and two otherwise. The analysis changes when there are more people, but none of these "I choose number blah" strategies is optimal. I don't have the optimal strategy at this time, and I'm beginning to wonder if its the sort of thing that you can write down, but there is a 'best in every circumstance' nash equilibrium oponent. You can get rules for the best strategy using the ideas above, like always chose 1 with probability at least 1/2, but I don't think its a good way to calculate the best strategy.

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Postby skeptical scientist » Sun Mar 18, 2007 9:16 pm UTC

jestingrabbit, you seem to be assuming superrational strategies on the part of the opponents, or something similar like that. The only way your strategy works is if you assume that by playing it, somehow everyone else will miraculously play it as well, which is not how games generally work.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

User avatar
jestingrabbit
Factoids are just Datas that haven't grown up yet
Posts: 5967
Joined: Tue Nov 28, 2006 9:50 pm UTC
Location: Sydney

Postby jestingrabbit » Sun Mar 18, 2007 10:20 pm UTC

Its not how games generally work, but it is the assumption behind nash equilibriums.

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Postby skeptical scientist » Sun Mar 18, 2007 10:36 pm UTC

jestingrabbit wrote:Its not how games generally work, but it is the assumption behind nash equilibriums.

No it's not. Nash equilibriums don't assume that everyone plays the same strategy. In fact, the Nash equilibria I described above are decidedly asymmetrical.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

User avatar
parallax
Posts: 157
Joined: Wed Jan 31, 2007 5:06 pm UTC
Location: The Emergency Intelligence Incinerator

Postby parallax » Mon Mar 19, 2007 2:46 am UTC

skeptical scientist wrote:
jestingrabbit wrote:Its not how games generally work, but it is the assumption behind nash equilibriums.

No it's not. Nash equilibriums don't assume that everyone plays the same strategy. In fact, the Nash equilibria I described above are decidedly asymmetrical.


The problem with your Nash equilibrium is that it's not a valid strategy. Know one knows what the other players are choosing. I'm assuming that you are not allowed to discuss your choices with other people before choosing your number.

In the case of three players, at least, there is no bound to the numbers you should choose. Proof: Assume that there is a highest number that is ever chosen with probability > 0. Let that be N. If two players follow that strategy, then the third player has a strictly better strategy. The third player chooses his number according to the same strategy, except whenever he would choose N, he chooses N+1 instead. The number N would only win when both other players choose the same number less than N, in which case N+1 still wins. Both N and N+1 lose when one player chooses a number less than N and the other player chooses a different number. But N+1 will win when both other players choose N, when N would normally only draw. Therefore, an optimal strategy would involve choosing N+1.

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Postby skeptical scientist » Mon Mar 19, 2007 3:21 am UTC

parallax wrote:The problem with your Nash equilibrium is that it's not a valid strategy.

In what sense is it not a valid strategy? What constitutes a valid strategy? In multiplayer (more than 2) games, often times there is no "correct" strategy, unlike the case with 2 player perfect information games. Many naturally occurring multiplayer games (and also several of the ones for sale in game stores) often result in circumstances during play where one person is put in a "kingmaker" position, where they themselves have no chance of winning, but by their actions can choose which of the other players will be the victor. In order to have a "correct" strategy, you either need a strategy which is in some sense "good" regardless of how the opponents move, or at least for any reasonable moves, which represents an amount of stability that some games, such as this one, just don't possess.

It seems like you are requiring this type of strategy before you will consider it to be valid, but you're completely ignoring the fact that such a strategy may not exist.

In the case of three players, at least, there is no bound to the numbers you should choose. Proof: Assume that there is a highest number that is ever chosen with probability > 0. Let that be N. If two players follow that strategy, then the third player has a strictly better strategy. The third player chooses his number according to the same strategy, except whenever he would choose N, he chooses N+1 instead. The number N would only win when both other players choose the same number less than N, in which case N+1 still wins. Both N and N+1 lose when one player chooses a number less than N and the other player chooses a different number. But N+1 will win when both other players choose N, when N would normally only draw. Therefore, an optimal strategy would involve choosing N+1.

I already mentioned this, but it only applies if you assume the strategies of the players are symmetrical, and isn't a valid argument otherwise. Perhaps there is only one person who ever has a chance of picking N; then N is just as good as N+1 for him, and your "modification" for the other players isn't actually a modification at all.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

User avatar
parallax
Posts: 157
Joined: Wed Jan 31, 2007 5:06 pm UTC
Location: The Emergency Intelligence Incinerator

Postby parallax » Mon Mar 19, 2007 4:33 am UTC

Because that strategy violates the (implied) rule that the players cannot discuss their strategies before choosing their numbers. You can't ensure that 2 players choose '1' and 2 players choose '2' if you have no idea what the other players are choosing, and no way to assign each player a number. A valid strategy for a game in which no communication is possible must be symmetric.

If we do allow communication, then your solution becomes valid, but is there an equivalent equilibrium for the odd-person case?

User avatar
Cosmologicon
Posts: 1806
Joined: Sat Nov 25, 2006 9:47 am UTC
Location: Cambridge MA USA
Contact:

Postby Cosmologicon » Mon Mar 19, 2007 6:29 am UTC

jestingrabbit wrote:
Cosmologicon wrote:If there's more than a couple people, announce loudly that you're picking 2. Show them your slip with 2 written on it, if you can.


This is actually a pretty weak strategy. I expect it was a toungue in cheek statement, but I'll treat it seriously. Say there are two other players....


Well, I don't think it's so weak, because like skeptical scientist said, your objection requires superrationality. Your result is surely counterintuitive - if one player of a 3-person game announces they're picking 2, you should pick 2 with 60% probability! Anyway, it was tongue-in-cheek in that I think it's a "cheating" answer to the puzzle, so it shouldn't count, not in that I think it's actually a bad strategy.

But anyway, I said it was for "more than a couple people", so you can't go assuming there's only two!

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Postby skeptical scientist » Mon Mar 19, 2007 6:55 am UTC

I think we do have a symmetric Nash equilibrium, at least in the 3 player case. If each player picks n with probability p_n, then each player is ambivalent between picking n-1 and n if p_n satisfies the following:
Image
(Where s_n is the sum p_1+p_2+...+p_n of the first n probabilities.)
If this is true for all n, then each player is ambivalent between all choices, so is perfectly happy playing the same strategy as everyone else. The recursion relation determines p_n for all n from p_1, which could be anything, so long as we are never forced to take the square root of a negative, and so we get a symmetrical Nash equilibrium if for some choice of p_1, the series p_1+p_2+p_3+... converges to 1 where p_n are all real, positive, and given by the recursion relation above. By inspection, no term can increase the probability above 1, so the series is guaranteed to converge to some value no greater than 1. If someone can show that it converges to at least 1 for some value of p_1, then we have a symmetric Nash equilibrium.

Notes: for p_{n+1} to be real, we must have (1-s_n)>p_n, so it must be the case that the probability of choosing greater than n is at least the probability of choosing n. This immediately shows us that p_1<1/2. If p_1 is close to 1/2, then p_2 will also be close to 1/2, so p_2 must be less than some value less than one half (I don't feel like figuring out the exact value) in order for p_3 (given by the recursion) to be real. The most likely possibilities seem to be either that for any positive p_1, the series is eventually imaginary or for some p_1 the series is always real and gives 1, for larger values it is eventually imaginary, and for smaller values it sums to something smaller that 1. But the recursion seems nasty enough that it's hard to do calculations, so I'm not sure how to go about solving it. Perhaps someone else would like to carry the ball from here (or point out any huge errors I've made)?
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

User avatar
parallax
Posts: 157
Joined: Wed Jan 31, 2007 5:06 pm UTC
Location: The Emergency Intelligence Incinerator

Postby parallax » Mon Mar 19, 2007 8:13 am UTC

I can't say for sure whether your math was done correctly. It looks vaguely like some equations I got, but I don't think I was formulating them very well. I am worried about your conclusion, though. If two players choose any strategy with p_1 < 1/2, then the third player (player A) benefits by defecting to the strategy "Always choose 1.".

Chance that A wins: (1-p_1)^2
Chance that A draws: (p_1)^2
Chance that A loses: (2)(p_1)(1-p_1)

The ratio of wins to losses (assuming that draws are irrelevant) is (1-p_1) : (2)(p_1) which is always better than 1:2 when p_1 < 1/2.

So, if your analysis is correct, and my analysis is correct, then this game has no symmetric Nash equilibrium. I'm not sure what that means in terms of game theory, but I imagine it means that there is no optimum strategy. Sort of like how the game "Who can name the largest number?" has no optimum strategy.

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Postby skeptical scientist » Mon Mar 19, 2007 8:24 am UTC

You seem to be considering draws as irrelevant, and are only comparing wins to losses. I'm counting a draw as essentially the same as a loss, so I was only forcing the winning probabilities to be the same, and ignoring the probability of a tie. Essentially, you're analyzing a game where the payoff for a draw is 1/3 of the way between a loss and a win, and I'm analyzing the game where the payoff for a draw is the same as the payoff for a loss. It's not surprising, then, if we come to incompatible conclusions.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

User avatar
parallax
Posts: 157
Joined: Wed Jan 31, 2007 5:06 pm UTC
Location: The Emergency Intelligence Incinerator

Postby parallax » Mon Mar 19, 2007 8:50 am UTC

Yeah, I was assuming that, in the case of a draw, the game was replayed, or that each player put in $1, and in the case of a win, won the whole lot, a draw, each player got their dollar back, or else lost the dollar. The payoff matrix was not specified. However, it seems, in the case of an auction, a draw would result in no one receiving the item, which could count as a loss.

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Postby skeptical scientist » Mon Mar 19, 2007 8:52 am UTC

parallax wrote:Yeah, I was assuming that, in the case of a draw, the game was replayed, or that each player put in $1, and in the case of a win, won the whole lot, a draw, each player got their dollar back, or else lost the dollar. The payoff matrix was not specified. However, it seems, in the case of an auction, a draw would result in no one receiving the item, which could count as a loss.

He didn't specify what to do, so I just chose one so I could analyze the game. Any choice seems more-or-less arbitrary.

I'm actually really curious if the series p_n converges to 1 for the right value of p_1, but I can't see any way of doing it.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

User avatar
parallax
Posts: 157
Joined: Wed Jan 31, 2007 5:06 pm UTC
Location: The Emergency Intelligence Incinerator

Postby parallax » Mon Mar 19, 2007 9:24 am UTC

Well, in the case of draw = loss, p_1 = 1 - sqrt(3)/3 gives the defecting player a 1/3 chance of winning when choosing 1 himself. If your formula is correct, then p_2 chosen according to your formula would result in an ambivalence between choosing 1 and 2. In other words, choosing 2 should also give the defector a 1/3 chance of winning when choosing 2. By induction, if the p_n's are chosen according to your formula, the defector will always have a 1/3 chance of winning. Therefore, p_1 = 1 - sqrt(3)/3 should give a Nash equilibrium. I guess it is still necessary to prove that p_n exist for all n and that sum(p_n) = 1, but I think that value for p_1 is the one you should be looking at.

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Postby skeptical scientist » Mon Mar 19, 2007 9:36 am UTC

parallax wrote:Well, in the case of draw = loss, p_1 = 1 - sqrt(3)/3 gives the defecting player a 1/3 chance of winning when choosing 1 himself. If your formula is correct, then p_2 chosen according to your formula would result in an ambivalence between choosing 1 and 2. In other words, choosing 2 should also give the defector a 1/3 chance of winning when choosing 2. By induction, if the p_n's are chosen according to your formula, the defector will always have a 1/3 chance of winning. Therefore, p_1 = 1 - sqrt(3)/3 should give a Nash equilibrium. I guess it is still necessary to prove that p_n exist for all n and that sum(p_n) = 1, but I think that value for p_1 is the one you should be looking at.

You can't arrange it so that everyone has a 1/3 chance of winning, because there is a nonzero chance that everyone loses.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

User avatar
Patashu
Answerful Bignitude
Posts: 378
Joined: Mon Mar 12, 2007 8:54 am UTC
Contact:

Postby Patashu » Mon Mar 19, 2007 12:07 pm UTC

Cosmologicon wrote:If there's more than a couple people, announce loudly that you're picking 2. Show them your slip with 2 written on it, if you can.


This only works if a draw is equivilant to a loss, I think.

User avatar
MMoto
Posts: 40
Joined: Thu Nov 30, 2006 6:23 pm UTC
Location: Québec, Québec, Canada

Postby MMoto » Mon Mar 19, 2007 3:20 pm UTC

Token wrote:
MMoto wrote:These aren't Nash equilibria, since every player would want to switch to n+1 to become the winner.

That wouldn't work, since as soon as they switch, the other player with their original number now has the lowest unique number.


Curses. My bad.

User avatar
Cosmologicon
Posts: 1806
Joined: Sat Nov 25, 2006 9:47 am UTC
Location: Cambridge MA USA
Contact:

Postby Cosmologicon » Mon Mar 19, 2007 5:22 pm UTC

Patashu wrote:
Cosmologicon wrote:If there's more than a couple people, announce loudly that you're picking 2. Show them your slip with 2 written on it, if you can.


This only works if a draw is equivilant to a loss, I think.


Isn't it? The OP asked, what should I pick to maximize my chances of wining?

User avatar
Drostie
Posts: 262
Joined: Fri Nov 03, 2006 6:17 am UTC

Postby Drostie » Mon Mar 19, 2007 8:19 pm UTC

I have the feeling that the moment you allow loud announcements that you're picking 2, the other players can collude such that one of them, circularly chosen, picks 2 for any given game, such that the other players can divide the spoils up amongst themselves.

The effect is to reduce the situation to an N-2 person game, with the same problem as before.

(This would require the game to be indefinitely iterated; but it would work. Everybody has a motive to exclude you, because it means that they win, on average, 1/(N-1) of the games, instead of 1/N of the games.)

cjr
Posts: 15
Joined: Wed Jan 10, 2007 6:29 pm UTC

Postby cjr » Fri Mar 23, 2007 1:27 pm UTC

Hi,

Normally these auctions are played with at least 100 people, so draws really don't turn up. With the three-person case, then they are more likely, although it is still up in the end what constitutes a good tie-breaker. Pick something of interest :)

Can Nash Equilibria still be realistically found with, say, 200 players?

Also, sorry about the title, I didn't think of the spamological ramifications of it :D

User avatar
Yakk
Poster with most posts but no title.
Posts: 11129
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Fri Mar 23, 2007 2:09 pm UTC

With 200 people playing, I'd be tempted to smash the 2-player.

I mean, multiple people are going to pick 1, so doing that doesn't work.

And going above 2 only works if someone smashes the 2-player.

And, in any case, my chances of winning are pretty damn small...

User avatar
Cosmologicon
Posts: 1806
Joined: Sat Nov 25, 2006 9:47 am UTC
Location: Cambridge MA USA
Contact:

Postby Cosmologicon » Fri Mar 23, 2007 5:35 pm UTC

If there are lots of people, announce you're picking a somewhat higher number. For 200 people, say 10. Then everyone who actually wants to win will have to try picking a number less than 10. There's still a very small chance that one of them will win and beat you, but the only way they can surely prevent you from winning is by picking 10 and throwing that away, so I still think this is the best strategy. :-)

Of course, one of the other 199 will announce that they're picking 9, and so on down. That makes things slightly harder I suppose....

User avatar
Yakk
Poster with most posts but no title.
Posts: 11129
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Fri Mar 23, 2007 5:39 pm UTC

You are underestimating perversity.

Second, if you announce you are taking 10, why not pick 11? Nobody would pick 11 if 10 is going to win... And the perverse people who pick 10 to screw you will end up cancelling each other!

Of course, people will work this out other than you, and some will pick 11. So you should announce that you are picking 10, then pick 12.

Wait...

User avatar
Drostie
Posts: 262
Joined: Fri Nov 03, 2006 6:17 am UTC

Postby Drostie » Sat Mar 24, 2007 4:06 am UTC

I just think that commitment strategies won't work. Let me treat both cases -- formal commitments, where you can't go back on your word, and informal commitments, where you can.

First, suppose that commitments are informal, and hence you can renege on your commitments.

Now, consider a woman (I'm picturing a muscular woman with a Russian accent, but it can be any woman,) who hears you say "I am choosing 10," and says, "Fantastic! I am also to choose 10. Let us celebrate with wodka!"

Irrational? No. She is a master strategist at work. She knows that if you don't renege on your commitment, you are now committed to losing for sure. Since the commitments are all-informal, nobody will force you to keep the choice of 10. So you'll, if you're rational, choose something else. At the same time, she reasons, nobody else is going to jump into this boondoggle. Since she tried to scare you away from 10s, a fortiori, she is not very likely to budge if you try to scare her away from 10s.

So informal commitment just leaves you back where you started, or worse.

Now, formal commitments are allowed, and you formally commit to some number N. Then, the rest of the players may formally commit to "We draw straws, and the person who draws the shortest straw will choose N, to punish the commitment."

Would they want to do this? Sure. Just to exemplify this, imagine the 3-player game, with Alice, Bob, and Cosmologicon. Cosmologicon formally commits to the number 2, and must now keep that number -- no reneging.

The strategy of "choose 1" now dominates all other strategies, but if both Alice and Bob choose 1, then they both win 0 -- which was what Cosmologicon intended.

Unfortunately for Cosmologicon, they can now both formally commit to the strategy, "We flip a coin, if it's heads, Alice picks 2, if it's tails, Bob picks 2, and the other person picks 1." They will then each win 1/2, instead of 0. It's not just a way to win; it's the only way for them to win. So they both agree to that contract, and either way they slice it, Cosmologicon loses the game.

So, in informal commitment, it doesn't really work. In formal commitment, it doesn't really work either.

(I really think that one should approach this problem by finding a stable equilibrium PMF.)

taemyr
Posts: 111
Joined: Fri Jan 26, 2007 12:14 pm UTC

Postby taemyr » Tue Apr 03, 2007 7:23 pm UTC

Yakk wrote:(The idea (well, the proof that one exists in all finite games) was worth a Nobel prize, IIRC, so don't feel bad if you missed it.)


Not really.
The award you are thinking about is likely "Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel".


Return to “Logic Puzzles”

Who is online

Users browsing this forum: Google Feedfetcher, Sabrar and 9 guests