Jump to content

Wikipedia:Reference desk/Archives/Mathematics/2006 December 21

From Wikipedia, the free encyclopedia
Mathematics desk
< December 20 << Nov | December | Jan >> December 22 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


December 21

[edit]

Calculating probabilities of winning a turn-based two-player duel

[edit]

I’ve had a problem in the back of my brain for a while now and may as well post it here for input from people smarter than myself.

I want to model a game where two players are in a duel. The game is played in turns, and in each turn both players simultaneously attempt to hit each other. Each player can take a set amount of damage before losing. The first player to knock out the other player is the winner, and a tie is possible if both players knock each other out simultaneously.

The game involves the following stats:

Hit1, Hit2 – The probabilities that player 1 and player 2 will hit each other each turn. These probabilities are constant through a game.

Dam1, Dam2 – The amount of damage inflicted by player 1 and player 2 on a hit. For example, if Dam1 = 2, then if player 1 hits he will inflict 2 damage.

Health1, Health2 – The total amount of damage each player can take before falling unconscious. If the total damage inflicted on a player is equal to or greater than his health, he loses the game.

So, given the variables above and how the game works, what is the probability that player 1 knocks player 2 unconscious (ie Player 1 either wins or ties)? Is it possible to produce a matrix or equation for this probability in terms of the game’s variables? The ultimate goal of the problem is to be able to find a way to calculate these probabilities in such a way that some sort of handicapping or point system can be implemented to evaluate how two players compare. This would hopefully eventually allow me, for example, to calculate how much it should cost a player to purchase additional health or additional damage or to improve their chance to hit or reduce their chance of being hit.

In theory, I could write a computer simulation that approximates the probabilities by running the game 1000 times and reporting the win/loss statistics. However, since this is the “math” reference desk, I’m more interested in looking at mathematical models or solutions. Thanks in advance for the input. Dugwiki 21:36, 22 December 2006 (UTC)[reply]

May it happen that both players get hit by the other one in the same turn? Then both could lose, or is that a draw? I don't think we can give a useful exact general formula for this. But if the ratio Dam1 : Dam2 is a simple rational, in which case you may as well make the Health and Dam parameters all natural numbers, then it is possible to write a program that computes the exact answer, given the values of the parameters. That is not helpful for designing a handicap system, but it should be possible to find approximate formulas that can be validated by comparison against exact answers. If indeed both can take a hit at the same time, and the initial health is reasonably large so that the game is not always soon over, we can reason as follows. Assume a demised player gets instantly resurrected with infinite health. Then eventually both will kick the bucket; the loser is simply the first to do so. The chances are not altered by this modification, since it takes effect in each game only when we already have a loser. How long will it take player i to fall fatally ill? On the average the health decreases by Hiti × Dami per turn, so this takes about Healthi / (Hiti × Dami) turns. If this quantity is the same for both players, they have about equal chances. As a slight simplification without any loss of generality you can replace Healthi by floor(Healthi / Dami) while from then on Dami is considered to be 1. Then you can use: Average time before player i joins the choir invisible = Healthi / Hiti. So if, for example, Hit1 = 0.43 and Hit2 = 0.26, you could use Health1 = 13 and Health2 = 8. (13/0.43 = 30.2+ and 8/0.26 = 30.8−.) Intuitively, I think the one with the smaller hit parameter has a slight advantage if these ratios are the same, and some experimentation may help to find a fair compensating correction. But if T is largish, the correction should be small.  --LambiamTalk 01:26, 23 December 2006 (UTC)[reply]
As a start, I would suggest having a look at negative binomial distribution - each "turn" would be a set of two independent (up to the point where one dies, but that is irrelevant - i think) Bernoulli trials with probability of 'success' being the Hit value for that player. Then you can find the probability that Player i carks it on the kth turn, ie the probability that there have been hits (r in the neg binom function) to that player. I'll try and come up with something more concrete later.... – AlbinoMonkey (Talk) 01:56, 23 December 2006 (UTC)[reply]
Okay here goes. The definition of negative binomial that I was taught and said above is different to the one on WP so from now on I'm talking about the one defined in the Negative binomial distribution article: The number of failures before r successes in an infinite series of Bernoulli(p) trials. Define the following variables, based on your game stats: (the probability that player 1 will be hit on any turn) and (the number of hits it will take for player 1 to die). Do similar with and for player 2.


Now we model each turn as two independent Bernoulli trials, where a "success" is that the relevant player is hit. Then , the number of times player 1 is not hit in the duel before he dies (cops hits) is a negative binomial random variable with parameters and . Again, similar thing for player 2 and random variable .
What you are interested in are the probabilities , that player 1 will win the duel, player 2 will win or it is a draw (both die on the same hit). This involves summing over all possible values of our random variables (0 to ), so as far as I can see there is no simple formula for it. But you can use the probability mass function and cumulative density functions to make it a bit easier. First consider :

You can evaluate this using the pmf () and cmf () listed on the negative binomial distribution article. (If you don't have a statistical program, the formula NEGBINOMDIST in Microsoft Excel evaluates the first expression). is evaluated in exactly the same way, but with 1's swapped with 2's everywhere. I won't bother typing the derivation of the formula for :
I did a few using R, with the following results:
Hit1 Hit2 Dam1 Dam2 Health1 Health2 r1 p1 r2 p2 P(P1) P(P2) P(draw)
0.5 0.5 1 1 10 10 10 0.5 10 0.5 0.4671 0.4671 0.0658
0.2 0.8 3 1 20 20 20 0.8 7 0.2 0.1268 0.7755 0.0977
0.9 0.1 3 1 30 10 30 0.1 4 0.9 1.0000 0.0000 0.0000
0.26 0.43 1 1 13 8 13 0.43 8 0.26 0.4826 0.4807 0.0367
AlbinoMonkey (Talk) 06:03, 23 December 2006 (UTC)[reply]

Thanks for the feedback guys, it'll give me something to think about this week. :) To reply to a couple of comments I saw above:

- Yes, draws are possible, as Albinomonkey illustrates in his sample calculations. Each turn both players attack simultaneously, and it's possible for both to reach zero health in the same turn.

- You are correct that if the expected time to reach zero is equal for both players than the probability of either player winning is (exactly or roughly) 1/2. However, the problem is that it's not clear what happens when the expected time to reach zero is different for both players. For example, if one player is expected to reach zero in 10 turns, and the other player reach it in 15 turns, it's obvious that the 15 turn player is more likely to win, but it's not obvious what that probability is.

- Thanks for running the calculations, Albino. A quick sanity check shows you probably got it right, since both players have equal chance of winning if their stats are equal, and that probability is slightly less than 1/2 (because of the possibility of a draw). Dugwiki 16:51, 26 December 2006 (UTC)[reply]