Difference between revisions of "Poisson race distribution"

From Maths
Jump to: navigation, search
(Added proof and entire gist of missing case.)
m (Removing inline name)
 
Line 4: Line 4:
 
Let {{M|X\sim}}[[Poisson distribution|{{M|\text{Poi} }}]]{{M|(\lambda_1)}} and {{M|Y\sim\text{Poi}(\lambda_2)}} be given {{plural|random variable|s}} (that are [[independent random variables|independent]]), then define a new random variable:
 
Let {{M|X\sim}}[[Poisson distribution|{{M|\text{Poi} }}]]{{M|(\lambda_1)}} and {{M|Y\sim\text{Poi}(\lambda_2)}} be given {{plural|random variable|s}} (that are [[independent random variables|independent]]), then define a new random variable:
 
* {{M|Z:\eq X-Y}}
 
* {{M|Z:\eq X-Y}}
We<sup><span style="color:#454545;">Alec:</span></sup><ref>I've heard this somewhere before, I can't remember where from - I've had a quick search, while not established it's in the right area</ref> call this a "''Poisson race''" as in some sense they're racing, {{M|Z>0}} then {{M|X}} is winning, {{M|Z\eq 0}}, neck and neck and {{M|Z<0}} then {{M|Y}} is winning.
+
We<ref>I've heard this somewhere before, I can't remember where from - I've had a quick search, while not established it's in the right area</ref> call this a "''Poisson race''" as in some sense they're racing, {{M|Z>0}} then {{M|X}} is winning, {{M|Z\eq 0}}, neck and neck and {{M|Z<0}} then {{M|Y}} is winning.
 
* {{Caveat|Remember that Poisson measures stuff ''per unit thing''}}, meaning, for example, that say we are talking "defects per mile of track" with {{M|\lambda_1}} being the defects per mile of the left rail (defined somehow) and {{M|\lambda_2}} the defects per mile of the right rail.
 
* {{Caveat|Remember that Poisson measures stuff ''per unit thing''}}, meaning, for example, that say we are talking "defects per mile of track" with {{M|\lambda_1}} being the defects per mile of the left rail (defined somehow) and {{M|\lambda_2}} the defects per mile of the right rail.
 
** If there are 5 on the left and 3 on the right {{M|Z}} for that mile was {{M|+2}} - the next mile is independent and doesn't start off with this bias, that is {{M|Z\eq 0}} for the next mile it doesn't mean the right rail caught up with 2 more than the left for this mile.  
 
** If there are 5 on the left and 3 on the right {{M|Z}} for that mile was {{M|+2}} - the next mile is independent and doesn't start off with this bias, that is {{M|Z\eq 0}} for the next mile it doesn't mean the right rail caught up with 2 more than the left for this mile.  

Latest revision as of 10:33, 24 December 2018

[ilmath]\newcommand{\P}[2][]{\mathbb{P}#1{\left[{#2}\right]} } \newcommand{\Pcond}[3][]{\mathbb{P}#1{\left[{#2}\!\ \middle\vert\!\ {#3}\right]} } \newcommand{\Plcond}[3][]{\Pcond[#1]{#2}{#3} } \newcommand{\Prcond}[3][]{\Pcond[#1]{#2}{#3} }[/ilmath]
[ilmath]\newcommand{\l}[1]{\lambda#1} [/ilmath]

Definition

Let [ilmath]X\sim[/ilmath][ilmath]\text{Poi} [/ilmath][ilmath](\lambda_1)[/ilmath] and [ilmath]Y\sim\text{Poi}(\lambda_2)[/ilmath] be given random variables (that are independent), then define a new random variable:

  • [ilmath]Z:\eq X-Y[/ilmath]

We[1] call this a "Poisson race" as in some sense they're racing, [ilmath]Z>0[/ilmath] then [ilmath]X[/ilmath] is winning, [ilmath]Z\eq 0[/ilmath], neck and neck and [ilmath]Z<0[/ilmath] then [ilmath]Y[/ilmath] is winning.

  • Caveat:Remember that Poisson measures stuff per unit thing, meaning, for example, that say we are talking "defects per mile of track" with [ilmath]\lambda_1[/ilmath] being the defects per mile of the left rail (defined somehow) and [ilmath]\lambda_2[/ilmath] the defects per mile of the right rail.
    • If there are 5 on the left and 3 on the right [ilmath]Z[/ilmath] for that mile was [ilmath]+2[/ilmath] - the next mile is independent and doesn't start off with this bias, that is [ilmath]Z\eq 0[/ilmath] for the next mile it doesn't mean the right rail caught up with 2 more than the left for this mile.
    • So this doesn't model an ongoing race.

Descriptions

  • for [ilmath]k\ge 0[/ilmath] and [ilmath]k\in\mathbb{Z} [/ilmath] we have:
    • Claim 1: [math]\P{Z\eq k}\eq \lambda_1^k e^{-\lambda_1-\lambda_2} \sum_{i\in\mathbb{N}_0}\frac{(\lambda_1\lambda_2)^i}{i!(i+k)!} [/math][Note 1], which may be written:
      • [math]\P{Z\eq k}\eq \lambda_1^k e^{-\lambda_1-\lambda_2}\sum_{i\in\mathbb{N}_0}\left( \frac{(\lambda_1\lambda_2)^i}{(i!)^2}\cdot\frac{i!}{(i+k)!}\right) [/math] - this has some terms that look a bit like a Poisson term (squared) - perhaps it might lead to a closed form.
  • for [ilmath]k\le 0[/ilmath] and [ilmath]k\in\mathbb{Z} [/ilmath] we can re-use the above result with [ilmath]X[/ilmath] and [ilmath]Y[/ilmath] flipped:
    • Can't find where I've written it down, will not guess it

Proof of claims

Grade: B
This page requires one or more proofs to be filled in, it is on a to-do list for being expanded with them.
Please note that this does not mean the content is unreliable. Unless there are any caveats mentioned below the statement comes from a reliable source. As always, Warnings and limitations will be clearly shown and possibly highlighted if very important (see template:Caution et al).
The message provided is:
Page 384 document spans consecutive numbers, squared, A4, no holes
  • Added proof, attention was all over the place so may be a bit disjoint Alec (talk) 02:08, 7 January 2018 (UTC)
    • Worth going through it again! Alec (talk) 02:08, 7 January 2018 (UTC)

STILL TODO:

  • The [ilmath]k<0[/ilmath] case! - Method outlined already. Alec (talk) 02:08, 7 January 2018 (UTC)
[ilmath]k[/ilmath] [ilmath]X[/ilmath] [ilmath]Y[/ilmath]
[ilmath]0[/ilmath] [ilmath]X\eq 0[/ilmath] [ilmath]Y\eq 0[/ilmath]
[ilmath]X\eq 1[/ilmath] [ilmath]Y\eq 1[/ilmath]
[ilmath]X\eq 2[/ilmath] [ilmath]Y\eq 2[/ilmath]
[ilmath]\vdots[/ilmath] [ilmath]\vdots[/ilmath]
[ilmath]X\eq i[/ilmath] [ilmath]Y\eq i[/ilmath]
[ilmath]1[/ilmath] [ilmath]X\eq 1[/ilmath] [ilmath]Y\eq 0[/ilmath]
[ilmath]X\eq 2[/ilmath] [ilmath]Y\eq 1[/ilmath]
[ilmath]X\eq 3[/ilmath] [ilmath]Y\eq 2[/ilmath]
[ilmath]\vdots[/ilmath] [ilmath]\vdots[/ilmath]
[ilmath]X\eq i+1[/ilmath] [ilmath]Y\eq i[/ilmath]
[ilmath]2[/ilmath] [ilmath]X\eq 2[/ilmath] [ilmath]Y\eq 0[/ilmath]
[ilmath]X\eq 3[/ilmath] [ilmath]Y\eq 1[/ilmath]
[ilmath]X\eq 4[/ilmath] [ilmath]Y\eq 2[/ilmath]
[ilmath]\vdots[/ilmath] [ilmath]\vdots[/ilmath]
[ilmath]X\eq i+2[/ilmath] [ilmath]Y\eq i[/ilmath]
[ilmath]\cdots[/ilmath]
[ilmath]j[/ilmath] [ilmath]X\eq j[/ilmath] [ilmath]Y\eq 0[/ilmath]
[ilmath]X\eq j+1[/ilmath] [ilmath]Y\eq 1[/ilmath]
[ilmath]X\eq j+2[/ilmath] [ilmath]Y\eq 2[/ilmath]
[ilmath]\vdots[/ilmath] [ilmath]\vdots[/ilmath]
[ilmath]X\eq i+j[/ilmath] [ilmath]Y\eq i[/ilmath]

Finding [ilmath]\P{Z\eq k} [/ilmath] for [ilmath]k\in\mathbb{N}_0[/ilmath]


TODO: Cover why we split the cases (from below: "Whereas: if we have [ilmath]k<0[/ilmath] then [ilmath]Y>X[/ilmath] so if we have [ilmath]Y\eq 0[/ilmath] then [ilmath]X< 0[/ilmath] follows, which we can't do, so we must sum the other way around ) - good start


  • Let [ilmath]k\in\mathbb{N}_0[/ilmath] be given
    • Suppose [ilmath]k\ge 0[/ilmath]
      • Then [ilmath]X-Y\eq k[/ilmath] giving [ilmath]X\eq Y+k[/ilmath]
      • Specifically, [ilmath]k\ge 0[/ilmath] means [ilmath]X-Y\ge 0[/ilmath] so [ilmath]X\ge Y[/ilmath] and so forth as shown for a handful of values in the table on the right
        • As Poisson is only defined for values in [ilmath]\mathbb{N}_0[/ilmath] we can start with [ilmath]Y\eq 0[/ilmath] and [ilmath]X[/ilmath] will be [ilmath]\ge 0[/ilmath] too
          • Whereas: if we have [ilmath]k<0[/ilmath] then [ilmath]Y>X[/ilmath] so if we have [ilmath]Y\eq 0[/ilmath] then [ilmath]X< 0[/ilmath] follows, which we can't do, so we must sum the other way around if [ilmath]k<0[/ilmath]. This is why we analyse [ilmath]\P{Z\eq k} [/ilmath] as two cases!
      • So we see that if [ilmath]Y[/ilmath] takes any value in [ilmath]\mathbb{N}_0[/ilmath] that [ilmath]X\eq Y+k[/ilmath] means [ilmath]X\in\mathbb{N}_0[/ilmath] too, the key point
      • Thus we want to sum over [ilmath]i\in\mathbb{N}_0[/ilmath] of [ilmath]\P{Y\eq i\wedge X\eq i+k} [/ilmath] as for these cases we see [ilmath]X-Y\eq i+k-i\eq k[/ilmath] as required
        • So: [math]\P{Z\eq k}:\eq\sum_{i\in\mathbb{N}_0} \P{Y\eq i}\cdot\Pcond{X\eq i+k}{Y\eq i} [/math]
          • As [ilmath]X[/ilmath] and [ilmath]Y[/ilmath] are independent distributions we know [ilmath]\Pcond{X\eq u}{Y\eq v}\eq\P{X\eq u} [/ilmath] we see:
            • [math]\P{Z\eq k}\eq\sum_{i\in\mathbb{N}_0} \P{Y\eq i}\cdot\P{X\eq i+k} [/math]
          • Then [math]\P{Z\eq k}\eq\sum_{i\in\mathbb{N}_0}{\Bigg[ \underbrace{e^{-\l{_2} }\cdot\frac{\l{_2}^i}{i!} }_{\P{Y\eq i} }\cdot\underbrace{e^{-\l{_1} }\cdot\frac{\l{_1}^{i+k} }{(i+k)!} }_{\P{X\eq i+k} }\cdot\Bigg]} [/math] - by definition of the Poisson distribution
            [math]\eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}\frac{(\lambda_1\cdot\lambda_2)^i}{i!}\cdot\frac{\lambda_1^k}{(i+k)!} [/math]
            • Notice that [ilmath](i+k)!\eq i!\big((i+1)(i+2)\cdots(i+k)\big)[/ilmath][math]\eq i!\frac{(i+k)!}{i!} [/math], so [math]\frac{1}{(i+k)!}\eq \frac{1}{i!\frac{(i+k)!}{i!} } \eq \frac{i!}{i!(i+k)!} [/math]
          • So [math]\P{Z\eq k} \eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}\frac{(\lambda_1\cdot\lambda_2)^i\lambda_1^k}{i!}\cdot\frac{i!}{i!(i+k)!} [/math]
            • Finally giving:
              • [math]\P{Z\eq k} \eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}\lambda_1^k\cdot\frac{(\lambda_1\cdot\lambda_2)^i}{i!i!}\cdot\frac{i!}{(i+k)!} [/math] or
              • is is perhaps better written as:
                [math]\P{Z\eq k} \eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}{\left[\lambda_1^k\cdot\frac{(\lambda_1\cdot\lambda_2)^i}{(i!)^2}\cdot\frac{1}{\prod_{j\eq 1}^{k}(i+j)}\right]} [/math]

NOTES for [ilmath]k<0[/ilmath]

Let [ilmath]k':\eq -k[/ilmath] so it's positive and note that:

  • [ilmath]X-Y\eq k[/ilmath] means [ilmath]Y-X\eq -k\eq k'[/ilmath]
    • Importantly [ilmath]Y-X\eq k'[/ilmath] for [ilmath]k' >0[/ilmath] (as [ilmath]k<0[/ilmath] we see [ilmath]k':\eq -k > 0[/ilmath])
      • Let [ilmath]X':\eq Y[/ilmath] and [ilmath]Y':\eq X[/ilmath] then we have:
        • [ilmath]X'-Y'\eq k'[/ilmath] for [ilmath]k'>0[/ilmath] and both [ilmath]X'[/ilmath] and [ilmath]Y'[/ilmath] as Poisson distributions with given rates
          • We can use the [ilmath]k\ge 0[/ilmath] formula for this as [ilmath]k'>0[/ilmath] so is certainly [ilmath]\ge 0[/ilmath]

Notes

  1. Note that [ilmath]\sum_{i\in\mathbb{N}_0}a_i[/ilmath] means [ilmath]\sum^\infty_{i\eq 0}a_i[/ilmath] - see Notes:Infinity notation

References

  1. I've heard this somewhere before, I can't remember where from - I've had a quick search, while not established it's in the right area